Backbone logoBackbone Jobs

Data Engineer

ensono logoensono

Job Description

At Ensono, our Purpose is to be a relentless ally, disrupting the status quo and unleashing our clients to Do Great Things! We enable our clients to achieve key business outcomes that reshape how our world runs. As an expert technology adviser and managed service provider with cross-platform certifications, Ensono empowers our clients to keep up with continuous change and embrace innovation.

We can Do Great Things because we have great Associates. The Ensono Core Values unify our diverse talents and are woven into how we do business. We achieve our purpose by living five core values: Honesty, Reliability, Collaboration, Curiosity, and Passion!

At Ensono, we are evolving into a software-first Managed Services Provider—a place where AI, automation, and human expertise work together to deliver 10x productivity for our clients. Our Envision Operating System is the backbone of this transformation, orchestrating operations across mainframe, distributed, and cloud environments.

 

Data Engineer (Operational Data & AI Enablement)

The Data Engineer plays a pivotal role in Ensono’s evolution toward predictive, zero‑touch managed services. This is not a traditional analytics or back‑office data role. As an operational data and AI enablement engineer, you will design and build production‑grade data pipelines and platforms that power predictive services, anomaly detection, and intelligent automation.

From ServiceNow tickets to mainframe and cloud telemetry, you’ll turn raw, noisy operational signals into high‑quality, AI/ML‑ready datasets that enable real‑time insights and proactive operations. The work you do directly impacts uptime, cost optimization, and Ensono’s ability to move from manual, reactive support to a predictive, automated model.

We are looking for engineers who don’t just architect pipelines but get things done—builders who deliver working solutions, iterate quickly, and collaborate closely with data scientists, ML engineers, and operations teams to ensure models don’t just run in notebooks but meaningfully change how work gets done in production.

If you want to be part of the team rewiring managed services for the AI era, this is your role.

What You Will Do

Data Pipeline Development
Build, optimize, and maintain ELT/ETL pipelines that ingest, clean, and organize operational data from ServiceNow, mainframe environments, distributed systems, and cloud platforms.

ServiceNow Data Integration
Develop robust extraction, transformation, and ingestion patterns for ServiceNow operational data (incidents, alerts, changes, requests), ensuring it is reliable, well‑modeled, and ready for AI/ML use cases.

Data Infrastructure & Architecture
Design scalable data models, storage frameworks, and integration layers in Snowflake and related modern data platforms, with an emphasis on performance, reliability, and operational relevance.

Data Quality & Governance
Implement data quality standards, monitoring, validation, and lineage to ensure pipelines produce clean, trustworthy, and auditable datasets.

Collaboration with AI/ML Teams
Partner with Data Scientists, ML Engineers, and MLOps to deliver and maintain production‑grade feature pipelines, training datasets, and inference‑ready data, supporting predictive models, anomaly detection, and intelligent runbooks.

ML Deployment Enablement
Support ML production workflows by enabling model registration, versioning, and lifecycle management within Snowflake, working alongside ML and MLOps teams (model development itself is not the primary responsibility of this role).

Snowflake AI & LLM Integration
Integrate Snowflake AI components, including Cortex LLMs, into data workflows for use cases such as enrichment, summarization, and operational intelligence.

Automation & Optimization
Identify opportunities to streamline data workflows, reduce manual intervention, and lower operational costs while improving reliability and scalability.

Cross‑Functional Enablement
Work with Finance, Procurement, Cloud Operations, Mainframe Operations, and Service Operations teams to ensure data products align with high‑value business outcomes.

We want all new Associates to succeed in their roles at Ensono. That’s why we’ve outlined the job requirements below. To be considered for this role, it’s important that you meet all Required Qualifications.

  • Strong SQL skills and solid data modeling fundamentals

  • Expertise in ELT/ETL pipeline development and orchestration

  • Python (required) plus experience with at least one of Java, Scala, or C++

  • Hands‑on experience with Snowflake or equivalent cloud data warehouse platforms

  • Minimum of 2 years Snowflake experience required

  • Proven experience extracting, transforming, and operationalizing data from ServiceNow and/or other enterprise operational systems (e.g., monitoring platforms, ITSM, finance, or HR systems such as Workday or Concur)

  • Familiarity with observability tooling and distributed data systems

  • Knowledge of enterprise data governance, compliance, and data lineage practices

  • Experience supporting AI/ML feature pipelines in production environments

  • 7+ years’ experience is required

Mindset & Values

  • Get Stuff Done – Biased toward execution and results over prolonged design cycles

  • Business Impact Driven – You build pipelines that directly improve uptime, cost efficiency, and operational predictability

  • Collaborative Partner – Comfortable working at the intersection of Operations, AI/ML, and business stakeholders

  • Continuous Learner – Actively explores new tools and techniques to accelerate delivery and improve outcomes

Success Looks Like

  • Reliable pipelines that pull ServiceNow data into Snowflake to support real‑time incident prediction

  • Faster transitions of AI/ML proof‑of‑concepts into production‑ready data pipelines

  • Demonstrated cost savings through automated workload optimization and capacity forecasting

  • Predictive services that scale seamlessly across mainframe, distributed, and cloud environments

This Role Is a Strong Fit If You…

  • Have built data pipelines for operational or event‑driven systems, not just analytics or reporting

  • Enjoy working close to reliability, uptime, and cost outcomes

  • Have supported ML systems in production by delivering dependable, high‑quality data

  • Prefer shipping working solutions over perfect architectures

  • Are comfortable owning pipelines end‑to‑end, from source systems through production consumption

Explore more jobs

Browse more opportunities from trusted companies, filter by technology, location, and seniority, and find the next role that fits you best.

Browse all jobs

    Related Articles

    Insights related to the expertise required for this role.

    Frequently Asked Questions

    Common questions about Backbone Jobs and how we help you find your next role.