Senior Data Engineer - AI & Data

india, Andhra Pradesh, Anantapur

Full–time

Posted on: 16 hours ago

About Delta Tech Hub: Delta Air Lines (NYSE: DAL) is the U.S. global airline leader in safety, innovation, reliability and customer experience. Delta has fast emerged as a customer-oriented, innovation-led, technology-driven business. The Delta Technology Hub will contribute directly to these objectives. It will sustain our long-term aspirations of delivering niche, IP-intensive, high-value, and innovative solutions. It supports various teams and functions across Delta and is an integral part of our transformation agenda, working seamlessly with a global team to create memorable experiences for customers.

The Senior Data Engineer designs, builds, and maintains the data infrastructure that powers Delta's Health Intelligence Platform. This role owns the pipelines that transform raw healthcare data from vendors (UMR, UHC, CVS, Spring Health) into clean, trusted, AI-ready data assets. You'll work with modern tools — dbt, Redshift Serverless, AWS Sage Maker — and collaborate with data quality, visualization, and data science teams to deliver enterprise-scale healthcare analytics.
Build and optimize dbt models across staging, intermediate, domain, and marts layers following healthcare data best practice.
Design and implement data pipelines for healthcare vendor data ingestion, transformation, and quality validation.
Develop automated monitoring and alerting to catch data issues before they impact downstream analytics.
Create ML feature engineering pipelines that prepare data for predictive models in SageMaker and Bedrock.
Optimize Redshift performance through query tuning, table design, and workload management.
Build CI/CD pipelines for reliable, automated deployments across environments.
Document data models and pipeline s to enable team knowledge sharing and platform maintainability.

Bachelor’s degree in Relevant Field.
  • 4+ years of relevant work experience, including 3+ years of data engineering experience with SQL and Python - Mandatory.
  • Hands-on experience with DBT- Mandatory , or similar transformation frameworks.
  • Experience with cloud data warehouses- AWS / GCP / Azure (Glue, Lambda, Redshift, Snowflake, or BigQuery).
  • Strong understanding of data modeling, ETL / ELT patterns, and data quality.
  • Ability to produce high quality results, work in a collaborative environment by embracing diverse perspectives and with a solution-based approach.

Healthcare data experience (claims, eligibility, pharmacy, clinical).
AWS experience (Redshift, SageMaker, Lambda, Step Functions).
ML feature engineering or model deployment experience.
AWS / GCP / Azure- AWS Preferred.
AWS Databricks- Nice to have.