Data Engineer - R01556585

1 Hour ago • All levels
Data Analysis

Job Description

Brillio is a rapidly expanding digital technology service provider. This role is for a Contingent Worker Data Engineer, focusing on building and maintaining Kafka pipelines for claims data ingestion and routing. Key responsibilities include developing ETL/ELT processes for integrating various systems, implementing schema validation, and ensuring data quality. The position requires proficiency in Kafka, Python, and SQL, along with experience in cloud-native data platforms.
Good To Have:
  • Familiarity with MongoDB, Talend, or other integration tools.
  • Strong data modeling, schema design, and performance optimization knowledge.
  • Ability to debug data pipeline issues in large-scale environments.
Must Have:
  • Build and maintain Kafka pipelines for claims data ingestion and routing.
  • Develop ETL/ELT processes for integrating Amisys, Facets, ABS, and Excelys into Pisces.
  • Implement schema validation and ensure data quality across multiple sources.
  • Collaborate with BSAs and QA to deliver accurate edits and exclusions.
  • Proficiency in Kafka, Python, SQL (for ETL and data validation).
  • Experience with cloud-native data platforms (AWS Glue, Azure Data Factory, GCP Dataflow).

Add these skills to join the top 1% applicants for this job

game-texts
quality-control
user-experience-ux
aws
azure
mongodb
python
sql

About Brillio:

Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction.

Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year.

Consultant

Primary Skills

  • Kafka, Python, SQL (for ETL and data validation).
  • Experience with cloud-native data platforms (AWS Glue, Azure Data Factory, GCP Dataflow).

Specialization

  • Data Engineer with Python, Kafka, SQL

Job requirements

  • Location: Coppel, TX and NY (Hybrid with 2-3 days WFO per week)

Responsibilities

  • Build and maintain Kafka pipelines for claims data ingestion and routing.
  • Develop ETL/ELT processes for integrating Amisys, Facets, ABS, and Excelys into Pisces.
  • Implement schema validation and ensure data quality across multiple sources.
  • Collaborate with BSAs and QA to deliver accurate edits and exclusions.

Skills

  • Proficiency in Kafka, Python, SQL (for ETL and data validation).
  • Experience with cloud-native data platforms (AWS Glue, Azure Data Factory, GCP Dataflow).
  • Familiarity with MongoDB, Talend, or other integration tools.
  • Strong data modeling, schema design, and performance optimization knowledge.
  • Ability to debug data pipeline issues in large-scale environments.

Set alerts for more jobs like Data Engineer - R01556585
Set alerts for new jobs by Brillio
Set alerts for new Data Analysis jobs in United States
Set alerts for new jobs in United States
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙