About Brillio:
Brillio is one of the fastest growing digital technology service providers and a partner of choice for many Fortune 1000 companies seeking to turn disruption into a competitive advantage through innovative digital adoption. Brillio, renowned for its world-class professionals, referred to as "Brillians", distinguishes itself through their capacity to seamlessly integrate cutting-edge digital and design thinking skills with an unwavering dedication to client satisfaction.
Brillio takes pride in its status as an employer of choice, consistently attracting the most exceptional and talented individuals due to its unwavering emphasis on contemporary, groundbreaking technologies, and exclusive digital projects. Brillio's relentless commitment to providing an exceptional experience to its Brillians and nurturing their full potential consistently garners them the Great Place to Work® certification year after year.
Consultant
Primary Skills
- Kafka, Python, SQL (for ETL and data validation).
- Experience with cloud-native data platforms (AWS Glue, Azure Data Factory, GCP Dataflow).
Specialization
- Data Engineer with Python, Kafka, SQL
Job requirements
- Location: Coppel, TX and NY (Hybrid with 2-3 days WFO per week)
Responsibilities
- Build and maintain Kafka pipelines for claims data ingestion and routing.
- Develop ETL/ELT processes for integrating Amisys, Facets, ABS, and Excelys into Pisces.
- Implement schema validation and ensure data quality across multiple sources.
- Collaborate with BSAs and QA to deliver accurate edits and exclusions.
Skills
- Proficiency in Kafka, Python, SQL (for ETL and data validation).
- Experience with cloud-native data platforms (AWS Glue, Azure Data Factory, GCP Dataflow).
- Familiarity with MongoDB, Talend, or other integration tools.
- Strong data modeling, schema design, and performance optimization knowledge.
- Ability to debug data pipeline issues in large-scale environments.