ETL QA Automation Engineer
Synechron
Job Summary
Synechron is seeking an experienced ETL QA Automation Engineer to develop, enhance, and execute automated testing solutions for data integration and data flow processes. This role involves collaborating with data engineers, QA teams, and business stakeholders to validate data transformations, ensure data quality, and support continuous deployment of scalable ETL solutions. The engineer's efforts will improve testing efficiency, data integrity, and system reliability, supporting the organization’s data-driven initiatives.
Must Have
- Strong experience in ETL testing, data validation, and automation scripting
- Proficiency in SQL scripting and data validation for testing data transformations and flow
- Hands-on experience with Python for automation and scripting tasks
- Familiarity with data warehousing concepts and ETL workflows (Informatica, Talend, etc.)
- Knowledge of API testing and validation techniques
- Experience working with version control systems like Git and CI/CD pipelines (Jenkins, Azure DevOps)
- Design and develop automated test scripts to validate ETL data flow and transformations
- Validate data quality, consistency, and integrity across multiple systems and platforms
- Perform root cause analysis and troubleshoot issues related to data discrepancies and process failures
- Maintain and enhance automation frameworks to support ongoing testing efforts
- Work closely with data engineers, QA, and product teams to align testing activities with project delivery timelines
- Generate detailed test reports, defect logs, and documentation to support ongoing quality assurance efforts
- Support and execute data migration validation, system upgrades, and process optimizations
- Contribute to continuous improvement initiatives in testing processes and tools
- Minimum 5+ years of experience in data validation, ETL testing, or automation testing within data environments
- Proven track record in designing and executing automated testing strategies for ETL workflows
- Experience working with cross-functional data and QA teams in Agile projects
- Demonstrated ability to troubleshoot data and process issues effectively and deliver quality results
- Bachelor’s or Master’s degree in Computer Science, Data Science, Information Technology, or related field
Good to Have
- Knowledge of cloud data platforms (AWS, Azure, GCP) and their data services
- Experience with data modeling and data quality tools
- Familiarity with automation frameworks such as Robot Framework or Apache Airflow
- Experience with big data testing technologies (Hadoop, Spark)
- Shell scripting, Java, or other scripting languages for automation tasks
- Working knowledge of big data tools and platforms such as Hadoop, Spark, or Amazon Redshift
- Support of complex workflows involving multiple data sources and destinations
- Familiarity with API testing tools like Postman, RestAssured, or similar
- Knowledge of containerization (Docker) and orchestration (Kubernetes) in testing environments
- Prior experience supporting data migration, integration, or big data projects is advantageous
- Certifications in Data Testing, ETL Tools (Informatica, Talend), or Cloud Data Services are a plus
Job Description
Job Summary
Synechron is seeking a experienced and detail-oriented ETL QA Automation Engineer to develop, enhance, and execute automated testing solutions for data integration and data flow processes. In this role, you will collaborate closely with data engineers, QA teams, and business stakeholders to validate data transformations, ensure data quality, and support continuous deployment of scalable ETL solutions. Your efforts will improve testing efficiency, data integrity, and system reliability, supporting the organization’s data-driven initiatives.
Software Requirements
Required Skills:
- Strong experience in ETL testing, data validation, and automation scripting
- Proficiency in SQL scripting and data validation for testing data transformations and flow
- Hands-on experience with Python for automation and scripting tasks
- Familiarity with data warehousing concepts and ETL workflows (Informatica, Talend, etc.)
- Knowledge of API testing and validation techniques
- Experience working with version control systems like Git and CI/CD pipelines (Jenkins, Azure DevOps)
Preferred Skills:
- Knowledge of cloud data platforms (AWS, Azure, GCP) and their data services
- Experience with data modeling and data quality tools
- Familiarity with automation frameworks such as Robot Framework or Apache Airflow
- Experience with big data testing technologies (Hadoop, Spark) is a plus
Overall Responsibilities
- Design and develop automated test scripts to validate ETL data flow and transformations
- Validate data quality, consistency, and integrity across multiple systems and platforms
- Perform root cause analysis and troubleshoot issues related to data discrepancies and process failures
- Maintain and enhance automation frameworks to support ongoing testing efforts
- Work closely with data engineers, QA, and product teams to align testing activities with project delivery timelines
- Generate detailed test reports, defect logs, and documentation to support ongoing quality assurance efforts
- Support and execute data migration validation, system upgrades, and process optimizations
- Contribute to continuous improvement initiatives in testing processes and tools
Technical Skills (By Category)
Languages & Tools:
- Required: SQL, Python (preferred for scripting and test automation)
- Preferred: Shell scripting, Java, or other scripting languages for automation tasks
Data & Data Management:
- Required: Deep understanding of data warehouse concepts, data validation, and data integrity testing
- Preferred: Working knowledge of big data tools and platforms such as Hadoop, Spark, or Amazon Redshift
ETL & Data Integration:
- Required: Experience with ETL platforms such as Informatica, Talend, or similar tools
- Preferred: Support of complex workflows involving multiple data sources and destinations
Testing & Automation Frameworks:
- Required: Development and maintenance of automation scripts for data testing, using tools like Python or Robot Framework
- Preferred: Familiarity with API testing tools like Postman, RestAssured, or similar
DevOps & Continuous Testing:
- Required: Experience integrating automated tests into CI/CD pipelines with Jenkins, Azure DevOps, or GitLab CI
- Preferred: Knowledge of containerization (Docker) and orchestration (Kubernetes) in testing environments
Experience Requirements
- Minimum 5+ years of experience in data validation, ETL testing, or automation testing within data environments
- Proven track record in designing and executing automated testing strategies for ETL workflows
- Experience working with cross-functional data and QA teams in Agile projects
- Demonstrated ability to troubleshoot data and process issues effectively and deliver quality results
- Prior experience supporting data migration, integration, or big data projects is advantageous
Day-to-Day Activities
- Develop and automate data validation scripts for ETL testing covering data transformations and flow
- Investigate and resolve data discrepancies and failures in data pipelines or staging environments
- Collaborate with data engineers, developers, and QA teams to optimize test coverage and efficiency
- Support system upgrades, patches, and continuous deployment activities for data platforms
- Review existing automation frameworks, refactor scripts, and implement process improvements
- Generate detailed test reports, defect logs, and documentation for audit and compliance purposes
- Support data migration and data quality assurance activities during project rollouts
- Stay updated on the latest automation tools, scripting techniques, and industry best practices
Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Science, Information Technology, or related field
- Certifications in Data Testing, ETL Tools (Informatica, Talend), or Cloud Data Services are a plus
- Proven hands-on experience in automating ETL testing and data validation workflows for enterprise-scale systems
Professional Competencies
- Strong analytical and troubleshooting skills for data discrepancies
- Effective communication skills for working with technical and business stakeholders
- Ability to work independently and as part of a collaborative team in a fast-paced environment
- Strong attention to detail and adherence to quality standards
- Continuous learner, eager to stay current with emerging data and automation technologies
- Process-oriented with a focus on efficiency, accuracy, and scalability