Data Engineer III
GHX
Job Summary
The Data Engineer is responsible for developing and executing data solutions that support product and technology initiatives. This role involves backend and ETL development using Python and SQL, integrating data flows between AWS and Snowflake, and implementing data quality and security frameworks. The engineer will also contribute to DevOps practices, provide architectural guidance, troubleshoot issues, and analyze business requirements to formulate designs.
Must Have
- Lead and contribute to backend and ETL development using Python and SQL.
- Integrate and optimize data flows between AWS and Snowflake.
- Implement and manage data quality, security, and monitoring frameworks.
- Develop and maintain infrastructure-as-code using tools such as CloudFormation or CDK.
- Contribute to DevOps practices for CI/CD pipeline automation, version control, and deployment.
- Provide architectural guidance and development/build standards for the team.
- Troubleshoot and resolve issues in APIs, data pipelines, and infrastructure.
- Analyze business requirements and formulate designs.
- Thorough understanding of Agile development methodologies.
- Ability to design, collect, and analyze large datasets.
- Strong demonstrable SQL and Python skills.
- Bachelor’s degree in Computer Science, Mathematics, or related fields.
- 5+ years of data engineering experience building business intelligence applications with SQL, PL/SQL, and/or Python skills.
- 5+ years of experience of ETL development in a big data environment.
- 5+ years working in an agile development environment.
- Technical writing experience in relevant areas, including queries, reports, and presentations.
Good to Have
- Experience in a diverse set of Amazon Web Services services such as SNS/SQS, S3, Glue, Lambda, API Gateway.
- Strong development experience in Python, PySpark, and SQL.
- Experience developing in Snowflake.
- Knowledge of data governance, API security, and best practices for cloud-based systems.
- Application, system or data architecture experience.
- Machine learning experience.
Perks & Benefits
- Health insurance
- Vision insurance
- Dental insurance
- Accident insurance
- Life insurance
- 401k matching
- Paid-time off
- Education reimbursement
Job Description
The Data Engineer is responsible for developing and executing data solutions that support product and technology initiatives, including general application development activities such as unit testing, code review, code deployment, and technical documentation. This role also collaborates with Product and Engineering teams to design solutions and enable new data capabilities.
Key Responsibilities
- Lead and contribute to the backend and ETL development effort of our data platform using Python and SQL
- Integrate and optimize data flows between AWS and Snowflake for application, analytics, and reporting use cases
- Implement and manage data quality, security, and monitoring frameworks
- Develop and maintain infrastructure-as-code using tools such as CloudFormation or CDK
- Contribute to DevOps practices for CI/CD pipeline automation, version control, and deployment
- Provide architectural guidance and development/build standards for the team
- Troubleshoot and resolve issues in APIs, data pipeline lines, and infrastructure
- Analyze business requirements and work with teammates to formulate supporting design and design documentation
KEY COMPETENCIES
- Thorough understanding of, and support for, Agile development methodologies
- Ability to design, collect, and analyze large datasets
- Ability to communicate technical concepts and designs to cross-functional and offshore teams who have varying levels of technical experience
- Proven data engineering, problem-solving, and analysis skills
- Strong demonstrable SQL and Python skills
- Ability to adapt to changing conditions and lead others through change
- Demonstrated organizational, prioritization, and time management skills
- Attention to detail
- Ability and willingness to travel nationally to remote offices and partners approximately 10% of the time
Required Education, Certifications, and Experience
- Bachelor’s degree in Computer Science, Mathematics, or related fields
- 5+ years of data engineering experience building business intelligence applications with exceptional SQL, PL/SQL, and/or Python skills
- 5+ years of experience of ETL development in a big data environment
- 5+ years working in an agile development environment
- Technical writing experience in relevant areas, including queries, reports, and presentations
Preferred Qualifications
- Experience in a diverse set of Amazon Web Services services such as SNS/SQS, S3, Glue, Lambda, API Gateway
- Strong development experience in Python, PySpark, and SQL
- Experience developing in Snowflake
- Knowledge of data governance, API security, and best practices for cloud-based systems
Key Differentiators
- Application, system or data architecture experience
- Machine learning experience
Estimated Salary: $98,000 - $130,500
The base salary range represents the anticipated low and high end of the GHX’s salary range for this position. The base salary is one component of GHX’s total compensation package for employees. Other rewards and benefits include: health, vision, and dental insurance, accident and life insurance, 401k matching, paid-time off, and education reimbursement, to name a few. To view more details of our benefits, visit us here: https://www.ghx.com/about/careers/
No 3rd party agencies or C2C allowed. #LI-SR