Middle Data Engineer
N-ix
Job Summary
As a Middle Data Engineer, you will support the migration of a legacy on-premises SQL Server data warehouse to a cloud-based Snowflake environment. This project involves integrating various data sources, refactoring ETL processes, and ensuring efficient, reliable data movement for analytics and reporting. Responsibilities include building and modifying data pipelines, refactoring legacy ETL jobs, implementing data transformations with dbt, optimizing cloud data processing, collaborating with engineers, and participating in QA and deployment.
Must Have
- Experience with cloud data warehouse technologies, including Snowflake and AWS S3
- Solid SQL skills; experience migrating legacy environments to cloud platforms
- Familiarity with ETL frameworks and tools (preferably SSIS, dbt)
- Proficiency in Python for data pipeline development and automation
- Strong documentation and communication skills; able to work in cross-functional teams
Good to Have
- Prior experience in large-scale data migration projects
- Background working with varied database platforms (SQL Server, Oracle, PostgreSQL, MySQL)
- Knowledge of reporting platforms such as Looker is a plus
Perks & Benefits
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
Job Description
Position Overview:
As a Middle Data Engineer, you will support the migration of legacy on-premises SQL Server data warehouse to a cloud-based Snowflake environment. The project involves integrating various data sources, refactoring ETL processes, and ensuring efficient, reliable data movement for analytics and reporting.
Responsibilities:
- Build and modify data pipelines to migrate data from SQL Server and other databases (Azure SQL, Oracle, Aurora) to AWS S3 and Snowflake.
- Refactor legacy ETL jobs (SSIS, Transact-SQL) for use in cloud environments.
- Implement data transformations and staging using dbt and/or similar tools.
- Optimize the performance and costs of cloud data processing and storage.
- Work collaboratively with client’s and N-iX engineers, ensuring smooth handoff and documentation.
- Participate in QA, testing, deployment, and post-migration validation.
Must-Have Technologies:
- Experience with cloud data warehouse technologies, including Snowflake and AWS S3.
- Solid SQL skills; experience migrating legacy environments to cloud platforms.
- Familiarity with ETL frameworks and tools (preferably SSIS, dbt).
- Proficiency in Python for data pipeline development and automation.
- Strong documentation and communication skills; able to work in cross-functional teams.
Nice to have:
- Prior experience in large-scale data migration projects.
- Background working with varied database platforms (SQL Server, Oracle, PostgreSQL, MySQL).
- Knowledge of reporting platforms such as Looker is a plus.
We offer*:
- Flexible working format - remote, office-based or flexible
- A competitive salary and good compensation package
- Personalized career growth
- Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
- Active tech communities with regular knowledge sharing
- Education reimbursement
- Memorable anniversary presents
- Corporate events and team buildings
- Other location-specific benefits
*not applicable for freelancers