Data Engineer (Snowflake + DBT (Data Build Tool))

Synechron

Job Summary

Synechron is seeking an experienced Data Engineer with strong expertise in Snowflake and DBT (Data Build Tool) to join our team. This role involves designing, developing, and maintaining scalable data pipelines, transforming raw data into actionable insights, and supporting our modern data platform used for NAV, accounting, reporting, and risk management systems. The successful candidate will contribute to building a secure, high-performance data environment that helps drive business decision-making and operational efficiency. You will work collaboratively across teams, ensuring data integrity and enabling scalable analytics solutions while supporting post-production operations.

Must Have

  • Design, build, and optimize data pipelines within Snowflake.
  • Develop and maintain modular, tested data models using DBT.
  • Optimize complex SQL queries.
  • Experience with ETL/ELT processes and data warehousing best practices.
  • Utilize Git for version control and CI/CD tools.
  • Familiarity with Unix/Linux command line and shell scripting.
  • Proficiency in Excel for data analysis and validation.
  • Collaborate with stakeholders to understand data requirements.
  • Implement best practices for data security, compliance, and performance tuning.
  • Provide ongoing support and troubleshooting post-production issues.

Good to Have

  • Experience with Python or orchestration tools such as Airflow.
  • Knowledge of data governance, data catalog, or data quality frameworks.
  • Exposure to Azure Cloud and cloud-native data architecture.
  • Financial, accounting, or risk systems experience.
  • Prior experience in a virtual or global team environment.
  • Relevant industry-recognized certifications (e.g., Snowflake).
  • Ongoing learning and professional development in data architectures.

Job Description

Job Summary

Synechron is seeking an experienced Data Engineer with strong expertise in Snowflake and DBT (Data Build Tool) to join our team. This role involves designing, developing, and maintaining scalable data pipelines, transforming raw data into actionable insights, and supporting our modern data platform used for NAV, accounting, reporting, and risk management systems. The successful candidate will contribute to building a secure, high-performance data environment that helps drive business decision-making and operational efficiency. You will work collaboratively across teams, ensuring data integrity and enabling scalable analytics solutions while supporting post-production operations.

Software Requirements

  • Required:
  • Proficiency in Snowflake including data modeling, performance optimization, and warehouse management
  • Extensive hands-on experience with DBT (Data Build Tool), including development and testing of data models
  • SQL expertise with the ability to optimize complex queries
  • Experience with ETL/ELT processes and data warehousing best practices
  • Version control workflows using Git (GitHub, GitLab, etc.)
  • Familiarity with CI/CD tools like GitHub Actions, GitLab CI, or similar
  • Unix/Linux command line and shell scripting familiarity
  • Proficiency in Excel for data analysis and validation
  • Preferred:
  • Experience with Python or orchestration tools such as Airflow
  • Knowledge of data governance, data catalog, or data quality frameworks

Overall Responsibilities

  • Design, build, and optimize data pipelines within Snowflake, ensuring high performance, reliability, and scalability
  • Develop and maintain modular, tested, and well-documented data models using DBT across business domains
  • Collaborate with stakeholders to understand data requirements, ensuring data accuracy and completeness for reporting and analytics
  • Implement best practices for data security, compliance, and performance tuning
  • Participate in developing and maintaining data workflows, pipelines, and automation processes
  • Provide ongoing support and troubleshooting post-production issues on rotation, maintaining data system health
  • Foster a culture of best practices around data management, quality, and governance within the team
  • Work alongside data governance and analytical teams to implement data standards and ensure regulatory compliance

Technical Skills (By Category)

  • Programming Languages:
  • Required: SQL (expertise in complex query optimization), shell scripting
  • Preferred: Python, for orchestration and data manipulation
  • Databases/Data Management:
  • Required: Snowflake (modeling, performance tuning, warehouse management)
  • Preferred: Exposure to data catalog, data governance, or data quality frameworks
  • Cloud Technologies:
  • Preferred: Azure Cloud, experience in cloud-native data architecture in a cloud environment
  • Frameworks and Libraries:
  • Primary: DBT, Git for version control
  • Secondary: Airflow (preferred) for orchestration
  • Development Tools and Methodologies:
  • CI/CD pipelines (GitHub Actions, GitLab CI)
  • Agile development practices
  • Data pipeline design and automation
  • Security & Compliance:
  • Knowledge of data security best practices and compliance regulations applicable in large data environments

Experience Requirements

  • 3 to 7 years of professional experience in data engineering, analytics engineering, or equivalent roles
  • Proven experience working with Snowflake environments, including data modeling, performance management, and warehouse tuning
  • Hands-on experience with DBT, including writing, testing, and deploying data models
  • Solid SQL skills, with a proven track record of optimizing complex queries
  • Familiarity with ETL/ELT concepts, data pipeline design, and best practices
  • Experience working with version-controlled workflows (Git) and CI/CD pipelines
  • Prior experience in a virtual or global team environment is preferred

Domain-specific experience:

  • Financial, accounting, or risk systems experience is advantageous but not mandatory

Alternative pathways:

  • Demonstrable success managing scalable data platforms using relevant tools and techniques

Day-to-Day Activities

  • Design and develop scalable data pipelines and transformation models in Snowflake using DBT
  • Collaborate with data analysts, data scientists, and business analysts to gather requirements and deliver data solutions
  • Optimize and tune data workflows and queries to meet performance and capacity requirements
  • Conduct code reviews, testing, and deployments within CI/CD frameworks
  • Troubleshoot data pipeline issues, investigate root causes, and resolve operational problems in production environments
  • Support post-production activities through scheduled rotation, addressing issues promptly and maintaining data accuracy
  • Document data models, pipelines, and processes for team knowledge sharing
  • Participate in agile ceremonies such as sprint planning, reviews, and retrospectives

Qualifications

  • Bachelor’s degree or higher in Computer Science, Data Science, Information Technology, or related field
  • Relevant industry-recognized certifications in data engineering or cloud platforms (e.g., Snowflake certification) are preferred
  • Ongoing learning and professional development in data architectures, cloud data platforms, or related fields

Professional Competencies

  • Strong analytical and problem-solving aptitude for complex data challenges
  • Ability to work independently and manage multiple priorities efficiently
  • Excellent communication skills—verbal and written—with the ability to clearly articulate technical concepts
  • Strong interpersonal skills to collaborate effectively with cross-functional teams
  • Self-motivated with a proactive approach to continuous improvement and learning
  • Detail-oriented, organized, and committed to adhering to data governance and security standards
  • Adaptability to evolving technologies and project requirements
  • Capacity to influence and mentor team members to adopt best practices.

22 Skills Required For This Role

Team Management Cross Functional Excel Communication Problem Solving Data Analytics Risk Management Github Game Texts Agile Development Gitlab Linux Azure Unix Data Science Ci Cd Git Python Shell Sql Github Actions Accounting

Similar Jobs