Python Data Engineer

Capgemini

Job Summary

As a Snowflake + Python Data Engineer at Capgemini, you will design and implement data solutions that support business intelligence, analytics, and data-driven decision-making. You will work closely with cross-functional teams to build robust data pipelines, optimize data models, and deliver high-impact visualizations. Capgemini is a leader in data-driven transformation, helping organizations harness the power of cloud, data, and AI to drive business growth and operational excellence.

Must Have

  • Develop and maintain scalable data pipelines using Snowflake and Python (Pandas).
  • Utilize SnowSQL and Snowpipe for efficient data ingestion and transformation.
  • Design and implement data models and ETL workflows using DBT or similar tools.
  • Create dashboards and reports using Power BI or Tableau.
  • Collaborate using Git for version control and team development.
  • Write optimized SQL queries for data extraction, transformation, and analysis.
  • Minimum 7 years of professional experience in data engineering.
  • Strong expertise in Snowflake, SnowSQL, and Snowpipe.
  • Proficient in Python, especially with Pandas in a Snowflake environment.
  • Hands-on experience with DBT or other ETL tools.
  • Skilled in Power BI or Tableau for reporting and visualization.
  • Solid understanding of SQL and data modeling principles.
  • Experience with Git for version control and collaboration.
  • Excellent problem-solving and communication skills.

Perks & Benefits

  • Flexible work hours
  • Remote work options
  • Inclusive culture that values innovation, collaboration, and continuous learning
  • Opportunities to work on cutting-edge technologies and impactful projects

Job Description

Your Role

As a Snowflake + Python Data Engineer, you will be instrumental in designing and implementing data solutions that support business intelligence, analytics, and data-driven decision-making. You’ll work closely with cross-functional teams to build robust data pipelines, optimize data models, and deliver high-impact visualizations.

In this role, you will:

  • Develop and maintain scalable data pipelines using Snowflake and Python (Pandas).
  • Utilize SnowSQL and Snowpipe for efficient data ingestion and transformation.
  • Design and implement data models and ETL workflows using DBT or similar tools.
  • Create dashboards and reports using Power BI or Tableau.
  • Collaborate using Git for version control and team development.
  • Write optimized SQL queries for data extraction, transformation, and analysis.
  • Work flexible hours to meet project and client needs.

Your Profile

  • Minimum 7 years of professional experience in data engineering.
  • Strong expertise in Snowflake, SnowSQL, and Snowpipe.
  • Proficient in Python, especially with Pandas in a Snowflake environment.
  • Hands-on experience with DBT or other ETL tools.
  • Skilled in Power BI or Tableau for reporting and visualization.
  • Solid understanding of SQL and data modeling principles.
  • Experience with Git for version control and collaboration.
  • Excellent problem-solving and communication skills.

What You’ll Love About Working Here

  • Flexible work hours and remote work options.
  • Inclusive culture that values innovation, collaboration, and continuous learning.
  • Opportunities to work on cutting-edge technologies and impactful projects.

About Us

Capgemini is a leader in data-driven transformation, helping organizations harness the power of cloud, data, and AI. With a global team of experts, we deliver innovative solutions that drive business growth and operational excellence.

12 Skills Required For This Role

Team Management Cross Functional Communication Github Game Texts Business Intelligence Power Bi Tableau Pandas Git Python Sql

Similar Jobs