GCP Data Engineer | Cloud Data Pipelines, Python, BigQuery, Agile Methodology

20 Minutes ago • 2-4 Years
Data Analysis

Job Description

Synechron is seeking a highly skilled GCP Data Engineer to join their dynamic team. This role involves designing, developing, and maintaining data solutions using Google Cloud Platform (GCP) technologies. The engineer will support the organization's data infrastructure, enabling scalable analytics, data processing, and integration. This position is crucial for ensuring data quality, security, and performance, contributing to strategic data initiatives.
Good To Have:
  • Knowledge of other cloud platforms (AWS, Azure).
  • Experience with containerization (Docker, Kubernetes).
  • Familiarity with IoT, mobile, or blockchain data integrations.
  • Prior experience in finance, healthcare, or technology industries.
Must Have:
  • Design, develop, and maintain data solutions on Google Cloud Platform (GCP).
  • Proficiency in GCP services: BigQuery, Dataflow, Storage, Composer, Pub/Sub, Cloud SQL (2+ years).
  • Strong programming skills in Python and/or Java (2+ years experience).
  • Experience with Apache Beam, Apache Airflow, and Data Studio.
  • Familiarity with Git, JIRA, Confluence, and Agile methodologies.
  • Design and optimize GCP data pipelines and workflows.
  • Create and maintain technical documentation.
  • Conduct code reviews and troubleshoot data workflows.
  • 2-4 years professional experience in data engineering or cloud data solutions.
  • Basic understanding of cloud security principles, IAM, and data encryption.

Add these skills to join the top 1% applicants for this job

team-management
cross-functional
communication
data-analytics
github
game-texts
software-development-lifecycle-sdlc
agile-development
apache-beam
aws
nosql
azure
data-studio
google-cloud-platform
cloud-security
data-science
node.js
docker
kubernetes
confluence
git
python
jira
sql
java

Job Summary

Synechron is seeking a highly skilled GCP Data Engineer to join our dynamic team. In this role, you will be responsible for designing, developing, and maintaining data solutions leveraging Google Cloud Platform (GCP) technologies. Your expertise will support the organization's data infrastructure, enabling scalable analytics, data processing, and integration capabilities. This position plays a vital role in ensuring data quality, security, and performance, contributing directly to the organization’s strategic data initiatives.

Software Requirements

  • Required:
  • Proficiency with Google Cloud Platform (GCP) services such as BigQuery, Cloud Dataflow, Cloud Storage, Cloud Composer, Pub/Sub, and Cloud SQL (minimum GCP/Google Cloud experience of 2+ years).
  • Programming skills in Python and/or Java (experience of 2+ years).
  • Familiarity with Data Engineering tools like Apache Beam, Apache Airflow, and Data Studio (or other visualization tools).
  • Experience with version control systems such as Git and project management tools like JIRA and Confluence.
  • Understanding of software development life cycle (SDLC) and Agile methodologies.
  • Preferred:
  • Knowledge of other cloud platforms (AWS, Azure) is a plus.
  • Experience with containerization (Docker, Kubernetes).
  • Familiarity with IoT, mobile, or blockchain data integrations.

Overall Responsibilities

  • Collaborate with cross-functional teams to gather and analyze data requirements, translating them into scalable cloud-based solutions.
  • Design, develop, and optimize data pipelines and workflows within GCP to support analytics, reporting, and data science initiatives.
  • Create and maintain detailed technical documentation, including architecture diagrams, data models, and processes.
  • Conduct code reviews to uphold code quality, security, and maintainability standards.
  • Monitor and troubleshoot data workflows, resolving technical issues promptly to ensure system stability.
  • Stay updated on emerging cloud and data engineering trends, recommending improvements and innovative solutions.
  • Collaborate with stakeholders to ensure data solutions align with business objectives and compliance standards.

Technical Skills (By Category)

Programming Languages (Required):

  • Python, Java, or Node.js (minimum 2 years hands-on experience)

Databases/Data Management:

  • Experience with Google BigQuery, Cloud SQL, or similar RDBMS and NoSQL databases

Cloud Technologies:

  • Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer) — required
  • Experience with other cloud providers is advantageous

Frameworks and Libraries:

  • Apache Beam, Apache Airflow (used within GCP), Data Studio or similar visualization tools

Development Tools & Methodologies:

  • Git, JIRA, Confluence
  • Familiarity with Agile/Scrum workflows

Security Protocols:

  • Basic understanding of cloud security principles, IAM policies, and data encryption practices

Experience Requirements

  • Minimum of 2-4 years of professional experience in data engineering or cloud-based data solutions.
  • Proven experience designing and implementing data pipelines using GCP or comparable cloud platforms.
  • Demonstrated ability to work with cross-disciplinary teams to deliver complex data solutions.
  • Hands-on experience with Agile methodologies and modern development tools (Git, JIRA).
  • Prior experience supporting data-driven decision-making in industries such as finance, healthcare, or technology is desirable but not mandatory.

Alternative Pathways:

Candidates with extensive experience in alternative cloud environments (AWS, Azure) and proven ability to adapt to GCP are encouraged to apply.

Day-to-Day Activities

  • Participate in daily stand-up meetings and sprint planning sessions.
  • Analyze business data requirements and design cloud-based data pipelines.
  • Develop, test, and deploy data processing workflows ensuring efficiency and reliability.
  • Conduct peer code reviews, offering constructive feedback to team members.
  • Monitor pipeline performance, troubleshoot issues, and optimize workflows.
  • Document technical specifications, system architectures, and operational procedures.
  • Collaborate with data analysts, data scientists, and business stakeholders to refine data solutions.
  • Keep abreast of new GCP features, tools, and best practices, evaluating their applicability.
  • Provide technical support and mentorship within the team when needed.

Qualifications

  • Bachelor's or Master’s degree in Computer Science, Information Technology, Data Science, or a related discipline.
  • Relevant industry certifications such as Google Professional Data Engineer, GCP Associate Cloud Engineer, or similar are preferred.
  • Continuous learning through professional training and certification programs is expected.

Professional Competencies

  • Strong analytical and problem-solving skills with attention to detail.
  • Effective communication abilities for articulating technical concepts to non-technical stakeholders.
  • Proven ability to work collaboratively within diverse teams.
  • Adaptability to evolving technologies and project requirements.
  • Proactive approach to learning and innovation.
  • Excellent organizational skills to manage multiple priorities and meet deadlines.

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Sustainability and Health Safety Commitment

At Synechron, we are committed to integrating sustainability into our business strategy, ensuring responsible growth while minimizing environmental impact. Employees play a key role in driving our sustainability initiatives, from reducing our carbon footprint to fostering ethical and sustainable business practices across global operations. All positions are required to adhere to our Sustainability and Health Safety standards, demonstrating a commitment to environmental stewardship, workplace safety, and sustainable practices.

Set alerts for more jobs like GCP Data Engineer | Cloud Data Pipelines, Python, BigQuery, Agile Methodology
Set alerts for new jobs by Synechron
Set alerts for new Data Analysis jobs in India
Set alerts for new jobs in India
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙