Saama is seeking a skilled and experienced GCP Data Engineer to join our dynamic team. The ideal candidate will have a strong background in Google Cloud Platform (GCP), BigQuery, Dataform, and data warehouse concepts. Experience with Airflow/Cloud Composer and cloud computing knowledge will be a significant advantage. Responsibilities: Designing, developing, and maintaining data pipelines and workflows on the Google Cloud Platform. Building and optimizing BigQuery data models and datasets for performance and scalability. Implementing Dataform for managing and documenting data transformations. Collaborating with data analysts, data scientists, and other stakeholders to understand data requirements and build appropriate solutions. Ensuring data integrity, security, and compliance with best practices and standards. Monitoring and optimizing data processing and storage resources on GCP. Troubleshooting and resolving data pipeline issues and performance bottlenecks. Documenting data engineering processes, best practices, and technical specifications.
Must have:
Experience with GCP, BigQuery, Dataform
Proficiency in SQL
Knowledge of data warehouse concepts
Experience with Airflow/Cloud Composer
Good to have:
Google Cloud certification
Not hearing back from companies?
Unlock the secrets to a successful job application and accelerate your journey to your next opportunity.
Description
About US
Saama develops life science solutions that accelerate the delivery of therapies to patients. With innovative AI technologies, Saama drives breakthrough intelligence into clinical and commercial operations. The Saama platform powered the clinical trial that led to the world’s first COVID-19 vaccine. Headquartered in Campbell, CA, with employees around the globe, Saama is committed to helping customers save and improve lives. Discover more at saama.com.
Job Title: GCP Data Engineer
Location: Pune, Chennai, or Coimbatore.
Type: Fulltime
Experience Level: 4+
We are seeking a skilled and experienced GCP Data Engineer to join our dynamic team. The ideal candidate will have a strong background in Google Cloud Platform (GCP), BigQuery, Dataform, and data warehouse concepts. Experience with Airflow/Cloud Composer and cloud computing knowledge will be a significant advantage.
Responsibilities:
Designing, developing, and maintaining data pipelines and workflows on the Google Cloud Platform.
Building and optimizing BigQuery data models and datasets for performance and scalability.
Implementing Dataform for managing and documenting data transformations.
Collaborating with data analysts, data scientists, and other stakeholders to understand data requirements and build appropriate solutions.
Ensuring data integrity, security, and compliance with best practices and standards.
Monitoring and optimizing data processing and storage resources on GCP.
Troubleshooting and resolving data pipeline issues and performance bottlenecks.
Documenting data engineering processes, best practices, and technical specifications.
Requirements:
Bachelor's degree in Computer Science, Engineering, or related field.
Proven experience working as a data engineer, specifically with GCP.
Strong proficiency in SQL and experience with database technologies.
Hands-on experience with BigQuery, Cloud Storage, Dataform, and other GCP data engineering tools.
Familiarity with data warehouse concepts and methodologies.
Knowledge of Airflow/Cloud Composer for orchestrating data workflows.
Experience with cloud computing platforms and services.
Excellent communication and collaboration skills.
Ability to work effectively in a fast-paced and dynamic environment.
Google Cloud certification is a plus.
If you are passionate about leveraging GCP to build robust and efficient data solutions, and you meet the above qualifications, we encourage you to apply for this exciting opportunity.