Principal Engineer- Data Devops

24 Minutes ago • 8 Years +

Job Summary

Job Description

We are seeking an experienced Principal Engineer (Data DevOps) to lead our Data DevOps team in building, managing, and optimizing high-scale, secure, and reliable big data platforms. The ideal candidate will combine strong technical expertise with proven leadership skills, driving best practices in cloud infrastructure, automation, CI/CD, and big data technologies. This role involves managing cross-functional priorities, mentoring engineers, and ensuring delivery of high-quality, scalable data solutions.
Must have:
  • 8+ years in DevOps/Data DevOps or related fields
  • 4+ years in a leadership role
  • Proven track record in managing high-scale big data infrastructure
  • Lead, mentor, and grow a high-performing Data DevOps team
  • Drive architecture, design, and implementation of large-scale cloud and data infrastructure
  • Collaborate with Data Engineering, Data Science, Analytics, and Product teams
  • Oversee operations and optimization of AWS-based infrastructure
  • Manage and scale big data platforms leveraging Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Looker, and Jupyter Notebooks
  • Implement and maintain CI/CD pipelines and infrastructure automation using Terraform, Ansible, and CloudFormation
  • Ensure system observability, proactive incident management, and SLA adherence
  • Champion cloud security best practices, including API security, TLS/HTTPS, and access control policies
  • Partner with stakeholders to prioritize initiatives, manage budgets, and optimize cloud and operational costs
  • Strong hands-on experience with AWS services and infrastructure automation tools
  • Deep knowledge and hands-on experience with Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Jupyter Notebooks, and Looker
  • Proficiency in Kubernetes/EKS, Docker, ECS, and CI/CD tools
  • Strong understanding of networking, cloud security, and compliance requirements
  • Excellent communication, stakeholder management, and decision-making skills
Good to have:
  • Exposure to SQL and data query optimization

Job Details

Job Overview:

We are seeking an experienced Principal Engineer (Data DevOps) to lead our Data DevOps team in building, managing, and optimizing high-scale, secure, and reliable big data platforms. The ideal candidate will combine strong technical expertise with proven leadership skills, driving best practices in cloud infrastructure, automation, CI/CD, and big data technologies. This role involves managing cross-functional priorities, mentoring engineers, and ensuring delivery of high-quality, scalable data solutions.

Key Responsibilities:

  • Lead, mentor, and grow a high-performing Data DevOps team, fostering technical excellence and ownership.
  • Drive architecture, design, and implementation of large-scale cloud and data infrastructure, ensuring scalability, performance, and security.
  • Collaborate closely with Data Engineering, Data Science, Analytics, and Product teams to deliver efficient and reliable data platforms.
  • Oversee operations and optimization of AWS-based infrastructure, including VPC, EC2, S3, EMR, EKS, SageMaker, Lambda, CloudFront, CloudWatch, and IAM.
  • Manage and scale big data platforms leveraging Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Looker, and Jupyter Notebooks.
  • Implement and maintain CI/CD pipelines and infrastructure automation using Terraform, Ansible, and CloudFormation.
  • Ensure system observability, proactive incident management, and SLA adherence.
  • Champion cloud security best practices, including API security, TLS/HTTPS, and access control policies.
  • Partner with stakeholders to prioritize initiatives, manage budgets, and optimize cloud and operational costs

Required Qualifications

  • Experience: 8+ years in DevOps/Data DevOps or related fields, including at least 4+ years in a leadership role.
  • Proven track record in managing high-scale big data infrastructure and leading engineering teams.
  • Strong hands-on experience with AWS services and infrastructure automation tools (Terraform, Ansible, CloudFormation).
  • Deep knowledge and hands-on experience with Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Jupyter Notebooks, and Looker.
  • Proficiency in Kubernetes/EKS, Docker, ECS, and CI/CD tools.
  • Strong understanding of networking, cloud security, and compliance requirements.
  • Excellent communication, stakeholder management, and decision-making skills.
  • Exposure to SQL and data query optimization is an advantage.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Noida, Uttar Pradesh, India

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Category Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

Amravati, Maharashtra, India (On-Site)

Visakhapatnam, Andhra Pradesh, India (On-Site)

Hassan, Karnataka, India (On-Site)

Bengaluru, Karnataka, India (On-Site)

Noida, Uttar Pradesh, India (On-Site)

Noida, Uttar Pradesh, India (On-Site)

Hyderabad, Telangana, India (On-Site)

Noida, Uttar Pradesh, India (On-Site)

Haveri, Karnataka, India (On-Site)

Kochi, Kerala, India (On-Site)

View All Jobs

Get notified when new jobs are added by Paytm

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug