DataOps Engineer

1 Day ago • 4-8 Years • Software Development & Engineering

Job Summary

Job Description

Velotio Technologies is seeking a skilled DataOps Engineer with a strong foundation in DevOps and Data Engineering principles. The role involves ensuring the smooth deployment, observability, and performance optimization of data pipelines and platforms. Responsibilities include designing and managing CI/CD pipelines using Jenkins, Git, and Terraform, maintaining Kubernetes clusters, developing observability tools like Prometheus and Grafana, and automating infrastructure provisioning. The engineer will collaborate with data teams to debug and optimize data pipelines (e.g., Airflow, Airbyte) and manage Snowflake environments. This position requires strong Python scripting, Kubernetes management, and experience with Infrastructure as Code tools, along with excellent problem-solving and communication skills.
Must have:
  • 4-8 years of experience in DevOps/DataOps/Platform Engineering
  • Proficient in Kubernetes cluster management
  • Hands-on experience with CI/CD pipelines (Jenkins, GitOps)
  • Strong Python scripting and automation skills
  • Experience with workflow orchestration (Airflow)
  • Experience with data ingestion tools (Airbyte)
  • Solid experience with IaC tools (Terraform)
  • Familiarity with monitoring tools (Prometheus, Grafana)
  • Working knowledge of Snowflake
  • Strong debugging and problem-solving skills
Good to have:
  • Experience with cloud platforms (AWS, GCP)
  • Familiarity with data cataloging and lineage tools
  • Exposure to container security and data governance tools
  • Background in data modeling or SQL optimization
Perks:
  • Autonomous and empowered work culture
  • Flat hierarchy with fast decision making
  • Startup-oriented culture
  • Fun & positive environment with regular celebrations
  • Inclusive, diverse & authentic environment

Job Details

Velotio Technologies is a product engineering company working with innovative startups and enterprises. We are a certified Great Place to Work® and recognized as one of the best companies to work for in India. We have provided full-stack product development for 110+ startups across the globe building products in the cloud-native, data engineering, B2B SaaS, IoT & Machine Learning space. Our team of 400+ elite software engineers solves hard technical problems while transforming customer ideas into successful products.

Job Overview:

We are seeking a skilled DataOps Engineer with a strong foundation in DevOps practices and Data Engineering principles. The ideal candidate will be responsible for ensuring smooth deployment, observability, and performance optimization of data pipelines and platforms. You will work at the intersection of software engineering, DevOps, and data engineering—bridging gaps between development, operations, and data teams.

Key Responsibilities:

  • Design, implement, and manage CI/CD pipelines using tools such as Jenkins, Git, and Terraform.
  • Manage and maintain Kubernetes (K8s) clusters for scalable and resilient data infrastructure.
  • Develop and maintain observability tools and dashboards (e.g., Prometheus, Grafana, ELK stack) for monitoring pipeline and platform health.
  • Automate infrastructure provisioning and deployments using Infrastructure as Code (IaC) tools, preferably Terraform.
  • Collaborate with data engineers to debug, optimize, and track performance of data pipelines (e.g., Airflow, Airbyte, etc.).
  • Implement and monitor data quality, lineage, and orchestration workflows.
  • Develop custom scripts and tools in Python to enhance pipeline reliability and automation.
  • Work closely with data teams to manage and optimize Snowflake environments, focusing on performance tuning and cost efficiency.
  • Ensure compliance with security, scalability, and operational best practices across the data platform.
  • Act as a liaison between development and operations to maintain SLAs for data availability and reliability.

Required Skills & Experience:

  • 4–8 years of experience in DevOps / DataOps / Platform Engineering roles.
  • Proficient in managing Kubernetes clusters and associated tooling (Helm, Kustomize, etc.).
  • Hands-on experience with CI/CD pipelines, especially using Jenkins, GitOps, and automated testing frameworks.
  • Strong scripting and automation skills in Python.
  • Experience with workflow orchestration tools like Apache Airflow and data ingestion tools like Airbyte.
  • Solid experience with Infrastructure as Code tools, preferably Terraform.
  • Familiarity with observability and monitoring tools such as Prometheus, Grafana, Datadog, or New Relic.
  • Working knowledge of data platforms, particularly Snowflake, including query performance tuning and monitoring.
  • Strong debugging and problem-solving skills, especially in production data pipeline scenarios.
  • Excellent communication skills and ability to collaborate across engineering, operations, and analytics teams.

Preferred Qualifications:

  • Experience with cloud platforms (AWS, and/or GCP) and cloud-native DevOps practices.
  • Familiarity with data cataloging and lineage tools.
  • Exposure to container security, policy management, and data governance tools.
  • Background in data modeling, SQL optimization, or data warehousing concepts is a plus.

Our Culture:

  • We have an autonomous and empowered work culture encouraging individuals to take ownership and grow quickly
  • Flat hierarchy with fast decision making and a startup-oriented “get things done” culture
  • A strong, fun & positive environment with regular celebrations of our success. We pride ourselves in creating an inclusive, diverse & authentic environment

At Velotio, we embrace diversity. Inclusion is a priority for us, and we are eager to foster an environment where everyone feels valued. We welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Bengaluru, Karnataka, India

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Software Development & Engineering Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

Velotio Technologies is a leading product engineering and digital solutions company working with innovative startups and enterprises across the globe. We specialize in Full-Stack development, Web & Mobile App Development, Cloud & DevOps, Data Engineering, AI/ML, UI/UX, and Quality Assurance.Since our inception in 2016, we have worked with over 110 global customers including NASDAQ-listed enterprises, unicorn startups, YCombinator and Sequoia funded companies, and cutting-edge product companies.

Bengaluru, Karnataka, India (Hybrid)

Pune, Maharashtra, India (Remote)

Pune, Maharashtra, India (Hybrid)

Pune, Maharashtra, India (On-Site)

Pune, Maharashtra, India (Remote)

Pune, Maharashtra, India (Hybrid)

Bengaluru, Karnataka, India (Hybrid)

Pune, Maharashtra, India (Remote)

Pune, Maharashtra, India (Remote)

Pune, Maharashtra, India (Remote)

View All Jobs

Get notified when new jobs are added by velotio technologies

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug