Senior Data Scheduling Infra Developer

31 Minutes ago • 5 Years +
Data Analysis

Job Description

As a Senior Data Development Infrastructure Developer at eBay, you will design, develop, and operate a high-performant distributed Data+AI job scheduling and execution engine using Apache Airflow. This role involves customizing and enhancing Airflow's capabilities, collaborating with the open-source community, and implementing best practices for architecture, security, scalability, and performance optimization. You will also develop and optimize complex Airflow DAGs and Operators for data processing and orchestration.
Good To Have:
  • Deep understanding of Apache Airflow architecture and internals, with proven experience in customizing its source code.
  • Experience as a contributor or committer to the Apache Airflow project or similar open-source projects.
  • Excellent communication skills and ability to collaborate effectively with both internal teams and the open-source community.
  • Experience with big data technologies such as Hadoop, Spark, Kafka, Flink.
Must Have:
  • Analyze, customize, and enhance Apache Airflow's source code to develop new features and improve existing functionalities and performance.
  • Collaborate with the Apache Airflow community to contribute patches, propose enhancements, and resolve issues.
  • Implement best practices for Airflow architecture, including security, scalability, and performance optimization.
  • Develop and optimize complex Airflow DAGs and Operators for data processing and orchestration.
  • Bachelor's degree in Computer Science, Information Technology, or related field.
  • 5+ years of experience in software development, with a strong focus on Python.
  • 2+ years of hands-on experience with Apache Airflow, including DAG and Operator development.
  • Strong understanding of cloud platforms and containerization technologies (Docker, Kubernetes).
  • Experience with CI/CD tools and practices.
  • Excellent problem-solving skills and ability to work independently on complex technical challenges.
  • Strong communication skills and ability to work collaboratively in a global team environment.

Add these skills to join the top 1% applicants for this job

communication
data-analytics
game-texts
hadoop
spark
ci-cd
docker
kubernetes
python

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts.

Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet.

Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all.

Job Description: As a Senior Data Development Infrastructure Developer, you will play a pivotal role in designing, developing, and operating high-performant distributed Data+AI job scheduling and execution engine based on Apache Airflow. You will be responsible for customizing, enhancing, and extending the capabilities of Apache Airflow to meet our specific needs in Ebay. You will work closely with the open-source community, potentially contributing to the Airflow codebase and influencing the direction of the project.

Key Responsibilities:

  • Analyze, customize, and enhance Apache Airflow's source code to develop new features and improve existing functionalities and performance.
  • Collaborate with the Apache Airflow community to contribute patches, propose enhancements, and resolve issues.
  • Implement best practices for Airflow architecture, including security, scalability, and performance optimization.
  • Develop and optimize complex Airflow DAGs and Operators for data processing and orchestration.
  • Stay updated with the latest trends and advancements in Airflow and related technologies.

Qualifications:

  • Bachelor's degree in Computer Science, Information Technology, or related field. Advanced degree preferred.
  • 5+ years of experience in software development, with a strong focus on Python.
  • 2+ years of hands-on experience with Apache Airflow, including DAG and Operator development.
  • Strong understanding of cloud platforms and containerization technologies (Docker, Kubernetes).
  • Experience with CI/CD tools and practices.
  • Excellent problem-solving skills and ability to work independently on complex technical challenges.
  • Strong communication skills and ability to work collaboratively in a global team environment.

Preferred Qualifications as a plus:

  • Deep understanding of Apache Airflow architecture and internals, with proven experience in customizing its source code.
  • Experience as a contributor or committer to the Apache Airflow project or similar open-source projects.
  • Excellent communication skills and ability to collaborate effectively with both internal teams and the open-source community.
  • Experience with big data technologies such as Hadoop, Spark, Kafka, Flink.

Set alerts for more jobs like Senior Data Scheduling Infra Developer
Set alerts for new jobs by eBay
Set alerts for new Data Analysis jobs in Ireland
Set alerts for new jobs in Ireland
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙