At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts.
Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet.
Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all.
Job Description: As a Senior Data Development Infrastructure Developer, you will play a pivotal role in designing, developing, and operating high-performant distributed Data+AI job scheduling and execution engine based on Apache Airflow. You will be responsible for customizing, enhancing, and extending the capabilities of Apache Airflow to meet our specific needs in Ebay. You will work closely with the open-source community, potentially contributing to the Airflow codebase and influencing the direction of the project.
Key Responsibilities:
- Analyze, customize, and enhance Apache Airflow's source code to develop new features and improve existing functionalities and performance.
- Collaborate with the Apache Airflow community to contribute patches, propose enhancements, and resolve issues.
- Implement best practices for Airflow architecture, including security, scalability, and performance optimization.
- Develop and optimize complex Airflow DAGs and Operators for data processing and orchestration.
- Stay updated with the latest trends and advancements in Airflow and related technologies.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or related field. Advanced degree preferred.
- 5+ years of experience in software development, with a strong focus on Python.
- 2+ years of hands-on experience with Apache Airflow, including DAG and Operator development.
- Strong understanding of cloud platforms and containerization technologies (Docker, Kubernetes).
- Experience with CI/CD tools and practices.
- Excellent problem-solving skills and ability to work independently on complex technical challenges.
- Strong communication skills and ability to work collaboratively in a global team environment.
Preferred Qualifications as a plus:
- Deep understanding of Apache Airflow architecture and internals, with proven experience in customizing its source code.
- Experience as a contributor or committer to the Apache Airflow project or similar open-source projects.
- Excellent communication skills and ability to collaborate effectively with both internal teams and the open-source community.
- Experience with big data technologies such as Hadoop, Spark, Kafka, Flink.