Senior Data Engineer, Trust & Safety
Match Group
Job Summary
As a Senior Data Engineer at Hinge, you will be crucial in developing essential data processes and contributing to a modern data pipeline, fundamental to Hinge's decision-making. Your work will directly impact the Trust and Safety team, enabling data-driven improvements that affect millions of users. Responsibilities include ensuring accurate data flow, improving the Data Engineering stack through containerization, data modeling, and ETL pipelines, designing and maintaining the analytics data warehouse for reporting, analytics, and machine learning, working with stakeholders to translate needs, continuously learning, and participating in on-call rotations. The job involves working on a big data architecture and concentrating on real-world problems.
Must Have
- 5+ years of professional experience.
- Proficient in Python, SQL, DevOps, and databases.
- Experience delivering data products from conception to delivery.
- Experience modeling data sets for different types of sources.
- Passionate about designing elegant data infrastructure.
Good to Have
- Familiarity with Kubernetes, Docker, Terraform, Kafka, Airflow, dbt, Looker, AWS technologies, GCP technologies, and CI/CD technologies.
- Spark/Databricks a plus!
Perks & Benefits
- 401(k) Matching.
- Professional Growth with a $3,000 annual Learning & Development stipend.
- Free access to Udemy.
- Parental Leave & Planning with 100% paid parental leave.
- Fertility Support with access to fertility care through Carrot and $10,000 toward fertility preservation.
- Date Stipend with a $100 monthly stipend.
- ERGs
Job Description
Hinge is the dating app designed to be deleted
In today's digital world, finding genuine relationships is tougher than ever. At Hinge, we’re on a mission to inspire intimate connection to create a less lonely world. We’re obsessed with understanding our users’ behaviors to help them find love, and our success is defined by one simple metric– setting up great dates. With tens of millions of users across the globe, we’ve become the most trusted way to find a relationship, for all.
About the Role:
As a Senior Data Engineer at Hinge, you will create essential data processes and contribute to components of a modern data pipeline that will be the foundation of Hinge’s decision-making ability. The systems you help create, the problems you help solve, and the support of our analytical minds, will be pivotal to the success of Hinge.
You will be making a real impact on the Trust and Safety team. Your work will enable the organization to make data-driven decisions and drive improvements, which will affect the love lives of millions of people.
As a Senior Data Engineer on the team, you will be improving core functionality and implementing critical pipelines that will not only aid the Data Engineering team but also the rest of the organization. You will be working on an actual big data architecture while concentrating on real-world problems.
Responsibilities:
- Work with our data teams to ensure data is flowing accurately through data creation to our presentation layers.
- Improve our Data Engineering stack through containerization, data modeling, developing our ETL pipelines, and more.
- Use your expertise in data modeling to design, build, and maintain our analytics data warehouse, which provides clean, accurate, and robust data sets that can be leveraged for reporting, analytics, and machine learning initiatives.
- Work with stakeholders and translate their needs and expectations into action items and deliverables.
- Continue to learn more about the Data Engineering discipline, utilize that knowledge in your deliverables, and identify opportunities to enhance our pipelines.
- Participate in our on-call rotation to ensure system reliability, promptly addressing issues to ensure smooth operations.
What We're Looking For:
- 5+ years of professional/industry experience
- Proficient in Python, SQL, DevOps, and databases.
- 5+ years of professional/industry experience.
- Experience delivering data products from conception to delivery and with the infrastructure that supports their underlying processes.
- Experience modeling data sets for different types of sources and business processes.
- Passionate about designing elegant data infrastructure, tooling, and pipelines.
- Familiarity with our stack: Kubernetes, Docker, Terraform, Kafka, Airflow, dbt, Looker, AWS technologies (S3, Redshift), GCP technologies (Dataflow, BigQuery). and CI/CD technologies (CircleCI). Spark/Databricks a plus!