Vimeo’s mission is to help every business and video professional grow their brand with video. Through our ever expanding suite of tools that allows everyone to create, manage, share, and sell video, we have seen incredible growth and are proud of our community of 200 million members. We are looking for a highly skilled and experienced Senior Data Engineer to join our team. The ideal candidate should possess a strong background in building and maintaining robust data pipelines, and have proven experience taking a leading role in data engineering projects from conception to production. Your technical expertise will be essential in helping us build scalable and reliable data solutions as we migrate from Snowflake to Databricks.
What you'll do:
- Technical Expertise: Leverage your strong knowledge of data engineering principles and technologies to build and maintain robust data solutions.
- Data Architecture: Design, build, and maintain scalable and reliable data pipelines and data models within our data warehouse solutions using tools like Databricks and Snowflake.
- Resiliency: Create robust, self-healing data pipelines through automation that effectively support the needs of the business, reducing outages and recovery time from incidents.
- Collaboration: Collaborate with data analysts, product managers, and other stakeholders to understand data requirements and provide progress updates within an Agile environment.
- Quality Assurance: Enforce best practices, coding standards, and ensure the delivery of high-quality, maintainable code and data.
- Continuous Learning: Stay abreast of industry trends and emerging technologies in data engineering, and encourage the team to do the same, fostering a culture of continuous learning.
Skills you will need :
- Bachelor’s degree (preferably in Business, Economics, Statistics, or Computer Science) and 5+ years of experience in a data analytics or data engineering role (preferably in a SaaS business).
- High proficiency in SQL is a must.
- Python experience (especially data processing libraries like Pandas/Polars) and/or data streaming technologies (Kafka, Spark Streaming) will set you apart.
- Existing experience with Databricks is a must, with nice-to-have knowledge of Snowflake.
- Experience with data modeling frameworks, especially dbt Cloud.
- Experience in documentation and version control via Github, Atlan, and/or Confluence - preferred.
- Proficient in Excel and Google Sheets (pivot tables, keyboard shortcuts, etc.).
- Expertise in data visualization tools with a preference for Looker with LookML.
- Strategic thinker with a proactive approach to identifying and solving business challenges.
- A sense of ownership, capable of independently initiating and driving projects from beginning to end.
- You truly believe that "getting it right" is just as important as "getting it done."