Sr. Data Engineer
Highspot
Job Summary
As a Senior Data Engineer at Highspot, you will be responsible for the entire data pipeline, ensuring the reliability, efficiency, and scalability of data systems. You will work with data scientists, analysts, and software engineers to develop robust data solutions. Responsibilities include creating data pipeline architecture, developing and maintaining scalable data infrastructure, automating processes, supporting stakeholders with data-related issues, developing data tools, collaborating with data experts, designing and implementing data models, optimizing performance, implementing data governance, staying updated on technologies, documenting processes, driving data strategy, and empowering the team. Highspot is a global leader in sales enablement, leveraging AI and GenAI.
Must Have
- 7+ years of SQL experience in query authoring and tuning.
- 5 years of Python experience in data wrangling and object-oriented programming.
- Expertise in designing, creating, managing, and utilizing large datasets.
- Ability to build efficient ETL and reporting solutions.
- Experience with root cause analysis of data and processes.
- Experience with cloud-based database platforms, preferably Snowflake.
- 3+ years of git experience for version control and collaboration.
- 3+ years of experience with AWS cloud technology.
- 3+ years of experience with workflow management platforms like airflow/dagster.
- 2+ years experience using Tableau to build reports and dashboards.
Good to Have
- Experience working with dbt.
Job Description
- Create optimal data pipeline architecture
- Develop, and maintain end-to-end scalable data infrastructure and pipelines.
- Identify, design, and implement process improvements, automating manual processes and optimizing data delivery.
- Assist internal stakeholders with data-related technical issues and support their data infrastructure needs.
- Develop data tools for analytics and data scientists, contributing to product innovation.
- Collaborate with data and analytics experts to enhance functionality in data systems.
- Design and implement data models, ensuring integrity and consistency.
- Identify and resolve performance bottlenecks, optimizing queries and processing.
- Implement data governance best practices for quality, security, and compliance.
- Stay informed about emerging technologies, contribute to tool selection, and enhance data infrastructure.
- Create and maintain comprehensive documentation for data engineering processes and workflows.
- Drive the team's data strategy, technical roadmap, and data storage solutions.
- Empower the team for self-servicing and efficient troubleshooting.
- Bachelor’s degree or equivalent experience.
- 7+ years of experience using SQL in an advanced capacity in query authoring, tuning, and identifying useful abstractions.
- 5 years of hands-on experience in Python, demonstrating proficiency in data wrangling and object-oriented programming
- Expertise in designing, creating, managing, and utilizing large datasets.
- Ability to build efficient, flexible, extensible, and scalable ETL and reporting solutions.
- Root cause analysis experience on internal and external data and processes.
- Cloud-based database platforms experience, preferably Snowflake.
- 3+ years of git experience for version control and collaboration.
- 3+ years of experience working with AWS cloud technology.
- 3+ years of experience working with workflow management platforms like airflow/dagster
- 2+ years experience using Tableau to build impactful reports and dashboards
- Experience working with dbt is a plus
- Strong analytical and problem-solving skills.
- Cross-functional team collaboration experience in a dynamic environment.
- Proven track record of navigating ambiguity, prioritizing needs, and solving impactful business problems.
- Excellent written and verbal communication skills for technical and non-technical audiences.
- Empathy-driven, supporting team success.
- Remote work experience with a U.S based team is preferred.