As a Analytics Engineer, you will lead data pipeline, data strategy, and data visualization-related efforts for the Data & Analytics organization at STAGE. You’re an engineer who not only understands how to use big data in answering complex business questions but also how to design semantic layers to best support self-service vehicles. You will manage projects from requirements gathering to planning to implementation of full-stack data solutions (pipelines to data tables to visualizations). You will work closely with cross-functional partners to ensure that business logic is properly represented in the semantic layer and production environments, where it can be used by the wider Product Analytics team to drive business insights and strategy.
What Will You Do?
Design and implement data models that support flexible querying and data visualization.
Partner with Product stakeholders to understand business questions and build out advanced analytical solutions.
Advance automation efforts that help the team spend less time manipulating & validating data and more time analyzing.
Build frameworks that multiply the productivity of the team and are intuitive for other data teams to leverage.
Participate in the creation and support of analytics development standards and best practices.
Create systematic solutions for solving data anomalies: identifying, alerting, and root cause analysis.
Work proactively with stakeholders to ready data solutions for new product and/or feature releases, with a keen eye for uncovering and troubleshooting any data quality issues or nuances.
Identify and explore new opportunities through creative analytical and engineering methods.
What To Bring?
Bachelor's degree in Engineering
4-8 years of relevant experience in business intelligence/data engineering
Expertise in writing SQL (clean, fast code is a must) and in data-warehousing concepts such as star schemas, slowly changing dimensions, ELT/ETL, and MPP databases
Experience in using dbt to build transformations.
Experience in transforming flawed/changing data into consistent, trustworthy datasets, and in developing DAGs to batch-process millions of records
Experience with general-purpose programming (e.g. Python, Java, Go), dealing with a variety of data structures, algorithms, and serialization formats
Advanced ability to build reports and dashboards with BI tools (such as Looker and Tableau)
Experience with analytics tools such as Athena, Redshift/BigQuery, Splunk, etc.
Proficiency with Git (or similar version control) and CI/CD best practices
Experience in managing workflows using Agile practices
Ability to write clear, concise documentation and to communicate generally with a high degree of precision
Ability to solve ambiguous problems independently
Ability to manage multiple projects and time constraints simultaneously
Care for the quality of the input data and how the processed data is ultimately interpreted and used
Experience with digital products, streaming services, or subscription products is preferred
Strong written and verbal communication skills