JOB DESCRIPTION – DATA ENGINEER II :
We are looking for an experienced Data Engineer with broad technical skills and ability to work with large amounts of data. You will collaborate with the Game teams to implement data strategies and develop complex ETL pipelines that support dashboards for promoting deeper understanding of our games. You will have experience developing and establishing scalable, efficient, automated processes for large scale data analyses. You will also stay informed of the latest trends and research on all aspects of data engineering and analytics. You will work with leaders from an internal Game Studio, providing them with data for understanding game and player insights and report to the Technical Lead for this group.
Key Responsibilities:
- As a Data Engineer you will be involved in the entire development life cycle, from brainstorming ideas to implementing elegant solutions to obtain data insights.
- You will gather requirements, model and design solutions to support reporting analytics, and exploratory analysis.
- You will implement efficient, scalable and reliable data pipelines to move and transform data.
- You will work with analysts, understand requirements, develop technical specifications for ETLs, including documentation.
- You will support production code to produce comprehensive and accurate datasets.
- You will guide communications between our users and studio engineers to provide scalable end-to-end solutions.
- You will promote strategies to improve our data modelling, quality and architecture
- You will work with big data solutions, data modelling, understand the ETL pipelines and dashboard tools.
- Explore data and suggest new opportunities to measure and assess the performance of our marketing and commercial actions.
Skill sets:
Primary: Python/Java, DBT, Airflow, Snowflake
Secondary: Other data techs
Required qualifications:
- 4+ years relevant industry experience in a data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field
- Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions
- Experience in data modelling, ETL processes, and data warehousing
- Experience with at least one of the programming languages like Python or Java
- Knowledge of latest data pipeline tools such as Airflow
- Comfortable working with a multi-functional team, both locally and remote, understanding the perspectives of each partner
- Experience working in an Agile development environment and familiar with process management tools such as JIRA, Target process, Trello or similar