We’re looking for a self-starting and enthusiastic Python Developer with an interest in specialising in data platforms and the surrounding tooling ecosystem.
With over 65 Petabytes of data and 130 billion events generated every day, data at King is at the heart of the business. The Game Platform Data organisation at King — split across Barcelona, London and Stockholm — is responsible for ingesting, processing & exposing all the data we receive each day and using it to drive all of King's data, analytics and machine learning needs.
You will be part of the Data Foundations team, in charge of providing data platforms and services used by the whole company. Our mission is to enable data professionals across King to work efficiently, safely, and at scale by building reliable, integrated data platforms and tools. You’ll take ownership and accountability of projects, spot opportunities for platform improvements, and drive our data systems to the next level. Your work will directly support hundreds of internal data practitioners, enabling them to build, experiment, and deliver insights faster.
Your Role Within Our Kingdom
- Work in a highly collaborative team to devise solutions to data systems, tooling and engineering problems, bringing your skills and ideas into every discussion.
- Build, operate, and maintain components of our data platform, including our Airflow-based orchestration framework, data catalog, Looker, interactive data science notebook environment & proprietary data notification system built with Python, GCP Pub/Sub, and Datastore.
- Use your expertise in Python to build and enhance services, frameworks, and developer tooling — improving reliability, observability, and developer productivity.
- Empower data engineers and scientists across King by delivering high-quality, developer-friendly tools and services that simplify workflows and promote best practices.
- Collaborate with teams across organisations to ensure consistent, coherent growth of our collective data capabilities.
- Design and maintain reusable Python libraries and frameworks for internal use, ensuring consistency, reliability, and ease of integration.
- Take ownership and responsibility for the quality of your solutions — testing them thoroughly and ensuring best practices are used throughout the development process, with a focus on maintainability, cost efficiency, and scalability.
- Drive improvements in developer workflows, including automation, CI/CD pipelines, and infrastructure as code.
- Participate in code reviews, documentation, and knowledge-sharing sessions, contributing to a strong engineering community and culture of continuous learning.
- Partner with stakeholders and user groups to understand pain points and design solutions that enhance their workflows and productivity.
- Stay curious — explore and learn new technologies to help the team achieve success.
Skills to Create Thrills
- Expertise in Python, including application design patterns, frameworks, TDD, API design, and best practices.
- Proven experience with GCP, AWS, or Azure.
- Experience working with Terraform and CI/CD pipelines.
- Exposure to Kubernetes, Helm, Docker.
- Experience building internal tools, SDKs, or reusable libraries that improve developer productivity.
- Ability to learn quickly in a rapidly changing environment.
- Strong collaboration and communication skills, with a willingness to share knowledge and contribute to code reviews.
- Curious, proactive mindset with a drive to improve systems and workflows.
Bonus Skills
- Contributions to open source projects.
- SQL and data modelling.
- In-depth knowledge of Airflow components, architecture, and best practices in DAG development (including advanced topics like DAG factories).
- Familiarity with BigQuery or similar cloud DWH platforms (Snowflake, Redshift).
- Experience with GCP-based services such as Cloud Run, Datastore, Dataflow, Cloud Functions, and Pub/Sub.
- Understanding of data science workflows.