Staff Data Engineer

Game District

Job Summary

GameDistrict is seeking an experienced Staff Data Engineer in Lahore to join their team. This senior individual contributor role involves working on complex, large-scale data systems and shaping the technical direction of a data platform used by game teams and data scientists. The role focuses on designing, building, and operating scalable, reliable, and cost-efficient data systems for live operations, experimentation, and monetization, while also mentoring other data engineers.

Must Have

  • Design, build, and operate large scale event ingestion and processing pipelines.
  • Define and evolve data platform architecture, including lakehouse, warehouse, and OLAP systems.
  • Improve performance, reliability, and cost efficiency across storage, compute, and query layers.
  • Design and maintain high quality data models, aggregations, and materialized views for analytics and experimentation.
  • Set best practices for data quality, testing, observability, and operational excellence.
  • Work closely with game teams to understand their data needs and turn them into practical technical solutions.
  • Provide technical guidance and mentorship to other data engineers.
  • Collaborate with data science and backend teams to enable advanced analytics and machine learning use cases.
  • Communicate clearly and openly with teammates and stakeholders in English.
  • More than ten years of experience in data engineering or a closely related field.
  • Strong hands-on experience building and running very large scale event processing pipelines, both batch and streaming.
  • A track record of owning and operating production data platforms in high scale environments.
  • Expert level experience with Apache Spark, including performance tuning, query execution, memory management, cluster sizing, and both structured streaming and batch workloads.
  • Strong experience with Apache Iceberg, including table design, partitioning strategies, compaction, snapshot management, and schema evolution.
  • Solid experience with Apache Airflow, including DAG design at scale, backfills, retries, and operational reliability.
  • Deep experience with one or more OLAP databases such as ClickHouse, BigQuery, Redshift, Snowflake, or similar.
  • Strong knowledge of query optimisation, indexing and partitioning strategies, and materialised views.
  • Excellent SQL skills for complex analytical workloads.
  • Good understanding of modern data warehouse and lakehouse architectures.
  • Deep understanding of AWS and its data related services, including S3, EC2, EMR, EKS, Glue, Athena, IAM, networking, security, and cost optimisation.
  • AWS Certified Solutions Architect certification is required.
  • Comfort working with game teams to understand and model player events, telemetry, live operations, experimentation, and monetisation data.

Good to Have

  • Experience with experimentation frameworks and A B testing analytics.
  • Experience with near real time or real time analytics systems.
  • Experience with large scale cost optimisation.
  • Exposure to machine learning feature pipelines or data science platforms.

Perks & Benefits

  • A chance to work on challenging data problems at scale.
  • High impact role with real ownership and influence over technical direction.
  • A collaborative, supportive team environment.
  • Competitive compensation and benefits.

Job Description

About the Role

We are looking for an experienced Staff Data Engineer to join our team in Lahore. This is a senior individual contributor role for someone who enjoys working on complex, large scale data systems and helping shape the technical direction of a data platform used by game teams and data scientists.

You will work closely with engineers, data scientists, and product partners to build scalable, reliable, and cost efficient data systems that support live operations, experimentation, and monetisation. You will also play an important role in mentoring others and raising the overall engineering bar.

What You Will Do

  • Design, build, and operate large scale event ingestion and processing pipelines that handle billions of events.
  • Help define and evolve the architecture of our data platform, including lakehouse, warehouse, and OLAP systems.
  • Improve performance, reliability, and cost efficiency across storage, compute, and query layers.
  • Design and maintain high quality data models, aggregations, and materialized views for analytics and experimentation.
  • Set best practices for data quality, testing, observability, and operational excellence.
  • Work closely with game teams to understand their data needs and turn them into practical technical solutions.
  • Provide technical guidance and mentorship to other data engineers.
  • Collaborate with data science and backend teams to enable advanced analytics and machine learning use cases.
  • Communicate clearly and openly with teammates and stakeholders in English.

What We Are Looking For

Experience

  • More than ten years of experience in data engineering or a closely related field.
  • Strong hands on experience building and running very large scale event processing pipelines, both batch and streaming.
  • A track record of owning and operating production data platforms in high scale environments.

Technical Skills

  • Expert level experience with Apache Spark, including performance tuning, query execution, memory management, cluster sizing, and both structured streaming and batch workloads.
  • Strong experience with Apache Iceberg, including table design, partitioning strategies, compaction, snapshot management, and schema evolution.
  • Solid experience with Apache Airflow, including DAG design at scale, backfills, retries, and operational reliability.
  • Deep experience with one or more OLAP databases such as ClickHouse, BigQuery, Redshift, Snowflake, or similar.
  • Strong knowledge of query optimisation, indexing and partitioning strategies, and materialised views.
  • Excellent SQL skills for complex analytical workloads.
  • Good understanding of modern data warehouse and lakehouse architectures.
  • Deep understanding of AWS and its data related services, including S3, EC2, EMR, EKS, Glue, Athena, IAM, networking, security, and cost optimisation.
  • AWS Certified Solutions Architect certification is required.

Domain Knowledge

  • Comfort working with game teams to understand and model player events, telemetry, live operations, experimentation, and monetisation data.
  • Experience working with high volume, low latency event data is a strong plus.

Communication and Collaboration

  • Comfortable communicating clearly in English, both in writing and conversation.
  • Enjoys collaborating across teams and explaining complex ideas in a clear, practical way.
  • Experience mentoring others and contributing to a supportive engineering culture.

Nice to Have

  • Experience with experimentation frameworks and A B testing analytics.
  • Experience with near real time or real time analytics systems.
  • Experience with large scale cost optimisation.
  • Exposure to machine learning feature pipelines or data science platforms.

What We Offer

  • A chance to work on challenging data problems at scale.
  • High impact role with real ownership and influence over technical direction.
  • A collaborative, supportive team environment.
  • Competitive compensation and benefits.

8 Skills Required For This Role

Game Texts Live Operations Networking Aws Spark Data Science Sql Machine Learning

Similar Jobs