Experienced Data Engineer - Streaming Platform

5 Months ago • 3-5 Years
Backend Development

Job Description

Voodoo's Ad-Network Team seeks an experienced Data Engineer to build, maintain, and optimize real-time data pipelines processing bid requests, impressions, clicks, and user engagement data. This role involves developing scalable solutions using Apache Flink, Spark Structured Streaming, or similar, integrating OpenRTB signals, ensuring high-throughput, low-latency, and fault-tolerant processing. Responsibilities include writing clean code (Java, Scala, or Python), working with cloud messaging platforms (GCP Pub/Sub, AWS Kinesis, etc.), managing event schemas (Protobuf, Avro), implementing monitoring and alerting, and improving data infrastructure. The position requires on-site presence 3 days/week in Paris.
Must Have:
  • 3-5 years real-time streaming data engineering experience
  • Apache Flink/Spark Structured Streaming expertise
  • Java/Scala/Python programming skills (distributed systems)
  • Experience with event streaming platforms (Kafka, etc.)
  • Event schema management (Avro, Protobuf)
  • Kubernetes and cloud deployment experience
  • CI/CD and infrastructure-as-code (Terraform, Docker, Helm)

Add these skills to join the top 1% applicants for this job

java
scala
ci-cd
helm
kubernetes
spark
apache-flink
aws
python
docker
terraform
scalability

Founded in 2013, Voodoo is a tech company that creates mobile games and apps with a mission to entertain the world. Gathering 800 employees, 7 billion downloads, and over 200 million active users, Voodoo is the #3 mobile publisher worldwide in terms of downloads after Google and Meta. Our portfolio includes chart-topping games like Mob Control and Block Jam, alongside popular apps such as BeReal and Wizz.

Team

The Engineering & Data team builds innovative tech products and platforms to support the impressive growth of their gaming and consumer apps which allow Voodoo to stay at the forefront of the mobile industry. 
Within the Data team, you’ll join the Ad-Network Team which is an autonomous squad of around 30 people.  The team is composed of top-tier software engineers, infrastructure engineers, data engineers, mobile engineers, and data scientists (among which 3 Kaggle Masters). The goal of this team is to provide a way for Voodoo to monetize our inventory directly with advertising partners, and relies on advanced technological solutions to optimize advertising in a real-time bidding environment. It is a strategic topic with significant impact on the business.

This roles requires to be onsite 3 days/week and is Paris based.

Role

    • Build, maintain, and optimize real-time data pipelines to process bid requests, impressions, clicks, and user engagement data.
    • Develop scalable solutions using tools like Apache Flink, Spark Structured Streaming, or similar stream processing frameworks.
    • Collaborate with backend engineers to integrate OpenRTB signals into our data pipelines and ensure smooth data flow across systems.
    • Ensure data pipelines handle high-throughput, low-latency, and fault-tolerant processing in real-time.
    • Write clean, well-documented code in Java, Scala, or Python for distributed systems.
    • Work with cloud-native messaging and event platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka to ensure reliable message delivery.
    • Assist in the management and evolution of event schemas (Protobuf, Avro), including data consistency and versioning.
    • Implement monitoring, logging, and alerting for streaming workloads to ensure data integrity and system health.
    • Continuously improve data infrastructure for better performance, cost-efficiency, and scalability.

Profile (Must have)

    • 3-5+ years of experience in data engineering, with a strong focus on real-time streaming systems.
    • Familiarity with stream processing tools like Apache Flink, Spark Structured Streaming, Beam, or similar frameworks.
    • Solid programming experience in Java, Scala, or Python, especially in distributed or event-driven systems.
    • Experience working with event streaming and messaging platforms like GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka.
    • Hands-on knowledge of event schema management, including tools like Avro or Protobuf.
    • Understanding of real-time data pipelines, with experience handling large volumes of event-driven data.
    • Comfortable working in Kubernetes for deploying and managing data processing workloads in cloud environments (AWS, GCP, etc.).
    • Exposure to CI/CD workflows and infrastructure-as-code tools such as Terraform, Docker, and Helm.
undefined

Set alerts for more jobs like Experienced Data Engineer - Streaming Platform
Set alerts for new jobs by Voodoo
Set alerts for new Backend Development jobs in France
Set alerts for new jobs in France
Set alerts for Backend Development (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙