Senior Data Engineer

3 Days ago • 5 Years +

Job Summary

Job Description

As a Senior Data Engineer specializing in Databricks, you will be responsible for setting up and maintaining the Databricks platform, building robust data pipelines, and collaborating with solution architects and data scientists. You will design, build, and maintain data pipelines within Databricks, debug complex issues, and enhance system performance. You will also be involved in setting up Databricks environments on cloud platforms, automating processes, creating documentation, and developing integration tests. The role requires a deep understanding of data governance and management strategies. The team is looking for someone who can collaborate with international teams to develop scalable data solutions, ensuring data quality and reliability.
Must have:
  • 5+ years of experience as a Data Engineer.
  • Advanced proficiency in Python for data pipelines.
  • Hands-on experience with Databricks platform.
  • Strong knowledge of Apache Spark.
  • Familiarity with Azure or AWS cloud environments.
  • Proficiency with SQL and relational databases.
  • Experience with Airflow or similar tools.
  • Understanding of CI/CD pipelines.
  • Solid skills in debugging data pipeline issues.
  • Proficiency in structured documentation practices.
  • B2 level or higher proficiency in English.
  • Strong collaboration skills.
Good to have:
  • Experience with Docker and Kubernetes.
  • Familiarity with Elasticsearch or other vector databases.
  • Understanding of DBT (data build tool).
  • Ability to travel abroad twice a year.
Perks:
  • Flexible working format (remote, office-based, or flexible).
  • Competitive salary and good compensation package.
  • Personalized career growth.
  • Professional development tools.
  • Active tech communities with regular knowledge sharing.
  • Education reimbursement.
  • Memorable anniversary presents.
  • Corporate events and team buildings.
  • Other location-specific benefits.

Job Details

We are seeking a Senior Data Engineer specializing in Databricks to join our global team. You will be instrumental in setting up and maintaining our Databricks platform, building robust data pipelines, and collaborating closely with our solution architects and data scientists. Your expertise will directly support our mission to leverage data and AI effectively within a cutting-edge automotive claims management environment.

Key Responsibilities:

  • Design, build, and maintain robust data pipelines within Databricks.
  • Collaborate closely with international teams, including data scientists and architects, to develop scalable data solutions.
  • Debug complex issues in data pipelines and proactively enhance system performance and reliability.
  • Set up Databricks environments on cloud platforms (Azure/AWS).
  • Automate processes using CI/CD practices and infrastructure tools such as Terraform.
  • Create and maintain detailed documentation, including workflows and operational checklists.
  • Develop integration and unit tests to ensure data quality and reliability.
  • Migrate legacy data systems to Databricks, ensuring minimal disruption.
  • Participate actively in defining data governance and management strategies.

What We Expect from You (Requirements):

  • 5+ years of proven experience as a Data Engineer.
  • Advanced proficiency in Python for developing production-grade data pipelines.
  • Extensive hands-on experience with Databricks platform.
  • Strong knowledge of Apache Spark for big data processing.
  • Familiarity with cloud environments, specifically Azure or AWS.
  • Proficiency with SQL and experience managing relational databases (MS SQL preferred).
  • Practical experience with Airflow or similar data orchestration tools.
  • Strong understanding of CI/CD pipelines and experience with tools like GitLab.
  • Solid skills in debugging complex data pipeline issues.
  • Proficiency in structured documentation practices.
  • B2 level or higher proficiency in English.
  • Strong collaboration skills, ability to adapt, and eagerness to learn in an international team environment.

Nice to have:

  • Experience with Docker and Kubernetes.
  • Familiarity with Elasticsearch or other vector databases.
  • Understanding of DBT (data build tool).
  • Ability to travel abroad twice a year for on-site workshops.

Why Join Us

  • Work on impactful projects with cross-functional teams.
  • Opportunity to grow your BI and analytics career in a data-driven organization.
  • Flexible working hours and remote work options.
  • Competitive compensation and benefits.
  • Opportunity to work on presales

We offer*:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

*not applicable for freelancers

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Poland

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Category Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!