Data Engineer (with Databricks)

6 Months ago • 3-4 Years
Data Analysis

Job Description

Enhance a robust data pipeline for a SaaS product, ensuring data contextualization, validation, and ingestion. Collaborate on building data quality solutions and self-service tools for data onboarding. Design, build, and maintain data pipelines using Python; conduct in-depth analysis and debugging; develop smart documentation; set up and configure new tenants; write integration tests; and use GitLab and Databricks. The role requires advanced Python skills, experience with data platforms (Databricks preferred), relational databases, and SQL.
Good To Have:
  • Docker and Kubernetes
  • Document/graph databases
  • Azure familiarity
Must Have:
  • 3-4 years Data Engineering experience
  • Advanced Python proficiency
  • Data pipeline design & maintenance
  • Databricks experience
  • SQL proficiency
  • Testing frameworks
  • GitLab experience
Perks:
  • Flexible working format
  • Competitive salary
  • Personalized career growth
  • Professional development tools
  • Education reimbursement
  • Corporate events

Add these skills to join the top 1% applicants for this job

saas-business-models
communication
problem-solving
data-analytics
talent-acquisition
data-structures
gitlab
azure
spark
docker
kubernetes
python
sql

Join our team to work on enhancing a robust data pipeline that powers our SaaS product, ensuring seamless contextualization, validation, and ingestion of customer data. Collabd engineering data and build data quality solutions that inspire customer confidence. Additionally, identify opportunities to develop self-service tools that streamline data onboarding and make it more accessible for our users.

Our Client was established with the mission to fundamentally transform the execution of capital projects and operations. Designed by industry experts for industry experts, Client’s platfororate with product teams to unlock new user experiences by leveraging data insights. Engage with domain experts to analyze real-worlm empowers users to digitally search, visualize, navigate, and collaborate on assets. Drawing on 30 years of software expertise and 180 years of industrial legacy as part of the renowned Scandinavian business group, Client plays an active role in advancing the global energy transition. The company operates from Norway, the UK, and the U.S.

Key Responsibilities:

  • Design, build, and maintain data pipelines using Python
  • Collaborate with an international team to develop scalable data solutions
  • Conduct in-depth analysis and debugging of system bugs (Tier 2) 
  • Develop and maintain smart documentation for process consistency, including the creation and refinement of checklists and workflows
  • Set up and configure new tenants, collaborating closely with team members to ensure smooth onboarding
  • Write integration tests to ensure the quality and reliability of data services
  • Work with Gitlab to manage code and collaborate with team members
  • Utilize Databricks for data processing and management

Requirements:

  • Programming: Minimum of 3-4 years as data engineer, or in a relevant field
  • Python Proficiency: Advanced experience in Python, particularly in delivering production-grade data pipelines and troubleshooting code-based bugs.
  • Data Skills: Structured approach to data insights
  • Cloud: Familiarity with cloud platforms (preferably Azure)
  • Data Platforms: Experience with Databricks, Snowflake, or similar data platforms
  • Database Skills: Knowledge of relational databases, with proficiency in SQL.
  • Big Data: Experience using Apache Spark
  • Documentation: Experience in creating and maintaining structured documentation.
  • Testing: Proficiency in utilizing testing frameworks to ensure code reliability and maintainability
  • Version Control: Experience with Gitlab or equivalent tools.
  • English Proficiency: B2 level or higher.
  • Interpersonal Skills: Strong collaboration abilities, experience in an international team environment, willing to learn new skills and tools, adaptive and exploring mindset 

Nice to have:

  • Experience with Docker and Kubernetes
  • Experience with document and graph databases
  • Ability to travel abroad twice a year for an on-site workshops 

We offer:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

Set alerts for more jobs like Data Engineer (with Databricks)
Set alerts for new jobs by N-ix
Set alerts for new Data Analysis jobs in Poland
Set alerts for new jobs in Poland
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙