Senior Big Data Engineer (#2447)

10 Months ago • 5 Years +

Job Description

N-iX seeks a Senior Big Data Engineer to design and develop data pipelines and ETL processes using tools like Apache Airflow, NiFi, and dbt. Responsibilities include managing large-scale data processing frameworks, collaborating with data scientists, optimizing data storage (SQL/NoSQL), working with streaming technologies (Kafka, Debezium), and participating in all SDLC stages using CI/CD principles (Docker, Kubernetes, GitLab). The role also involves setting up data warehousing solutions and providing technical expertise in data governance and security. The client requires a strategic partner to support the development of robust and scalable applications.
Good To Have:
  • Private cloud experience
  • Data science/ML framework knowledge (TensorFlow, PyTorch)
  • Data visualization tool experience (QlikView, Tableau)
Must Have:
  • 5+ years Big Data Engineering experience
  • Expertise in ETL tools (Airflow, NiFi, dbt)
  • Proficient in SQL & NoSQL databases
  • Experience with data warehousing solutions
  • Knowledge of streaming & CDC tools (Kafka, Debezium)
  • CI/CD experience (Docker, Kubernetes, GitLab)
Perks:
  • Flexible working format
  • Competitive salary
  • Personalized career growth
  • Professional development tools
  • Education reimbursement
  • Corporate events

Add these skills to join the top 1% applicants for this job

oracle
kubernetes
mongodb
redis
nosql
docker
sql
postgresql
tensorflow
ci-cd
grafana
data-visualization
pytorch
gitlab
data-science

N-iX is looking for a Senior Big Data Engineer to join our team!

Our client is a leading provider of technical services, delivering both standard and custom intranet and internet-based software and applications systems. Due to increasing demand for in-house digital projects, the client is seeking to outsource certain development tasks to strategic partners.

The client is looking to build strategic long-term relationships with leading development partners to accelerate business growth through high-quality and cost-efficient software development. The selected partner(s) will support the development of robust and scalable Consumer and Enterprise applications.

Responsibilities:

  • Design and develop data pipelines and ETL processes using tools such as Apache Airflow, NiFi, and dbt.
  • Implement and manage large-scale, distributed data processing frameworks.
  • Collaborate closely with data scientists to ensure data availability and quality for machine learning models.
  • Optimize performance of data storage solutions, including SQL/Relational (PostgreSQL, Oracle) and NoSQL (MongoDB, Redis) databases.
  • Work with streaming and CDC technologies like Kafka and Debezium to facilitate real-time data processing.
  • Participate in all stages of SDLC from conceptualization to deployment, focusing on CI/CD principles using Docker, Kubernetes, and GitLab.
  • Set up and manage data warehousing solutions (Postgres, Oracle, Clickhouse) to support analytical requirements.
  • Provide technical expertise and best practices for data engineering solutions, including data governance and security.

Requirements:

  • Proven experience (5+ years) as a Big Data Engineer or similar role.
  • Strong expertise in ETL tools: Apache Airflow, NiFi, Huawei ETL, ODI, and dbt.
  • Experience with streaming and change data capture tools: Kafka, Debezium.
  • Proficiency in SQL and NoSQL databases, including PostgreSQL, Oracle, MongoDB, and Redis.
  • Experience with data warehousing solutions like Clickhouse, Postgres, and Oracle.
  • Familiarity with cloud-based data processing environments and containerization tools such as Docker and Kubernetes.
  • Excellent problem-solving and analytical skills, with attention to detail.
  • Bachelor’s degree in Computer Science, Information Systems, or a related field.
  • Upper-Intermediate level of English.

Nice to Have:

  • Experience with private cloud setups.
  • Familiarity with data science and machine learning frameworks like TensorFlow, PyTorch.
  • Knowledge of data visualization tools: QlikView, Tableau, Apache Superset, Grafana.

We offer:

  • Flexible working format - remote, office-based or flexible
  • A competitive salary and good compensation package
  • Personalized career growth
  • Professional development tools (mentorship program, tech talks and trainings, centers of excellence, and more)
  • Active tech communities with regular knowledge sharing
  • Education reimbursement
  • Memorable anniversary presents
  • Corporate events and team buildings
  • Other location-specific benefits

Set alerts for new jobs by N-ix
Set alerts for new jobs in Ukraine
Contact Us
hello@outscal.com
Made in INDIA 💛💙