As a Data Engineer , you will take ownership of building, maintaining, and optimizing scalable data products, pipelines and platforms that support TVH’s business objectives. You will collaborate closely with cross-functional teams to design and implement data solutions that meet both functional and technical needs, with a strong focus on data quality, performance, and governance. This role demands hands-on technical expertise combined with effective stakeholder communication to ensure seamless data flow and integration across systems.
Data Pipeline Development & Maintenance
Independently design, build, and optimize scalable, fault-tolerant data pipelines for batch and real-time data processing.
Develop and manage data assets as data products. Curated, documented, reusable, and governed for business and analytical and operational consumption.
Automate ETL/ELT workflows using orchestration tools (e.g., Apache Airflow).
Contribute to CI/CD pipeline development, infrastructure-as-code, and platform automation for robust and scalable deployments.
Collaborate with ML Engineers and Data Scientists to ensure data pipelines and features meet ML use-case requirements.
Data Modeling
Develop and maintain data models aligned with business requirements, collaborating with data architects and analysts to ensure usability and consistency.
Continuous Improvement
Identify bottlenecks or inefficiencies in data processes and pipelines; assist in drafting and implementing improvement plans.
Contribute to improving data engineering standards, tooling, and best practices.
Data Governance
Create and maintain high-quality technical documentation covering data pipelines, integration flows, data models, and operational procedures.
Provide and collect metadata to support lineage and classification
Contribute to organizational standards and guidelines documentation.
Stakeholder Management
Collaborate with data owners, data maintenance teams, business users, and other stakeholders to ensure data quality and alignment with business needs.
Facilitate communication and feedback loops to promote shared understanding of data solutions and issues.
Project Participation
Contribute actively to the planning, coordination, and execution of assigned data engineering projects, supporting timely delivery within scope and quality standards.
Mentoring & Collaboration
Support junior colleagues in task execution and provide mentorship on complex topics.
Participate in team knowledge sharing sessions to disseminate learnings and insights.
Information Security & Compliance
Implement and monitor data security measures including row-level security and access controls.
Ensure all data solutions comply with data protection, privacy (GDPR), and retention requirements, collaborating with Governance and Security teams
Technical Design & Quality
Assist in technical administration of data engineering tools and platforms.
Support optimization of data quality checks and pipeline performance.
Integrations & Reporting
Design, develop, and maintain system integrations using APIs, middleware, or other methods.
Ensure data flow accuracy, reliability, and performance with error handling and monitoring mechanisms.
Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
Proven hands-on experience building and maintaining scalable data pipelines and ETL/ELT workflows.
Proficiency with SQL and at least one programming language such as Python, Java, or Scala.
Experience with data transformation and workflow orchestration tools (e.g., dbt and Airflow).
Experience with data storage solutions (preferably BigQuery, Cloud Storage, Cloud SQL)
Familiarity with cloud platforms (preferably GCP) and their data services.
Strong analytical and problem-solving skills.
Excellent communication and stakeholder collaboration skills.
Knowledge of data modeling concepts and data warehousing principles.
Understanding of data governance, security, and quality best practices.
Experience with real-time streaming technologies (e.g., Kafka, Pub/Sub).
Familiarity with DataOps en DevOps practices
Knowledge of containerization (Docker) and orchestration (Kubernetes).
Experience with scripting languages for automation (e.g., Bash, Python).
Agile/Scrum experience.
Fluent in English
You will become part of a people-oriented company where your well-being really matters. Flexible working hours, work from home possibilities, 20 days holiday and 12 WTR days within a 40-hour week. At our headquarters you will also discover our TVH Park, a green area where you can move around and have the possibility to relax, meet or have lunch. Furthermore, we also offer:
An attractive salary package with extra-legal benefits such as group and hospitalization insurance, luncheon vouchers, corporate restaurant, company car, ...
An exciting position in an international company with a family atmosphere where people are at the center.
You are part of a dynamic entrepreneurial team that is fast-growing and at the center of the transformation.
You arrive in an innovative, progressive and technological environment.
Numerous opportunities for personal development, among other things through permanent guidance and professional (internal/external) training courses.
Fun afterworks and other optional events (e.g. TVH Kaffee).
TVH is a global business with a family atmosphere, where people are at the center. We value clarity, mutual respect, kindness and open communication. Our people are down-to-earth, easy to work and engage with. We welcome differences and celebrate new ideas.
TVH is a parts specialist for quality parts and accessories for material handling, industrial vehicles, and construction and agricultural equipment. Working at TVH is opting for a company that excels as an international market leader and is well-known for its unstoppable craving for innovation.