Data Engineer (Azure) | KD Pharma
GT HQ
Job Summary
GT is seeking a Data Engineer (Azure) for KD Pharma, a CDMO specializing in ultra-pure Omega-3 concentrates. This role involves building and scaling a modern Azure-based data platform for BI, supporting Finance, Operations/Supply Chain, and Quality/Manufacturing. The project focuses on improving data quality, lineage, reliability, and time-to-insight, with a long-term opportunity to own and shape the data platform and grow as a data leader in a global organization.
Must Have
- 4-6 years of experience
- Own the Azure data platform architecture & roadmap
- Design, build, and operate ETL/ELT pipelines into ADLS/Warehouse
- Model data for Power BI
- Implement data quality, lineage, governance & security
- Partner with BI analysts to deliver reusable, trusted semantic models and dashboards
- Drive reliability & cost optimisation
- Support immediate projects: Business Central (ERP + MES), TrackWise (QMS), ECC6 extracts
Good to Have
- Experience with Synapse, Databricks, and Delta Lake
- Knowledge of Microsoft Purview, IaC (Bicep/Terraform)
- Familiarity with ML basics
- Background in regulated manufacturing/pharma (GxP)
Job Description
GT was founded in 2019 by a former Apple, Nest, and Google executive. GT’s mission is to connect the world’s best talent with product careers offered by high-growth companies in the UK, USA, Canada, Germany, and the Netherlands.
On behalf of KD Pharma, GT is looking for a Data Engineer (Azure) interested in building and scaling a modern data platform to support Finance, Operations/Supply Chain, and Quality/Manufacturing functions.
About the Client & the Project
Founded in 1988, KD Pharma is a pure-play, technology-driven CDMO (Contract Development & Manufacturing Organization) dedicated to revolutionizing pharmaceutical and nutraceutical production. They are uniquely recognized for offering ultra-pure Omega-3 concentrates at commercial scale through patented supercritical fluid chromatography technologies.
Located in Germany, Norway, the UK, the USA, Canada, Peru, and with sales presence across Asia, they deliver end-to-end solutions from custom synthesis to finished dosage forms while adhering to cGMP and global certifications
The project is to establish a robust Azure-based data platform for business intelligence (BI). It includes assessing Microsoft Fabric vs. Azure Data Factory (ADF) and, if needed, re-platforming to a scalable ADF-led architecture.
About the Role
This role will focus on improving data quality, lineage, reliability, and time-to-insight, while helping reassess Fabric vs. an Azure Data Factory architecture. This is a high-impact role offering the long-term opportunity to own and shape the Azure data platform and grow as a trusted data leader in a global organisation.
Success Measures:
- 0–6 months: Audit current estate, define migration plan, build ADF pipelines for priority sources (Business Central, TrackWise), achieve >98% dataset refresh success, establish baseline data quality checks & lineage.
- 6–24 months: Deliver a consolidated Lakehouse/Warehouse with governed semantic models, optimised for cost & performance, with documented controls and stakeholder CSAT/NPS ≥8/10.
Responsibilities
- Own the Azure data platform architecture & roadmap (ADF vs Fabric; Synapse/Databricks evaluation)
- Design, build, and operate ETL/ELT pipelines into ADLS/Warehouse
- Model data for Power BI (DAX/Tabular)
- Implement data quality, lineage, governance & security (Purview, RBAC, CI/CD)
- Partner with BI analysts to deliver reusable, trusted semantic models and dashboards
- Drive reliability & cost optimisation (monitoring, alerting, SLAs)
- Support immediate projects: Business Central (ERP + MES), TrackWise (QMS), ECC6 extracts
Essential knowledge, skills & experience
- Experience Level: 4-6 years of experience
- Strong expertise in Azure Data Factory & Azure Data Lake Gen2
- Advanced SQL/T-SQL
- Power BI (DAX, Tabular modeling, deployment pipelines)
- Python or PySpark
- Git & Azure DevOps (CI/CD pipelines)
- Dimensional modeling
- Security & RBAC
Nice-to-have
- Experience with Synapse, Databricks, and Delta Lake
- Knowledge of Microsoft Purview, IaC (Bicep/Terraform)
- Familiarity with ML basics
- Background in regulated manufacturing/pharma (GxP) — can be learned
Soft Skills
- Strong communication & collaboration abilities
- Pragmatic “architect-builder” mindset — able to balance strategy with hands-on delivery
- Comfort in leading technology choices and engaging stakeholders
- Results-driven with focus on data reliability, governance, and business value
Interview Steps
1. GT interview with Recruiter
2. Technical interview
3. Final interview
4. Reference Check
5. Offer