Description
Exusia, a cutting-edge digital transformation consultancy, is looking for top talent in the Data Engineering space with specific skills in Ab Initio / Azure Data Engineering services to join our global delivery team's Industry Analytics practice.
What’s the Role?
Full-time job to work with Exusia's clients to design, develop and maintain large scale data engineering solutions. The right candidates will also get a chance to work across the entire data landscape including Data Governance, Metadata Management and will work closely with client stakeholders to capture the requirements, design and implement Analytical reporting, Compliance and Data Governance solutions.
Qualifications & Role Responsibilities
- Master of Science (preferably in Computer and Information Sciences or Business Information Technology) or an Engineering degree in the above areas.
- Have a minimum of 4 years experience in Data Management, Data Engineering & Data Governance space with hands on project experience using Ab Initio, Pyspark, Databricks and SAS
- Should have worked on large data initiatives and should have exposure to different ETL / Data engineering tools
- Work with business stakeholders to gather and analyze business requirements, building a solid understanding of the Data Analytics and Data Governance domain
- Document, discuss and resolve business, data and reporting issues within the team, across functional teams, and with business stakeholders
- Should be able to work independently and come up with solution design
- Build optimized data processing and data governance solutions using the given toolset
- Collaborate with delivery leadership to deliver projects on time adhering to the quality standards
Requirements
Mandatory Skills:
- Must have strong Data Warehousing / Data Engineering foundational skills with exposure to different types of data architecture
- Strong conceptual understanding of Data Management & Data Governance principles
- Hands-on experience using
- Strong Ab Initio skills with hands on experience on GDE, Express>IT, Conduct>IT, MetadataHub
- Databricks and should be fluent with Pyspark & Spark SQL
- Experience working with multiple databases like Oracle/SQL Server/Netezza as well as cloud hosted DWH like Snowflake/Redshift/Synapse and Big Query
- Exposure to Azure services relevant for Data engineering - ADF/Databricks/Synapse Analytics
- Experience working in an agile software delivery model is required.
- Prior data modelling experience is mandatory preferably for DWH/Data marts/Lakehouse
- Discuss & document data and analytics requirements with cross functional teams and business stakeholders.
- Analyze requirements and come up with technical specifications, source-to-target mapping, data and data models
- Manage changing priorities during the software development lifecycle (SDLC)
- Transforming business/functional requirements into technical specifications.
- Azure Certification relevant for Data Engineering/Analytics
- Experience and knowledge of one or more domains within Banking and Financial Services
Nice-to-Have Skills:
- Exposure to tools like Talend,Informatica,SAS for data processing
- Prior experience in converting Talend/Informatica/Mainframe based data pipelines to Ab Initio will be a big plus
- Data validation and testing using SQL or any tool-based testing methods
- Reporting/Visualization tool experience - PowerBI
- Exposure to Data Governance projects including Metadata Management, Data Dictionary, Data Glossary, Data Lineage and Data Quality aspects.