IN_Senior Associate _Azure DE + Databricks Developer _Data &Analytics _Advisory _PAN India

2 Months ago • 5-10 Years • Data Analyst

About the job

Summary

This role requires a Senior Associate with 5-10 years of experience in Azure Data Engineering and Databricks development. Must-have skills include PySpark, Azure Data Lake Services, Databricks, and Unity Catalog. The role focuses on Data Lake and Lakehouse implementation, data processing, management, integration, and storage.
Must have:
  • PySpark & Python
  • Azure Data Lake
  • Databricks Platform
  • Unity Catalog
Good to have:
  • Delta Live Tables
  • Azure Fabric
  • SAP & Dynamics
  • Databricks Engineer
Perks:
  • Vibrant Community
  • Inclusive Benefits
Not hearing back from companies?
Unlock the secrets to a successful job application and accelerate your journey to your next opportunity.

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.

Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers.

*Why PWC
At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.
At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. "



Responsibilities:

1. Data Lake and Lakehouse Implementation:
o Design, implement, and manage Data Lake and Lakehouse architectures. (Must have)
o Develop and maintain scalable data pipelines and workflows. (Must have)
o Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have)
o Knowledge on Medalion Architecture, Delta Format. (Must have)
2. Data Processing and Transformation:
o Use PySpark for data processing and transformations. (Must have)
o Implement Delta Live Tables for real-time data processing and analytics. (Good to have)
o Ensure data quality and consistency across all stages of the data lifecycle. (Must have)
3. Data Management and Governance:
o Employ Unity Catalog for data governance and metadata management. (Good to have)
o Ensure robust data security and compliance with industry standards. (Must have)
4. Data Integration:
o Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems.
o Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have)
o Performance Optimization of the Jobs. (Must have)
5. Data Storage and Access:
o Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have)
o Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have)
6. Collaboration and Communication:
o Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have)
o Provide technical guidance and mentorship to junior team members. (Good to have)
7. Continuous Improvement:

o Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have)
o Continuously improve data processes and infrastructure for efficiency and scalability. (Must have)


Required Skills and Qualifications:
· Technical Skills:
o Proficient in PySpark and Python for data processing and analysis.
o Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture.
o Hands-on experience with Databricks for data engineering and analytics.
o Knowledge of Unity Catalog for data governance.
o Expertise in Delta Live Tables for real-time data processing.
o Familiarity with Azure Fabric for data integration and orchestration.
o Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing.
o Experience in pulling data from multiple sources like SAP, Dynamics 365, and others.


· Soft Skills:
o Excellent problem-solving and analytical skills.
o Strong communication and collaboration abilities.
o Ability to work independently and as part of a team.
o Attention to detail and commitment to data accuracy and quality.
Certifications required:
· Certification in Azure Data Engineering or relevant Azure certifications.
o DP203 (Must have)
· Certification in Databricks.
o Databricks certified Data Engineer Associate (Must have)
o Databricks certified Data Engineer Professional (Good Have)

Mandatory skill sets: Azure DE, Pyspark, Databricks


Preferred skill sets: Azure DE, Pyspark, Databricks


Years of experience required: 5-10 Years


Educational Qualification: BE, B.Tech, MCA, M.Tech

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Databricks Platform, Microsoft Azure

Optional Skills

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Available for Work Visa Sponsorship?

Government Clearance Required?

Job Posting End Date

View Full Job Description

About The Company

At PwC, our purpose is to build trust in society and solve important problems. We’re a network of firms in 152 countries with over 327,000 people who are committed to delivering quality in assurance, advisory and tax services. Find out more and tell us what matters to you by visiting us at www.pwc.com. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity.


Content on this page has been prepared for general information only and is not intended to be relied upon as accounting, tax or professional advice. Please reach out to your advisors for specific advice.

View All Jobs

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug