Data Engineer-AWS Snowflake -Senior Associate

1 Day ago • 4-8 Years • DevOps

About the job

Summary

This Senior Associate Data Engineer role at PwC's Advisory Acceleration Centre requires 4+ years of hands-on experience in architecting and delivering scalable, cloud-based enterprise data solutions. Responsibilities include end-to-end implementation of cloud data engineering solutions (data lake, data hub) on AWS, leveraging Lambda or Kappa architectures. Proficiency in AWS services (EMR, Glue, S3, Redshift, DynamoDB, Kinesis, SQS, MSK), big data frameworks (Hadoop, Spark), and DBT is crucial. Experience with DevOps tools (Git, Jenkins, CI/CD), cloud security, and data migration processes is also required. The role demands strong analytical, problem-solving, and communication skills.
Must have:
  • AWS Cloud expertise
  • Big Data (Hadoop, Spark)
  • Data Lake/Hub implementation
  • DBT ELT Tool
  • Python/Scala programming
  • Data modeling & management
  • DevOps (Git, CI/CD)
  • AWS Security & Management
Good to have:
  • Java/JavaScript
  • SQL, XML, Linux
  • Web services experience
Not hearing back from companies?
Unlock the secrets to a successful job application and accelerate your journey to your next opportunity.

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

Data, Analytics & AI

Management Level

Senior Associate

Job Description & Summary

A career in our Advisory Acceleration Centre is the natural extension of PwC’s leading class global delivery capabilities. We provide premium, cost effective, high quality services that support process quality and delivery capability in support for client engagements.

Years of Experience: Candidates with 4+ years of hands on experience

Position Requirements:

  • Experience in architecting and delivering highly scalable, distributed, cloud-based enterprise data solutions

  • Strong expertise in end-to-end implementation of Cloud data engineering solutions like Enterprise Data lake, Data hub in AWS

  • Proficient in Lambda or  Kappa Architectures

  • Should be aware of Data Management concepts and Data Modelling

  • Strong AWS hands-on expertise with a programming background preferably Python/Scala

  • Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory

  • Strong experience in AWS compute services like AWS EMR, Glue and storage services like S3, Redshift & Dynamodb

  • Good experience with any one of the AWS Streaming Services like AWS Kinesis, AWS SQS and AWS MSK

  • Troubleshooting and Performance tuning experience in Spark framework - Spark core, Sql and Spark Streaming

  • Strong understanding of DBT ELT Tool, and usage of DBT macros etc

  • Good knowledge of Application DevOps tools (Git, CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline, Code Build and Code Commit

  • Experience with AWS CloudWatch, AWS Cloud Trail, AWS Account Config, AWS Config Rules

  • Good knowledge in AWS Security and AWS Key management

  • Strong understanding of Cloud data migration processes, methods and project lifecycle

  • Good analytical & problem-solving skills

  • Good communication and presentation skills

Desired Knowledge / Skills:

·        Excellent interpersonal, communication and team-facilitation skills

·        Strong analytical and problem-solving skills

·        Expert technical skills in software development (Java and/or JavaScript), data and database design, SQL, XML, Linux and Web services

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required:

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Optional Skills

Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Automation, Communication, Creative Solutions Development, Creativity, Data Governance Training, Data Infrastructure, Data Integrity, Data Management Plan (DMP), Data Monetization, Data Quality Improvement Plans (DQIP), Data Stewardship Frameworks, Data Strategy, Data Warehouse Governance, Embracing Change, Emotional Regulation, Empathy, Inclusion, Information Security, Intellectual Curiosity, Knowledge Management, Learning Agility {+ 10 more}

Desired Languages (If blank, desired languages not specified)

Travel Requirements

0%

Available for Work Visa Sponsorship?

No

Government Clearance Required?

No

Job Posting End Date

View Full Job Description

About The Company

At PwC, our purpose is to build trust in society and solve important problems. We’re a network of firms in 152 countries with over 327,000 people who are committed to delivering quality in assurance, advisory and tax services. Find out more and tell us what matters to you by visiting us at www.pwc.com. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity.


Content on this page has been prepared for general information only and is not intended to be relied upon as accounting, tax or professional advice. Please reach out to your advisors for specific advice.

View All Jobs

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug