Senior Associate_Hadoop Developer_Advisory Corporate_Advisory_Bangalore Millenia

1 Month ago • 4-8 Years

About the job

SummaryBy Outscal

Seeking a Senior Hadoop Developer with 4+ years of experience in Hadoop and Spark, proficient in Python and Pyspark. Proven experience in data transformation pipelines and Apache Airflow is crucial. Exposure to Cloud technologies (GCP preferred) is a plus.

Line of Service

Advisory

Industry/Sector

Not Applicable

Specialism

SAP

Management Level

Senior Associate

Job Description & Summary

A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.

Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers.

Why PWC

At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.

Responsibilities:

 
  • 6-7+ years of Experience in Hadoop or any Cloud Bigdata components (specific to the Data Engineering role), Hadoop hands on (airflow, oozie, hive, hdfs, sqoop, pig, map, reduce )

  • · 4+ years of Exp in Spark (Spark Batch, Streaming, Mlib etc). Candidates should possess proficiency in utilizing the Apache Spark framework.

  • · 6-7+ years of experience in Python programming language.

  • · 4+ years of experience with pyspark data transformation (json,csv,rdbms,stream) pipeline design , development and deployment with kubernate/onprem platform (not cloud based).

  • · 2+ years of experience in designing and implementing data workflows with Apache Airflow.

  • · Kafka or equivalent Cloud Bigdata components (specific to the Data Engineering role)

  • · Exposure in Oracle, MySql, SQL Server, DB2, Teradata, SPARK SQL, POSTGRES SQL, Spark SQL

  • · Unix/Shell Scripting experience.

  • · Cloud technologies GCP preferable.

  • Additional Requirements:

  • · Exposure to Large enterprise data

  • · Experience in application support and maintenance of spark applications

  • · Experience in optimize and tune the performance to handle large and medium scale data volume with spark.

  • · Experience in performance tuning techniques for large-scale data processing.

  • · Experience working with Continuous Integration/Continuous Deployment tools

  • · Experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)

  • · Adherence to clean coding principles: Candidates should be capable of producing code that is devoid of bugs and can be easily understood and replicated by other developers.

  • · Strong teamwork abilities: developers typically collaborate closely with data scientists and other backend developers. Therefore, candidates should exhibit excellent communication and collaboration skills.

  • Good To Have:  No Sql, druid, Elasticsearch, google big query

 

Mandatory skill sets:

Hadoop, Pyspark, Python

 

Preferred skill sets:

Hadoop, Pyspark, Python

 

Years of experience required:

4 - 8

 

Education qualification:

B.Tech / M.Tech / MBA / MCA

Education (if blank, degree and/or field of study not specified)

Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering

Degrees/Field of Study preferred:

Certifications (if blank, certifications not specified)

Required Skills

Apache Hadoop, PySpark, Python (Programming Language)

Optional Skills

Desired Languages (If blank, desired languages not specified)

Travel Requirements

Available for Work Visa Sponsorship?

Government Clearance Required?

Job Posting End Date

About The Company

At PwC, our purpose is to build trust in society and solve important problems. We’re a network of firms in 152 countries with over 327,000 people who are committed to delivering quality in assurance, advisory and tax services. Find out more and tell us what matters to you by visiting us at www.pwc.com. PwC refers to the PwC network and/or one or more of its member firms, each of which is a separate legal entity.


Content on this page has been prepared for general information only and is not intended to be relied upon as accounting, tax or professional advice. Please reach out to your advisors for specific advice.

View All Jobs

Similar Jobs

Next Level Business Services - Hadoop Developer

Florida, United States (On-Site)

Similar Skill Jobs

King - Data Scientist | Candy Crush Soda Saga

Stockholm County, Sweden (Hybrid)

Aristocrat Gaming - Payout & Risk Operator

New Hampshire, United States (Hybrid)

DraftKings - Product Manager I (Golf)

England, United Kingdom (On-Site)

Scientific Games  - Package Assembly Tech II

Georgia, United States (On-Site)

Tencent - Project Management Intern

Île-de-France, France (On-Site)

Activision - FP&A Manager, World of Warcraft and Diablo

England, United Kingdom (Hybrid)

Warner Bros. Games - Staff Data Engineer- C360, Hyderabad

Telangana, India (Hybrid)

Warner Bros. Games - Data Engineer II - C360, Hyderabad

Telangana, India (Hybrid)

Aristocrat Gaming - QA Manual (Pasino)

Masovian Voivodeship, Poland (Hybrid)

Jobs in Bengaluru, Karnataka, India

Warner Bros. Games - Staff Data Engineer- C360, Hyderabad

Telangana, India (Hybrid)

Warner Bros. Games - Data Engineer II - C360, Hyderabad

Telangana, India (Hybrid)

Nasdaq - Lead - Project Product Owner

Karnataka, India (On-Site)

Nasdaq - Lead - Project Product Owner

Karnataka, India (On-Site)

paypal - Sr. Storage and Systems Engineer

Tamil Nadu, India (Hybrid)

paypal - Software Engineer (Fullstack)

Karnataka, India (Hybrid)

paypal - Data scientist - Payments

Tamil Nadu, India (Hybrid)

paypal - Lead Software Engineer

Karnataka, India (Hybrid)

Software Engineering Jobs

Aristocrat Gaming - Payout & Risk Operator

New Hampshire, United States (Hybrid)

Scientific Games  - Package Assembly Tech II

Georgia, United States (On-Site)

Activision - Lead Network Programmer

Masovian Voivodeship, Poland (On-Site)

Warner Bros. Games - Staff Data Engineer- C360, Hyderabad

Telangana, India (Hybrid)

Warner Bros. Games - Data Engineer II - C360, Hyderabad

Telangana, India (Hybrid)

Aristocrat Gaming - QA Manual (Pasino)

Masovian Voivodeship, Poland (Hybrid)

Aristocrat Gaming - QA Manual (Pasino)

Lesser Poland Voivodeship, Poland (Hybrid)

DraftKings - Software Engineering Manager

Massachusetts, United States (On-Site)

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug