Manager Data Engineering DE - Big Data GCP

1 Week ago • 8-8 Years • Data Analyst • DevOps

About the job

SummaryBy Outscal

Must have:
  • Overall 8+ years of IT experience with 3+ years in Data related technologies
  • Expertise of 1+ years in data-related GCP Cloud services
  • Knowledge of Big Data Architecture Patterns
  • Experience in the delivery of end-to-end Big Data solutions on GCP Cloud
  • Expert in programming languages like Java/ Scala
  • Expert in at least one distributed data processing framework: Spark (Core, Streaming, SQL), Storm or Flink
  • Expert in Hadoop eco-system with GCP cloud distribution
  • Worked on big data ingestion tools (Sqoop, Flume, NiFI)
  • Worked on distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub)
  • Worked on NoSQL solutions like Mongo DB, Cassandra, HBase
  • Exposure in development with CI / CD pipelines
Good to have:
  • Python
  • Knowledge of containerization, orchestration, and Kubernetes engine
  • Certification on GCP cloud platform or big data technologies
Perks:
  • Gender Neutral Policy
  • 18 paid holidays throughout the year
  • Generous parental leave and new parent transition program
  • Flexible work arrangements
  • Employee Assistance Programs to help you in wellness and well being
Not hearing back from companies?
Unlock the secrets to a successful job application and accelerate your journey to your next opportunity.

Company Description

Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting, and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of the next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

Job Description

As Manager, Data Engineering, you will be responsible for translating client requirements into design, architecting, and implementing GCP Cloud-based big data solutions for clients. Your role will be focused on delivering high-quality solutions by independently driving design discussions related to Data Ingestion, Transformation & Consumption, Data Storage and Computation Frameworks, Performance Optimizations, Infrastructure, Automation & Cloud Computing, and Data Governance & Security. The role requires a hands-on technologist with expertise in Big Data solution architecture and with a strong programming background in Java / Scala / Python.

Your Impact:

  • Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modeling, data processing, visualization, design, and implementation.
  • Lead a team to deliver high quality big data technologies-based solutions on GCP Cloud. Manage functional & nonfunctional scope and quality.
  • Help establish standard data practices like governance and address other non-functional issues like data security, privacy, and quality.
  • Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies.
  • Participate in workshops with clients and align client stakeholders to optimal solutions.
  • Consulting, Soft Skills, Thought Leadership, Mentorship etc.
  • People management, contributing to hiring and capability building.

Qualifications

Your Skills & Experience:

  • Overall 8+ years of IT experience with 3+ years in Data related technologies, and expertise of 1+ years in data-related GCP Cloud services and delivered at least 1 project as an architect.
  • Mandatory to have knowledge of Big Data Architecture Patterns and experience in the delivery of end-to-end Big Data solutions on GCP Cloud.
  • Expert in programming languages like Java/ Scala and good to have Python
  • Expert in at least one distributed data processing framework: Spark (Core, Streaming, SQL), Storm or Flink, etc.
  • Expert in Hadoop eco-system with GCP cloud distribution and worked at least on one or more big data ingestion tools (Sqoop, Flume, NiFI, etc), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub, etc) and good to know traditional tools like Informatica, Talend, etc
  • Should have worked on any NoSQL solutions like Mongo DB, Cassandra, HBase, etc, or Cloud-based NoSQL offerings like DynamoDB, Big Table, etc.
  • Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration, and Kubernetes engine would be an added advantage.

Set Yourself Apart With:

  • Certification on GCP cloud platform or big data technologies.
  • Strong analytical and problem-solving skills.
  • Excellent understanding of data technologies landscape/ecosystem.

A Tip from the Hiring Manager:

Join the team to sharpen your skills and expand your collaborative methods. Make an impact on our clients and their businesses directly through your work.

Additional Information

  • Gender Neutral Policy
  • 18 paid holidays throughout the year
  • Generous parental leave and new parent transition program
  • Flexible work arrangements 
  • Employee Assistance Programs to help you in wellness and well being
View Full Job Description

Virginia, United States (On-Site)

Virginia, United States (Hybrid)

Virginia, United States (Hybrid)

Virginia, United States (On-Site)

(On-Site)

New Zealand (On-Site)

View All Jobs

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug