Principal Engineer (R-18436)

10 Minutes ago • 10 Years + • Software Development & Engineering

Job Summary

Job Description

Dun & Bradstreet is seeking a Principal Engineer to design, build, and deploy scalable data pipelines within their Big Data ecosystem. This role involves architecting distributed data processing solutions using technologies like Apache Spark and Hadoop, with a strong focus on Python programming and cloud infrastructure (AWS/GCP). The engineer will manage workflows with Apache Airflow, optimize resource allocation, and contribute to innovation, ensuring efficient data storage and retrieval while collaborating with data science teams.
Must have:
  • Design, build, and deploy scalable and efficient data pipelines using Apache Spark and Apache Airflow.
  • Familiarity with data pipelines, data lakes, and modern data warehousing practices.
  • Design and implement distributed data processing solutions using Apache Spark and Hadoop.
  • Expert-level programming skills in Python.
  • Utilize cloud-based infrastructures (AWS/GCP) and their services.
  • Develop and manage workflows using Apache Airflow.
  • Strong knowledge of Big Data architecture.
  • Minimum 10+ years of hands-on experience in Big Data technologies.
  • Minimum 3 years of experience with Spark, Pyspark.
  • At least 6 years of experience in cloud environments (AWS/GCP).
  • Hands-on experience in managing cloud-deployed solutions (preferably AWS), NoSQL, and Graph databases.
Good to have:
  • Experience with Google Cloud Platform (GCP) particularly with Dataproc.
  • Prior experience working in a global organization.
  • Prior experience working within a DevOps model.

Job Details

Why We Work at Dun & Bradstreet

Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers.

Our global community of colleagues bring a diverse range of experiences and perspectives to our work. You'll find us working from a corporate office or plugging in from a home desk, listening to our customers and collaborating on solutions. Our products and solutions are vital to businesses of every size, scope and industry. And at the heart of our work, you’ll find our core values: to be data inspired, relentlessly curious and inherently generous. Our values are the constant touchstone of our community; they guide our behavior and anchor our decisions.

Key Responsibilities:

  • Design and Develop Data Pipelines: Architect, build, and deploy scalable and efficient data pipelines within our Big Data ecosystem using Apache Spark and Apache Airflow. Document new and existing pipelines and datasets to ensure clarity and maintainability.
  • Data Architecture and Management: Demonstrate familiarity with data pipelines, data lakes, and modern data warehousing practices, including virtual data warehouses and push-down analytics. Design and implement distributed data processing solutions using technologies like Apache Spark and Hadoop.
  • Programming and Scripting: Exhibit expert-level programming skills in Python, with the ability to write clean, efficient, and maintainable code.
  • Cloud Infrastructure: Utilize cloud-based infrastructures (AWS/GCP) and their various services, including compute resources, databases, and data warehouses. Manage and optimize cloud-based data infrastructure, ensuring efficient data storage and retrieval.
  • Workflow Orchestration: Develop and manage workflows using Apache Airflow for scheduling and orchestrating data processing jobs. Create and maintain Apache Airflow DAGs for workflow orchestration.
  • Big Data Architecture: Possess strong knowledge of Big Data architecture, including cluster installation, configuration, monitoring, security, resource management, maintenance, and performance tuning.
  • Innovation and Optimization: Create detailed designs and proof-of-concepts (POCs) to enable new workloads and technical capabilities on the platform. Collaborate with platform and infrastructure engineers to implement these capabilities in production. Manage workloads and optimize resource allocation and scheduling across multiple tenants to fulfill service level agreements (SLAs).
  • Continuous Learning and Collaboration: Participate in planning activities and collaborate with data science teams to enhance platform skills and capabilities.

Key Skills:

  • Minimum 10+ years of hands-on experience in Big Data technologies, including a minimum of 3 year's experience working with Spark, Pyspark.
  • Experience with Google Cloud Platform (GCP) is preferred, particularly with Dataproc, and at least 6 years of experience in cloud environments is required.
  • Must have hands-on experience in managing cloud-deployed solutions, preferably on AWS, along with NoSQL and Graph databases.
  • Prior experience working in a global organization and within a DevOps model is considered a strong plus.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Hyderabad, Telangana, India

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Software Development & Engineering Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

Hyderabad, Telangana, India (Hybrid)

Taipei City, Taiwan (On-Site)

Chennai, Tamil Nadu, India (Hybrid)

Shanghai, China (On-Site)

Shanghai, China (On-Site)

Delhi, India (On-Site)

Changsha, Hunan, China (On-Site)

Changsha, Hunan, China (On-Site)

Jacksonville, Florida, United States (On-Site)

View All Jobs

Get notified when new jobs are added by dun bradstreet

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug
Contact Us
hello@outscal.com
Made in INDIA 💛💙