Software Engineer II

1 Month ago • 6-6 Years • Software Development & Engineering

Job Summary

Job Description

Dun & Bradstreet is seeking a Software Engineer II to design, build, and deploy new data pipelines within their Big Data Ecosystems using tools like Streamsets/Talend/Informatica BDM. The role involves designing ETL/ELT data pipelines, working with data lakes and modern data warehousing practices. Key responsibilities include expert-level programming in Python and Spark, utilizing Cloud Based Infrastructure on GCP, and experience with ETL tools for complex loads and cluster batch execution. The engineer will work with various databases like DB2, Oracle, and SQL Server, handle web service integrations, and gain exposure to Apache Airflow for job scheduling. A strong understanding of Big Data Architecture (HDFS), cluster management, and performance tuning is essential. The role also includes creating Proofs of Concept (POCs), implementing capabilities in production, managing workloads, enabling optimization, and participating in planning and skill development activities.
Must have:
  • Minimum 6 years of experience in ETL/ELT Technologies (StreamSets/Informatica/Talend)
  • Minimum 6 years hands-on experience with Big Data technologies (Hadoop, Spark, Hive)
  • Minimum 3+ years of experience on Spark
  • Minimum 3 years of experience in Cloud environments (GCP)
  • Minimum 2 years of experience in Big Data service delivery
  • Experience with Informatica or StreamSets Data integration (ETL/ELT)
  • Hands-on experience managing solutions deployed in the Cloud (GCP)
  • Expert level programming skills on Python
  • Expert level programming skills on Spark
  • Cloud Based Infrastructure: GCP
Good to have:
  • Familiarity with Data Lakes and modern Data Warehousing practices
  • Strong exposure working with web service origins/targets/processors/executors, XML/JSON Sources and Restful API’s
  • Strong exposure working with relation databases DB2, Oracle & SQL Server including complex SQL constructs and DDL generation
  • Exposure to Apache Airflow for scheduling jobs
  • Strong knowledge of Big data Architecture (HDFS), Cluster installation, configuration, monitoring, cluster security, cluster resources management, maintenance, and performance tuning
  • Any experience with NoSQL and Graph databases
  • Exposure to role and attribute based access controls
  • Experience working in a Global company
  • Working in a DevOps model is a plus

Job Details

Why We Work at Dun & Bradstreet
Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers.


Key Responsibilities:

  • Design, build, and deploy new data pipelines within our Big Data Eco-Systems using Streamsets/Talend/Informatica BDM etc. Document new/existing pipelines, Datasets.
  • Design ETL/ELT data pipelines using StreamSets, Informatica or any other ETL processing engine. Familiarity with Data Pipelines, Data Lakes and modern Data Warehousing practices (virtual data warehouse, push down analytics etc.)
  • Expert level programming skills on Python
  • Expert level programming skills on Spark
  • Cloud Based Infrastructure: GCP
  • Experience with one of the ETL Informatica, StreamSets in creation of complex parallel loads, Cluster Batch Execution and dependency creation using Jobs/Topologies/Workflows etc.,
  • Experience in SQL and conversion of SQL stored procedures into Informatica/StreamSets, Strong exposure working with web service origins/targets/processors/executors, XML/JSON Sources and Restful API’s.
  • Strong exposure working with relation databases DB2, Oracle & SQL Server including complex SQL constructs and DDL generation.
  • Exposure to Apache Airflow for scheduling jobs
  • Strong knowledge of Big data Architecture (HDFS), Cluster installation, configuration, monitoring, cluster security, cluster resources management, maintenance, and performance tuning.
  • Create POCs to enable new workloads and technical capabilities on the Platform.  
  • Work with the platform and infrastructure engineers to implement these capabilities in production.
  • Manage workloads and enable workload optimization including managing resource allocation and scheduling across multiple tenants to fulfill SLAs.
  • Participate in planning activities, Data Science and perform activities to increase platform skills.

Key Requirements:

  • Minimum 6 years of experience in ETL/ELT Technologies, preferably StreamSets/Informatica/Talend etc.
  • Minimum of 6 years hands-on experience with Big Data technologies e.g. Hadoop, Spark, Hive.
  • Minimum 3+ years of experience on Spark.
  • Minimum 3 years of experience in Cloud environments, preferably GCP.
  • Minimum 2 years of experience working in a Big Data service delivery (or equivalent) roles focusing on the following disciplines:
  • Any experience with NoSQL and Graph databases
  • Informatica or StreamSets Data integration (ETL/ELT)
  • Exposure to role and attribute based access controls
  • Hands on experience with managing solutions deployed in the Cloud, preferably on GCP.
  • Experience working in a Global company, working in a DevOps model is a plus.
All Dun & Bradstreet job postings can be found at https://www.dnb.com/about-us/careers-and-people/joblistings.html and https://jobs.lever.co/dnb. Official communication from Dun & Bradstreet will come from an email address ending in @dnb.com.

Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever. Your use of this page is subject to Lever's Privacy Notice and Cookie Policy, which governs the processing of visitor data on this platform.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Hyderabad, Telangana, India

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Software Development & Engineering Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

Navi Mumbai, Maharashtra, India (Hybrid)

Frankfurt Am Main, Hessen, Germany (Hybrid)

Stockholm, Stockholm County, Sweden (Hybrid)

Chennai, Tamil Nadu, India (Hybrid)

Jacksonville, Florida, United States (On-Site)

Sambhaji Nagar, Maharashtra, India (On-Site)

London, England, United Kingdom (Hybrid)

Warsaw, Masovian Voivodeship, Poland (Hybrid)

United States (Remote)

Mumbai, Maharashtra, India (On-Site)

View All Jobs

Get notified when new jobs are added by dun bradstreet

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug