Senior Data Engineer

1 Month ago • 4 Years +
Data Analysis

Job Description

SailPoint is seeking a Senior Data Engineer to join its Data Platform team. This role involves designing and implementing robust data ingestion and processing systems, focusing on ELT processes for various endpoints like RDBMS, NoSQL, and data warehouses. The engineer will develop scalable data pipelines using JVM-based languages, leverage AWS services for storage and integration, and maintain workflow orchestration with tools like Apache Airflow. The position requires adaptability and problem-solving skills in an ambiguous environment, contributing to innovative data solutions for SailPoint's identity security platform.
Must Have:
  • Spearhead the design and implementation of ELT processes, especially focused on extracting data from and loading data into various endpoints, including RDBMS, NoSQL databases and data-warehouses.
  • Develop and maintain scalable data pipelines for both stream and batch processing leveraging JVM based languages and frameworks.
  • Collaborate with cross-functional teams to understand diverse data sources and environment contexts, ensuring seamless integration into our data ecosystem.
  • Utilize AWS service-stack wherever possible to implement lean design solutions for data storage, data integration and data streaming problems.
  • Develop and maintain workflow orchestration using tools like Apache Airflow.
  • Stay abreast of emerging technologies in the data engineering space, proactively incorporating them into our ETL processes.
  • Thrive in an environment with ambiguity, demonstrating adaptability and problem-solving skills.

Add these skills to join the top 1% applicants for this job

cross-functional
problem-solving
game-texts
prototyping
aws
nosql
terraform
helm
spark
docker
kubernetes
jenkins

Responsibilities

  • Spearhead the design and implementation of ELT processes, especially focused on extracting data from and loading data into various endpoints, including RDBMS, NoSQL databases and data-warehouses.
  • Develop and maintain scalable data pipelines for both stream and batch processing leveraging JVM based languages and frameworks.
  • Collaborate with cross-functional teams to understand diverse data sources and environment contexts, ensuring seamless integration into our data ecosystem.
  • Utilize AWS service-stack wherever possible to implement lean design solutions for data storage, data integration and data streaming problems.
  • Develop and maintain workflow orchestration using tools like Apache Airflow.
  • Stay abreast of emerging technologies in the data engineering space, proactively incorporating them into our ETL processes.
  • Thrive in an environment with ambiguity, demonstrating adaptability and problem-solving skills.

Qualifications

  • BS in computer science or a related field.
  • 4+ years of experience in data engineering or related field.
  • Demonstrated system-design experience orchestrating ELT processes targeting data
  • Hands-on experience with at least one streaming or batch processing framework, such as Flink or Spark.
  • Hands-on experience with containerization platforms such as Docker and container orchestration tools like Kubernetes.
  • Proficiency in AWS service stack.
  • Familiarity with workflow orchestration tools such as Airflow.
  • Experience with DBT, Kafka, Jenkins and Snowflake.
  • Experience leveraging tools such as Kustomize, Helm and Terraform for implementing infrastructure as code.
  • Strong interest in staying ahead of new technologies in the data engineering space.
  • Comfortable working in ambiguous team-situations, showcasing adaptability and drive in solving novel problems in the data-engineering space

What success looks like in the role

Within the first 30 days you will:

  • Onboard into your new role, get familiar with our product offering and technology, proactively meet peers and stakeholders, set up your test and development environment.
  • Seek to deeply understand business problems or common engineering challenges and propose software architecture designs to solve them elegantly by abstracting useful common patterns.

By 90 days:

  • Proactively collaborate on, discuss, debate and refine ideas, problem statements, and software designs with different (sometimes many) stakeholders, architects and members of your team.
  • Take a committed approach to prototyping and co-implementing systems alongside less experienced engineers on your team—there’s no room for ivory towers here.

By 6 months:

  • Collaborates with Product Management and Engineering Lead to estimate and deliver small to medium complexity features more independently.
  • Occasionally serve as a debugging and implementation expert during escalations of systems issues that have evaded the ability of less experienced engineers to solve in a timely manner.
  • Share support of critical team systems by participating in calls with customers, learning the characteristics of currently running systems, and participating in improvements.

Set alerts for more jobs like Senior Data Engineer
Set alerts for new jobs by Sailpoint
Set alerts for new Data Analysis jobs in India
Set alerts for new jobs in India
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙