Senior Data Engineer

Sumo logic

Job Summary

As a Sr Data Engineer at Sumo Logic, you will design and develop data engineering solutions, focusing on complex data integrity, security, process, and sanitization problems across Sumo Logic’s cloud services and business applications. You will work with a team to deliver high-impact engagements, improving existing data warehouse systems and ensuring compliance, security, and cost-optimization. The role requires passion for scalable, resilient, secure, and cost-efficient data solutions, producing clean, maintainable code, and providing experienced guidance.

Must Have

  • Design and implement complex enterprise data solutions.
  • Improve existing data warehouse systems.
  • Experience with DevOps methodologies and practices.
  • Ensure data compliance, security, and cost-optimization.
  • Strong understanding of AWS data infrastructure and compute services.
  • Design and manage data schemas and data flow.
  • Deliver well-architected, end-to-end data solutions.
  • Extensive experience in Databricks, Spark, and AWS (EC2, RDS, Aurora, DynamoDB, S3, Kinesis).
  • 6+ years of industry experience.
  • Experience with Python scripting, PySpark, SQL, and data schemas.
  • Experience with API calls and API-based ingestion.
  • Hands-on and deep experience with Git and GitHub.
  • Proven experience in AWS infrastructure management and deployment for data platforms.
  • Experience working with data ingestion, storage, and consumption layers.
  • Experience with both structured and unstructured data.
  • Agile development experience and familiarity with Jira, sprints, and pointing.
  • Experience building robust and well-architected designs for enterprise scale data architectures and workflows.
  • Experience and comfort with an on-call schedule for enterprise systems.

Good to Have

  • Experience in AI/ML, Data Science, LLMs, Contextualization.
  • Experience with Amazon Bedrock, Amazon Nova, SageMaker, Iceberg.
  • Experience in Terraform.
  • Experience with Tableau, Looker, and AWS Quicksight.
  • Experience with Big Data services such as HDFS, Spark, Hive, HBase, Yarn, and Oozie.

Job Description

Senior Data Engineer

The proliferation of machine log data has the potential to give organizations unprecedented real-time visibility into their infrastructure and operations. With this opportunity comes tremendous technical challenges around ingesting, managing, and understanding high-volume streams of heterogeneous data.

As a Sr Data Engineer, you will actively contribute to, execute for, and lead IT in the design and development of data engineering solutions. You will be instrumental in helping us solve complex data integrity, security, process, and sanitization problems among Sumo Logic’s robust inventory of cloud services, data tooling, and business applications. You will work closely with a team of highly skilled individuals to deliver on high impact engagements, rolling up to the Global Head of IT and reporting to our Senior Manager of Data Engineering.

You are a strong data engineer who is passionate about scalable, resilient, secure, and cost-efficient data solutions. You care about producing clean, elegant, maintainable, robust, well-tested code; you do this as a member of a team, helping the group come up with a better solution than you would as individuals and providing experienced guidance and insight. Ideally, you have experience with performance, scalability, and reliability issues of 24x7 uptime systems and solutions.

Responsibilities:

  • Designing and implementing complex and well-architected enterprise data solutions that support mid-to-large size organizations
  • Work to improve existing data warehouse systems, solutions, and process through architectural reviews and design enhancement initiatives
  • Experience in devops methodologies and practices as applied to infrastructure, cloud, and business application data
  • Improve systems to provide strict data compliance adherence, security guardrails, and cost-optimization
  • Strong understanding and hands-on experience with data infrastructure and compute services in AWS that our data platform is deployed on top of
  • Design and manage data schemas and the flow of data through corporate systems and applications to ensure compliance, data integrity, and data security
  • Deliver well-architected, end-to-end data solutions to a growing enterprise organization across multiple infrastructure environments, data sources, and business applications
  • Build strong partnerships and engage with other teams to create solutions that enable and benefit all parties and increased derived value
  • Ensuring global delivery and alignment on all data initiatives and maintaining 24/7 uptime

Requirements:

  • Extensive experience in Databricks, Spark, and AWS (EC2, RDS, Aurora, DynamoDB, S3 and Kinesis primarily)
  • Experience developing scalable, secure, and resilient data architectures and implementations
  • 6+ of industry experience with a proven track record of ownership and delivery
  • Experience with Python scripting, PySpark, and other data frameworks or tools
  • Experience with SQL and data schemas
  • Experience with API calls and API-based ingestion
  • Hands-on and deep experience with git and GitHub
  • Proven experience and success in AWS infrastructure management and deployment for data platforms to leverage and run on top of (EC2, S3, RDS, VPC, Basic Network Adjustments, KMS, PrivateLink all a bonus)
  • Experience working with the data ingestion, data storage, and data consumption layers
  • Experience with both structured and unstructured data
  • Agile development experience and a familiarity with Jira, sprints, and pointing
  • Experience building robust and well-architected designs for enterprise scale data architectures and workflows
  • You should have a passion for continuous learning and deep technological curiosity
  • Excellent verbal and written communication skills
  • Experience and comfort with an on-call schedule for enterprise systems

Desirable:

  • Experience in AI/ML, Data Science, LLMs, Contextualization, Amazon Bedrock, and Amazon Nova, SageMaker, Iceberg
  • Experience in Terraform is a major positive
  • Experience in the following specific technologies is a plus:
  • Tableau, Looker, and AWS Quicksight
  • Big Data services such as HDFS, Spark, Hive, HBase, Yarn, and Oozie

About Us

Sumo Logic, Inc. helps make the digital world secure, fast, and reliable by unifying critical security and operational data through its Intelligent Operations Platform. Built to address the increasing complexity of modern cybersecurity and cloud operations challenges, we empower digital teams to move from reaction to readiness—combining agentic AI-powered SIEM and log analytics into a single platform to detect, investigate, and resolve modern challenges. Customers around the world rely on Sumo Logic for trusted insights to protect against security threats, ensure reliability, and gain powerful insights into their digital environments. For more information, visit www.sumologic.com.

Sumo Logic Privacy Policy

. Employees will be responsible for complying with applicable federal privacy laws and regulations, as well as organizational policies related to data protection.

Create a Job Alert

Interested in building your career at Sumo Logic? Get future opportunities sent straight to your email.

Create alert

Apply for this job

------------------

  • indicates a required field

Autofill with MyGreenhouse

First Name*

Last Name*

Email*

Phone

Country*

Phone*

Resume/CV*

AttachAttach

Dropbox

Google Drive

Enter manuallyEnter manually

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

AttachAttach

Dropbox

Google Drive

Enter manuallyEnter manually

Accepted file types: pdf, doc, docx, txt, rtf

  • * *

LinkedIn Profile

How did you hear about this job?

Submit application

17 Skills Required For This Role

Communication Data Analytics Github Game Texts Agile Development Hbase Aws Terraform Looker Tableau Spark Data Science Yarn Git Python Jira Sql

Similar Jobs