Senior Data Engineer

1 Minute ago • All levels • Data Analysis

Job Summary

Job Description

In this key leadership role, you will lead the development of foundational components for a lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using technologies such as Python, PySpark, EMR Serverless, Iceberg, Glue and Glue Data Catalog. The main goal of this position is to ensure successful migration and establish robust data quality governance across the new platform, enabling reliable and efficient data processing. Success in this role requires deep technical expertise, exceptional problem-solving skills, and the ability to lead and mentor within an agile team.
Must have:
  • Lead complex projects autonomously.
  • Demonstrate profound technical knowledge of AWS data services and engineering practices.
  • Provide strategic guidance on design, development, and implementation best practices.
  • Write high-quality, efficient code and develop necessary tools and applications.
  • Lead the development of innovative data engineering tools and frameworks.
  • Collaborate with architects, Product Owners, and Dev team members to decompose solutions.
  • Drive the migration of existing data processing workflows to the Lakehouse architecture using Iceberg.
  • Establish and enforce best practices for coding standards, design patterns, and system architecture.
  • Build and maintain strong relationships with internal and external stakeholders.
  • Serve as an internal subject matter expert in software development.
  • Mentor team members and foster a culture of continuous learning and innovation.
  • Bachelor’s degree in computer science, Software Engineering, or a related field.
  • Proficient in Python and data pipeline implementation using AWS Glue, Lambda, Step Functions.
  • Extensive experience in software architecture and cloud-native design.
Good to have:
  • A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics).
  • Knowledge of additional programming languages and development tools.
  • Financial Services expertise, working with Equity and Fixed Income asset classes and a working knowledge of Indices.
  • Experienced in implementing and optimizing CI/CD pipelines.
Perks:
  • Continuous learning, mentoring and career growth opportunities.
  • Culture of inclusion for all employees.
  • Healthcare.
  • Retirement planning.
  • Paid volunteering days.
  • Wellbeing initiatives.

Job Details

Company Overview:

FTSE Russell, part of the London Stock Exchange Group, is an essential index partner for a changing world, providing category-defining indices across asset classes and investment objectives to create new possibilities for the global investment community. FTSE Russell’s expertise and products are used extensively by institutional and retail investors globally.

Job Summary:

In this key leadership role, you will lead the development of foundational components for a lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using technologies such as Python, PySpark, EMR Serverless, Iceberg, Glue and Glue Data Catalog. The main goal of this position is to ensure successful migration and establish robust data quality governance across the new platform, enabling reliable and efficient data processing. Success in this role requires deep technical expertise, exceptional problem-solving skills, and the ability to lead and mentor within an agile team.

Key Accountabilities:

Project Leadership and Culture Building:

  • Leads complex projects autonomously, fostering an inclusive and open culture within development teams.
  • Sets a high standard for technical contributions while promoting a collaborative environment that encourages knowledge sharing and innovation.

Technical Expertise and Advisory:

  • Demonstrates profound technical knowledge of AWS data services and engineering practices.
  • Provides strategic guidance on best practices in design, development, and implementation, ensuring solutions meet business requirements and technical standards.

Data Development and Tool Advancement:

  • Writes high-quality, efficient code and develops necessary tools and applications to address complex business needs.
  • Leads the development of innovative tools and frameworks to enhance data engineering capabilities.

Solution Decomposition and Design Leadership:

  • Collaborates closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading the design and planning of these components.
  • Drive the migration of existing data processing workflows to the Lakehouse architecture, leveraging Iceberg capabilities.
  • Establish and enforce best practices for coding standards, design patterns, and system architecture.
  • Utilizes existing design patterns to develop reliable solutions, while also recognizing when to adapt or avoid patterns to prevent anti-patterns.

Stakeholder Relationship Building and Communication:

  • Builds and maintains strong relationships with internal and external stakeholders, collaborating across teams.
  • Serves as an internal subject matter expert in software development, advising stakeholders on technical issues and best practices.
  • Applies deep technical expertise to assess complex challenges and propose strategic solutions.
  • Lead technical discussions, mentor team members, and foster a culture of continuous learning and innovation.

Qualifications and Experience:

  • Bachelor’s degree in computer science, Software Engineering, or a related field is essential. A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous.
  • Advanced Programming Proficiency: Deep technical knowledge of data engineering solutions and practices. Implementation of data pipelines using tools like AWS Glue, AWS Lambda, and AWS Step Functions. Proficient in Python and familiar with a variety of development technologies. This knowledge enables the Principal Data Engineer to adapt solutions to project-specific needs, apply best practices, and identify when patterns are appropriate or should be avoided.
  • System Architecture and Solution Design: Extensive experience in software architecture and solution design, including microservices, distributed systems, and cloud-native architectures. Capable of breaking down large-scale projects into manageable components, ensuring scalability, security, and alignment with strategic objectives.

Key Skills:

  • Advanced Software Development Practices: Demonstrates mastery of best practices in software development, including knowledge of object-oriented programming, functional programming, and design patterns. Skilled at implementing complex coding structures and promoting efficient, maintainable code across projects.
  • Advanced expertise in Python and Spark: Specialized expertise in Python and Spark, with a deep focus on ETL data processing and data engineering practices. Ability to provide technical direction, set high standards for code quality and optimize performance in data-intensive environments. Knowledge of additional programming languages and development tools provides flexibility and adaptability across varied data engineering projects.
  • Automation and CI/CD Pipelines: Skilled in leveraging automation tools and Continuous Integration/Continuous Deployment (CI/CD) pipelines to streamline development, testing, and deployment. Experienced in setting up and optimising CI/CD processes to ensure rapid, high-quality releases and minimise manual intervention.
  • Cross-Functional Collaboration and Communication: Exceptional communicator who can translate complex technical concepts for diverse stakeholders, including engineers, product managers, and senior executives. Skilled in building alignment and driving consensus, ensuring that technical decisions support broader business goals.
  • Technical Leadership and Mentorship: Provides thought leadership within the engineering team, setting high standards for quality, efficiency, and collaboration. Experienced in mentoring engineers, guiding them in advanced coding practices, architecture, and strategic problem-solving to enhance team capabilities.
  • Domain Expertise in AWS Cloud Services: Solid understanding of AWS services and cloud solutions, particularly as they pertain to data engineering practices. Familiar with AWS solutions including IAM, Step Functions, Glue, Lambda, RDS (e.g., DynamoDB, Aurora Postgres), SQS, API Gateway, Athena.
  • Quality Assurance and Continuous Improvement: Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Committed to continuous improvement, implementing methods that enhance data reliability, performance, and user satisfaction.
  • Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices. Experienced in implementing and optimizing CI/CD pipelines. Skilled at setting up processes that enable rapid, reliable releases, minimizing manual effort and supporting agile development cycles.

Other:

  • LSEG champions a culture committed to the growth of individuals through continuous learning, mentoring and career growth opportunities.
  • LSEG champions a culture of inclusion for all employees that respects their individual strengths, views, and experiences. We believe that our differences enable us to be a better team – one that makes better decisions, drives innovation, and delivers better business results.
  • Diversity is a core value at LSEG. We are passionate about building and sustaining an inclusive and equitable working and learning environment for all. We believe every member on our team enriches our diversity by exposing us to a broad range of ways to understand and engage with the world, identify challenges, and to discover, craft and deliver solutions.

Join us and be part of a team that values innovation, quality, and continuous improvement. If you're ready to take your career to the next level and make a significant impact, we'd love to hear from you.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Bengaluru, Karnataka, India

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Data Analysis Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

LSEG (London Stock Exchange Group) isa leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our culture of connecting, creating opportunity and delivering excellence shapes how we think, how we do things and how we help our people fulfil their potential.

Bengaluru, Karnataka, India (On-Site)

Bengaluru, Karnataka, India (On-Site)

Bengaluru, Karnataka, India (On-Site)

London, England, United Kingdom (On-Site)

Paris, Île-de-France, France (On-Site)

London, England, United Kingdom (On-Site)

Bucharest, Romania (On-Site)

New York, United States (On-Site)

New York, United States (On-Site)

View All Jobs

Get notified when new jobs are added by London stock Exchange

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug