Senior Data Engineer, Risk Technology
Ion
Job Summary
We are seeking a skilled and experienced Data Engineer to join our innovative team. The ideal candidate will possess expertise in data engineering technologies, experience with market and credit counterparty risk platforms, and a solid understanding of the financial services sector. Responsibilities include analyzing, designing, coding, testing, configuring, and modifying software for functional delivery of platforms and solutions. You will design, develop, test, debug, and implement platforms, solutions, software tools, and utilities. Building and managing automated delivery pipelines, implementing monitoring, alerting, logging, and tracing are also key. Collaboration with the Data Warehouse Architect, design and optimization of scalable data pipelines using technologies like Airflow, Snowflake, and AWS cloud services are crucial. Working closely with stakeholders to ensure platforms meet business and technical requirements and producing technical documentation are also part of the role.
Must Have
- 8+ years of experience delivering data-centric platforms
- Experience with market and credit counterparty risk platforms
- Advanced proficiency in Python
- Strong experience with AWS, Airflow, and Snowflake
- Comfortable working in an agile delivery environment
- Self-sufficient in a CI/CD environment
- Ability to contribute as an individual and review code
- Troubleshooting and debugging skills
- Strong interpersonal and organizational skills
Good to Have
- Experience building scalable, high-performance data platforms
- Experience with infrastructure as code
- Experience with continuous integration practices
Job Description
- Analyze, design, code, test, configure, and modify software for the functional delivery of platforms and solutions using programming languages and development methodologies.
- Design, develop, test, debug, and implement platforms, solutions, software tools, and utilities to ensure acceptable performance and service levels.
- Build and manage automated delivery pipelines for platforms and solutions using source control, infrastructure as code, and continuous integration practices.
- Implement monitoring, alerting, logging, and tracing to ensure the durability, availability, and performance of platforms and solutions.
- Collaborate with the Data Warehouse Architect to ensure successful platform strategies.
- Design and optimize scalable data pipelines using technologies like Airflow, Snowflake, and AWS cloud services.
- Work closely with stakeholders to ensure platforms meet both business and technical requirements.
- Produce technical documentation, including testing, training, and delivery artifacts.
- 8+ years of experience delivering data-centric platforms with large datasets, fast SLAs, and high data quality standards.
- Proven experience with market and credit counterparty risk platforms (mandatory).
- Advanced proficiency in Python.
- Strong experience with AWS, Airflow, and Snowflake.
- Comfortable working in an agile delivery environment.
- Self-sufficient in a CI/CD environment, with hands-on experience automating deployments.
- Proven ability to contribute as an individual, including reviewing pull requests and ensuring quality code.
- Experience troubleshooting and debugging simple to complex issues.
- Strong interpersonal and organizational skills, with the ability to work collaboratively.