Senior Data Engineer

7 Hours ago • 4-8 Years • Data Analyst

About the job

Summary

As a Senior Data Engineer at Auros, you'll be responsible for stewarding and championing the firm's market and trading data archives and internal data products. This involves working with existing data pipelines and databases while designing and implementing next-generation data and analytic capabilities. You'll develop, test, and maintain high-throughput, high-volume distributed data architectures; analyze and automate data quality improvements; build real-time data collectors; and create tools for automating pipeline configuration, deployment, and troubleshooting. You'll collaborate with traders and developers to improve data quality and access. The role demands expertise in Python for data analysis, experience with large-scale data pipelines, and proficiency with SQL and NoSQL databases.
Must have:
  • Extensive Python experience for data analysis
  • Real-time large-scale data pipeline development
  • Distributed, high-performance SQL & NoSQL databases
  • High-throughput, high-volume data architecture development
Good to have:
  • Experience with data lakes (Amazon S3)
  • C++ development on Linux
  • Protocol-level network analysis
  • Terraform, ClickHouse, Hive, Hadoop, Snowflake, Presto
Not hearing back from companies?
Unlock the secrets to a successful job application and accelerate your journey to your next opportunity.

We are Auros! 

Auros is a leading algorithmic trading and market-making firm specialising in digital asset liquidity provision. We trade across 10+ global locations, facilitating 3-4% of global daily volumes, and have through connectivity to over 50 venues.

We’re proud of the strong reputation we’ve built by combining our systematic approach, sophisticated pricing models, and state-of-the-art execution capabilities to provide robust, reliable trading performance and bring liquidity to crypto markets worldwide.

What sets us apart, though, is our culture. Our flat structure means you’ll have autonomy and plenty of opportunity to bring your ideas to life and help shape the systems that will power our business into the future.

 

The Role

This is a rare opportunity for an experienced Data Engineer to become both steward and champion for the firm's market and trading data archives and internal data products

You will work with our existing data pipelines and databases while designing and implementing the next generation of Auros data and analytic capabilities. You’ll enjoy taking on responsibilities where you’ll have the opportunity to make a substantial impact on the business outcomes through the work you do every day.

You’ll learn from our experienced trading team and help develop and support systems that execute millions of trades on crypto exchanges across the globe.  

What You'll Do

  • Develop, test and maintain high throughput, high volume distributed data architectures
  • Analyze, define and automate data quality improvements
  • Develop and maintain real time data collectors for time series databases
  • Build and improve trading analytics systems
  • Create tools to automate the configuration, deployment and troubleshooting of the data pipeline
  • Develop strategies to make our data pipeline efficient, timely and robust in a 24/7 trading environment
  • Implement monitoring that measures the completeness and accuracy of captured data
  • Manage the impact that changes to trading systems and upstream protocols have on the data pipeline
  • Back populate and clean historical datasets
  • Collaborate with traders and trading system developers to understand our data analysis requirements, and to continue to improve the quality of our stored data
  • Develop tools, APIs and screens to provide easy access to the archived data

 

What You'll Bring

  • Extensive experience using python for data analysis and other ad hoc tooling to analyse time series data sets and other large sets of data
  • Ideally you’ll have experience with developing real time large scale data pipelines, with petabytes of data
  • Experience with distributed, high performance SQL and NoSQL database systems
  • A bachelor's degree (or above) in Computer Science, Software Engineering or similar, with excellent results.

Highly Desirable Skills

  • Experience with data lakes, Amazon S3 or similar
  • Experience developing in C++ on linux
  • Protocol level network analysis experience
  • Experience with terraform
  • Experience with Clickhouse
  • Experience with technologies such as Hive, Hadoop, Snowflake, Presto or similar.
View Full Job Description

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug