Binance Accelerator Program - Research Data Analyst
binance
Job Summary
Binance, a global blockchain company, is seeking a Research Data Analyst for its Accelerator Program. The role involves working with large volumes of data (PB-scale) to engineer data pipelines, build visualizations, and develop machine learning models. Responsibilities include analyzing transactional, operational, and customer data using various tools, translating complex findings into actionable recommendations, processing confidential information, managing the reporting environment, and troubleshooting database issues. The ideal candidate will have a Bachelor's degree in Computer Science, Math, or Statistics, proficiency in data engineering, modeling, ETL, SQL, GraphQL, and Python. Experience with Dune analytics and Nansen, along with an understanding of DeFi and Web 3.0 infrastructure, is preferred. The position requires a commitment of at least 3 days per week for 6 months.
Must Have
- Bachelor's degree in CS, Math, or Statistics
- Proficient in data engineering, modeling, ETL
- Experience with SQL, GraphQL, Python
- Minimum 3 days/week commitment for 6 months
- Understanding of project tokenomics
- Knowledge of DeFi and Web 3.0
- High-level written and verbal communication skills
Good to Have
- Experience with data sourcing and APIs
- Experience with Dune analytics & Nansen
- Understanding of addressing and metadata standards
Job Description
Responsibilities
- Work across all aspects of data from engineering to building sophisticated visualizations, machine learning models and experiments
- Analyze and interpret large (PB-scale) volumes of transactional, operational and customer data using proprietary and open source data tools, platforms and analytical tool kits
- Translate complex findings into simple visualizations and recommendations for execution by operational teams and executives
- Processing confidential data and information according to guidelines
- Managing and designing the reporting environment, including data sources, security, and metadata
- Troubleshooting the reporting database environment and reports
Requirements
- Bachelor’s degree from an accredited university or college in Computer Science or Math or Statistics
- Proficient in data engineering, modeling and ETL - preferred experience with data sourcing and working with APIs
- Experience with data querying using languages such as SQL, GraphQL & Python
- Able to commit minimum 3 days per week for at least 6 months
- Understands project tokenomics and possess good knowledge of the DeFi and Web 3.0 infrastructure landscape
- Experience in using tools such as Dune analytics & Nansen
- Understanding of addressing and metadata standards
- High-level written and verbal communication skills