Job Description Summary:
We are looking for a versatile Data Engineer with expertise in both ETL development and Business Intelligence to join our team. The ideal candidate will have strong hands-on experience in building scalable data ingestion pipelines using Python, integrating with Snowflake, and orchestrating workflows using Azure services. Additionally, the candidate should be proficient in reporting tools such as MicroStrategy, Power BI, or Tableau, and have a solid understanding of SQL and financial domain data.
Data Engineering
- Adept at writing efficient SQL with working experience with relational databases e.g., Snowflake, SQL Server, PostgreSQL, MySQL, and others.
- Demonstrate expertise in ETL development and use of ETL Framework in building pipelines.
- Proficiency in Python programming language with exposure to Libraries - Pandas, NumPy etc. for data manipulation and transformation.
- Modeling skills to design efficient schemas with data objects with an understanding of normalization vs. denormalization
- Familiar with Data Governance & Security with role-based access control data encryption, masking, etc.
- Automation and orchestration tools, manage data pipeline to automate and orchestrate processes.
- Adequate Exposure in Azure Cloud Computing especially in use of Azure Blob Storage, Azure Key Vault, ADF pipelines, etc.
- Soft Skills: Communication, Problem-solving, Collaboration, Adaptability, etc.
Power BI & Reporting Skills:
- In-depth knowledge of Power BI Desktop, Power BI Service and Microsoft Report Builder for paginated reports.
- Familiarity with Power Query for data transformation and Power Pivot for data modeling.
- Expertise in Data modeling, DAX (Data Analysis Expressions), SQL Knowledge for data manipulation & ETL tasks.
- Demonstrate ability to translate business requirements into effective visualizations, understanding of data analysis concepts with a proficiency in creating various types of charts, graphs, and dashboards to represent data insights