A premier end-to-end digital transformation consultancy dedicated to partnering with ambitious brands to create digital solutions for today’s complex challenges and tomorrow’s opportunities. With uncompromising standards for technical and domain expertise, we deliver innovative and strategic solutions in Strategy, Analytics, Digital Engineering, Cloud, Data & AI, Experience Design, and Marketing. Our Co-Innovation methodology is a unique engagement model designed to align interests and accelerate value creation. Our clients worldwide benefit from the skills and expertise of over 4,000+ expert team members across the Americas, APAC, and EMEA. By partnering with leading technology providers, we craft transformative digital experiences that enhance customer engagement and drive business success. We are looking for a skilled Data Engineer with strong experience in Snowflake and a deep understanding of data migration, transformation, and unstructured data handling. The ideal candidate will have hands-on experience building complex SQL queries, integrating APIs, and working with modern ETL tools to support robust and scalable data pipelines. This is a hybrid role with 3 days in office.
Information Security Responsibilities
- Promote and enforce awareness of key information security practices, including acceptable use of information assets, malware protection, and password security protocols
- Identify, assess, and report security risks, focusing on how these risks impact the confidentiality, integrity, and availability of information assets
- Understand and evaluate how data is stored, processed, or transmitted, ensuring compliance with data privacy and protection standards (GDPR, CCPA, etc.)
- Ensure data protection measures are integrated throughout the information lifecycle to safeguard sensitive information
Responsibilities:
- Design, develop, and maintain scalable data pipelines using Snowflake.
- Write and optimize complex SQL queries for data transformation and analysis.
- Migrate and ingest data from REST APIs into Snowflake.
- Work with various ETL tools to orchestrate and automate data workflows.
- Manage and process unstructured data (e.g., JSON, XML, log files, text).
- Collaborate with data analysts, data scientists, and business teams to ensure data availability and accuracy.
- Ensure data quality, performance, and governance across the data pipeline.
Qualifications:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 6+ years of experience as a Data Engineer or in a similar role.
- Snowflake expertise is a MUST, including experience with Snowflake SQL and architecture.
- Strong knowledge of ETL processes and hands-on experience with tools such as Informatica, Talend, Apache Nifi, or DBT (as applicable).
- Proven ability to write and debug complex SQL queries.
- Experience in data migration from REST APIs to Snowflake.
- Familiarity with managing and analyzing unstructured data formats.
- Strong problem-solving and communication skills.
Nice to have:
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Exposure to CI/CD pipelines for data engineering workflows.
- Knowledge of Python or other scripting languages for data manipulation is a plus.