Senior Data Engineer

4 Months ago • All levels
Data Analysis

Job Description

The Senior Data Engineer will be responsible for delivering end-to-end data and analytics capabilities, including data ingestion, transformation, data science, and data visualization. This involves designing and deploying databases and data pipelines to support analytics projects. The engineer will develop scalable and fault-tolerant workflows, clearly document issues and solutions, and apply various tools and technologies proficiently, such as Python, PySpark, SQL, Spark, Snowflake, Airflow, AWS, and others. They will also be expected to optimize performance, develop briefings, analyze client data, and provide support to other data engineers and analysts. The candidate should be able to lead the team, communicate with business, gather and interpret business requirements.
Good To Have:
  • Exposure to Snowflake and Airflow.
  • Other programming languages (R, Scala, SAS, Java, etc.)
  • AWS Solutions Architect / Developer / Data Analytics certifications.
Must Have:
  • Expert experience in SQL, Python, and PySpark.
  • Experience with data and analytics technologies like SQL/NoSQL databases.
  • Knowledge of CI/CD tools (Gitlab, AWS CodeCommit).
  • Experience with AWS services (EMR, Glue, Athena, etc.).
  • Solid scripting skills (bash/shell scripts, Python).
  • Experience with data streaming technologies.
  • Experience with Big Data technologies like Hadoop, Spark, Hive, etc.

Add these skills to join the top 1% applicants for this job

leadership
data-analytics
data-structures
agile-development
gitlab
networking
linux
aws
nosql
data-visualization
tableau
hadoop
spark
data-science
ci-cd
python
sql
shell
scala
jira
bash
java

Data Engineer Responsibilities :
  • Deliver end-to-end data and analytics capabilities, including data ingest, data transformation, data science, and data visualization in collaboration with Data and Analytics stakeholder groups 
  • Design and deploy databases and data pipelines to support analytics projects 
  • Develop scalable and fault-tolerant workflows 
  • Clearly document issues, solutions, findings and recommendations to be shared internally & externally 
  • Learn and apply tools and technologies proficiently, including: 
    • Languages: Python, PySpark, ANSI SQL, Python ML libraries
    • Frameworks/Platform: Spark, Snowflake, Airflow, Hadoop , Kafka 
    • Cloud Computing: AWS
    • Tools/Products: PyCharm, Jupyter, Tableau, PowerBI 
  • Performance optimization for queries and dashboards
  • Develop and deliver clear, compelling briefings to internal and external stakeholders on findings, recommendations, and solutions 
  • Analyze client data & systems to determine whether requirements can be met 
  • Test and validate data pipelines, transformations, datasets, reports, and dashboards built by team
  • Develop and communicate solutions architectures and present solutions to both business and technical stakeholders
  • Provide end user support to other data engineers and analysts
Candidate Requirements :
  • Expert experience in the following[Should have/Good to have]:
    • SQL, Python, PySpark, Python ML libraries. Other programming languages (R, Scala, SAS, Java, etc.) are a plus
    • Data and analytics technologies including SQL/NoSQL/Graph databases, ETL, and BI
    • Knowledge of CI/CD and related tools such as Gitlab, AWS CodeCommit etc. 
    • AWS services including EMR, Glue, Athena, Batch, Lambda CloudWatch, DynamoDB, EC2, CloudFormation, IAM and EDS
    • Exposure to Snowflake and Airflow.
  • Solid scripting skills (e.g., bash/shell scripts, Python) 
  • Proven work experience in the following:
    • Data streaming technologies
    • Big Data technologies including, Hadoop, Spark, Hive, Teradata, etc.
    • Linux command-line operations
    • Networking knowledge (OSI network layers, TCP/IP, virtualization) 
  • Candidate should be able to lead the team, communicate with business, gather and interpret business requirements
  • Experience with agile delivery methodologies using Jira or similar tools 
  • Experience working with remote teams 
  • AWS Solutions Architect / Developer / Data Analytics Specialty certifications, Professional certification is a plus 
  • Bachelor Degree in Computer Science relevant field, Masters Degree is a plus 

Set alerts for more jobs like Senior Data Engineer
Set alerts for new jobs by Yodlee
Set alerts for new Data Analysis jobs in India
Set alerts for new jobs in India
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙