Sr Cloud Big Data Engineer, SMAI

Micron Technology

Job Summary

Micron Technology is seeking a Sr Cloud Big Data Engineer, SMAI, to innovate memory and storage solutions. This role involves understanding business problems, architecting data management systems, developing ETL processes, and preparing data for analysis. Candidates should have 5+ years of experience in data engineering, advanced analytics, or business intelligence solutions, with expertise in cloud platforms, big data processing, and various programming languages and databases.

Must Have

  • Understand business problems and data.
  • Architect and implement data management systems.
  • Develop and orchestrate ETL processes.
  • Prepare data for analysis.
  • 5+ years in data engineering/analytics/BI.
  • Proficiency in MS Office, Unix, Linux.
  • Experience with AWS, Azure, GCP, Snowflake.
  • Experience with big data processing and Spark.
  • Knowledge of distributed systems and software architecture.
  • Strong database skills (SQL, NoSQL, Oracle, MSSQL).
  • Strong analytical and communication skills.
  • Experience in global, cross-functional teams.
  • 2+ years in object-oriented languages (C#, Java, Python, Perl).
  • 2+ years in web programming (PHP, MySQL, JavaScript, ASP).
  • Experience with data extraction tools (SSIS, Informatica).
  • Software development experience.
  • Ability to travel.

Perks & Benefits

  • Paid Time Off
  • Continuous Learning
  • Wellness Programs
  • Stock Purchase Program
  • Health Insurance
  • Recognition

Job Description

Our vision is to transform how the world uses information to enrich life for all.

Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever.

Responsibilities and Tasks

Understand the Business Problem and the Relevant Data

  • Maintain an intimate understanding of company and department strategy
  • Translate analysis requirements into data requirements
  • Identify and understand the data sources that are relevant to the business problem
  • Develop conceptual models that capture the relationships within the data
  • Define the data-quality objectives for the solution
  • Be a subject matter expert in data sources and reporting options

Architect Data Management Systems

  • Leverage understanding of the business problem and the nature of the data to select appropriate data management system (Big Data, OLTP, OLAP, etc.)
  • Design and implement optimum data structures in the appropriate data management system (GCP, Snowflake, Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements
  • Plan methods for archiving/deletion of information

Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data

  • Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static)
  • Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model
  • Develop processes to efficiently load the transform data into the data management system

Prepare Data to Meet Analysis Requirements

  • Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.)
  • Develop and code data extracts
  • Follow best practices to ensure data quality and data integrity
  • Ensure that the data is fit to use for data science applications

Qualifications and Experience:

  • 5+ years developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions
  • Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.)
  • Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake
  • Experienced in Cloud based solutions using AWS/AZURE/GCP/Snowflake.
  • Significant experience with big data processing and/or developing applications and data sources via Spark, etc.
  • Understanding of how distributed systems work
  • Familiarity with software architecture (data structures, data schemas, etc.)
  • Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL.
  • Strong mathematics background, analytical, problem solving, and organizational skills
  • Strong communication skills (written, verbal and presentation)
  • Experience working in a global, cross-functional environment
  • Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.)
  • Software development
  • Ability to travel as needed

25 Skills Required For This Role

Ms Office Cross Functional Communication Data Analytics Oracle Cpp Data Structures Game Texts Mysql C# Linux Aws Nosql Unix Azure Business Intelligence Hadoop Spark Data Science Python Sql Perl Php Javascript Java

Similar Jobs