In this role you'll architect, implement, and optimize data solutions for a small, highly effective engineering team while providing technical leadership on data strategy, identifying new data opportunities within the Reliability department, proposing organization-wide data platforms, and building new analytics: Design and build scalable data pipelines and ETL processes to ingest, transform, and manage hardware test data from multiple sources Lead the migration of legacy data warehouse components to modern cloud-native solutions while ensuring minimal downtime, data integrity, and seamless transition for end users Build and integrate cutting-edge AI/ML solutions that drive decision-making, automate workflows, and surface novel insights from complex datasets Design and optimize data warehouse schemas, dimensional models, and indexing strategies for large-scale hardware test datasets, ensuring efficient storage and high-performance query execution Develop and maintain robust, fault-tolerant data processing workflows to handle high-volume test data ingestion, transformation, and validation with appropriate error handling and recovery mechanisms Implement comprehensive data quality frameworks, validation rules, and monitoring systems to ensure accuracy, completeness, and reliability of critical metrics and analytics Continuously analyze and optimize data warehouse performance through query tuning, resource allocation, cost management, and capacity planning to support growing test data volumes Partner with reliability engineers, cross-functional software teams, and business stakeholders to understand data requirements and deliver analytics-ready datasets that enable data-driven insights and decision making