The role requires a Data Architect with 10 years of IT experience and deep expertise in AWS services like S3, Redshift, Aurora, Glue, and Lambda. The candidate will be responsible for developing end-to-end data platforms using AWS and possess hands-on programming experience with Python, including unit testing. Experience with orchestration tools such as Airflow or Step Functions is required. Proficiency in AWS Redshift, including writing stored procedures, understanding its data API, and executing federated queries, is mandatory. The role also involves Redshift performance tuning and strong communication and problem-solving skills, including stakeholder management.