Technical Program Manager, Data Lake

ansira

Job Summary

Ansira is consolidating and scaling a unified, enterprise Data Lake to integrate product, media, and business data, standardize reporting, and accelerate decision-making. This Technical Program Manager will lead this cross-functional program end-to-end, aligning roadmaps, driving ingestion and migration, maturing data governance and quality, and ensuring business adoption of self-serve analytics. The role is high-visibility, aiming to unify Ansira’s data foundation and improve insights across products and services through a modern, governed data platform.

Must Have

  • Lead end-to-end Data Lake program, including roadmap, budget, and risks.
  • Drive migration from legacy data systems to modern cloud data stack.
  • Define and manage Data Lake program backlog and data contracts.
  • Establish and monitor data governance, quality, and security practices.
  • Orchestrate work across Product, Engineering, Data Science, and Business teams.
  • Manage change adoption for standardized reporting and data usage.
  • Possess technical fluency in modern data stacks (Snowflake, Azure, dbt).
  • 8+ years in Program/Project/Product Management, 5+ years in data platforms.
  • Strong experience with Agile methodologies and project tools (Jira, Confluence).
  • Competent in SQL for data validation and triage.

Good to Have

  • Background in marketing/media data and standardized performance reporting.
  • Experience migrating from legacy ETL/BI ecosystems (Alteryx/Insighter/Tableau).
  • Experience establishing data domains and productizing data (SLAs, contracts, versioning).
  • Familiarity with privacy, security, and compliance standards (RBAC/ABAC, PII governance).
  • FinOps mindset: cost observability, unit economics, right-sizing compute/storage.

Job Description

Role Summary

Ansira is consolidating and scaling a unified, enterprise Data Lake to integrate product, media, and business data, standardize reporting, and accelerate decision-making across the organization. We are seeking a Technical Program Manager (TPM) to lead this cross-functional program end-to-end — aligning product and engineering roadmaps, driving ingestion and migration from legacy systems, maturing data governance and quality, and ensuring business adoption of standardized, self-serve analytics.

This leader will orchestrate work across Ansira’s Product solutions, Data Engineering, Data Science/BI, Media, and Client Partnership teams, with a clear mandate: deliver consistent, governed, and performant data to downstream products and reporting while deprecating redundant systems and minimizing operational cost.

What You’ll Do

Program Leadership & Delivery

  • Own the multi-quarter program plan for the unified Data Lake: scope, roadmap, milestones, budgets (OPEX/CAPEX), risks, and dependencies.
  • Stand up and run the operating model: weekly workstream standups, cross-functional syncs, monthly steering committee, and a transparent executive status rhythm.
  • Build and maintain a single-source-of-truth for delivery: program charter, RACI, RAID log, decision log, intake/triage process, and dashboards for progress/risks.
  • Drive the migration plan from legacy pipelines and tools (e.g., Alteryx, Insighter) to the target stack (e.g., Snowflake, Power BI embedded via platform connectors).
  • Coordinate parallel workstreams (ingestion, modeling, governance, reporting cutover) to hit time-bound deliverables with predictable quality.

Product Management & Roadmap

  • Define and maintain the Data Lake program backlog, translating business use cases into technical epics, data contracts, and acceptance criteria.
  • Partner with Product and Data Science teams to standardize media and product reporting packages and ensure they’re backed by governed, contract-driven data.
  • Prioritize sources and domains for ingestion based on business value, client impact, and technical feasibility; establish clear go/no-go gates.
  • Align with platform architecture to ensure scalable patterns for batch/stream ELT/CDC, cost control, observability, and reusability across domains.

Data Governance, Quality, and Security

  • Establish practical data contracts with upstream product and business owners; define schema, SLAs, lineage, and DQ checks at ingestion.
  • Stand up governance ceremonies and roles (data owners, stewards) and implement data catalog/lineage practices to improve discoverability and trust.
  • Define and monitor quality KPIs (completeness, timeliness, accuracy) and drive remediation plans with accountable teams.
  • Ensure data privacy, compliance, and security best practices (e.g., PII handling, role-based access, data masking) across environments.

Stakeholder Management & Change Adoption

  • Serve as the connective tissue across Product, Engineering, Data Science, Media, Finance, and Client Partnership — communicating decisions, trade-offs, and timelines.
  • Lead change management for reporting standardization (e.g., Media (AdTech/LBN)-based standard reports), business onboarding to the lake, and client-facing cutovers.
  • Create enablement assets (runbooks, playbooks, onboarding guides) and training plans to accelerate adoption and reduce support burden.

Technical Fluency

  • Partner effectively with architects and data engineers on Snowflake/BigQuery/Databricks, Azure/AWS/GCP services, orchestration (ADF/Airflow), and transformation (dbt).
  • Understand ELT/CDC patterns, API/file ingestion, schema design for analytics, and BI tooling (Power BI, Looker). Write and review basic SQL for validation.
  • Apply FinOps and performance/cost optimization practices (storage tiers, compute sizing, job scheduling, caching strategies).

Minimum Qualifications

  • 8+ years in Program/Project/Product Management, with 5+ years leading complex data platform initiatives in a cloud environment.
  • Proven delivery of cross-functional data programs involving multiple product lines and business stakeholders; strong executive communication.
  • Hands-on experience with modern data stacks: one or more of Snowflake/BigQuery/Databricks; Azure Data Factory/Airflow; dbt; Kafka/Kinesis; Git/Terraform; REST/SFTP integrations.
  • Strong grounding in data governance and quality practices, data contracts, catalog/lineage, and secure data access.
  • Demonstrated expertise in Agile at scale (Scrum/Kanban), Jira/Confluence, dependency/risk management, and budget tracking (including CAPEX/OPEX).
  • Competent SQL skills for validation/triage; fluency in reading pipeline/log artifacts and interpreting BI/semantic model requirements.

Preferred Qualifications

  • Background in marketing/media data and standardized performance reporting (e.g., Media (AdTech/LBN), campaign hierarchies, Power BI embedded).
  • Prior experience migrating from legacy ETL/BI ecosystems (e.g., Alteryx/Insighter/Tableau) to a lakehouse with standardized semantic layers.
  • Experience establishing data domains and productizing data (SLAs, contracts, versioning, lifecycle) to accelerate downstream analytics.
  • Familiarity with privacy, security, and compliance standards (e.g., RBAC/ABAC, PII governance) and enterprise SSO/permissions models for embedded analytics.
  • FinOps mindset: cost observability, unit economics, and right-sizing compute/storage.

Success Metrics (What Great Looks Like)

  • Sources Onboarded: number of prioritized sources/domains ingested to the lake with production-grade contracts and SLAs.
  • Time-to-Data: cycle time from intake approval to governed data available for downstream consumption.
  • Data Quality & Reliability: sustained improvement in DQ scorecards; incident MTTR reduction; SLA adherence for freshness and availability.
  • Migration Progress: percent of targeted legacy pipelines and reports decommissioned; client reporting cutovers delivered on schedule.
  • Adoption & Reuse: growth in standardized reporting/package usage; reduction in ad hoc one-off pipelines.
  • Cost & Performance: measurable storage/compute cost per use case; query performance improvements aligned to agreed SLOs.

30/60/90-Day Plan

  • 30 Days: Confirm program charter, governance model, and delivery rhythm. Baseline current-state architecture, sources, and reporting dependencies. Publish the first integrated roadmap/milestone plan and RAID.
  • 60 Days: Land 1–2 ingestion patterns as reusable templates (contracts, lineage, observability). Execute the first cutover to standardized reporting for a targeted use case. Stand up DQ scorecards and weekly KPI review.
  • 90 Days: Complete a tranche of source onboardings and at least one significant reporting migration. Retire targeted legacy jobs. Publish quarterly executive readout with outcomes, cost/performance improvements, and next-wave priorities.

Tools You’ll Use

  • Jira, Confluence, Power BI; Snowflake/BigQuery/Databricks; Azure/AWS/GCP services (e.g., ADF, IAM); Airflow; dbt; Git; catalog/lineage tools.

Why This Role

This is a high-visibility opportunity to unify Ansira’s data foundation and materially improve the speed, quality, and consistency of insights across products and services. You will own the operating model, drive cross-functional alignment, and deliver tangible business outcomes through a modern, governed data platform.

20 Skills Required For This Role

Team Management Cross Functional Risk Management Budget Management Github Talent Acquisition Game Texts Kanban Agile Development Aws Azure Terraform Power Bi Looker Tableau Data Science Confluence Git Sql Jira

Similar Jobs