Data Engineering

7 Minutes ago • 3 Years +
Data Analysis

Job Description

Supabase is seeking an experienced data engineer to join their growing team. This role involves building and maintaining data infrastructure for analytics, machine learning, and business intelligence. The successful candidate will work with a stack including BigQuery, Airflow, dbt, and GCP, impacting internal operations and the developer community through data-driven insights and products. The position is fully remote and emphasizes autonomy and open-source commitment.
Good To Have:
  • Experience with Pulumi for infrastructure management.
  • Knowledge of PostHog, HubSpot, or other marketing/sales tools.
  • Experience with real-time data processing and streaming.
  • Background in developer tools or B2B SaaS companies.
  • Contributions to open source data engineering projects.
Must Have:
  • Design, build, and maintain scalable ETL/ELT pipelines using Airflow and Meltano.
  • Develop and optimize dbt models following established data warehouse architecture.
  • Implement data quality monitoring, testing, and alerting across all pipelines.
  • Manage data ingestion from 15+ sources including GitHub, HubSpot, Stripe, PostHog, Sentry, and internal PostgreSQL databases.
  • Manage BigQuery datasets, reservations, and slot allocation across dev/staging/prod environments.
  • Deploy and maintain Airflow DAGs using Cloud Composer with custom Docker images.
  • Implement infrastructure as code using Pulumi for GCP resources.
  • Monitor pipeline performance and optimize for cost and efficiency.
  • Build and maintain multi-layered data warehouse architecture with standardized naming conventions.
  • Design and implement data governance policies and documentation standards.
  • Optimize BigQuery performance through partitioning, clustering, and materialization strategies.
  • Develop and maintain reverse ETL pipelines to sync data to HubSpot, Customer.io, and other downstream systems.
  • Build attribution models and customer journey analytics.
  • Create automated triggers for sales and marketing outreach based on data milestones.
  • Implement data quality checks and monitoring for all reverse ETL processes.
  • Work closely with Analytics Engineers, Data Scientists, and business stakeholders.
  • Maintain comprehensive documentation for all data models and pipelines.
  • Participate in code reviews and establish best practices for the team.
  • Support other team members in using new datasets and analytical tools.
  • 3+ years of production experience with Python and SQL.
  • Strong experience with dbt for data modeling and transformation.
  • Experience with Apache Airflow for workflow orchestration.
  • Proficiency with cloud data warehouses (BigQuery preferred, but Snowflake/Redshift acceptable).
  • Experience with infrastructure as code (Terraform, Pulumi, or similar).
  • Strong understanding of data warehouse design patterns and best practices.
  • Experience with Git, CI/CD pipelines, and collaborative development workflows.
Perks:
  • Fully Remote: We provide a WeWork membership or co-working allowance you can use anywhere in the world.
  • ESOP: Every team member receives equity ownership in the company.
  • Tech Allowance: Budget to set up your ideal work environment.
  • Health Benefits: Supabase covers 100% of health insurance for employees and 80% for dependents.
  • Annual Off-Sites: The entire company gathers once a year for connection, collaboration, and fun.
  • Flexible Work: We operate asynchronously and trust you to manage your own time.
  • Professional Development: Annual education allowance for learning.

Add these skills to join the top 1% applicants for this job

saas-business-models
cross-functional
budget-management
design-patterns
github
game-texts
postgresql
business-intelligence
terraform
google-cloud-platform
ci-cd
docker
git
python
sql
stripe
machine-learning

Supabase is the Postgres development platform, built by developers for developers. We provide a complete backend solution including Database, Auth, Storage, Edge Functions, Realtime, and Vector Search. All services are deeply integrated and designed for growth.

Supabase is looking for an experienced data engineer to join our growing data engineering team. You'll be responsible for building and maintaining our data infrastructure that powers analytics, machine learning, and business intelligence across the company. Your work will directly impact both our internal operations and our developer community through data-driven insights and products.

We believe in giving engineers the autonomy to work efficiently while maintaining high performance standards. As we scale rapidly, we're seeking teammates who share our commitment to open source, know how to ship impactful features, and would contribute to our developer-focused culture.

The Stack

  • Data Warehouse: BigQuery (primary), PostgreSQL
  • Orchestration: Apache Airflow (Cloud Composer), Meltano
  • Data Modeling: dbt
  • Infrastructure: Google Cloud Platform, Pulumi
  • Analytics: Hex, PostHog
  • Reverse ETL: Hightouch
  • Languages: Python, SQL

The Role

Core Responsibilities

Data Pipeline Development & Maintenance

  • Design, build, and maintain scalable ETL/ELT pipelines using Airflow and Meltano
  • Develop and optimize dbt models following our established data warehouse architecture
  • Implement data quality monitoring, testing, and alerting across all pipelines
  • Manage data ingestion from 15+ sources including GitHub, HubSpot, Stripe, PostHog, Sentry, and internal PostgreSQL databases

Infrastructure & Operations

  • Manage BigQuery datasets, reservations, and slot allocation across dev/staging/prod environments
  • Deploy and maintain Airflow DAGs using Cloud Composer with custom Docker images
  • Implement infrastructure as code using Pulumi for GCP resources
  • Monitor pipeline performance and optimize for cost and efficiency

Data Architecture & Modeling

  • Build and maintain our multi-layered data warehouse architecture with standardized naming conventions
  • Design and implement data governance policies and documentation standards
  • Optimize BigQuery performance through partitioning, clustering, and materialization strategies

Reverse ETL & Data Activation

  • Develop and maintain reverse ETL pipelines to sync data to HubSpot, Customer.io, and other downstream systems
  • Build attribution models and customer journey analytics
  • Create automated triggers for sales and marketing outreach based on data milestones
  • Implement data quality checks and monitoring for all reverse ETL processes

Collaboration & Documentation

  • Work closely with Analytics Engineers, Data Scientists, and business stakeholders
  • Maintain comprehensive documentation for all data models and pipelines
  • Participate in code reviews and establish best practices for the team
  • Support other team members in using new datasets and analytical tools

Your Experience

Required Skills

  • 3+ years of production experience with Python and SQL
  • Strong experience with dbt for data modeling and transformation
  • Experience with Apache Airflow for workflow orchestration
  • Proficiency with cloud data warehouses (BigQuery preferred, but Snowflake/Redshift acceptable)
  • Experience with infrastructure as code (Terraform, Pulumi, or similar)
  • Strong understanding of data warehouse design patterns and best practices
  • Experience with Git, CI/CD pipelines, and collaborative development workflows

Bonus Points

  • Experience with Pulumi for infrastructure management
  • Knowledge of PostHog, HubSpot, or other marketing/sales tools
  • Experience with real-time data processing and streaming
  • Background in developer tools or B2B SaaS companies
  • Contributions to open source data engineering projects

What We Offer

  • Fully Remote

We hire globally. We believe you can do your best work from anywhere. There are no Supabase offices, but we provide a WeWork membership or co-working allowance you can use anywhere in the world.

  • ESOP

Every team member receives ESOP (equity ownership) in the company. We want everyone to share in the upside of what we’re building together.

  • Tech Allowance

Use this budget to set up your ideal work environment—laptop, monitor, headphones, or whatever helps you do your best work.

  • Health Benefits

Supabase covers 100% of health insurance for employees and 80% for dependents, wherever you are. Your wellbeing and your family’s health are important to us.

  • Annual Off-Sites

Once a year, the entire company gathers in a new city for a week of connection, collaboration, and fun. It’s a highlight of our year.

  • Flexible Work

We operate asynchronously and trust you to manage your own time. You know what needs to be done and when.

  • Professional Development

Every team member receives an annual education allowance to spend on learning—courses, books, conferences, or anything that supports your growth.

About the Team

Supabase was born-remote and open-source-first. We believe our globally distributed team is our secret weapon in building tools developers love.

  • 120+ team members
  • 35+ countries
  • 15+ languages spoken
  • $396M raised
  • 350,000+ community members
  • 20,000+ memes posted (and counting)

We move fast, build in public, and use what we ship. If it’s in your project, we probably use it in ours too. We believe deeply in the open-source ecosystem and strive to support—not replace—existing tools and communities.

Hiring Process

We keep things simple, async-friendly, and respectful of your time:

1. Apply – Our team will review your application.

2. Intro Call – A short video chat to get to know each other.

3. Interviews – Up to four calls with:

  • Founders
  • Future teammates
  • Someone cross-functional from product, growth, or engineering (depending on the role)

4. Decision – We may follow up with a final question or go straight to offer.

All communication is remote and we aim to move fast.

Set alerts for more jobs like Data Engineering
Set alerts for new jobs by Supabase
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙