Senior Data Operations Engineer

11 Minutes ago • All levels
Data Analysis

Job Description

Oportun (Nasdaq: OPRT) is a mission-driven fintech company focused on empowering members with financial tools. As a Senior Data Operations Engineer, you will design, implement, and maintain optimal database schemas for relational (MariaDB) and NoSQL (MongoDB) databases. Key responsibilities include performance monitoring and tuning, ensuring security and compliance, managing backup and recovery solutions, and supporting ETL pipelines. You will also administer MariaDB, MongoDB, and Databricks environments, collaborate with various engineering teams, and maintain documentation.
Must Have:
  • Design, implement, and maintain optimal database schemas for relational (MariaDB) and NoSQL (MongoDB) databases.
  • Monitor and tune performance across all platforms.
  • Implement access controls, encryption, and database hardening techniques.
  • Implement and maintain backup/recovery solutions for all database platforms.
  • Support and optimize ETL pipelines between MongoDB, MariaDB, and Databricks.
  • Set up and monitor database alerts; troubleshoot incidents and resolve outages.
  • Administer MariaDB instances, optimize SQL queries, and manage schema migrations.
  • Manage MongoDB replica sets and sharded clusters, perform capacity planning.
  • Manage Databricks workspace permissions and clusters, optimize Spark jobs and Delta Lake usage.
  • Collaborate with developers, data scientists, and DevOps engineers.
  • Proficiency with MariaDB Tools: mysqldump, mysqladmin, Percona Toolkit.
  • Proficiency with MongoDB Tools: mongodump, mongotop, mongoexport, Atlas UI.
  • Proficiency with Databricks Tools: Jobs UI, Databricks CLI, REST API, SQL Analytics.
  • Scripting skills: Bash, Python, PowerShell.
  • Experience with Monitoring tools: Prometheus, Grafana, CloudWatch, DataDog.
  • Experience with Version Control & CI/CD: Git, Jenkins, Terraform.
  • Preferred Cloud provider: AWS.

Add these skills to join the top 1% applicants for this job

forecasting-budgeting
github
talent-acquisition
game-texts
incident-response
aws
nosql
prometheus
grafana
terraform
mariadb
powershell
spark
mongodb
ci-cd
git
python
sql
bash
jenkins

ABOUT OPORTUN

Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $19.7 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.

WORKING AT OPORTUN

Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.

RESPONSIBILITIES

Database Design & Architecture

  • Design, implement, and maintain optimal database schemas for relational (MariaDB) and NoSQL (MongoDB) databases.
  • Participate in data modeling efforts to support analytics in Databricks.

Performance Monitoring & Tuning

  • Monitor and tune performance across all platforms to ensure optimal performance.
  • Use profiling tools (e.g., query plans, system logs) to identify and resolve bottlenecks.

Security & Compliance

  • Implement access controls, encryption, and database hardening techniques.
  • Manage user roles and privileges across MariaDB, MongoDB, and Databricks.
  • Ensure compliance with data governance policies (e.g., GDPR, HIPAA).

Backup & Recovery

  • Implement and maintain backup/recovery solutions for all database platforms.
  • Periodically test restore procedures for business continuity.

Data Integration & ETL Support

  • Support and optimize ETL pipelines between MongoDB, MariaDB, and Databricks.
  • Work with data engineers to integrate data sources for analytics.

Monitoring & Incident Response

  • Set up and monitor database alerts.
  • Troubleshoot incidents, resolve outages, and perform root cause analysis.

MariaDB-Specific Responsibilities

  • Administer MariaDB instances (standalone, replication, Galera Cluster).
  • Optimize SQL queries and indexing strategies.
  • Maintain stored procedures, functions, and triggers.
  • Manage schema migrations and upgrades with minimal downtime.
  • Ensure ACID compliance and transaction management.

MongoDB-Specific Responsibilities

  • Manage replica sets and sharded clusters.
  • Perform capacity planning for large document collections.
  • Tune document models and access patterns for performance.
  • Set up and monitor MongoDB Ops Manager / Atlas (if used).
  • Automate backup and archival strategies for NoSQL data.

Databricks-Specific Responsibilities

  • Manage Databricks workspace permissions and clusters.
  • Collaborate with data engineers to optimize Spark jobs and Delta Lake usage.
  • Ensure proper data ingestion, storage, and transformation in Databricks.
  • Support CI/CD deployment of notebooks and jobs.
  • Integrate Databricks with external data sources (MariaDB, MongoDB, S3, ADLS)

Collaboration & Documentation

  • Collaborate with developers, data scientists, and DevOps engineers.
  • Maintain up-to-date documentation on data architecture, procedures, and standards.
  • Provide training or onboarding support for other teams on database tools.

REQUIREMENTS

  • MariaDB Tools: mysqldump, mysqladmin, Percona Toolkit
  • MongoDB Tools: mongodump, mongotop, mongoexport, Atlas UI
  • Databricks Tools: Jobs UI, Databricks CLI, REST API, SQL Analytics
  • Scripting: Bash, Python, PowerShell
  • Monitoring: Prometheus, Grafana, CloudWatch, DataDog
  • Version Control & CI/CD: Git, Jenkins, Terraform (for infrastructure-as-code)
  • Preferred CLoud provider: AWS

Set alerts for more jobs like Senior Data Operations Engineer
Set alerts for new jobs by oportun
Set alerts for new Data Analysis jobs in India
Set alerts for new jobs in India
Set alerts for Data Analysis (Remote) jobs

Contact Us
hello@outscal.com
Made in INDIA 💛💙