Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
ComboCurve

ComboCurve

via Indeed

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Database Administrator, MongoDB

Anywhere
Full-time
Posted 12/3/2025
Verified Source
Key Skills:
MongoDB
Replica sets and clusters
Performance tuning
Backup and disaster recovery
Observability tools (Datadog, Atlas)
Relational DB support (PostgreSQL, Snowflake)
ETL/ELT jobs
JavaScript/Node.js
SQL
Python

Compensation

Salary Range

$60K - 90K a year

Responsibilities

Manage and optimize MongoDB and relational database infrastructure, develop ETL pipelines, collaborate with engineering teams, and participate in incident response.

Requirements

1-3 years database experience with MongoDB and relational databases, familiarity with JavaScript/Node.js and Python, strong communication, and eagerness to learn.

Full Description

ComboCurve is a industry leading cloud-based software solution for A&D, reservoir management, and forecasting in the energy sector. Our platform empowers professionals to evaluate assets, optimize workflows, and manage reserves efficiently, all in one integrated environment. By streamlining data integration and enhancing collaboration, we help operators, engineers, and financial teams make informed decisions faster. Trusted by top energy companies, ComboCurve delivers real-time analytics and exceptional user support, with a world-class customer experience team that responds to inquiries in under 5 minutes. We’re looking for a motivated and detail-oriented Database Administrator to help build and maintain the data systems that power our applications. In this role, you’ll support our database infrastructure — primarily MongoDB, with exposure to relational databases and Responsibilities data engineering workflows — while learning from senior engineers and gaining experience in real-world production environments. • MongoDB ownership • * Design and administer replica sets and clusters (capacity planning, scaling, automation). • Performance tuning: indexing strategy, query profiling, schema design, aggregation pipelines, concurrency, and connection management. • Backup/restore, point-in-time recovery, and disaster recovery runbooks. • Observability: Create dashboards/alerts (e.g., Atlas/Cloud Manager, Datadog) and capacity/latency SLOs. • Run and verify backups/restores; maintain runbooks and checklists. • Relational DB support (secondary) • * Administer and RDBMS (e.g., PostgreSQL or Snowflake): backups, replication, query tuning, indexing, vacuum/maintenance. • Data modeling, normalization vs. denormalization, and query plan analysis. • Data engineering • * Build and maintain ETL/ELT jobs to move data between MongoDB and relational systems. • Implement change data capture, data validation, and quality checks. • Development & collaboration • * Write utilities and automation in JavaScript/Node.js; craft complex SQL and MQL queries. • Review designs with application engineers; advise on data access patterns and schema evolution. • Participate in an on-call rotation and incident response; drive root-cause analysis and preventive fixes. • Collaboration & communication • * Partner with dev teams to understand feature needs and data access patterns; translate requirements into DB tasks. • Communicate clearly in standups, PRs, and docs; escalate risks early with actionable proposals. • Contribute to incident response as a learner; participate in postmortems and document follow-ups. Requirements • 1–3 years of experience working with databases. • Basic understanding of MongoDB concepts such as documents, collections, and indexes. • Exposure to SQL and relational databases (PostgreSQL, or Snowflake preferred). • Familiarity with JavaScript/Node.js and Python • Detail-oriented, organized, and comfortable working in a collaborative engineering environment. • Excellent communication skills and eagerness to learn from feedback. Bonus Points • Hands-on experience with MongoDB Atlas, Datadog, or similar monitoring tools. • Familiarity with ETL pipelines, data validation, or automation frameworks. • Exposure to cloud platforms (GCP, or Azure). • Experience writing small automation or data tools using APIs or scripting. Why You’ll Love This Role • * Mentorship from experienced engineers who will help you grow into a database specialist. • * Opportunities to work on real production systems and learn modern data infrastructure from the inside out. • * A collaborative, growth-oriented culture that encourages learning and continuous improvement. • * The chance to make a meaningful impact on the reliability and performance of our platform.

This job posting was last updated on 12/6/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt