Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
AirOps

AirOps

via Ashby

Apply Now
All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Data Engineer

Anywhere
full-time
Posted 8/12/2025
Direct Apply
Key Skills:
Python
SQL
ETL/ELT pipelines
dbt
Data warehouses (Snowflake, Redshift, BigQuery)
Cloud environments (AWS, GCP)
Orchestration frameworks (Airflow, Dagster, Prefect)

Compensation

Salary Range

$90K - 130K a year

Responsibilities

Design and maintain scalable data ingestion and transformation pipelines, ensure data quality and reliability, collaborate with data scientists and product teams, and optimize cloud data infrastructure.

Requirements

4+ years data engineering experience with Python, SQL, modern data modeling tools, cloud data warehouses, orchestration frameworks, and ability to work in fast-paced environments.

Full Description

About AirOps Today thousands of leading brands and agencies use AirOps to win the battle for attention with content that both humans and agents love. We’re building the platform and profession that will empower a million marketers to become modern leaders — not spectators — as AI reshapes how brands reach their audiences. We’re backed by awesome investors, including Unusual Ventures, Wing VC, Founder Collective, XFund, Village Global, and Alt Capital, and we’re building a world-class team with in-person hubs in San Francisco, New York, and Montevideo, Uruguay. What You’ll Own We’re hiring a Data Engineer to design and maintain the high-scale data infrastructure that powers the AirOps platform. You will build robust ingestion, cleanup, and integration pipelines, ensuring that the data our customers rely on for brand visibility insights is accurate, reliable, and ready for analysis. Responsibilities Design, build, and maintain scalable ETL/ELT pipelines for ingesting and transforming large volumes of data Implement automated data validation, monitoring, and alerting to ensure quality and reliability Integrate diverse internal and external data sources into unified, queryable datasets Optimize storage and query performance for analytical workloads Collaborate with data scientists to productionize ML models and ensure they run reliably at scale Work with product and engineering teams to meet data needs for new features and insights Maintain cost efficiency and operational excellence in cloud environments Your Experience 4+ years of experience in data engineering, ideally in AI, SaaS, or data-intensive products Strong fluency in Python and SQL Experience with modern data modeling tools such as dbt Experience with data warehouses and OLAP databases (e.g., Redshift, Snowflake, BigQuery, ClickHouse) Proven ability to design and maintain production-grade data pipelines in cloud environments (AWS, GCP, or similar) Familiarity with orchestration frameworks (Airflow, Dagster, Prefect) Comfort operating in fast-paced, ambiguous environments where you ship quickly and iterate About You You love building systems that make data accurate, reliable, and accessible at scale You think in terms of automation and scalability, not manual workarounds You collaborate well with data scientists, product managers, and engineers You enjoy working with large, complex datasets and solving performance challenges You take pride in operational excellence and care about the quality of the data you deliver Our Guiding Principles Extreme Ownership Quality Curiosity and Play Make Our Customers Heroes Respectful Candor Benefits Equity in a fast-growing startup Competitive benefits package tailored to your location Flexible time off policy Generous parental leave A fun-loving and (just a bit) nerdy team that loves to move fast!

This job posting was last updated on 8/12/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt