Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Hilbert's AI

Hilbert's AI

via Ashby

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Data Engineer (Core Platform & Optimization)

Anywhere
Full-time
Posted 2/26/2026
Direct Apply
Key Skills:
ClickHouse
Dagster
OLAP
Data Orchestration
Python

Compensation

Salary Range

$Not specified

Responsibilities

Optimize ClickHouse clusters, maintain shared code libraries, and architect monitoring systems for data integrity.

Requirements

Expertise in OLAP systems, query optimization, software engineering rigor including CI/CD and testing, and experience with orchestration tools.

Full Description

Hilbert is a scalable, data science-first growth engine that gives B2C teams predictive clarity into user behavior, revenue drivers, and the actions that drive sustainable growth. Fully agentic by design, Hilbert shrinks months-long decision cycles to minutes. From Fortune 10 enterprises to beloved brands like FreshDirect, Blank Street, and Levain Bakery, operators run their growth on Hilbert. We're also co-building alongside leading AI companies. We’re looking for a Core Data Engineer to build the foundation that allows our integration team to move 10x faster. While others focus on connecting the pipes, you focus on the integrity and flow of the entire refinery. You will be responsible for our internal "shared brain", the common code, the optimization of our ClickHouse clusters, and the monitoring systems that tell us a pipeline failed before the customer even notices. THE ROLE The "Optimizer": You enjoy shaving seconds off a query and megabytes off a memory footprint. You think in terms of scale and reliability. Dagster Ninja: You have experience (or a deep desire to master) Dagster for complex orchestration. OLAP Expert: You understand the internals of ClickHouse (or similar like Druid/Pinot) and how to structure data for maximum analytical performance. Software Rigor: You treat data code like production software. CI/CD, unit testing for data, and modular code aren't optional for you. Overlap: You can provide at least 5 hours of overlap with the global team. WHO THRIVES IN THIS ROLE Optimize our ClickHouse performance. Create configuration schemas that allow data ingestion pipelines to be controlled rather than custom built. Build and maintain the shared Python libraries used by the entire engineering team for data syncs. Enhance our canonical data models to ensure they stay "generic" enough to scale but powerful enough to perform. Architect our monitoring, alerting, and observability stack so we have "five-nines" confidence in our data. Design new integration pipelines by investigating new data sources and different data sync mechanisms. Bonus Points Experience in E-commerce or Retail sectors (understanding what a "SKU" or "Attribution Window" is without being told). Experience with product event usage data. Contribution to open-source data tools. Working with Data Scientists or ML Engineers Experience of building a data orchestration and integration engine/tool from scratch Location San Francisco, or Istanbul At least 5 hours overlap with PST timezone (7am-5pm) Compensation Competitive salary + equity package, commensurate with experience. Performance-based bonuses tied to project milestones and customer impact. The Hiring Journey Short form → Intro call → Technical working session → Team conversations → Offer Fast, human, no bureaucracy.

This job posting was last updated on 2/26/2026

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt