$100K - 120K a year
Design and maintain scalable ETL pipelines and data solutions, perform SQL tuning, and collaborate with teams to meet data and reporting requirements.
5+ years of data engineering experience with advanced SQL, Python, Spark, Snowflake, Databricks, Airflow, and cloud platform expertise.
Key Responsibilities: • Partner with technical and non-technical colleagues to understand data and reporting requirements • Work with engineering teams to collect required data from internal and external systems • Design table structures and define ETL pipelines to build performant data solutions that are reliable and scalable in a fast growing data ecosystem • Develop automated data quality checks • Develop and maintain ETL routines using ETL and orchestration tools such as Airflow • Implement database deployments using tools like SchemaChange • Perform ad hoc analysis as necessary. • Perform SQL and ETL tuning as necessary. Required Skills: • Data Engineering skills using - • Complex or Advanced SQL queries • Python • Spark • Snowflake • Databricks • Airflow or Prefect • Experience with at least one cloud platform (AWS / Azure / GCP) • Media and Entertainment, Solution Selling, Practice Development • 5+ years of data engineering experience - Core technical foundation • Strong data modeling expertise including dimensional modeling and normalization principles • Advanced SQL performance tuning skills - Critical for optimization Strategic analytical thinking with ability to interpret market and consumer data Preferred (Nice to Have): • Experience working with Kafka • Familiarity with tools like Datorama / Improvado / FiveTran for integrating, harmonizing, and visualizing data across platforms. • Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions, etc.) & Docker containers • Exposure to monitoring tools like Datadog. Job Type: Full-time Pay: From $100,000.00 per year Location: • Santa Monica, CA 90403 (Preferred) Work Location: In person
This job posting was last updated on 10/14/2025