via LinkedIn
$90K - 130K a year
Design and maintain data ingestion pipelines and integrate data from multiple sources to support analytics and reporting.
Requires 4+ years in data engineering with hands-on Databricks, Lakeflow Connect, cloud platform experience, and knowledge of data governance and CI/CD.
Key Responsibilities Design develop and maintain data ingestion pipelines using Databricks and Lakeflow Connect Integrate data from various structured and unstructured sources into Delta Lake and other data storage systems Implement realtime and batch ingestion workflows to support analytics and reporting needs Optimize data ingestion performance ensuring scalability reliability and cost efficiency Collaborate with data architects analysts and business stakeholders to define data requirements and ingestion strategies Ensure data quality lineage and governance compliance across the ingestion process Automate data ingestion monitoring alerting and errorhandling mechanisms Stay up to date with emerging Databricks Lakehouse and data integration technologies and best practices Required Qualifications Bachelors or Masters degree in Computer Science Information Systems Data Engineering or a related field 4 years of experience in data engineering or ETL development Handson experience with Databricks SQL PySpark Delta Lake Proficiency with Lakeflow Connect for building and managing data ingestion workflows Strong understanding of data integration patterns data modeling and data lakehouse architectures Experience with cloud platforms Azure AWS or GCP and associated data services Knowledge of CICD version control Git and infrastructure as code practices Familiarity with data governance security and compliance standards Preferred Skills Experience with streaming technologies Kafka Event Hubs etc Knowledge of REST APIs and connectorbased ingestion Exposure to machine learning data pipelines in Databricks Strong problemsolving communication and collaboration skills"
This job posting was last updated on 2/23/2026