Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
SB

Sierra Business Solution LLC

via Dice

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

AZ Databricks Technical Lead Remote Location

Anywhere
Contract
Posted 12/15/2025
Verified Source
Key Skills:
TypeScript
React
Node.js
Full-stack development
API development
Cloud platforms (AWS)
Data pipelines

Compensation

Salary Range

$120K - 200K a year

Responsibilities

Designing, developing, and maintaining scalable data pipelines and workflows in a cloud environment.

Requirements

Experience with full-stack development, cloud services, API integration, and data processing, but lacks specific experience with Databricks, Azure Delta Lake, and ETL tools.

Full Description

About the Role: We are looking for a skilled Data Engineer with strong experience in Databricks SQL and Azure Delta Lake to join our data team. The ideal candidate will be responsible for designing, developing, and maintaining scalable data pipelines and optimizing data workflows in a cloud environment. Experience with ETL tools like Informatica is a plus Key Responsibilities: • Develop, test, and maintain data pipelines and workflows using Databricks and Azure Delta Lake. Design and implement data models, tables, and views to support analytics and business intelligence use cases. • Optimize data ingestion and transformation processes for performance and cost-effectiveness. • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. • Ensure data quality, data governance, and security best practices are followed. • Troubleshoot data issues and provide timely support to data consumers. Assist in migrating and integrating data from on-premises and other cloud sources into Azure Delta Lake. • (Good to have Develop and maintain ETL processes using Informatica to support data integration requirements. Required Skills & Qualifications: • Bachelors degree in Computer Science, Information Technology, Engineering, or related field. • Proven experience working with Databricks Notebooks Knowledge of Spark (PySparkScala) within Databricks environments. • Understanding of Azure Data Factory or other orchestration frameworks. • Strong knowledge of Azure Delta Lake architecture, workbook, and best practices. • Hands-on experience with cloud data platforms, preferably Azure. • Proficient in writing complex SQL queries and data transformation logic. • Familiarity with data warehousing concepts and cloud-based data lakehouse architecture. • Strong problem-solving and communication skills. • Experience with version control and collaboration tools like Git. Good to Have: • Experience with ETL tools, especially Informatica for data integration and workflow orchestration. • Experience with CICD pipelines for data projects. • Familiarity with data governance, metadata management, and data catalog tool

This job posting was last updated on 12/17/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt