Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Amtex Systems Inc

Amtex Systems Inc

via LinkedIn

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Data platform engineer with Data bricks & Python exp

Anywhere
Contract
Posted 2/13/2026
Verified Source
Key Skills:
Databricks
Apache Spark
Python

Compensation

Salary Range

$120K - 200K a year

Responsibilities

Lead and develop data ingestion and extraction strategies, build scalable data pipelines, and ensure data quality and governance.

Requirements

7+ years in data engineering roles with expertise in Databricks, Python, ELT/ETL pipelines, data lakehouse design, and streaming architectures.

Full Description

Hi We have an urgent role of Data platform engineer. Data platform engineer- emphasis databricks and building data pipeline with python Heavy data bricks and python. Have to have medallion architecture as well Remote As a Data Platform Engineer – Lead, you will lead and execute the data ingestion and extraction strategies for core enterprise datasets—primarily large-scale compensation survey data used by consulting teams to deliver guidance and insights to customers. You’ll own the technical design and hands-on delivery of data pipelines that ensure accuracy, reliability, and accessibility of this data across the organization. This role requires deep Databricks experience and strong Python expertise, along with the ability to partner closely with business stakeholders. You’ll serve as a bridge between technical teams and the business, translating requirements into scalable data solutions while setting engineering standards and mentoring others. Curiosity, ownership, and self-direction are critical to success in this role. Required Qualifications • 7+ years of experience in data platform or data engineering roles • Strong, hands-on experience with Databricks and Apache Spark • Advanced Python programming skills (data processing, pipeline development, testing) • Experience building ELT/ETL pipelines in distributed environments • Deep understanding of data lake and lakehouse design principles • Experience with data quality frameworks, observability tooling, and governance patterns • Experience building and consuming RESTful and/or gRPC services • Experience with event-driven and streaming architectures • Strong understanding of SDLC best practices, including CI/CD, testing, and Git workflows • Excellent communication skills with the ability to partner directly with business and consulting stakeholders • Self-starter mindset with a strong sense of ownership, curiosity, and motivation • Experience in regulated or data-sensitive industries is a plus

This job posting was last updated on 2/19/2026

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt