Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Connect Search

Connect Search

via Built In

Apply Now
All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Master - Data Engineer - Python/AWS/Dataframes

Anywhere
contractor
Posted 8/13/2025
Verified Source
Key Skills:
Python
SQL
AWS (S3, RDS, DynamoDB, Kinesis)
Data Pipeline Development
Data Streaming
Pandas
Agile Delivery
CI/CD

Compensation

Salary Range

$120K - 160K a year

Responsibilities

Develop and maintain back-end data engineering pipelines using Python and AWS services, manage data ingestion and streaming, and deliver features in an Agile environment.

Requirements

8+ years of data engineering experience, 2+ years with AWS services including S3, RDS, DynamoDB, and Kinesis, proficiency in SQL and Python, and strong understanding of data streaming and ingestion.

Full Description

Our client is going through a digital transformation on IoT equipment and are seeking to hire multiple engineers. This is a 24 Month Fully Remote Contract can hire only W2 Workers and Cannot work with any H1Bs nor can we sponsor any visas Overview: We are seeking multiple experienced Data Engineers with 8+ years in of experience, focusing on back-end pipeline development and AWS technologies. Proficiency in SQL and Python is essential, along with the ability to thrive in a fast-paced Agile environment. Key Responsibilities: • Python Development: Employ Python and Pandas for data transformations, converting SQL operations into Python code. • Pipeline Development: Design and implement back-end data engineering pipelines for large-scale processing. • Data Ingestion: Manage data ingestion from various sources into AWS, utilizing Kinesis and S3 for frequent streaming. • SQL Proficiency: Write complex SQL queries, including multi-table joins to update records. • Agile Delivery: Work within Agile teams to deliver features in 2-week cycles. • CI/CD Implementation: Use CI/CD practices to streamline data workflows. Qualifications: • 8+ years of data engineering experience in large-scale systems. • 2+ years with AWS services (S3, RDS, DynamoDB, Kinesis). • Proficient in SQL, preferably advanced. • Experience with Python and Pandas. • Strong understanding of data streaming and ingestion. Benefits can include: Medical, Dental, Vision, Life Insurance, Short-Term Disability, Long-Term Disability, 401(k), Vacation, PTO, Sick/Personal Time, Holidays, etc.

This job posting was last updated on 8/18/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt