Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
JO

Jobgether

via Workable

Apply Now
All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Data Engineer (Remote - US)

Anywhere
full-time
Posted 10/22/2025
Direct Apply
Key Skills:
Data Engineering
ETL
ELT
Python
SQL
Kafka
NiFi
Data Security
Cloud Architectures
Data Lakes
Collaboration
Technical Documentation
Problem-solving
Continuous Improvement
Mentoring
Distributed Data Systems

Compensation

Salary Range

$Not specified

Responsibilities

The Data Engineer will design, implement, and maintain high-performance ETL/ELT pipelines for real-time and batch processing. They will collaborate with various teams to migrate legacy systems to scalable, cloud-based architectures while ensuring compliance with security standards.

Requirements

Candidates should have a bachelor's degree in a related field and at least 3 years of experience implementing enterprise-grade data pipelines. Hands-on expertise with technologies such as NiFi, Kafka, Python, and SQL is required, along with a strong understanding of data security and compliance.

Full Description

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer in the United States. The Data Engineer will play a pivotal role in modernizing and optimizing enterprise data pipelines, enabling the secure and efficient movement of critical data across systems. You will design, implement, and maintain high-performance ETL/ELT pipelines for both real-time and batch processing, while ensuring compliance with strict security standards. Collaborating closely with data architects, data scientists, integration teams, and product stakeholders, you will help migrate legacy systems to scalable, cloud-based architectures. This role requires hands-on technical expertise, a deep understanding of distributed data systems, and the ability to translate complex challenges into reliable, production-ready solutions. You will also mentor team members, drive continuous improvement, and contribute to mission-driven outcomes with measurable impact. Accountabilities Design, implement, and operate secure, high-performance data pipelines for real-time and batch workflows. Architect scalable solutions to reliably move sensitive data while meeting security and compliance standards. Build and optimize ETL/ELT pipelines using NiFi, Kafka, Python, and SQL. Collaborate with integration, product, and data science teams to migrate legacy pipelines to modern architectures. Maintain and optimize data lakes, ensuring smooth data exchange across multiple systems. Define monitoring and alerting strategies, create runbooks, and troubleshoot production issues. Drive continuous improvement and innovation across the data ecosystem, while mentoring and guiding team members. Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 3+ years of experience implementing enterprise-grade data pipelines, or 7+ years of equivalent professional experience. Hands-on expertise with NiFi, Kafka, Python, SQL, and distributed data systems. Familiarity with OpenSearch/Elasticsearch, CI/CD, and DevOps practices. Strong understanding of data security and compliance requirements for sensitive data. Proven ability to design and operate robust, production-grade pipelines. Experience with relational databases, existing data models, and schema extensions. Excellent collaboration, communication, and technical documentation skills. Growth mindset and proactive problem-solving in complex, mission-driven environments. Preferred / Standout Skills: Experience with public health data standards (HL7 v2.x, FHIR, LOINC, CVX, SNOMED, ICD-10). Event-driven architectures and data de-identification strategies. Hands-on experience with integration engines (e.g., Rhapsody, Mirth) and containerized deployments (Docker, Kubernetes). Familiarity with Master Patient Index (MPI) or data deduplication frameworks. Remote-first, virtual work environment with flexibility. Paid Time Off (PTO) and holidays, with flexible scheduling. 401(k) retirement plan with corporate matching. Comprehensive medical, prescription, vision, and dental coverage. Short-term and long-term disability insurance, plus life insurance. Support for home office setup for new team members. Opportunity to work on impactful, mission-driven projects improving public health. Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching. When you apply, your profile goes through our AI-powered screening process designed to identify top talent efficiently and fairly. 🔍 Our AI evaluates your CV and LinkedIn profile thoroughly, analyzing your skills, experience, and achievements. 📊 It compares your profile to the job’s core requirements and past success factors to determine your match score. 🎯 Based on this analysis, we automatically shortlist the three candidates with the highest match to the role. 🧠 When necessary, our human team may perform an additional manual review to ensure no strong profile is missed. This process is transparent, skills-based, and free of bias, focusing solely on your fit for the role. Once the shortlist is completed, we share it directly with the company offering the position. The final decision and next steps (such as interviews or assessments) are made by their internal hiring team. Thank you for your interest! #LI-CL1

This job posting was last updated on 10/23/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt