Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Activesoft, Inc.

Activesoft, Inc.

via Dice

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Senior Data Platform Engineer (AI/ML) - Remote but reside in DMV - Full Time(Only W2)

Anywhere
Full-time
Posted 2/13/2026
Verified Source
Key Skills:
Apache Spark
Kafka
Snowflake
Databricks
ML feature engineering
Blockchain & FinTech Systems

Compensation

Salary Range

$120K - 200K a year

Responsibilities

Build and optimize data infrastructure for AI/ML at scale, including designing high-performance pipelines and cloud-native services.

Requirements

Proficiency with relational databases, cloud services (AWS), containerization (Docker, Kubernetes), and experience with AI/ML data processing systems.

Full Description

The Opportunity As AI/ML adoption accelerates, the need for robust data management has never been more critical. We re building a new-to-market platform that orchestrates the full data lifecycle from ingestion and enrichment to discovery and dissemination. Join us as a Software Engineer to help create the metadata hub that will enable AI/ML at scale, streamlining how organizations discover, access, use, and share data across their entire ecosystem. What You ll Do Work alongside experienced engineers leveraging a modern tech stack. You ll develop, test, and productize solution components that integrate relational databases, microservices, cloud services, and containerized applications. Responsibilities • Build the data infrastructure that powers next-generation AI/ML at scale. You ll design and optimize high-performance pipelines that ingest, transform, and deliver data across enterprise ecosystems, working with relational databases including Oracle, PostgreSQL, and MySQL. • Develop cloud-native services using NestJS and Python while leveraging AWS infrastructure (Glue, S3, Lambda, ECS/EKS). Deploy production systems using Docker, Kubernetes, and Terraform, ensuring performance, reliability, and scalability at every layer. • Ship quality code through comprehensive testing and CI/CD workflows. Architect data pipelines that fuel LLM training and AI workloads, enabling seamless integration across diverse datasets. • Stay current with emerging technologies, including modern data lake / lakehouse frameworks, and help define the future of enterprise data management. Technical Foundation • 4+ years of progressive software development experience. • Proficiency with relational databases (Oracle, PostgreSQL, MySQL, etc.). • Experience with modern data lake/lakehouse technologies (Apache Iceberg, Spark, etc.). • Exposure to AWS services (Glue, S3, Lambda, EKS/ECS). • Background with Terraform for infrastructure as code. • Familiarity with workflow orchestration tools (Airflow, Step Functions, Argo, etc.). • Working knowledge of Docker and Kubernetes for containerization. • Proficiency with GitLab or similar tools for CI/CD and version control. • Experience in designing, building, and scaling AI/ML data processing systems including integrating and operationalizing LLMs in production environments.

This job posting was last updated on 2/16/2026

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt