via Greenhouse
$130K - 270K a year
Create and maintain data workflows and automation pipelines using Apache Airflow, ensuring reliability, scalability, and observability.
Requires experience with Python, Java, Linux, Apache Airflow, Docker, Git, and security clearance, with 5+ years of experience.
Build to something to be proud of. Captivation has built a reputation on providing customers exactly what is needed in a timely manner. Our team of engineers take pride in what they develop and constantly innovate to provide the best solution. Captivation is looking for software developers who can get stuff done while making a difference in support of the mission to protect our country. Description Captivation Software is looking for a mid level software engineer who shall be responsible for creating and maintaining data workflows and automation pipelines using Apache Airflow. This role focuses on building reliable, scalable, and observable workflow orchestration solutions that support data engineering, analytics, and operational use cases. The engineer will collaborate closely with data engineers, platform teams, and stakeholders to ensure workflows are efficient, secure, and production ready. Requirements Security Clearance: Must currently hold a Top Secret/SCI U.S. Government security clearance with a favorable Polygraph, therefore all candidates must be a U.S. citizen Minimum Qualifications: Master's degree in Computer Science or related discipline from an accredited college or university, plus three (3) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity. Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus five (5) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity Seven (7) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity. Required Skills: Experience using the Linux CLI and Linux tools Experience developing Bash scripts to automate manual processes Recent software development experience using Python and Java Experience using Apache Airflow (DAG design, scheduling, operators, sensors) to orchestrate, schedule, and monitor complex workflows Experience using Distributed Big Data processing engines including Apache Spark Experience with containerization technologies such as Docker, containers, and Podman Experience with Git Source Control System Desired Skills: Experience using the Atlassian Tool Suite (JIRA, Confluence) Familiar with AWS Cloud Services and Infrastructure This position is open for direct hires only. We will not consider candidates from third party staffing/recruiting firms. Benefits Annual Salary: $130,000 - $270,000 (Depends on the Years of Experience) Up to 20% 401k contribution (No Matching Required and Vested from Day 1) Above Market Hourly Rates $3,600 HSA Contribution 6 Weeks Paid Time Off Company Paid Employee Medical/Dental/Vision Insurance/Life Insurance/Short-Term & Long-Term Disability/AD&D
This job posting was last updated on 2/18/2026