$120K - 160K a year
Develop and architect cloud-hosted backend systems and APIs for data warehouse and processing pipelines supporting autonomous systems.
Extensive Node.js backend development, database and API design experience, Linux/open-source stack proficiency, cloud platform and Kubernetes knowledge, and familiarity with robotics/autonomy data.
Full Stack/Backend Developer needed to support the development of could hosted back-end elements of a data warehouse and processing pipeline. -This position can be fully remote with a few trips per year to Lexington, MA. Our client is seeking a software engineer to develop cloud hosted back-end elements of a data warehouse and processing pipeline for robotics and autonomous systems data. Robotic systems generate a large volume of data from sensors and internal processing. Storing and reusing that data to validate and develop new autonomy algorithms is a crucial element of building innovative and robust autonomy capability. The data engineer will work in a team of autonomy-, front-end-, and devops-focused engineers to conceive, architect, and implement data and processing systems that accelerate and enhance autonomous systems prototyping. Required Qualifications: • Extensive hands-on experience building web application back-end systems and APIs using Node.js (preferred) or Flask. • Strong experience interfacing with document and relational databases as well as search tools. • Demonstrated ability to design and architect back-end systems and APIs. Ability to communicate and iterate on the design with stakeholders using effective diagrams and verbal discussions. • Experience working in small teams and owning responsibility for many elements across the back-end. • Professional approach to software development with habitual application of software engineering best practices. Passion for producing high-quality artifacts. • Expertise with Linux-based development environment using open-source technology stacks. Able to create and use containers or virtual machines as needed. • Good knowledge of security best practices and common pitfalls. • Experience with robotics or autonomous systems data and/or the Robot Operating System (ROS). • Experience with cloud platforms and ecosystems, such as AWS. • Experience with infrastructure-as-code (IAC), especially terraform • Experience with container orchestration in Kubernetes for deployment of cloud applications. Nice-to-have Qualifications: • Bachelor's Degree in Computer Science. Recent graduates or candidates without a Bachelor's degree considered with clear evidence of significant outside-of-classroom experience. • Experience with the Apache Maven or Gradle build system. • Ability to understand front-end source code written in React or similar frameworks. Provide guidance to less experienced front-end engineers. • General knowledge of machine learning and reinforcement learning concepts, frameworks, and environments, such as Pandas, TensorFlow, and Jupyter Notebook. • Broad knowledge of the general features, capabilities, and trade-offs of common data warehouse (e.g. Apache Hadoop); workflow orchestration (e.g. Apache Beam); data extract, transform and load (ETL); and stream processing (e.g. Kafka) technologies. Hands-on experience with several of these technologies. -This position can be fully remote with a few trips per year to Lexington, MA. Hours can be a little flexible but preference for someone to be working EST hours/or close to.
This job posting was last updated on 9/9/2025