Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Cooper

Cooper

via Remote Tech Jobs

Apply Now
All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

[Remote] Senior/Principal Software & Data Engineer

Anywhere
full-time
Posted 10/16/2025
Verified Source
Key Skills:
Python
Data Engineering
ETL/ELT Pipelines
Airbyte
Postgres
Snowflake
Terraform
AWS
SOC2 Compliance
ISO 27001
Security Best Practices
FastAPI
Java (Spring Boot)
CI/CD
Distributed Systems
AI/ML Infrastructure
LangChain
Pydantic
Kafka

Compensation

Salary Range

$140K - 180K a year

Responsibilities

Lead design and implementation of data infrastructure and backend services ensuring compliance with SOC2 and ISO 27001, build scalable data pipelines and APIs, drive security and infrastructure automation, and collaborate on technical strategy.

Requirements

7+ years software engineering with 3+ in data engineering, expert Python and SQL skills, experience with Airbyte, Terraform, AWS, SOC2 and ISO 27001 compliance, strong leadership and communication skills, and familiarity with AI/ML infrastructure and security frameworks.

Full Description

Note: The job is a remote job and is open to candidates in USA. Cooper is an AI-enabled enterprise decision-making platform designed to support executives in making critical daily decisions. They are seeking a highly experienced Senior/Principal Engineer to drive the design and implementation of their data infrastructure and backend services while ensuring compliance with security standards. Responsibilities • Collaborate with other Principal Engineers to define and execute the technical roadmap • Co-lead architectural decisions and technology selection across the stack • Establish and maintain SOC2 Type I and II compliance, including control implementation and audit coordination • Drive ISO 27001 certification and maintain Information Security Management System (ISMS) • Develop and enforce security policies, procedures, and technical controls • Own vendor relationships and technology partnerships • Work closely with the CEO on technical strategy and customer-facing technical discussions • Mentor team members and drive technical excellence across the engineering organization • Design and implement end-to-end data pipelines using modern ETL/ELT patterns and tools • Build and maintain RESTful APIs and microservices that power our AI platform • Architect and optimize our data infrastructure across Postgres and Snowflake environments • Lead infrastructure automation initiatives using Infrastructure as Code principles • Own the entire data lifecycle from ingestion through transformation and serving • Establish data engineering best practices for reliability, scalability, and maintainability • Design schemas, optimize queries, and ensure data quality across multiple data stores • Experience with AI agent frameworks (LangChain, LangGraph, or similar) • Working with LLMs and prompt engineering • Pydantic for data validation and AI model integration • Understanding of RAG (Retrieval-Augmented Generation) patterns • Expert-level Python programming for data engineering applications • Deep experience building and maintaining production ETL/ELT pipelines • Strong proficiency with Airbyte for data integration and replication • Advanced SQL skills with proven experience in both Postgres and Snowflake • Experience with data modeling, warehouse design, and query optimization • Understanding of batch and streaming data processing patterns • Extensive experience building RESTful web services in Python (FastAPI) and Java (Spring Boot) • Strong understanding of API design principles, authentication, and security best practices • Experience with async Python programming and performance optimization • Proficiency in building scalable, fault-tolerant distributed systems • Hands-on expertise with Terraform for infrastructure automation, specifically: • AWS provider for cloud infrastructure management • Airbyte provider for connector and connection management • Snowflake provider for database and warehouse configuration • Experience with AWS CDK for infrastructure as code for managing serverless application lifecycle and infrastructure • Strong understanding of AWS services (e.g., EC2, RDS, S3, Lambda, ECS, etc.) • Experience with CI/CD pipelines and DevOps practices (e.g., GitHub Actions) • Proven experience achieving and maintaining SOC2 Type II compliance • Experience with ISO 27001 implementation and certification processes • Deep understanding of security best practices, vulnerability management, and incident response • Experience implementing technical controls for data privacy regulations (GDPR, CCPA) • Ability to work with auditors and manage compliance documentation • Experience with security tools and practices (SIEM, vulnerability scanning, penetration testing) • 7+ years of software engineering experience with at least 3+ years focused on data engineering • Proven track record of designing and implementing complex data systems at scale • Experience with enterprise compliance frameworks (SOC2, ISO 27001, or similar) • Demonstrated ability to balance technical excellence with business requirements • Strong collaborative skills with ability to work effectively with other senior engineers • Experience leading technical initiatives in a team environment • Strong problem-solving skills with ability to debug complex distributed systems • Excellent communication skills with ability to interface with executives, customers, and auditors • Excellent communication skills for collaborating with both technical and business stakeholders • Entrepreneurial mindset and excitement about AI-powered business intelligence • Previous startup experience in a technical leadership role • Experience with dbt for data transformation • Familiarity with data orchestration tools (Airflow, Dagster, Prefect) • Experience with real-time data processing (Kafka, Kinesis) • Background in AI/ML infrastructure and MLOps • CISSP, CISA, or other security certifications • Experience with HIPAA, or other regulatory frameworks • Contributions to open-source data engineering tools • Building robust data pipelines that handle millions of records daily • Creating APIs that serve AI model predictions with low latency requirements • Automating infrastructure deployment across development, staging, and production environments • Designing data architectures that balance cost, performance, and maintainability • Implementing monitoring and alerting for data quality and system health • Scaling our platform to support 10x growth in data volume and API traffic • Developing agentic systems that drive automated data analysis (Pydantic AI, LangChain, custom agent architectures) • Establishing and maintaining SOC2 controls across all engineering systems • Leading ISO 27001 certification efforts and maintaining ongoing compliance • Building security-first architecture that enables rapid growth while maintaining enterprise trust • Collaborating with our Principal Engineer on major architectural decisions and technical strategy Skills • 7+ years of software engineering experience with at least 3+ years focused on data engineering • Proven track record of designing and implementing complex data systems at scale • Experience with enterprise compliance frameworks (SOC2, ISO 27001, or similar) • Demonstrated ability to balance technical excellence with business requirements • Strong collaborative skills with ability to work effectively with other senior engineers • Experience leading technical initiatives in a team environment • Strong problem-solving skills with ability to debug complex distributed systems • Excellent communication skills with ability to interface with executives, customers, and auditors • Excellent communication skills for collaborating with both technical and business stakeholders • Entrepreneurial mindset and excitement about AI-powered business intelligence • Expert-level Python programming for data engineering applications • Deep experience building and maintaining production ETL/ELT pipelines • Strong proficiency with Airbyte for data integration and replication • Advanced SQL skills with proven experience in both Postgres and Snowflake • Experience with data modeling, warehouse design, and query optimization • Understanding of batch and streaming data processing patterns • Extensive experience building RESTful web services in Python (FastAPI) and Java (Spring Boot) • Strong understanding of API design principles, authentication, and security best practices • Experience with async Python programming and performance optimization • Proficiency in building scalable, fault-tolerant distributed systems • Hands-on expertise with Terraform for infrastructure automation, specifically: AWS provider for cloud infrastructure management, Airbyte provider for connector and connection management, Snowflake provider for database and warehouse configuration • Experience with AWS CDK for infrastructure as code for managing serverless application lifecycle and infrastructure • Strong understanding of AWS services (e.g., EC2, RDS, S3, Lambda, ECS, etc.) • Experience with CI/CD pipelines and DevOps practices (e.g., GitHub Actions) • Proven experience achieving and maintaining SOC2 Type II compliance • Experience with ISO 27001 implementation and certification processes • Deep understanding of security best practices, vulnerability management, and incident response • Experience implementing technical controls for data privacy regulations (GDPR, CCPA) • Ability to work with auditors and manage compliance documentation • Experience with security tools and practices (SIEM, vulnerability scanning, penetration testing) • Previous startup experience in a technical leadership role • Experience with dbt for data transformation • Familiarity with data orchestration tools (Airflow, Dagster, Prefect) • Experience with real-time data processing (Kafka, Kinesis) • Background in AI/ML infrastructure and MLOps • CISSP, CISA, or other security certifications • Experience with HIPAA, or other regulatory frameworks • Contributions to open-source data engineering tools Education Requirements • Bachelor's degree in Computer Science or equivalent experience Company Overview • Cooper is an AI-enabled enterprise decision-making platform that guides executives to their most important daily decisions, tracks performance of decisions over time, and transforms organizations into nimble market leaders. It was founded in 2016, and is headquartered in Los Angeles, California, USA, with a workforce of 11-50 employees. Its website is https://www.cooper.ai. Company H1B Sponsorship • Cooper has a track record of offering H1B sponsorships, with 1 in 2023. Please note that this does not guarantee sponsorship for this specific role.

This job posting was last updated on 10/21/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt