via Remote Rocketship
$120K - 160K a year
Design, implement, optimize, and maintain Kafka-based data streaming architectures and clusters for cybersecurity data processing, ensuring high availability, security, and performance.
5+ years experience with Kafka and distributed data streaming technologies, strong troubleshooting and communication skills, cloud deployment experience, and U.S. citizenship.
Job Description: • Design, implement, and optimize Kafka-based data streaming architectures for cybersecurity data collection and processing. • Develop and maintain Kafka clusters to ensure high availability, fault tolerance, and scalability. • Configure and tune Kafka for optimal performance, including partitioning, replication, and consumer group strategies. • Collaborate with integration engineers to design and implement efficient data pipelines from data sources through Kafka into downstream platforms. • Participate in Agile ceremonies including backlog grooming, demos, and retrospectives. • Provide expertise on Kafka security features including encryption, authentication, and authorization. • Conduct capacity planning and performance testing for Kafka deployments. • Troubleshoot complex issues in Kafka systems. • Develop and maintain documentation for Kafka configurations, best practices, and troubleshooting procedures. Requirements: • 5+ years of relevant experience • Strong experience with Kafka and other distributed, big data, or data streaming technologies • In-depth knowledge of Kafka functionality and operational workflows • Ability to install, maintain, and troubleshoot Kafka clusters • Understanding of data serialization formats and schema management • Ability to design secure configurations and access controls for shared Kafka deployments • Excellent troubleshooting, communication, and interpersonal skills • Proven ability to analyze complex requirements and translate them into clear, actionable tasks and processes through critical thinking • Ability to design, build, and maintain message configurations and flows in high-throughput, low-latency environments • Strong problem-solving skills with the ability to analyze issues in Kafka and other complex distributed systems • Experience documenting tests and presenting findings • Demonstrated ability to apply critical thinking to translate undefined tasks into actionable work streams • Experience deploying Kafka in cloud-based environments (AWS preferred; Azure and GCP also acceptable) • Proven ability to write documentation and communicate effectively with cross-functional teams • Familiarity with containerization and orchestration technologies such as Docker and Kubernetes • Experience operating and monitoring large-scale production clusters • Applicants must be a U.S. citizen in compliance with federal contract requirements. Benefits: • 18 days of PTO • 11 holidays • 85% of insurance premium covered • 401k • continued education • certifications maintenance and reimbursement • and more.
This job posting was last updated on 11/26/2025