$130K - 180K a year
Lead and maintain large-scale Big Data infrastructure, coordinate cross-team initiatives, design architecture, and provide on-call production support.
7+ years with Hadoop, 3+ years leading DevOps or Big Data teams, experience with large Hadoop clusters, NoSQL DBs, Kafka, infrastructure automation tools, and shell scripting.
Summary: As a Senior or Lead Big Data DevOps Engineer, you will be working with a team responsible for setting up, scaling, and maintaining Big Data infrastructure and tools in private and public cloud environments. Main Responsibilities: • Driving improvement of the efficiency of Big Data infrastructure. • Coordinating cross-team infrastructure and Big Data initiatives. • Leading Big Data – related architecture and design efforts. • Ensuring availability, efficiency, and reliability of the Big Data infrastructure. • Building and supporting tools for operational tasks. • Evaluating, designing, deploying monitoring tools. • Design and implementation of DR/BC practices and procedures. • On-call support of production systems. Requirements: • 7+ years of experience working with Hadoop, preferably Open Source. • 3+ years of leading Big Data, DevOps, SRE, DBA, or development team. • Experience setting up and running Hadoop clusters of 1000+ nodes. • Solid knowledge of NoSQL databases, preferably Cassandra or ScyllaDB. • Experience running and troubleshooting Kafka. • Working knowledge of at least one of: Terraform, Ansible, SaltStack, Puppet. • Proficiency in shell scripting. Nice to have: • Experience with Prometheus. • Experience managing Showflake. • Solid knowledge of Graphite and Grafana. • Python or Perl scripting skills. • Experience with installing and managing Aerospike. • DBA experience with one of: PostgreSQL, MySQL, MariaDB.
This job posting was last updated on 8/13/2025