via Zoho
$Not specified
Design and maintain scalable, secure data pipelines and warehouses supporting analytics and BI.
5+ years in data engineering, strong SQL, experience with AWS Glue, Spark, Redshift, Python/Scala, cloud security, and compliance.
This is a remote position. Job Description At Luxer One, we’re on a mission to relentlessly improve the way the world receives goods! We design and deploy package locker solutions for retail, office, universities, as well as apartment complexes throughout the US and Canada. We are committed to making life simpler by automating package acceptance and solving the complete package problem. We integrate best-in-class hardware and state-of-the-art software to create the best user experience. We are experiencing rapid growth and we are seeking a talented Senior Data Engineer to join in driving our product and data platform to technical excellence. ABOUT THE POSITION We are the Luxer One Data Engineering Team, an Agile group responsible for building, maintaining, and securing our data platform and pipelines across multiple cloud environments. This position plays a critical role in enabling the company to scale our analytics and insights capabilities, while safeguarding sensitive data. If you are a Senior Data Engineer with a passion for building robust, secure, and compliant data solutions, have strong SQL and distributed data processing experience, and know your way around cloud infrastructure, then we would love to talk with you. Come join the Luxen Team! Our stack is based on Amazon Web Services (AWS) and Google Cloud Platform (GCP), utilizing technologies like AWS Glue, Apache Spark, Redshift, and BigQuery with Python and Scala to deliver scalable, secure, and high-performance data solutions. As a Senior Data Engineer you will use your knowledge of core data engineering principles, distributed data processing, and cloud infrastructure to design and implement secure, robust data pipelines, support analytics workloads, and ensure the reliability, scalability, and compliance of our data ecosystem. WHAT YOU’LL DO Design, develop, and maintain scalable and secure data pipelines and ETL processes using AWS Glue, Spark, and other modern data tools. Architect, implement, and optimize data warehouse solutions in Amazon Redshift and Google BigQuery to support analytics and business intelligence. Collaborate with software engineering, analytics, and product teams to ensure data models meet business requirements and security standards. Author and review technical documentation for data pipelines, schemas, workflows, and security controls. Implement data quality, validation, and monitoring processes to ensure reliable and accurate data. Apply data security and privacy best practices (encryption at rest/in transit, IAM roles, access controls, and data masking) across all data storage and movement. Work closely with security teams to ensure pipelines adhere to regulatory and compliance requirements (SOC 2, GDPR, CCPA, etc.). Support and mentor junior data engineers in best practices, including secure coding and data handling. Participate in on-call and incident response activities for critical data pipelines. WHO YOU ARE A team player, highly collaborative with the ability to work cross-functionally across teams. A strong communicator, both verbally and written. A self-starter, motivated to learn and innovate in the data engineering space. Able to learn new software, frameworks, and cloud technologies quickly. Detail-oriented and well organized. Security-minded with a strong sense of responsibility for protecting sensitive data. Requirements EXPERIENCE YOU BRING 5+ years of experience in data engineering or related field. Strong SQL skills with demonstrable experience building complex queries and optimizing performance on large datasets. 3+ years hands-on experience building data pipelines with AWS Glue, Apache Spark, or similar ETL frameworks. 3+ years production experience with Amazon Redshift, Google BigQuery, or other large-scale data warehouses. Proficiency in Python and Scala for data processing and automation. Strong understanding of cloud infrastructure and security concepts (AWS and/or GCP), including encryption, storage, networking, IAM, and access control. Experience with data modeling, schema design, and best practices for secure data warehousing. Experience implementing data security controls such as encryption, key management, access auditing, and data masking. Experience with data privacy regulations (GDPR, CCPA, or SOC 2 controls) and aligning pipelines to compliance requirements. Advanced troubleshooting skills including analyzing logs, application monitoring, and driving problem resolution. Excellent verbal and written communications skills. Bachelor’s degree in Computer Science, Engineering, Data Science, or related field or equivalent combination of education and experience. Equal Employment Opportunity Statement [Company Name] is an Equal Opportunity Employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, veteran status, or any other legally protected status. Benefits Perks & Benefits We believe that taking care of our team means more than just a paycheck. That’s why we offer a well-rounded benefits package designed to support your health, growth, and work-life balance. Our offerings include comprehensive medical, dental, and vision coverage, a 401(k) plan with employer match to help you invest in your future, and tuition reimbursement to keep your career moving forward. You’ll also enjoy paid vacation and sick time, giving you the flexibility to recharge and take care of what matters most.
This job posting was last updated on 12/19/2025