via LinkedIn
$120K - 180K a year
Lead design and delivery of scalable data platforms and data engineering standards to support consumer goods operations.
Proven leadership in data platform architecture, expertise in modern data lakehouse technologies, and proficiency with Python, SQL, and cloud data engineering tools.
Job Title: Principal Data Engineer Fortune 500 Consumer Goods Business $10 Billion in Revenue / 10k Employees Stock prices rising Summary The Principal Data Engineer shapes the data backbone that drives innovation in the consumer goods industry. This role leads the design of scalable data platforms that power everything from supply chain optimisation to demand forecasting, digital commerce, and consumer insights. As a strategic technical leader, you’ll enable the organisation to unlock smarter decision-making, faster product innovation, and a deeper understanding of consumer behaviour. Key Responsibilities • Lead the architecture and delivery of modern data platforms that support end-to-end consumer goods operations. • Establish high-impact engineering standards for data quality, governance, performance, and experimentation. • Collaborate with supply chain, marketing, sales, R&D, and analytics teams to translate business goals into scalable data solutions. • Inspire, mentor, and elevate data engineering teams, fostering a culture of curiosity, craftsmanship, and continuous improvement. • Drive adoption of cutting-edge data technologies enabling real-time insights, demand forecasting, route optimisation, and omnichannel analytics. • Ensure secure, reliable, and high-performing data ecosystems critical to manufacturing, logistics, and consumer engagement. • Champion data as a strategic asset and influence long-term data strategy across the organisation. Role Functions • Provide visionary technical leadership in data architecture, modern lakehouse design, and data platform strategy. • Design and oversee complex data models spanning products, inventory, marketing, consumers, e-commerce, and retail partners. • Evaluate, select, and roll out tools that accelerate experimentation and insight generation across the business. • Guide engineering decisions, code reviews, and solution design across multiple squads. • Partner with cross-functional leaders to build resilient, automated, and insight-rich data flows. • Own standards for CI/CD, observability, and high-scale data pipeline reliability across global operations. Typical Tech Stack • Cloud: AWS / Azure / GCP • Compute/ETL: Spark, Databricks, • Orchestration: Airflow, Prefect, • Data Storage: Delta Lake, Snowflake, Lakehouse architectures • Languages: Python, SQL, Scala • DevOps: Docker, Kubernetes, Terraform, CI/CD (GitHub Actions, GitLab CI) • Monitoring: Prometheus, Grafana, CloudWatch, Datadog
This job posting was last updated on 12/8/2025