via DailyRemote
$120K - 200K a year
Own the multi-quarter program plan for the Data Lake, build and maintain a single-source-of-truth, and drive migration plans.
Requires 8+ years in program/project/product management, 5+ years leading data platform initiatives in the cloud, experience with modern data stacks, and expertise in data governance and Agile practices.
Job Description: • Own the multi-quarter program plan for the unified Data Lake: scope, roadmap, milestones. • Build and maintain a single-source-of-truth for delivery. • Drive the migration plan from legacy pipelines and tools to the target stack. • Define and maintain the Data Lake program backlog, translating business use cases into technical epics. • Ensure data privacy, compliance, and security best practices across environments. Requirements: • 8+ years in Program/Project/Product Management • 5+ years leading complex data platform initiatives in a cloud environment • Hands-on experience with modern data stacks: one or more of Snowflake/BigQuery/Databricks; Azure Data Factory/Airflow; dbt; Kafka/Kinesis; Git/Terraform; REST/SFTP integrations • Strong grounding in data governance and quality practices • Demonstrated expertise in Agile at scale (Scrum/Kanban), Jira/Confluence, dependency/risk management, and budget tracking (including CAPEX/OPEX) Benefits: • Professional development opportunities • Flexible working hours • Health insurance
This job posting was last updated on 1/15/2026