via Teamtailor
$90K - 130K a year
Lead data engineering execution focusing on data migrations, transformations, validation, and building production-grade data pipelines and APIs.
7+ years in data engineering with strong Python, SQL/NoSQL, Azure data tools, and leadership in data migration projects.
We are seeking an experienced Data Engineer Lead to support complex application and product development initiatives. This role focuses on designing, building, and leading data migrations, pipelines, and models that power web and mobile software experiences. You will work closely with product and engineering teams in a fast-paced, collaborative environment. This is a hands-on technical leadership role with responsibility across the full data lifecycle — from design through deployment and ongoing support. What You’ll Do Lead data engineering execution for complex application initiatives Design and deliver data migrations between systems, including: Data mapping and transformation Validation and reconciliation Ensuring accuracy and completeness across environments Build and maintain production-grade data pipelines, APIs, and data management tools Model data and analyze source and target schemas Write complex SQL and work with both relational and NoSQL databases Resolve data quality issues during migrations Support continuous delivery across the lifecycle: design, build, deploy, test, monitor, and support Contribute to discussions around data quality, testing strategy, and automation Collaborate closely with engineers and stakeholders in an agile environment How You’ll Succeed Deliver reliable, production-ready data solutions Ensure migrations are accurate, secure, and well-tested Maintain high standards for performance, simplicity, and quality Communicate clearly with both technical and non-technical partners Move quickly while maintaining strong attention to detail Who You Are 7+ years of experience in a data-focused engineering role Strong experience leading data migration initiatives Proficient in Python with a track record of shipping production pipelines Strong experience with SQL and NoSQL databases (SQL Server and MongoDB preferred) Comfortable building scripts or services for data movement and validation Experience with Azure data tools (such as Azure Data Factory, Azure Data Lake, Microsoft Fabric) or comparable technologies Experience with Git and distributed version control Familiar with automated deployments and continuous delivery practices Strong understanding of REST APIs, JSON, testing, security, and performance best practices Bachelor’s degree in Computer Science or related field, or equivalent experience Strong verbal and written communication skills Thrive in collaborative, service-oriented environments The successful candidate will be required to work PST hours.
This job posting was last updated on 2/26/2026