via Successfactors
$90K - 130K a year
Develop, optimize, and maintain Python-based ETL pipelines in Azure Synapse and coordinate with teams for releases and audits.
Requires 5+ years data engineering experience, 2+ years Python for data processing, 1+ year Microsoft Azure experience, and familiarity with BI and DevOps.
Who we are At the heart of our outsourcing organization, the Data & Intelligence Competence Center serves as a dedicated hub for advanced data-driven solutions. We specialize in data engineering, analytics, and AI-powered insights, helping businesses turn raw information into actionable intelligence. By combining deep technical expertise with industry best practices, we enable smarter decision-making, optimize processes, and foster innovation across diverse sectors. To deliver on this mission, we rely on talented professionals who can transform complex data challenges into robust, scalable solutions. This is where you come in. We are seeking an experienced and motivated Data Engineer to join our team. The role focuses on the development, optimization, and maintenance of Python-based ETL pipelines within an Azure Synapse environment. You will also be responsible for ensuring the reliability, scalability, and performance of our data systems. The ideal candidate will have a strong technical background, excellent communication skills, and the ability to work collaboratively with diverse stakeholders. What you'll be doing Develop and optimize Python-based ETL pipelines using Azure Synapse Maintain and improve existing pipelines for reliability and performance Plan and manage capacity and sizing of cloud service components Oversee operations and maintenance of software, infrastructure, and databases Manage interactions with PowerBI APIs Monitor system performance and ensure SLA compliance Create and maintain service-related documentation Optimize processes and configurations across environments Coordinate with development, architecture, rollout, operations teams, and customers for releases and upgrades Support risk assessments and audits for service-related components What you'll bring along Bachelor’s degree in Computer Science, Information Technology, or a related field. Minimum 5 years of experience in data engineering Minimum 2 years of experience with Python for data processing (Polars, Pandas, PySpark, etc.) At least 1 year of hands-on experience with Microsoft Azure Solid understanding of cloud architecture and ITIL principles Familiarity with the software-as-a-service (SaaS) model Excellent communication skills with a high degree of customer focus Experience with Azure Synapse, Delta Lake, and/or PowerBI Exposure to Business Intelligence (BI) projects Familiarity with DevOps practices and tools Knowledge of DataVault 2.0 methodologies Excellent command of spoken and written English
This job posting was last updated on 12/8/2025