via SimplyHired
$90K - 130K a year
Design, develop, optimize, and maintain ETL workflows and data pipelines to ensure data quality and alignment with business requirements.
Bachelor's degree, 5+ years ETL or data engineering experience, proficiency with ETL tools and SQL, knowledge of data warehousing and cloud platforms, scripting skills, and relevant certifications preferred.
Job Description: • Design & Develop ETL Workflows • Build robust, scalable data pipelines using CloverDX to extract, transform, and load data across systems. • Handle complex data sources (e.g., derivatives trade data, financial systems) and ensure alignment with business and regulatory requirements. • Transform raw data into structured formats using business rules and logic • Load data into target databases, data lakes, or warehouses • Optimize performance of ETL processes for scalability and efficiency • Conduct data quality checks and validation to ensure accuracy and consistency • Collaborate with data analysts, engineers, and business stakeholders • Maintain and troubleshoot ETL pipelines and resolve data-related issues • Document ETL processes and maintain technical specifications Requirements: • Bachelor’s degree in computer science, Information Systems, or related field • Over 5 years of experience in ETL development or data engineering • Proficiency in SQL and ETL tools (e.g., Informatica, Talend, Apache NiFi, SSIS) • Experience with data warehousing concepts and data modeling • Strong understanding of database systems (e.g., Oracle, MySQL, PostgreSQL) • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) • Knowledge of scripting languages like Python or Shell • Certifications in ETL tools or cloud platforms are a plus Benefits:
This job posting was last updated on 11/27/2025