via ZipRecruiter
$120K - 200K a year
Build and maintain robust data pipelines, evaluate data lakehouse platforms, and support financial data analysis for clients.
Proficiency in SQL, Python, SDLC, and experience with Databricks and Snowflake, along with client collaboration skills.
Data Engineering: Utilize your ETL/Data Engineering expertise in Databricks, Snowflake and Cloud data services to build and maintain robust data solutions. Lead the evaluation of various catalogs and query engines for the Data Lakehouse platform, documenting findings and reviewing them with the architecture teams. SQL, Python, and strong knowledge of the SDLC are required. Build and manage dozens of data pipelines to source and transform data based on business requirements. Financial Data Analysis: Apply your knowledge in financial data analysis, risk, and compliance data management to support our financial services customers. Data Analysis and Discovery: Leverage Databricks and Snowflake for data analysis and discovery, ensuring data is accessible and actionable. Leverage 10+ sources of data to derive insights. Innovation and Learning: Quickly learn new technologies by applying your current skills, staying ahead of industry trends and advancements. Self-identify the need for new skills to be developed and adopt new technologies into your skill set in a month’s time. Client Collaboration: Work closely with financial services clients to build modern data solutions that transform how they leverage data for key business decisions, investment portfolio performance analysis, and risk and compliance management. Manage multiple stakeholder groups and their requirements.
This job posting was last updated on 2/16/2026