Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Streamline

Streamline

via Remote Rocketship

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Senior Data Engineer – Snowflake, Azure, SaaS, Python

Anywhere
Full-time
Posted 2/11/2026
Verified Source
Key Skills:
Azure Functions
Snowflake
Python
SQL
Data Pipelines

Compensation

Salary Range

$120K - 200K a year

Responsibilities

Design, develop, and optimize data pipelines and workflows using Azure and Snowflake to support analytics and reporting.

Requirements

Strong expertise in Azure data services, Snowflake, SQL, Python, and experience with streaming and batch data ingestion.

Full Description

Job Description: • Design, develop, and deploy Azure Functions and broader Azure data services to extract, transform, and load data into Snowflake data models and marts. • Implement automated data quality checks, monitoring, and alerting to ensure accuracy, completeness, and timeliness across all pipelines. • Optimize workloads to reduce cloud hosting costs, including right-sizing compute, tuning queries, and leveraging efficient storage and caching patterns. • Build and maintain ELT/ETL workflows and orchestration to integrate multiple internal and external data sources at scale. • Design data pipelines that support both near real-time streaming data ingestion and scheduled batch processing to meet diverse business requirements. • Collaborate with engineering and product teams to translate requirements into robust, secure, and highly available data solutions. Requirements: • Strong expertise with Azure data stack (e.g., Azure Functions, Azure Data Factory, Event/Service Bus, storage) and Snowflake for analytical workloads. • Proven experience designing and operating production data pipelines, including CI/CD, observability, and incident response for data systems. • Advanced SQL and performance tuning skills, with experience optimizing transformations and Snowflake queries for cost and speed. • Solid programming experience in Python or similar for building reusable ETL components, libraries, and automation. • Experience with streaming and batch ingestion patterns (e.g., Kafka, Spark, Databricks) feeding Snowflake. • Familiarity with BI and analytics tools (e.g., Power BI, Grafana) consuming Snowflake data models. • Background in DevOps practices, including containerization, CI/CD pipelines, and infrastructure-as-code for data platforms. • Experience with modern data transformation tools (e.g., dbt) and data observability platforms for monitoring data quality, lineage, and pipeline health. Benefits: • A challenging and rewarding role in a dynamic and international environment. • Opportunity to be part of a growing company with a strong commitment to innovation and excellence. • A supportive and collaborative team culture that values personal growth and development. • Competitive compensation and benefits package.

This job posting was last updated on 2/16/2026

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt