$Not specified
The Data Engineer will design and implement scalable data pipelines using Snowflake and other cloud technologies. They will also develop and maintain ETL/ELT processes while ensuring data quality and security across all platforms.
Candidates should have over 3 years of experience in data engineering with hands-on experience in Snowflake and proficiency in SQL and scripting languages. Familiarity with cloud platforms and ETL tools is also required.
Description Location: 3 days onsite in Boston Long term contact to hire Key Responsibilities: Design and implement scalable data pipelines using Snowflake and other cloud-based data platform technologies. Develop and maintain ETL/ELT processes to ingest data from various sources. Optimize Snowflake performance through clustering, partitioning, query tuning and materialized views. Collaborate with cross-functional teams to understand data requirements and deliver solutions. Ensure data quality, integrity, and security across all data platforms. Automate data workflows and implement monitoring and alerting systems. Maintain documentation for data architecture, processes, and best practices. Stay current with Snowflake features and industry trends to continuously improve data infrastructure. Requirements Required Qualifications: 3+ years of experience in data engineering or related roles. Hands-on experience with Snowflake, including data modeling, performance tuning, and security. Proficiency in SQL and scripting languages (e.g., Python). Experience with ETL tools (e.g., dbt, Apache Airflow, Talend). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong understanding of data warehousing concepts and best practices. Preferred Qualifications: Snowflake certification(s). Experience with CI/CD pipelines and DevOps practices. Knowledge of data governance and compliance standards. Excellent problem-solving and communication skills.
This job posting was last updated on 9/25/2025