1 open position available
Design, develop, optimize, and maintain scalable Snowflake data warehouse solutions and ETL pipelines while collaborating with cross-functional teams. | 3+ years Snowflake experience, strong SQL and data modeling skills, ETL tool proficiency, cloud platform familiarity, and knowledge of data governance and security. | About the Role We’re currently supporting a leading client network in identifying talented Snowflake Cloud Data Engineers for upcoming contract-based roles. As part of this opportunity, you'll be matched with innovative organizations that are leveraging modern cloud data infrastructure to solve real-world business challenges. In this role, you will be responsible for designing, developing, and optimizing scalable data pipelines and warehouse solutions using Snowflake. You'll work closely with data analysts, architects, and business stakeholders to ensure data availability, quality, and performance across the organization. Responsibilities • Design and implement scalable Snowflake data warehouse solutions • Develop ELT/ETL pipelines using Snowflake and data integration tools (e.g., dbt, Talend, Informatica, Fivetran, Matillion) • Optimize performance of Snowflake queries and manage data storage and compute costs • Implement security, data governance, and access controls within Snowflake • Collaborate with cross-functional teams to understand data needs and deliver solutions • Monitor, troubleshoot, and ensure the reliability of data pipelines and environments • Maintain documentation related to data architecture and processes Requirements • 3+ years of experience working with Snowflake in a production environment • Strong SQL and data modeling skills (Star Schema, Snowflake Schema) • Hands-on experience with ELT/ETL tools and scripting languages (Python preferred) • Familiarity with cloud platforms (AWS, Azure, or GCP) • Experience with CI/CD and version control tools (e.g., Git, GitHub) • Skilled in Snowflake performance tuning and optimization • Practical knowledge of Snowflake advanced features (e.g., Streams, Tasks, Time Travel) • Experience with data orchestration tools (e.g., Airflow, dbt Cloud, Dagster) • Knowledge of data governance, security, and compliance best practices • Exposure to data validation frameworks and automated testing tools • Comfortable working in agile development environments and collaborating across teams • Excellent problem-solving, documentation, and communication skills Preferred Qualifications • Snowflake SnowPro Certification • Experience with data visualization tools (e.g., Power BI, Tableau) • Familiarity with Airflow or other orchestration tools • Background in analytics or data science Eligibility Requirements No visa sponsorships available at this time Open to U.S. Citizens and Green Card holders only You must be currently located in the United States Job Type: Contract Pay: $65.00 - $90.00 per hour Expected hours: No less than 35 per week Compensation Package: • 1099 contract Schedule: • 8 hour shift Work Location: Remote
Create tailored applications specifically for ServClin with our AI-powered resume builder
Get Started for Free