$120K - 160K a year
Design, develop, and optimize data pipelines and data warehouse solutions using AWS Glue, PySpark, and Snowflake in a 12-month contract remote role.
8+ years of data engineering experience with strong expertise in PySpark, AWS Glue, DynamoDB, Snowflake, and familiarity with AWS data ecosystem and CI/CD practices.
Our direct Client is looking for Sr. AWS Data Engineer-12 Months Contract-Remote opportunity. Primary Skills : Spark (using PySpark) , AWS Glue , AWS DynamoDB, Snowflake Experience / Minimum Requirements: • 8+ years of experience as a Data Engineer, with strong hands-on expertise in AWS Glue, PySpark, AWS DynamoDB, and Snowflake. • Deep understanding of Spark architecture, distributed processing, and performance tuning techniques. • Strong grasp of data modeling, schema design, and data warehouse concepts. • Experience with AWS data ecosystem including S3, Lambda and Glue Catalog. • Proficiency in Python (PySpark) for data transformation and automation tasks. • Familiarity with CI/CD practices and infrastructure-as-code tools such as Terraform is a plus. • Excellent communication and problem-solving skills, with the ability to work independently and in a team environment.
This job posting was last updated on 8/28/2025