$156K - 177K a year
Lead the migration of legacy data pipelines to Snowflake on AWS, architect enterprise data solutions, and provide technical leadership and client communication.
5+ years in data engineering with strong AWS and Snowflake expertise, ETL/ELT pipeline design skills, cloud architecture experience, and client-facing abilities.
Data Engineer: AWS & Snowflake (Contract) 2-3 Month Contract | Remote | High potential of extension & expansion About PartnerMax PartnerMax is a fast-growing data analytics consultancy that helps non-technical businesses become technology powerhouses. We specialize in enterprise-grade data solutions with startup agility, serving global brands and focusing on meaningful business impacts. We are headquartered in Miami, Florida and are seeing strong growth momentum since our inception. The Role We're seeking an AWS / Snowflake Data Engineer for a 2-3 month engagement to lead a pipeline migration project for an enterprise client. You'll be working closely with our Technical Director and the client’s internal team to help them migrate off of legacy tech into Snowflake. This is a high-impact contract role where you'll help architect enterprise data solutions, lead technical implementations, and deliver measurable business outcomes for a multi-phase project. What You'll Do: Lead Complex Technical Projects • Architect and implement data pipelines and cloud migrations (AWS DMS, Snowflake, ETL/ELT) • Design multi-tenant data architectures with enterprise-grade security • Migrate legacy systems to modern cloud platforms such as Snowflake. • Use Snowflake’s functionality to build tables, transformation logic and star schemas • Build real-time data ingestion pipelines using AWS services and Snowflake Client-Facing Technical Leadership • Lead technical discovery sessions with client engineering teams • Work with PartnerMax leadership to present recommendations to non-technical stakeholders • Provide technical guidance for working with structured and unstructured data. Technology Expertise • Core Stack: AWS (DMS, S3, DynamoDB, Lambda), Snowflake, data visualization tools (PowerBI, Tableau, Sigma Computing) • Specializations: ETL/ELT pipeline design, cloud data architecture, API integrations • Bonus: Experience in medallion architectures and embedded analytics What We're Looking For Required Experience • 5+ years in data engineering, cloud migrations, or similar technical roles • Strong AWS expertise: DMS, S3, DynamoDB, IAM, data pipeline services • Snowflake proficiency: Tasks, Streams, Snowpipe, performance optimization • ETL/ELT experience: Pipeline design, data transformation logic, scheduling and orchestration • Client-facing skills: Ability to communicate technical concepts to business stakeholders Technical Must-Haves • SQL, AWS & Snowflake mastery: Complex queries, stored procedures, performance tuning • Cloud architecture experience: Multi-tenant systems, security, scalability, • Data migration expertise: Legacy system modernization, parallel run strategies • Problem-solving mindset: Debug complex data issues, design robust error handling Preferred Background • Familiarity with ELT tools like Hevo, Fivetran, or similar ETL tools • Experience with PowerBI, Tableau, or Sigma Computing Contract-Specific Requirements • Availability: Full-time commitment for 2-3 months • Remote collaboration: Work closely with PartnerMax team and client technical staff Deliverable-driven: Success measured by project completion and client satisfaction Job Type: Contract Pay: $75.00 - $85.00 per hour Expected hours: 40 per week Work Location: Remote
This job posting was last updated on 8/11/2025