Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
ExecutivePlacements.com

ExecutivePlacements.com

via LinkedIn

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Snowflake Data Engineer (with AWS)

Nashville, TN
Full-time
Posted 11/26/2025
Verified Source
Key Skills:
Java
Kubernetes
AWS
Snowflake
Data Pipelines
Apache NiFi
Apache Flink
Prometheus
GitHub
Jenkins
Terraform
Kafka

Compensation

Salary Range

$120K - 160K a year

Responsibilities

Build and maintain data pipelines and cloud applications using Java, Kubernetes, and AWS, collaborating with a team to deliver data movement and transformation services.

Requirements

3+ years in data and cloud application engineering with strong Java, Kubernetes, AWS skills, and experience in data warehousing and pipeline tools like NiFi and Flink.

Full Description

Software and Data Engineering: Utilize your expertise in Java and Kubernetes to build and maintain robust data solutions. DevOps and monitoring solutions as well as a strong knowledge of the SDLC are required. Build and manage dozens of data pipelines to source and transform data based on business requirements. Innovation and Learning: Quickly learn new technologies by applying your current skills, staying ahead of industry trends and advancements. Self-identify the need for new skills to be developed and adopt new technologies into your skill set in a month's time. Team Collaboration: Collaborate within a Pod of 4+ data engineers, working towards common objectives in a consultative fashion with clients. Data Movement and Transformation: Use Apache NiFi and Apache Flink for data movement, streaming, and transformation services, ensuring efficient and reliable data workflows. 3+ years of experience in Data and Cloud Application Engineering. 2+ years of experience working with Snowflake and AWS cloud services. 3+ years of experience in a Data warehousing environment. 3+ years of experience building Data Pipelines (NiFi, DBT, Apache Airflow, or Matillion, etc.) Java: Advanced skills in Java programming. Kubernetes: Proficiency in using Kubernetes for container orchestration. AWS: Strong knowledge of AWS services. Prometheus: Expertise in using Prometheus for monitoring. GitHub: Proficiency in using GitHub for version control. Jenkins: Experience in using Jenkins for continuous integration and delivery. Terraform: Strong knowledge of Terraform for infrastructure as code. Kafka: Experience in using Kafka for streaming data. Domain Expertise: Nice to have experience in financial data analysis, risk, and compliance data management. Learning Agility: Ability to quickly learn new technologies by applying current skills. Adaptability: Ability to adapt to new technologies quickly and efficiently.

This job posting was last updated on 11/27/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt