Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Keylent

Keylent

via Dice

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Google Cloud Platform Data Engineer- Onsite (Atlanta/NJ)-Experience-12+ Years- need only locals or near by and need F2F interview for client round-Google Cloud Platform Certification mandatory.

Anywhere
Contract
Posted 2/10/2026
Verified Source
Key Skills:
Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Run)
ETL/ELT development
Data modeling and warehousing
SQL and Python scripting
Data pipeline optimization

Compensation

Salary Range

$Not specified

Responsibilities

Design, build, and optimize data pipelines and analytics solutions on Google Cloud Platform, including data extraction, transformation, loading, and modeling.

Requirements

Minimum 8 years in data engineering, strong GCP service expertise, proficiency in SQL and Python, experience with data modeling, and familiarity with CI/CD and IaC tools.

Full Description

Google Cloud Platform Data Engineer- Onsite (Atlanta/NJ)-Experience-12+ Years- need only locals or near by and need F2F interview for client round-Google Cloud Platform Certification mandatory. Kindly share profiles at 12+ years experience Locals Preferred We are seeking a skilled Google Cloud Platform (Google Cloud Platform) Data Engineer to design, build, and optimize data pipelines and analytics solutions in the cloud. The ideal candidate must have hands-on experience with Google Cloud Platform data services, strong ETL/ELT development skills, and a solid understanding of data architecture, data modeling, data warehousing and performance optimization. Key Responsibilities: • Develop ETL/ELT processes to extract data from various sources, transform it, and load it into BigQuery or other target systems. • Build and maintain data models, data warehouses, and data lakes for analytics and reporting. • Design and implement scalable, secure, and efficient data pipelines on Google Cloud Platform using tools such as Dataflow, Pub/Sub, cloud run, Python and linux scripting. • Optimize BigQuery queries, manage partitioning and clustering, and handle cost optimization. • Integrate data from on-premise and cloud systems using Cloud Storage, and APIs. • Work closely with DevOps teams to automate deployments using Terraform, Cloud Build, or CI/CD pipelines. • Ensure security and compliance by applying IAM roles, encryption, and network controls. • Collaborate with data analysts, data scientists, and application teams to deliver high-quality data solutions. • Implement best practices for data quality, monitoring, and governance. Required Skills and Experience: • Bachelor’s degree in Computer Science, Information Technology, or related field. • Minimum 8 years of experience in data engineering, preferably in a cloud environment. • Minimum 3 years of hands-on and strong expertise in Google Cloud Platform services: o BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring. • Proficiency in SQL, Python and Linux scripting. • Prior experience with ETL tools such as Datastage, Informatica, SSIS • Familiarity with data modeling (star/snowflake) and data warehouse concepts. • Understanding of CI/CD, version control (Git), and Infrastructure as Code (Terraform). • Strong problem-solving and analytical mindset. • Effective communication and collaboration skills. • Ability to work in an agile and fast-paced environment. • Google Cloud Platform Professional Data Engineer or Cloud Architect certification is a plus.

This job posting was last updated on 2/16/2026

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt