Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
JO

Jobgether

via Workable

Apply Now
All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Big Data Engineer with Google cloud (GCP)

Anywhere
full-time
Posted 10/18/2025
Direct Apply
Key Skills:
Big Data
Google Cloud
Data Engineering
ETL
Cloud Dataflow
BigQuery
Cloud Dataproc
Cloud Pub/Sub
Cloud Composer
Cloud Storage
NoSQL Databases
Relational Databases
AI
Machine Learning
Data Visualization
DevOps

Compensation

Salary Range

$Not specified

Responsibilities

Architect, design, and implement data pipelines and enterprise infrastructure on Google Cloud Platform (GCP). Lead cloud transformation and migration projects, ensuring robust, secure, and high-performing data infrastructure.

Requirements

Bachelor’s degree in Computer Science, Engineering, or a related field is required, along with Google Cloud Certified Professional Cloud Architect certification. Candidates should have 8+ years of experience in data engineering, with 3-5+ years specifically on GCP.

Full Description

This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Big Data Engineer with Google Cloud (GCP) in the United States. As a Big Data Engineer, you will play a pivotal role in architecting and implementing scalable data solutions on Google Cloud Platform. You will lead the migration of on-premises data systems to the cloud, build enterprise-level data pipelines, and optimize big data workloads for performance and cost-efficiency. This role offers the opportunity to work with cutting-edge cloud technologies such as BigQuery, Cloud Dataflow, Cloud Composer, and Kubernetes Engine, while influencing enterprise-wide data strategies. You will collaborate with cross-functional teams, providing technical guidance and ensuring robust, secure, and high-performing data infrastructure. The position combines technical depth with leadership, allowing you to shape cloud transformation initiatives and deliver impactful solutions at scale. Accountabilities Architect, design, and implement data pipelines and enterprise infrastructure on Google Cloud Platform (GCP). Lead cloud transformation and migration projects, including strategy, design, and implementation of private and public cloud solutions. Utilize GCP services such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Pub/Sub, Cloud Composer, and Cloud Storage to build scalable data platforms. Design and manage relational and NoSQL databases including Cloud Spanner, Cloud SQL, Cloud Bigtable, and Cloud Firestore. Implement advanced analytics, AI, and machine learning pipelines to support business intelligence and data-driven decision-making. Monitor and optimize cloud workloads for cost, performance, and security efficiency. Provide technical guidance and subject matter expertise to cross-functional teams and senior management. Ensure proper documentation, best practices, and continuous improvement in cloud architecture and governance. Bachelor’s degree in Computer Science, Engineering, or a related field. Google Cloud Certified Professional Cloud Architect certification required. 8+ years of experience in data engineering, with 3-5+ years on GCP and deep hands-on experience designing and deploying cloud data solutions. Expertise in ETL and Big Data tools such as BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Pub/Sub, Cloud Composer, and Google Data Studio. Experience with NoSQL databases, search technologies (Lucene, Elasticsearch), and relational databases (Cloud Spanner, Cloud SQL). Strong software development background, with knowledge of SDLC, DevOps, CI/CD, and Agile methodologies. Proficiency in data visualization tools such as Kibana, Grafana, Tableau. Excellent communication and leadership skills, with experience advising stakeholders and collaborating with cross-functional teams. Desired: knowledge in GCP storage lifecycle management, BigQuery slots management, cost optimization, and Hadoop ecosystem tools (HDFS, Hive, Spark, Kafka, NiFi, Oozie, Splunk). Competitive long-term contract compensation. Remote work flexibility (location-dependent within the U.S.). Opportunity to lead large-scale cloud migration and data transformation projects. Exposure to cutting-edge GCP technologies and enterprise-scale big data infrastructure. Collaborative environment with mentorship opportunities and influence on enterprise architecture. Professional growth through complex, high-impact data engineering initiatives. Jobgether is a Talent Matching Platform that partners with companies worldwide to efficiently connect top talent with the right opportunities through AI-driven job matching. When you apply, your profile goes through our AI-powered screening process designed to identify top talent efficiently and fairly. 🔍 Our AI evaluates your CV and LinkedIn profile thoroughly, analyzing your skills, experience, and achievements. 📊 It compares your profile to the job’s core requirements and past success factors to determine your match score. 🎯 Based on this analysis, we automatically shortlist the 3 candidates with the highest match to the role. 🧠 When necessary, our human team may perform an additional manual review to ensure no strong profile is missed. The process is transparent, skills-based, and free of bias — focusing solely on your fit for the role. Once the shortlist is completed, we share it directly with the company that owns the job opening. The final decision and next steps (such as interviews or additional assessments) are then made by their internal hiring team. Thank you for your interest! #LI-CL1

This job posting was last updated on 10/19/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt