Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
DMV IT Service

DMV IT Service

via Workable

Apply Now
All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Data Engineer

Anywhere
contractor
Posted 10/8/2025
Direct Apply
Key Skills:
Google Cloud Platform (Pub/Sub, BigQuery, Dataform, Cloud Storage, Cloud Composer)
SQL
Data ingestion
Data pipeline development
GitHub
Data validation
Testing (unit, system, UAT)
RESTful APIs

Compensation

Salary Range

$80K - 110K a year

Responsibilities

Design, develop, and maintain data ingestion and transformation pipelines on GCP, ensuring data quality, performance, and security.

Requirements

Bachelor's degree, 3+ years in data engineering, proficiency with GCP components, strong SQL skills, and experience with GitHub.

Full Description

Job Title: Data Engineer Location: Freeport, ME Employment Type: Contract About Us DMV IT Service LLC, founded in 2020, is a trusted IT consulting firm specializing in IT infrastructure optimization, cybersecurity, networking, and staffing solutions. We partner with clients to achieve technology goals through expert guidance, workforce support, and innovative solutions. With a client-focused approach, we also provide online training and job placements, ensuring long-term IT success. Job Purpose The Data Engineer will be responsible for designing, developing, and maintaining robust data ingestion and pipeline processes on the Google Cloud Platform (GCP). The role focuses on transforming raw data into structured, high-quality datasets that support business analytics, reporting, and decision-making. This position ensures the reliability, scalability, and security of data workflows while continuously optimizing performance and supporting production systems. Key Responsibilities Design, build, and optimize data ingestion and transformation pipelines within GCP. Translate business requirements into detailed technical designs and specifications. Develop solutions using a combination of code development and configuration tools. Perform data validation and troubleshooting through advanced SQL queries. Conduct testing activities such as unit, system, and user acceptance testing, and document results. Manage project deliverables independently or as part of a team, ensuring timely completion. Participate in code and design reviews, maintaining compliance with security and coding standards. Provide technical expertise and production support during on-call rotations. Track project timelines and communicate progress to stakeholders. Identify and recommend process enhancements to improve data efficiency and reliability. Required Skills & Experience Education: Bachelor’s degree in Computer Science, Information Technology, or a related discipline. Experience: At least 3 years of professional experience in data engineering, data pipeline development, or related fields. Technical Skills (Must Have): Proficiency with Google Cloud Platform (GCP) components including: Google Pub/Sub BigQuery Google Dataform Google Cloud Storage Cloud Composer Strong knowledge of SQL for querying, analysis, and debugging. Experience in data ingestion to BigQuery. Familiarity with GitHub for version control. Preferred Skills: Experience using Cloud Data Fusion. Working knowledge of relational databases such as SQL Server, DB2, or Oracle. Understanding of RESTful APIs and SOAP Web Services integration. Exposure to Jira and Confluence for project collaboration. Experience connecting with various Google Cloud services and external databases.

This job posting was last updated on 10/11/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt