via LinkedIn
$120K - 200K a year
Designing and optimizing data pipelines and analytics solutions on GCP, supporting marketing and healthcare data needs.
Requires advanced GCP, data engineering, and API integration skills, with experience in HIPAA compliance and marketing attribution, which are not present in your profile.
Job Description We are seeking a senior, part-time Data Generalist to support analytics, data engineering, and MarTech initiatives over a 4-month contract (March–June). This project may extend beyond that, but that is what we anticipate right now. This role is ideal for a highly autonomous engineer who can operate with minimal handholding in a fast-moving environment. The role is remote, based in North America, averaging ~15 hours per week, with workload that may be uneven or “spiky” depending on project needs. Strong communication and comfort in client-facing discussions are essential. Responsibilities • Design, build, and optimize data pipelines and analytics solutions on Google Cloud Platform (GCP). • Develop and optimize BigQuery datasets using partitioning, clustering, and query optimization. • Build and maintain ETL/ELT pipelines using Python, SQL, and tools such as Fivetran and dbt. • Develop and support custom APIs using Cloud Functions or Cloud Run. • Implement event-driven architectures using Pub/Sub and workflow orchestration via Cloud Composer (Airflow) or Cloud Workflows. • Apply IAM and security best practices, including handling PHI/PII in HIPAA-compliant environments. • Support marketing data use cases including attribution, Google Ads, Meta Ads, and API-based integrations. • Partner closely with the Martech Architect to translate business and marketing requirements into scalable technical solutions. • Communicate clearly with internal teams and clients, explaining technical concepts and tradeoffs. Requirements Must-Have • Strong expertise in Google Cloud Platform, including: • * BigQuery (partitioning, clustering, optimization) • Cloud Functions and/or Cloud Run • Pub/Sub • IAM and security best practices • Cloud Composer (Airflow) or Cloud Workflows • Solid data engineering background, including: • * Python (pandas, requests, google-cloud libraries) • Advanced SQL (complex queries, window functions, performance tuning) • ETL/ELT pipeline design • Data modeling (star and snowflake schemas) • Familiarity with dbt • Experience with data clean rooms • Experience in healthcare and/or marketing technology, including: • * HIPAA compliance requirements • PHI/PII handling • Marketing attribution concepts • API integrations (REST, OAuth) • Hands-on experience with: • * Fivetran • Git version control • Terraform (infrastructure as code) • Monitoring tools (Datadog, Cloud Monitoring) • Analytics platforms such as Tableau or Looker • Strong written and verbal communication skills; comfortable working directly with clients. • Ability to work independently with limited oversight; senior-level judgment and execution. Nice-to-Have • Experience with LiveRamp or IQVIA • Prior healthcare data background • Backend development experience using JavaScript/TypeScript
This job posting was last updated on 2/12/2026