via LinkedIn
$NaNK - NaNK a year
Lead data migration from InterSystems to cloud platforms like Snowflake and Databricks, building scalable ETL pipelines, ensuring HIPAA compliance, and collaborating with global teams.
5+ years as a Data Engineer with hands-on experience in healthcare data migration, cloud data warehousing, and large-scale ETL pipeline development, with knowledge of HIPAA and healthcare data standards.
Data Engineer – Intersystems to Cloud Migration (Snowflake/Databricks) Location: Remote NYC Contract: Ongoing Contract-Part-time About BigRio BigRio is a Boston-headquartered, remote-first technology consulting firm specializing in AI/ML, Cloud Transformation, Data Engineering, Healthcare Modernization, and Digital Innovation. We partner with leading healthcare, life sciences, and enterprise organizations to modernize legacy systems, build intelligent data platforms, and accelerate their path to AI-driven operations. As part of our expanding Cloud & Data Engineering practice, we are looking for a talented Data Engineer to support a high-impact healthcare data migration initiative. At BigRio, you’ll work with a collaborative, cross-functional global team while leveraging cutting-edge technologies to solve complex real-world challenges. About the Role We are seeking a highly skilled Data Engineer with hands-on experience migrating data from InterSystems to modern cloud data platforms, specifically Snowflake and Databricks. This role focuses on building secure, scalable pipelines to migrate large-volume datasets (5M+ records), including accurate de-identification of PHI, ensuring compliance with healthcare privacy standards. You will be a core contributor to BigRio’s cloud modernization solutions, working closely with internal stakeholders and client teams. Key Responsibilities • Lead end-to-end migration of datasets from InterSystems (Caché/IRIS) to Snowflake and Databricks within BigRio’s cloud modernization framework. • Build, optimize, and maintain large-scale ETL pipelines supporting 5M+ records with high performance and reliability. • Implement PHI de-identification and masking processes aligned with HIPAA and BigRio security best practices. • Develop scalable data processing jobs using Databricks (Spark, PySpark, Delta Lake). • Integrate data workflows with AWS RDS, Snowflake, and Databricks. • Work with BigRio’s architecture and AI engineering teams to support analytics, reporting, and ML/GenAI initiatives. • Ensure data quality, monitoring, and validation throughout migration cycles. • Maintain documentation of architecture, data lineage, transformation logic, and operational procedures. • Collaborate with distributed BigRio and client teams across the U.S. and India. Required Skills & Experience • 5+ years as a Data Engineer, with strong cloud engineering experience. • Hands-on expertise with InterSystems (Caché/IRIS) — extraction, schema mapping, transformations. • Strong experience building solutions with Snowflake and cloud-native data warehousing. • Expertise with Databricks, Spark, and PySpark for large-scale processing. • Proven ability to build and optimize ETL pipelines handling 5M+ record volumes. • Strong understanding of PHI de-identification, HIPAA compliance, and healthcare data workflows. • Experience with AWS RDS, SQL, Python, and cloud orchestration tools. • Strong foundation in data modeling, performance tuning, and scalable architecture. • Excellent communication skills and comfortable working with global remote teams. Preferred Qualifications • Experience with HL7, FHIR, CCD, or healthcare data interoperability. • Exposure to dbt, Airflow, Step Functions, or similar orchestration tools. • Experience in healthcare data modernization or cloud migrations.
This job posting was last updated on 12/11/2025