RBA, Inc.

RBA, Inc.

1 open position available

1 location
1 employment type
Actively hiring
Contract

Latest Positions

Showing 1 most recent job
RBA, Inc.

Senior Data Engineer- Contract

RBA, Inc.AnywhereContract
View Job
Compensation$120K - 150K a year

Designing, building, and optimizing data pipelines and platforms for analytics and AI/ML initiatives, collaborating with cross-functional teams, and implementing best practices for data governance. | 5+ years of data engineering experience, proficiency in Python and SQL, hands-on with cloud platforms and data integration tools, and familiarity with distributed computing frameworks. | Senior Data Engineer- Contract RBA is an established leader and trusted partner for enterprise and mid-size organizations seeking to transform their business through technology solutions. As a Digital and Technology consultancy, we combine strategic insight with technical expertise to deliver impactful, scalable solutions that align with business goals. We take pride in working with some of the most recognized companies in our market—while fostering a culture that blends challenging career opportunities with a collaborative, fun work environment. We welcome talent across the U.S. and support remote, hybrid, and in-office work. Our office is located in the vibrant downtown of Wayzata, MN. We are seeking a Senior Data Engineer to join our growing Data & Analytics practice. In this role, you’ll be instrumental in architecting, building, and optimizing data pipelines and modern data platforms that power analytics, AI/ML, and business intelligence solutions for our clients. You’ll partner with business stakeholders, data scientists, and software engineers to design scalable, secure, and high-performance data ecosystems. The ideal candidate is equally comfortable digging into complex data problems, enabling AI-driven applications, and setting technical direction for the team. Responsibilities • Design, build, and optimize end-to-end data pipelines across structured, semi-structured, and unstructured data sources. • Architect and implement modern cloud-based data platforms leveraging tools such as Databricks, Snowflake, or equivalent. • Collaborate with cross-functional teams to understand requirements and translate them into robust, scalable technical solutions. • Implement and enforce best practices for data governance, quality, integrity, and security throughout the data lifecycle. • Develop and optimize ETL/ELT processes using tools such as dbt, Airflow, or Azure Data Factory. • Work with real-time streaming technologies (Kafka, Kinesis, or Event Hubs) for event-driven architectures. • Enable AI and machine learning initiatives by designing data pipelines that support feature engineering, semantic search, and retrieval-augmented generation (RAG). • Evaluate and integrate emerging architectures such as Microsoft Fabric and Medallion to support both analytics and AI-driven applications. • Mentor junior engineers and contribute to technical leadership and thought leadership within the team and client projects. • Stay current with emerging trends in big data, AI/ML, and cloud-native architectures, introducing new tools and frameworks where appropriate. Requirements • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. • 5+ years of proven experience in data engineering or data platform development. • Strong programming skills in Python, Scala, or Java for data engineering tasks. • Deep expertise with SQL and NoSQL databases, data modeling, and data warehousing. • Hands-on experience with at least one major cloud platform (AWS, Azure, GCP). • Proficiency with ETL/ELT, workflow orchestration, and data integration tools (e.g., Airflow, dbt, Azure Data Factory, AWS Glue). • Familiarity with distributed computing frameworks (Spark, Hadoop, or similar). • Strong understanding of CI/CD practices, DevOps for data, and infrastructure-as-code (Terraform, CloudFormation). • Excellent communication skills and ability to thrive in client-facing, collaborative environments. Preferred Qualifications • Hands-on experience with Microsoft Fabric and Medallion Architecture for implementing scalable, structured lakehouse solutions. • Knowledge of Vector Databases (e.g., Pinecone, Milvus, Weaviate, FAISS) and Vector Search for powering semantic search and AI-driven applications. • Familiarity with AI/ML workflows—supporting data scientists, preparing features, and enabling retrieval-augmented generation (RAG) use cases. • Certifications in cloud computing (AWS, Azure, or GCP). • Experience with modern data platforms such as Databricks, Snowflake, BigQuery, or Synapse. • Experience with data governance and cataloging tools (Collibra, Alation, Microsoft Purview). • Exposure to machine learning pipelines and integration with cloud-based ML services. • Experience with data visualization tools (Power BI, Tableau, Looker). Why RBA? At RBA, you’ll work with some of the most innovative data and cloud solutions in the market, shaping how organizations harness data to drive real business outcomes. You’ll enjoy: - Impactful client projects across industries - A culture of learning, collaboration, and growth - Opportunities to influence technology direction and mentor others - Flexibility to work remote, hybrid, or onsite If you’re passionate about building modern data solutions and driving digital transformation, we’d love to hear from you.

Python
SQL
Spark
Snowflake
Databricks
ETL/ELT
Cloud Platforms (AWS, Azure, GCP)
Data Modeling
Verified Source
Posted 11 days ago

Ready to join RBA, Inc.?

Create tailored applications specifically for RBA, Inc. with our AI-powered resume builder

Get Started for Free

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt