Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Databricks

Databricks

via WantRemote

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Resident Solutions Architect – Public Sector

Anywhere
Full-time
Posted 1/8/2026
Verified Source
Key Skills:
distributed computing
Apache Spark
cloud ecosystems (AWS, Azure, GCP)
data architecture
CI/CD
MLOps

Compensation

Salary Range

$200K - 200K a year

Responsibilities

Designing and building data architectures, guiding customer projects, and providing technical support for big data and AI applications.

Requirements

6+ years in data engineering, experience with Spark, cloud platforms, Python or Scala, and project delivery skills.

Full Description

Job Description: • Handle a variety of impactful customer technical projects which may include designing and building reference architectures, creating how-to's and productionalizing customer use cases • Work with engagement managers to scope variety of professional services work with input from the customer • Guide strategic customers as they implement transformational big data projects, 3rd party migrations, including end-to-end design, build and deployment of industry-leading big data and AI applications • Consult on architecture and design; bootstrap or implement customer projects which leads to a customers' successful understanding, evaluation and adoption of Databricks. • Provide an escalated level of support for customer operational issues • Collaborate with the Databricks Technical, Project Manager, Architect and Customer teams to ensure the technical components of the engagement are delivered to meet customer's needs • Work with Engineering and Databricks Customer Support to provide product and implementation feedback and to guide rapid resolution for engagement specific product and support issues Requirements: • US Top Secret Clearance Required this position • 6+ years experience in data engineering, data platforms & analytics • Comfortable writing code in either Python or Scala • Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one • Deep experience with distributed computing with Apache Spark™ and knowledge of Spark runtime internals • Familiarity with CI/CD for production deployments • Working knowledge of MLOps • Design and deployment of performant end-to-end data architectures • Experience with technical project delivery - managing scope and timelines • Documentation and white-boarding skills • Experience working with clients and managing conflicts • Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects • Bachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience through work experience • Ability to travel up to 30% when needed. Benefits: • Comprehensive benefits and perks that meet the needs of all employees

This job posting was last updated on 1/9/2026

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt