Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free
Ajna Infotech

Ajna Infotech

via Indeed

All our jobs are verified from trusted employers and sources. We connect to legitimate platforms only.

Solution Architect_Local candidates only(W2 role)

Anywhere
Contract
Posted 12/16/2025
Verified Source
Key Skills:
Data architecture
Cloud platforms (AWS, Azure)
Databricks
Python
SQL
ETL/ELT pipelines
Data governance

Compensation

Salary Range

$NaNK - NaNK a year

Responsibilities

Design and implement enterprise-scale data systems, optimize data workloads, and lead cloud migration initiatives.

Requirements

Extensive experience with cloud data platforms, Databricks, Python, SQL, and data governance frameworks.

Full Description

Job Description Role: Solution Architect Location: Phoenix, AZ – (Remote) but Locals only for in-person interview Type: Contract Job Description • Data is responsible for contributing to the design, modernization, and optimization of enterprise-scale data systems, as well as the maintenance and operations strategy for client. • This role involves designing and implementing data systems that organize, store, and manage data within our cloud data platform. • The architect will perform continuous maintenance, and operations work for client in the cloud environment. • They will review and analyze client’s data infrastructure, plan future database solutions, and implement systems to support data management for client users. • Additionally, this role is accountable for ensuring data integrity, making sure the client team adheres to data governance standards to maintain accuracy, consistency, and reliability across all systems. • The architect will identify data discrepancies and quality issues, and work to resolve them. This position requires a strong blend of architectural leadership, technical depth, and the ability to collaborate with business stakeholders, data engineers, machine learning practitioners, and domain experts to deliver scalable, secure, and reliable AI-driven solutions. • The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks, Azure, and AWS environments. Key Responsibilities • Design scalable data lake and data architectures using Databricks and cloud-native services. • Develop metadata-driven, parameterized ingestion frameworks and multi-layer data architectures. • Optimize data workloads and performance. • Define data governance frameworks for client. • Design and develop robust data pipelines. • Architect AI systems, including RAG workflows and prompt engineering. • Lead cloud migration initiatives from legacy systems to modern data platforms. • Provide architectural guidance, best practices, and technical leadership across teams. • Build documentation, reusable modules, and standardized patterns. Required Skills and Experience • Strong expertise in cloud platforms, primarily Azure or AWS. • Hands-on experience with Databricks. • Deep proficiency in Python and SQL. • Expertise in building ETL/ELT pipelines and ADF workflows. • Experience architecting data lakes and implementing data governance frameworks. Hands-on experience with CI/CD, DevOps, and Git-based development. • Ability to translate business requirements into technical architecture. Technical Additional Information All your information will be kept confidential according to EEO guidelines.

This job posting was last updated on 12/18/2025

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt