3 open positions available
Designing and governing enterprise data architecture supporting analytics, real-time processing, and AI/ML workflows. | Extensive experience in data architecture, cloud platforms, streaming, security, and enterprise governance, with leadership skills. | About Dynatron Dynatron is transforming the automotive service industry with intelligent SaaS solutions that drive measurable results for thousands of dealerships and service departments. Our proprietary analytics and workflow tools empower service leaders to boost profitability, enhance customer satisfaction, and unlock operational excellence. With accelerating growth, strong customer traction, and increasing market demand, we’re scaling, and we’re just getting started. The Opportunity We’re looking for an experienced, visionary Data Architect to join our expanding data organization. This is a critical role responsible for designing, governing, and optimizing the enterprise data architecture that powers scalable analytics, real-time data processing, AI/ML workflows, and secure data operations across the business. You will architect end-to-end data ecosystems, spanning streaming, warehousing, lakehouse, governance, and ML enablement, to ensure high performance, extensibility, and long-term sustainability. The ideal candidate is deeply technical, hands-on with modern cloud data platforms, and highly skilled in data modeling, security, streaming architectures, and enterprise guardrails. If you thrive in complexity, think strategically, and deliver data foundations that scale, this role offers the opportunity to shape Dynatron’s data future. What You’ll Do Data Architecture & Modeling Design scalable conceptual, logical, and physical data models supporting OLTP, OLAP, real-time analytics, and ML workloads. Architect modular, domain-driven data structures for multi-domain analytics. Apply modern modeling techniques, including 3NF, Dimensional Modeling, Data Vault, Medallion Architecture, and Data Mesh principles. Define canonical models, conformed dimensions, and enterprise reference datasets. Ensure performance, usability, and long-term maintainability of data schemas. Real-Time & Streaming Architecture Architect real-time ingestion and event-driven pipelines using Kafka, Kinesis, Pulsar, or Azure Event Hub. Implement CDC frameworks such as Debezium, Fivetran, or StreamSets. Design low-latency, high-throughput streaming architectures for operational and analytical use cases. Build real-time data models supporting live analytics and data-driven decision-making. ML/AI Data Architecture Design ML-ready datasets, feature stores, and reproducible data pipelines. Partner with ML and Data Science teams to enable production-grade model workflows. Integrate modern AI/ML platform capabilities (e.g., Snowflake Cortex, Databricks Feature Store, AWS Bedrock). Architect for drift detection, data quality monitoring, lineage visibility, retraining workflows, and model governance. Cloud Data Platform Architecture Design scalable architectures using Snowflake, Databricks, or other cloud-native platforms. Build data pipelines using ADF, Databricks Workflows, AWS Glue, Step Functions, or equivalent technologies. Optimize compute and storage performance leveraging Delta, Iceberg, Parquet, and lakehouse patterns. Implement governance controls, including RBAC, masking, tokenization, and secure data sharing. Data Security, Privacy & PII Protection Architect secure data environments aligned with GDPR, CCPA, PCI, SOC 2, and other regulatory frameworks. Implement encryption, masking, hashing, and IAM/RBAC policies. Design retention, lineage, and access governance for sensitive data. Collaborate with Compliance to ensure proper handling of PII/PHI and protected datasets. Enterprise Governance & Guardrails Define enterprise-wide modeling standards, data contracts, and schema evolution guidelines. Establish reference architectures and curated “golden” datasets. Create SLAs/SLOs across data domains to ensure reliability and quality. Enforce adherence to governance, quality, and architectural frameworks. Leadership, Mentorship & Collaboration Mentor data engineers and guide architectural best practices. Lead design reviews and cross-functional architectural discussions. Partner closely with product, engineering, ML, and analytics teams to ensure alignment on data strategy. Communicate risks, trade-offs, and long-term architectural impact with clarity. Delivery, Scalability & Operational Excellence Ensure data systems meet SLAs and scale with business demand. Drive observability, monitoring, and alerting across data platforms. Reduce technical debt through proactive governance and architectural discipline. Support the full data lifecycle: design → build → deployment → governance. What You Bring Technical Expertise 7-10+ years of experience as a Data Architect or Senior Data Engineer in enterprise-scale environments. Deep hands-on experience with Snowflake, Databricks, Azure Data Factory, AWS Glue, Bedrock, Redshift, BigQuery, or Teradata. Strong SQL and Python/Scala skills, with expertise in schema design and metadata management. Experience building streaming architectures with Kafka, Kinesis, or Event Hub. Knowledge of ML/AI pipelines, feature stores, vector databases, and modern AI platform tooling. Security & Privacy Expertise in encryption, masking, tokenization, IAM, and RBAC. Understanding of PII/PHI requirements and regulatory standards. Experience implementing secure patterns across cloud platforms. Cloud Architecture Experience designing distributed systems across AWS, Azure, or GCP. Strong understanding of compute scaling, storage layers, and cloud-native services. Soft Skills & Leadership Strong documentation skills, including architectural diagrams, ADRs, and playbooks. Excellent communication skills, with the ability to influence at all levels. Proven mentorship, leadership, and cross-functional collaboration. Strategic thinker with a high degree of ownership and accountability. Nice To Have Experience with data mesh and domain-driven design. Experience with Snowflake Cortex, Databricks AI, or AWS Bedrock. Expertise in lakehouse architectures (Delta, Iceberg, Hudi). Background in large-scale modernization or cloud migration initiatives. Why Dynatron Opportunity to architect the data foundation of a rapidly growing SaaS organization. High-impact role with visibility across engineering, product, ML, analytics, and executive teams. Values-driven culture built on accountability, urgency, positivity, and delivering results. Remote-first environment offering flexibility and autonomy. Compensation Base Salary: $140,000 - $180,000/yr Equity: Participation in Dynatron’s Equity Incentive Plan Benefits Summary Comprehensive health, vision, and dental insurance Employer-paid short- and long-term disability and life insurance 401(k) with competitive company match Flexible vacation policy and 11 paid holidays Remote-first culture Ready to shape the future of Dynatron’s data architecture and build scalable, intelligent systems that power our next stage of growth? Join us as we build the data foundation for extraordinary outcomes.
Managing and optimizing SQL Server databases, cloud services, and supporting data migration and performance tuning. | Extensive experience in database administration, cloud platforms, and scripting, but lacks specific AI, ML, and modern data engineering skills required for this role. | About Dynatron Dynatron is transforming the automotive service industry with intelligent SaaS solutions that deliver measurable results for thousands of dealership service departments. Our proprietary analytics, automation capabilities, and AI-powered workflows empower service leaders to increase profitability, elevate customer satisfaction, and operate with greater efficiency. With accelerating demand and a rapidly expanding product ecosystem, we’re scaling fast, and we’re just getting started. The Opportunity We’re looking for a hands-on and forward-thinking AI Data Engineer to build and operationalize agentic AI systems that combine modern data engineering, LLM orchestration, and automation frameworks. In this high-impact role, you will design the pipelines, retrieval systems, and infrastructure that allow AI agents to reason, take action, and deliver real-time intelligence using enterprise data. As a core member of our AI & Data Engineering team, you’ll work with cutting-edge platforms, including AWS Bedrock, LangChain, Snowflake, and vector databases, to develop AI-driven workflows that enhance automation across the business. This is a rare opportunity to help shape the foundation of Dynatron’s AI strategy while contributing directly to product innovation and operational excellence. What You’ll Do AI-Driven Data Engineering Design and maintain scalable data pipelines and ingestion frameworks powering advanced AI and agent workflows using AWS Glue, Lambda, Step Functions, and Kinesis. Prepare and optimize structured and unstructured data for retrieval-augmented generation (RAG) and LLM-enabled use cases. Integrate vector databases such as Pinecone, Chroma, Amazon OpenSearch, or FAISS for semantic retrieval. Automate context curation, data transformations, and memory persistence to support dynamic prompt construction and autonomous agent behavior. Agentic AI & Orchestration Build and deploy autonomous AI agents using LangChain, LlamaIndex, or AWS Agents for Bedrock. Develop multi-agent workflows capable of reasoning, tool usage, and decision-making through secure function calling. Integrate AI agents with enterprise systems including Snowflake, Redshift, Databricks, Salesforce, JIRA, and ServiceNow. Create pipelines that enable agents to execute SQL, APIs, and document searches with full governance and security. AI Infrastructure & MLOps Operationalize Bedrock-hosted models (Anthropic Claude, Amazon Titan, Llama 3, and others) into data engineering workflows. Build embedding pipelines and feature stores supporting intelligent retrieval and semantic search. Implement CI/CD for AI services using GitHub Actions, Airflow, or AWS CodePipeline. Monitor model accuracy, hallucination rates, and system reliability through automated evaluation. Data Governance & Compliance Apply data security, masking, and lineage controls to all AI-enabled pipelines and retrieval systems. Ensure Responsible AI practices, such as transparency, fairness, auditability, are embedded throughout the AI stack. Maintain metadata, logging, and observability for all AI and data engineering workflows. What You Bring Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field. 5+ years of data engineering experience, including 2+ years building AI-integrated data systems. Hands-on expertise with: AWS Bedrock, SageMaker, and Lambda LangChain or LlamaIndex Snowflake, Redshift, or Databricks Python, SQL, and API integrations Vector databases (Pinecone, FAISS, Chroma) Familiarity with RAG pipelines, LLM function calling, and prompt optimization techniques. Experience integrating enterprise data with LLMs from Anthropic, OpenAI, or Meta. Strong understanding of data modeling, ETL orchestration, and MLOps best practices. Preferred Qualifications Experience with multi-agent frameworks such as CrewAI, AutoGen, or Bedrock Agents. Knowledge of data observability tools like Monte Carlo, DataHub, or Marquez. Familiarity with Docker, Kubernetes, and CI/CD automation. Relevant industry or academic certifications, including: AWS Certified Machine Learning – Specialty Google Cloud Generative AI Engineer MIT/Stanford AI & ML Certifications DeepLearning.AI LLM Applications Certificate Success Metrics Deployment of agentic AI solutions that significantly reduce manual workloads and improve efficiency. Faster data-to-AI latency and increased retrieval accuracy across AI experiences. Reduction in data quality issues and model hallucination rates. Expansion of automation coverage and measurable improvements in real-time AI-enabled insights. Why Dynatron Opportunity to help define the future of AI and automation at a category-defining SaaS company. Build production-grade agentic AI systems that have immediate and measurable operational impact. High-performance culture grounded in innovation, collaboration, and continuous learning. Remote-first workplace offering autonomy, flexibility, and deep cross-functional partnership. Competitive compensation and comprehensive benefits package. Benefits Summary Competitive base salary Participation in Dynatron’s Equity Incentive Plan Comprehensive health, vision, and dental insurance Employer-paid short- and long-term disability and life insurance 401(k) with competitive company match Flexible vacation policy and 9 paid holidays Remote-first culture Compensation Base Salary: $140,000 to $180,000 Equity: Participation in Dynatron’s Equity Incentive Plan If you're excited to build the AI foundations that will power Dynatron’s next wave of innovation, we’d love to meet you. Dynatron is an Equal Opportunity Employer and encourages all qualified individuals to apply.
Design, build, and maintain marketing lead funnels and automation programs while reporting on campaign ROI and collaborating with CRM and internal teams to optimize marketing operations. | 3+ years in marketing operations with experience in Hubspot, Salesforce, B2B software marketing, and familiarity with digital marketing channels and analytics. | At Dynatron Software, we help automotive service departments increase revenue and profitability with our suite of services. We strive to be a people-first company where employees enjoy coming to work, the people they work with, and are given the autonomy to succeed. Our company culture is built on a foundation of teamwork, accountability, integrity, clear communication, and positive attitudes. We are currently looking to add new talent to our growing team! About the Role The Marketing Operations Manager is responsible for designing, building and maintaining the marketing lead engine. This role will spearhead the optimization of our marketing operations, from technology implementation to lead management and conversion strategies. This person will be responsible for calling balls and strikes, reporting on the effectiveness of Marketing campaigns and programs. This person has a passion for data, automation and analytics. What You’ll Be Accountable For Building, maintaining, and reporting on the marketing lead funnel Building and maintaining marketing automation programs and campaigns Reporting on campaign ROI and recommending areas for optimization Working with the CRM team to build out new functionality and improvements Managing the SDR pipeline funnel, lists and prioritization views Manage incoming leads and data cleanup process Collaborate with internal teams to optimize landing pages and conversion funnels. Evaluate customer experience across multiple channels and touchpoints. Identify and implement new marketing technologies. Evaluate emerging technologies for potential adoption. Key Success Indicator(s) Landing Page Conversion Rates Speed from Marketing Lead to Sales Lead New CRM functionality project completed on time Your Work DNA Experience in B2B software marketing and experience working with strategic partners to develop co-marketing initiatives. Strong analytical skills, loves spreadsheets and reports Understanding of lead funnels, comfortable using the language of ROI, ROAS, CAC Track record of delivering strong results and innovation. Willingness to roll up their sleeves and get stuff done. Exceptional verbal and written communication skills. Ability to work and thrive in an autonomous, fast-paced and changing environment Love for diverse work assignments and the opportunity to try new approache Your Background 3+ years experience in Marketing Operations in the B2B software space Experience building and maintaining landing pages and forms Familiar with traditional and digital marketing channels, content marketing and distribution platforms, and consumer analytics. Experience with Hubspot required Experience with SFDC required, experience with SugarCRM preferred Experience with WordPress, basic HTML, Zapier, Typeform preferred Knowledge of the automotive industry a plus. Some travel might be required. In Return for Your Expertise, You Will Receive Excellent benefits including health, dental, and vision insurance, stock options, work from home and flexible scheduling depending on job requirements, professional development opportunities, 9 paid holidays, and 15 days PTO. Home office setup support for remote employees. A welcome “swag bag” with branded clothing as an official welcome to the team. The chance to work for an organization that puts people first and fosters a culture of teamwork, integrity, communication, accountability, and positive attitude! Dynatron Software is an Equal Opportunity Employer and encourages all qualified individuals to apply. Compensation Range: $120,000 - $140,000/yr
Create tailored applications specifically for Dynatron Software with our AI-powered resume builder
Get Started for Free