Smash CR

Smash CR

3 open positions available

1 location
1 employment type
Actively hiring
Full-time

Latest Positions

Showing 3 most recent jobs
Smash CR

Senior ERP Manager P-143

Smash CRAnywhereFull-time
View Job
Compensation$120K - 200K a year

Lead ERP needs assessments and software selection engagements, facilitating executive decision-making and advising manufacturing clients. | Proven experience in ERP consulting, end-to-end engagement leadership, and manufacturing or operations-driven environment expertise. | SMASH, Who we are? We believe in long-lasting relationships with our talent. We invest time getting to know them and understanding what they seek as their professional next step. We aim to find the perfect match. As agents, we pair our talent with our US clients, not only by their technical skills but as a cultural fit. Our core competency is to find the right talent fast. This position is remote within the United States. You must have U.S. citizenship or a valid U.S. work permit to apply for this role. Role summary You will act as a trusted, technology-agnostic advisor leading ERP needs assessments and software selection engagements for operations-driven organizations. This role emphasizes executive advisory, end-to-end engagement leadership, and deep understanding of manufacturing and cost accounting to guide clients toward the right ERP decisions. Responsibilities Lead ERP needs assessment and software selection engagements from discovery through executive recommendation. Define business, functional, and technical requirements in collaboration with client stakeholders. Develop vendor shortlists, manage RFP/RFI processes, and coordinate ERP evaluations and demos. Facilitate executive-level workshops and decision-making sessions with client leadership teams. Serve as engagement or program lead with full accountability for scope, timeline, quality, and outcomes. Advise clients in manufacturing and operations-driven environments on ERP strategy and fit. Translate accounting, costing, and operational requirements into ERP selection criteria. Provide objective, vendor-agnostic guidance aligned to client business goals. Partner with internal teams to deliver high-quality advisory outputs and recommendations. Support transition planning from selection into implementation readiness, when applicable. Requirements – Must-haves Proven experience leading ERP needs assessment and software selection engagements. Demonstrated end-to-end engagement or program leadership (Project Manager, Engagement Lead, or Program Lead). Strong ability to lead executive-level meetings and facilitate client decision-making. Experience advising manufacturing clients (discrete, process, or industrial manufacturing). Solid accounting and costing knowledge relevant to ERP advisory work. Experience working with private-sector, operations-driven organizations. Strong analytical, problem-solving, and stakeholder management skills. Ability to operate as a trusted, technology-agnostic advisor. Nice-to-haves (optional) Advisory experience in distribution, supply chain–intensive businesses, or consumer products. Experience supporting industrial services organizations, local government, or tribal entities. Exposure to multiple ERP platforms across mid-market and enterprise environments. Experience supporting ERP implementation readiness or post-selection transitions.

ERP needs assessment
Software selection
Manufacturing and cost accounting
Direct Apply
Posted 7 days ago
Smash CR

Senior Data Engineer P-133

Smash CRAnywhereFull-time
View Job
Compensation$NaNK - NaNK a year

Design and implement scalable GCP-native data solutions to support machine learning and analytics initiatives. | 8+ years in data engineering with expertise in Python, SQL, GCP data services, and data architecture. | SMASH, Who we are? We believe in long-lasting relationships with our talent. We invest time getting to know them and understanding what they seek as their professional next step. We aim to find the perfect match. As agents, we pair our talent with our US clients, not only by their technical skills but as a cultural fit. Our core competency is to find the right talent fast. This position is remote within the United States. You must have U.S. citizenship or a valid U.S. work permit to apply for this role. Role summary You will design and deliver scalable, GCP-native data solutions that power machine learning and analytics initiatives. This role focuses on building high-quality, domain-driven data products and decentralized data infrastructure that enable rapid iteration, measurable outcomes, and long-term value creation. Responsibilities Design and implement a scalable, GCP-native data strategy aligned with machine learning and analytics initiatives. Build, operate, and evolve reusable data products that deliver compounding business value. Architect and govern squad-owned data storage strategies using BigQuery, AlloyDB, ODS, and transactional systems. Develop high-performance data transformations and analytical workflows using Python and SQL. Lead ingestion and streaming strategies using Pub/Sub, Datastream (CDC), and Cloud Dataflow (Apache Beam). Orchestrate data workflows using Cloud Composer (Airflow) and manage transformations with Dataform. Modernize legacy data assets and decouple procedural logic from operational databases into analytical platforms. Apply Dataplex capabilities to enforce data governance, quality, lineage, and discoverability. Collaborate closely with engineering, product, and data science teams in an iterative, squad-based environment. Drive technical decision-making, resolve ambiguity, and influence data architecture direction. Ensure data solutions are secure, scalable, observable, and aligned with best practices. Requirements – Must-haves 8+ years of professional experience in data engineering or a related discipline. Expert-level proficiency in Python and SQL for scalable data transformation and analysis. Deep expertise with Google Cloud Platform data services, especially BigQuery. Hands-on experience with AlloyDB (PostgreSQL) and Cloud SQL (PostgreSQL). Strong understanding of domain-driven data design and data product thinking. Proven experience architecting ingestion pipelines using Pub/Sub and Datastream (CDC). Expertise with Dataform, Cloud Composer (Airflow), and Cloud Dataflow (Apache Beam). Experience modernizing legacy data systems and optimizing complex SQL/procedural logic. Ability to work independently and lead initiatives with minimal guidance. Strong critical thinking, problem-solving, and decision-making skills. Nice-to-haves (optional) Experience applying Dataplex for data governance and quality management. Exposure to proprietary SQL dialects (T-SQL, PL/pgSQL). Experience supporting machine learning or advanced analytics workloads. Background working in decentralized, squad-based or product-oriented data teams. Experience influencing technical direction across multiple teams or domains.

Python
SQL
Google Cloud Platform (BigQuery, AlloyDB, Cloud SQL)
Data pipelines (Pub/Sub, Datastream, Dataflow)
Data governance (Dataplex)
Direct Apply
Posted 19 days ago
Smash CR

Senior Data Operations (DataOps) Engineer P-134

Smash CRAnywhereFull-time
View Job
Compensation$200K - 250K a year

Lead the design and implementation of enterprise-scale DataOps platforms and automation frameworks on GCP. | 8+ years in DataOps, Data Engineering, or Platform Engineering with expertise in data warehousing, distributed processing, GCP, microservices, and CI/CD. | SMASH, Who we are? We believe in long-lasting relationships with our talent. We invest time getting to know them and understanding what they seek as their professional next step. We aim to find the perfect match. As agents, we pair our talent with our US clients, not only by their technical skills but as a cultural fit. Our core competency is to find the right talent fast. This position is remote within the United States. You must have U.S. citizenship or a valid U.S. work permit to apply for this role. Role summary You will lead the evolution of DataOps practices at a global scale, designing highly automated, resilient, and scalable data platforms. This role focuses on building self-service, microservices-based data infrastructure on GCP, enabling rapid deployment, strong data reliability, and continuous delivery through advanced automation and observability. Responsibilities Lead the design and implementation of enterprise-scale DataOps platforms and automation frameworks. Architect and evolve GCP-native data platforms supporting high-throughput batch and real-time workloads. Design and implement microservices-based data architectures using containerization technologies. Build and maintain CI/CD pipelines for data workflows, including automated testing and deployment. Develop Infrastructure as Code (IaC) solutions to standardize and automate platform provisioning. Implement robust data orchestration, monitoring, and observability capabilities. Establish and enforce data quality frameworks to ensure reliability and trust in data products. Support real-time data platforms operating at extreme scale. Partner with platform squads to deliver self-service data infrastructure products. Drive best practices for automation, resiliency, scalability, and operational excellence. Influence technical direction, mentor senior engineers, and lead through ambiguity. Requirements – Must-haves 8+ years of progressive experience in DataOps, Data Engineering, or Platform Engineering roles. Strong expertise in data warehousing, data lakes, and distributed processing technologies (Spark, Hadoop, Kafka). Advanced proficiency in SQL and Python; working knowledge of Java or Scala. Deep experience with Google Cloud Platform (GCP) data and infrastructure services. Expert understanding of microservices architecture and containerization (Docker, Kubernetes). Proven hands-on experience with Infrastructure as Code tools (Terraform preferred). Strong background in CI/CD methodologies applied to data pipelines. Experience designing and implementing data automation frameworks. Advanced knowledge of data orchestration, monitoring, and observability tooling. Ability to architect highly scalable, resilient, and fault-tolerant data systems. Strong problem-solving skills and ability to operate independently in ambiguous environments. Nice-to-haves (optional) Experience with real-time streaming systems at very large scale. Exposure to AWS or Azure data platforms (in addition to GCP). Experience with data quality tooling and governance frameworks. Background building internal developer platforms or self-service infrastructure. Experience influencing technical strategy across multiple teams or domains.

DataOps
Cloud Infrastructure (GCP)
Data Engineering
Direct Apply
Posted 19 days ago

Ready to join Smash CR?

Create tailored applications specifically for Smash CR with our AI-powered resume builder

Get Started for Free

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt