2 open positions available
Drive workflow automation initiatives, transition workflows to SAP-supported solutions, and promote BPA adoption across business teams. | Strong BPA technical expertise, hands-on experience with UFT, Tosca ALM, or MicroFocus, ability to communicate technical concepts to non-technical stakeholders, and experience enabling team adoption of automation solutions. | Job Title: Business Process Automation Expert Location: Open (Manager is in Iowa, corporate in Indiana) Start Date: 10/1 Duration: 1 year initially (likely to extend 2 3 years) Job Description: We are seeking a Business Process Automation (BPA) Expert to support S4 and ECC platforms. The ideal candidate will have strong technical expertise in workflow automation and the ability to collaborate closely with both technical teams and business stakeholders. Responsibilities: • Work as part of the development team, reporting to the development lead • Drive workflow automation initiatives across business groups • Transition existing workflows (currently in UFT, Tosca ALM, MicroFocus) to SAP-supported solutions • Guide business teams to understand and maintain the workflows built • Promote adoption of BPA within internal teams and enable automation of new requests • Leverage automation, AI, and streamlined functionality to enhance business efficiency • Act as a bridge between technology and business, ensuring alignment and sustainability of solutions Qualifications: • Strong technical background in Business Process Automation, preferably within SAP environments • Hands-on experience with workflow automation tools (UFT, Tosca ALM, MicroFocus, etc.) • Ability to communicate technical concepts effectively to non-technical stakeholders • Experience in enabling teams to adopt and maintain automation solutions • Knowledge of AI and automation trends is a plus
Design, develop, optimize, and maintain big data pipelines and cloud applications using Python/PySpark within a Hadoop ecosystem, ensuring performance tuning and adherence to Agile methodologies. | 5+ years experience with Python/PySpark, Hadoop ecosystem tools, cloud application development, distributed computing, relational and NoSQL databases, multi-threading, shell scripting, CI/CD tools, microservices, container orchestration, Agile/SAFe practices, and strong communication skills. | Job Requirements - Qualifications: Software Engineer Big Data 5+ years in Python/PySpark 5+ years optimizing Python/PySpark jobs in a hadoop ecosystem 5+ years working with large data sets and pipelines using tools and libraries of Hadoop ecosystem such as Spark, HDFS, YARN, Hive and Oozie. 5+ years with designing and developing cloud applications: AWS, OCI or similar. 5+ years in distributed/cluster computing concepts. 5+ years with relational databases: MS SQL Server or similar 3+ years with NoSQL databases: HBASE (preferred) 3+ years in creating and consuming RESTful Web Services 5+ years in developing multi-threaded applications; Concurrency, Parallelism, Locking Strategies and Merging Datasets. 5+ years in Memory Management, Garbage Collection & Performance Tuning. Strong knowledge of shell scripting and file systems. Preferred: Knowledge of CI tools like Git, Maven, SBT, Jenkins, and Artifactory/Nexus Knowledge of building microservices and thorough understanding of service-oriented architecture Knowledge in container orchestration platforms and related technologies such as Docker, Kubernetes, OpenShift. Understanding of prevalent Software Development Lifecycle Methodologies with specific exposure or participation in Agile/Scrum techniques Strong knowledge and application of SAFe agile practices, preferred. Flexible work schedule. Experience with project management tools like JIRA. Strong analytical skills Excellent verbal, listening and written communication skills Ability to multitask and prioritize projects to meet scheduled dea
Create tailored applications specifically for Cosmic-I LLC DBA Northern Base with our AI-powered resume builder
Get Started for Free