7 open positions available
Manage and optimize Oracle Fusion Cloud applications, lead testing and rollouts, and ensure system security and performance. | Minimum 10+ years in ERP, 5+ years in Oracle Fusion Cloud, experience with configurations, custom fields, formulas, and project leadership. | Job Title: Oracle Cloud Solutions Architect (EPM) Location: Lawrenceville, GA 30046 (5 Days Onsite) Type: Long Term Contract Full JD:- We are looking for a candidate with an in-depth understanding of Oracle Fusion Cloud applications, experienced in ERP implementations and long-term maintenance of these, and a team player who is reliable, enthusiastic, committed, creative, and customer focused. This role also requires strong collaboration & leadership skills to manage the technical and operational work, and proactively engaging with the business teams in a continuously evolving environment. Minimum Qualifications: Bachelor's degree in computer science or equivalent. 10+ years of overall ERP Administration Experience. 5+ years of experience as Solutions architect for Oracle Fusion Cloud ERP, EPM and/or HCM. Well-versed with configurations, custom fields and fast formulas used in Oracle Fusion Cloud, with a strong understanding of integration with other Fusion applications and third-party systems. Experience in leading full projects involving Requirements, Design, Testing and Roll-out. Comfortable with Oracle Fusion data dictionary and the relationship between various objects, to quickly understand requirements and design solutions. Responsibilities: Responsible for the technical configurations, fast formulas, custom fields, and streamlined performance of the Oracle Fusion cloud applications. Review the impacts of the Oracle Fusion quarterly releases and other patch deployments and coordinate the needed testing with the business teams. Lead the end-to-end testing of fixes/enhancements, including the creation of test plans and test scripts. Design & create custom security roles to ensure they don't grant too many privileges. Conduct regular security audits to identify and fix vulnerabilities. Proactively Monitor Oracle Fusion Cloud applications performance and architecture to ensure the system reliability, integrity and recoverability. Work with internal IT staff, third-party vendors and Oracle to update and communicate environment maintenance schedules, refresh schedules, and outages. Design and implement best practices to administer and improve the functionality, reliability and security of Oracle Fusion Cloud applications. Troubleshoot issues related to Oracle Fusion Cloud applications. Partner with internal IT teams such as Security, Network and Service Desk teams, and external vendor teams to proactively monitor, identify issues and drive them to resolution.
Design, deploy, and optimize large-scale event streaming environments and automate infrastructure provisioning. | Over 10 years in DevOps or similar roles, with hands-on experience in Confluent Kafka, AWS, Terraform, and CI/CD tools. | Role: Devops Architect / Lead Engineer with Confluent Kafka Location-Remote Mode Of hire- Contract to hire We are seeking a highly skilled DevOps Architect / Lead Engineer with expertise in Confluent Kafka, AWS, and modern automation frameworks. The ideal candidate will lead the design, deployment, and optimization of large-scale, distributed event-streaming environments across on-premise and cloud infrastructures. Key Responsibilities • Design, deploy, and configure Confluent Kafka clusters, topics, partitions, replication strategies, and security configurations across on-prem and cloud environments (including Confluent Cloud). • Automate provisioning, deployment, scaling, and maintenance using tools such as Terraform, Chef, Ansible, Jenkins, and other CI/CD technologies. • Build automated, self-service capabilities for topic creation, schema governance, ACLs, and resource provisioning to streamline engineering workflows. • Build and maintain CI/CD pipelines integrating Kafka components and infrastructure changes using Jenkins, Git, and other DevOps toolchains. • Perform root-cause analysis, optimize throughput, tune brokers/producers/consumers, and manage high-severity production incidents. • Provide architectural leadership for event-driven solutions, real-time data processing, and streaming ecosystems across cloud and hybrid environments. Skill set requirement: • 10+ years of experience in DevOps, Cloud Engineering, or Platform Engineering roles. • Hands-on expertise with Confluent Kafka (Cloud and On-Prem). • Advanced knowledge of AWS (VPC, IAM, Networking, Security, EKS nice to have). • Strong proficiency in Terraform. • Strong CI/CD experience with Jenkins, Git, and automated deployment pipelines. • Experience managing and monitoring large-scale distributed systems. • Strong understanding of event-driven architecture, data streaming patterns, and real-time integration. • Excellent communication and leadership skills with the ability to collaborate across teams.
Gathering and documenting business requirements, creating process diagrams, facilitating requirement sessions, and coordinating testing and deployment. | 7+ years of business analysis experience in financial domain, strong communication skills, experience with Agile, and proficiency with tools like MS Azure DevOps and Visio. | IT Business Analyst (Financial domain) Location: Remote Duration: 6+ Months Job description: MAIN RESPONSIBILITIES & DUTIES • Elicits key inputs from project sponsors and identify project interdependencies. • Clearly articulate business requirements to IT teams in verbal and written form, using user story methodologies with keen attention to detail. • Organizes, creates, and maintains story maps for one or more product lines and can, in detail, describe and discuss the universe of features that make up a given product. • Engages with employees to incorporate usability and user interface needs when designing and enhancing systems. • Finalizes solution design by creating requirements documentation and process flow diagrams. • Nurtures ongoing relationships with business partners to drive satisfaction, quality, and project outcomes. • Works with business partners to prioritize work in order to maximize overall business value and then provides guidance to the technical teams so feature / functionality is being worked in the appropriate order. The Senior IT Business Analysts internalizes and owns this process. • Initiates and schedules the deployment of product releases into a production environment (per audit requirements). • Provides IT leadership with regular productivity reports in Scrum and Kanban boards using Azure DevOps (or an equivalent tool). • Provides guidance to business analyst team by giving input on processes and general direction on their work throughout the day; elevating quality of their work. • Coordinates user acceptance testing, training, and external communication. Responsible for gathering and writing a framework for the testing, organizing and scheduling user acceptance testing, and keeping track of the results (driving open issues to closure). • Ability to facilitate and document meetings and disseminate action items/status to the appropriate teams. • Please note, this job description is designed to cover or contain a comprehensive listing of activities, duties, or responsibilities that are required of the employee for this job. Duties, responsibilities, and activities may change at any time with or without notice. PROFESSIONAL KNOWLEDGE, SKILLS & ABILITIES • Bachelor's degree or equivalent experience • 7+ years of Business Analyst work, with banking or financial domain (Preferably working with BoFa, Citi, JPMC etc). • 5+ years of writing detailed business requirements, creating process flow diagrams, and working closely with business and technical teams. • 5+ years of being part of a technical team that has successfully delivered software from lower environments up through a production environment. • 3+ years of planning, hosting, and running requirement sessions (both gathering and reviewing requirements). • 2+ years of experience being a Subject Matter Expert (SME) for a given product, system, or feature group and providing guidance to others that are not as familiar. • As a Business Analyst, having a solid understanding of accounting systems and proven track record of successfully implementing positive changes within the accounting systems of an organization • A track record of excellent written and oral communication skills; the majority of this job involves writing and orally communicating with people. • Experience in Agile methodologies (Scrum, Kanban) is preferred. • Tools: MS Azure Dev Ops, MS Visio, MS Office QUALITIES & CHARACTERISTICS • Ability to think and act strategically based on the information that is gathered, but still work independently and tactfully on project-specific items • Outstanding leadership skills, including the ability to thoughtfully listen, partner with, and technically execute on the needs of the business. Be able to work holistically to prioritize items that intrinsically provide value to the business over items that are of lesser value. • Strong analytical skills required, including a thorough understanding of how to interpret stakeholder needs and translate them into operational and project requirements. • Strong understanding of business complexity and project interdependencies. • Strong technical understanding of how software is created, tested, and deployed from lower to upper environments. You don't need programming skills for this position; however, you should have an understanding as what is meant by front-end, middle ware, back-end, database, and so on. • Solid understanding of various project management methodologies (Agile, Waterfall) • Committed to driving change, continuous improvement, and bringing projects to fruition on time and within budget. • Ability, forethought, and inclination to mentor and/or coach less experienced Business Analysts on best practices and executing on the nuances of the Business Analyst role. COMMUNICATION SKILLS • Outstanding verbal communication skills, with an ability to adapt communication style to suit various audience types and sizes. • Ability to understand customer business functionality and translate it into clearly written project requirements. • Ability to collaborate with peers, project teams, and quality analysts to identify and communicate courses of resolutions. • Committed to using human-centered design strategies when conducting discovery and solution ideation sessions with stakeholders.
Design, develop, and optimize scalable data pipelines and workflows on Databricks, collaborate with AI/ML teams, and ensure data quality and compliance in cloud environments. | 5+ years software/data engineering experience including 2+ years with Databricks and Spark, strong Python, SQL, cloud expertise, infrastructure-as-code knowledge, and leadership skills. | Title: Sr. Data Engineer (Databricks) Location: USA/ Remote Job Type: 6-12 Months Contract About the Role We're looking for a hands-on Data Engineer to build reliable, scalable data pipelines on Databricks. You'll turn requirements into production-grade ELT/ETL jobs, Delta Lake tables, and reusable components that speed up our teams. In this role, you'll implement and improve reference patterns, optimize Spark for performance and cost, apply best practices with Unity Catalog and workflow orchestration, and ship high-quality code others can build on. If you love solving data problems at scale and empowering teammates through clean, well-documented solutions, you'll thrive here. Core Qualifications: Bachelor's degree in computer science, Engineering, or a related field 5+ years of experience in software/data engineering, including at least 2 years working with Databricks and Apache Spark Strong proficiency in Python, SQL, and PySpark Deep understanding of AWS and Azure Cloud service Experience with Databricks Data LakeHouse, Databricks Workflows, and Databricks SQL, dbt Solid grasp of data Lakehouse and warehousing architecture Prior experience supporting AI/ML workflows, including training data pipelines and model deployment support Familiarity with infrastructure-as-code tools like Terraform or CloudFormation Strong analytical and troubleshooting skills in a fast-paced, agile environment Excellent collaboration skills for interfacing with both technical and non-technical customer stakeholders Clear communicator with strong documentation habits Comfortable leading discussions, offering strategic input, and mentoring others Key Responsibilities: The ideal candidate will have a strong background in building scalable data pipelines, optimizing big data workflows, and integrating Databricks with cloud services This role will play a pivotal part in enabling the customer's data engineering and analytics initiatives-especially those tied to AI-driven solutions and projects-by implementing cloud-native architectures that fuel innovation and sustainability Partner directly with the customer's data engineering team to design and deliver scalable, cloud-based data solutions Execute complex ad-hoc queries using Databricks SQL to explore large lakehouse datasets and uncover actionable insights Leverage Databricks notebooks to develop robust data transformation workflows using PySpark and SQL Design, develop, and maintain scalable data pipelines using Apache Spark on Databricks Build ETL/ELT workflows with AWS and Azure Services Optimize Spark jobs for both performance and cost within the customer's cloud infrastructure Collaborate with data scientists, ML engineers, and business analysts to support AI and machine learning use cases, including data preparation, feature engineering, and model operationalization Contribute to the development of AI-powered solutions that improve operational efficiency, route optimization, and predictive maintenance in the waste management domain Implement CI/CD pipelines for Databricks jobs using GitHub Actions, Azure DevOps, or Jenkins Ensure data quality, lineage, and compliance through tools like Unity Catalog, Delta Lake, and AWS Lake Formation Troubleshoot and maintain production data pipelines Provide mentorship and share best practices with both internal and customer teams
Collaborate with business and tech teams to analyze requirements, maintain and troubleshoot data pipelines, conduct testing, and support enhancements for Microsoft BI and Power Platform solutions. | 5-7+ years as a systems or technical analyst with strong Microsoft BI and Power Platform experience, SQL knowledge, testing skills, and excellent communication. | Job Title: Technical Systems Analyst (Microsoft BI & Power Platform) Job Location: USA- Remote Job Type: Contract Job Summary: We are seeking a versatile and hands-on Technical Systems Analyst with strong experience in Microsoft technologies, particularly within BI and Power Platform. This role blends systems analysis, requirements gathering, workflow documentation, pipeline operations, troubleshooting, and testing. It is ideal for someone with a programming background who thrives in a small team environment where everyone rolls up their sleeves. You will work closely with business users, developers, and data engineers to describe integrated workflows, document requirements, run and maintain pipelines, troubleshoot and fix issues, and conduct and document testing to ensure reliable business solutions. Key Responsibilities: Business & Systems Analysis: • Collaborate with business stakeholders to gather, refine, and document functional and non-functional requirements. • Describe workflows and integrations across multiple systems. • Translate business needs into system specifications and process flows. • Maintain clear documentation for requirements and workflows. Data Pipeline Operations & Troubleshooting: • Run and maintain pipelines for business users to ensure accurate and timely delivery of data. • Troubleshoot and fix technical issues in data pipelines. • Ensure data integrity and reliability across staging, transformation, and reporting layers. • Document recurring issues and propose process or system improvements. QA & Testing: • Develop and manage test plans, test cases, and execution strategies. • Personally conduct and document testing results. • Ensure traceability of requirements to test outcomes. Support & Enhancements: • Provide escalation support for custom applications, dashboards, and workflows, including triage, root cause analysis, and coordination with dev teams. • Scope and assist with minor enhancements or configuration changes. • Maintain support documentation and user guides. Ideal Skills & Experience: • 5 to 7+ years in a systems analyst or technical analyst role. • Strong understanding of business processes, data workflows, and integrated systems. • Hands-on experience running, maintaining, troubleshooting, and fixing data pipelines. • Hands-on experience conducting and documenting testing. • Familiarity with Microsoft Power Platform (Power Apps, Power Automate). • Foundational understanding of SQL, data structures, and relational data. • Excellent communication and collaboration skills. Bonus (Nice to Have) • Proficiency with Microsoft Power BI (data modeling, DAX, visual design). • Experience designing dashboards and reports in Power BI. • Exposure to CI/CD processes or working in Git-based environments. • Knowledge of Azure Data Factory or other ETL tools. • Programming background with Microsoft technologies. This Role Is Ideal for Someone Who: • Is a "translator" between business and tech. • Has a programming background but now thrives as a systems analyst. • Is hands-on and thrives in a small, collaborative team. • Enjoys running, maintaining, troubleshooting, and fixing data pipelines. • Is organized, quality-focused, and not afraid to roll up their sleeves. • Describes workflows and documents requirements clearly.
Configure and customize Dynamics 365 modules, manage user access and security, oversee data integration, monitor system performance, support testing and deployment, maintain documentation, and provide first-line technical support. | Experience with Dynamics 365 F&O and CRM system configuration, security administration, data migration, system monitoring, testing, deployment, documentation, and user training. | Title-MS Dynamics 365 F&O & CRM System Analyst Location-Remote Mode Of Hire-Contract Job Details:: • System Configuration & Customization Configure modules, entities, workflows, and business rules to align with organizational requirements. • Dynamics 365 F&O & CRM System Configuration & Customization Configure modules, entities, workflows, and business rules to align with organizational requirements. • User Access & Security Management Administer user roles, permissions, and data access levels to ensure proper security and compliance. • Data Integration & Migration Oversee import/export of data • Monitoring & Performance Optimization Track system health, troubleshoot performance issues, and ensure platform uptime and responsiveness. • Testing & Deployment Support Participate in testing new features, upgrades; assist in deploying solutions into production. • Documentation & Training Maintain documentation on system processes, configurations, and train users or support teams on Dynamics functionality. • Issue Resolution & Technical Support Serve as first-line support for Dynamics-related issues and coordinate with Microsoft or internal tech teams for resolution.
Design and implement event-driven architecture solutions using Kafka, including governance, cluster management, and disaster recovery. | Experience architecting Kafka-based event streaming solutions, managing Kafka clusters, schema registries, replication, and disaster recovery in high availability environments. | Role: Event-Driven Architecture (EDA) Architect Mode Of hire: Contract Preferred Location: Miami, FL (Remote) Skills: Confluent Kafka, Kafka connect, Schema registry, Kafka admin APIs, Kubernetes, DevOps, AWS EKS / ELK • Experience in Architecture and solutioning around application event streaming defining data models, message schemas, and communication protocols for efficient and reliable data transmission across diverse systems and environments. • Must have designed and implemented robust governance frameworks to create clear boundaries and guidelines for the usage of Kafka, Schema Registry, Mirror Maker for Disaster Recovery (DR), and Migration Replication, ensuring data integrity and compliance with organizational standards • Solutioning around application event streaming on Kafka, defining data models, message schemas, and communication protocols. • Administration , Upgrades, Manage ACL's and Topic Creation, Rebalancing and Performance Tuning of cluster. • Worked on the design and created boundaries around the usage of Kafka, Schema Registry, Cluster Linking, Replication. • Infrastructure planning and implementation of Confluent Kafka in High Availability and DR based scenarios
Create tailored applications specifically for InfiCare Technologies with our AI-powered resume builder
Get Started for Free