Find your dream job faster with JobLogr
AI-powered job search, resume help, and more.
Try for Free

Latest Jobs

These are the latest job openings our job search agents have found.

SA

Data Engineer II

SamsaraAnywhereFull-time
View Job
Compensation$70K - 110K a year

Build and maintain ETL/ELT data pipelines to support analytics and AI use cases. | 2-3 years industry experience in data engineering with proficiency in Python, SQL, and modern data platforms like Databricks. | Who we are Samsara (NYSE: IOT) is the pioneer of the Connected Operations™ Cloud, which is a platform that enables organizations that depend on physical operations to harness Internet of Things (IoT) data to develop actionable insights and improve their operations. At Samsara, we are helping improve the safety, efficiency and sustainability of the physical operations that power our global economy. Representing more than 40% of global GDP, these industries are the infrastructure of our planet, including agriculture, construction, field services, transportation, and manufacturing — and we are excited to help digitally transform their operations at scale. Working at Samsara means you’ll help define the future of physical operations and be on a team that’s shaping an exciting array of product solutions, including Video-Based Safety, Vehicle Telematics, Apps and Driver Workflows, and Equipment Monitoring. As part of a recently public company, you’ll have the autonomy and support to make an impact as we build for the long term. About the role: Samsara’s Revenue Operations AI & Data Team is building the future of how we go to market — with intelligence, personalization, and speed. We’re a high-impact team of builders, scientists, and strategists focused on transforming sales operations through AI. Our mission is to help sellers reach the right customer at the right time with the right message — and to put everything they need at their fingertips, whether that’s data from Salesforce, context from a past call, or content that wins deals. As a Data Engineer II, you’ll own the data platforms that power Samsara’s GTM AI engine. You’ll be responsible for building, scaling, and optimizing our Databricks data store, visualization store, and AI store, while also enabling large-scale generative AI jobs in Databricks. Your work will ensure that our AI applications are grounded in clean, reliable, and well-structured data from CRM pipelines, CS Systems to GenAI-powered copilots. You’ll partner closely with data scientists, AI engineers, and business stakeholders to deliver the infrastructure that fuels innovation at scale. This role is open to candidates residing in the US except the San Francisco Bay Metro Area, NYC Metro Area, and Washington, D.C. Metro Area. You should apply if: You want to impact the industries that run our world: Your efforts will result in real-world impact—helping to keep the lights on, get food into grocery stores, reduce emissions, and most importantly, ensure workers return home safely. You are the architect of your own career: If you put in the work, this role won’t be your last at Samsara. We set up our employees for success and have built a culture that encourages rapid career development, countless opportunities to experiment and master your craft in a hyper growth environment. You’re energized by our opportunity: The vision we have to digitize large sectors of the global economy requires your full focus and best efforts to bring forth creative, ambitious ideas for our customers. You want to be with the best: At Samsara, we win together, celebrate together and support each other. You will be surrounded by a high-calibre team that will encourage you to do your best. In this role, you will: Build and maintain ETL/ELT data pipelines in Databricks and Spark, ensuring data is ingested, transformed, and delivered reliably for analytics and AI use cases. Develop and evolve logical and physical data models to support reporting, experimentation, and advanced workflows (e.g., scoring models, signal generation). Implement monitoring, alerts, and testing for data quality, timeliness, and lineage to ensure trustworthy data delivery. Support workflow orchestration with Databricks Jobs, DBT, or equivalent scheduling tools to operate at scale. Contribute to data pipelines and tooling that support retrieval-augmented generation (RAG), vector integrations, or embedding workflows. Design and optimize bulk GenAI data pipelines in Databricks to support generative AI applications at scale. Partner with AI engineers and data scientists to enable experimentation, model training, and production-grade deployments. Develop frameworks for data ingestion, transformation, governance, and monitoring across CRM, sales, and revenue systems. Work with RevOps, sales, and customer success stakeholders to translate business needs into data requirements and stable technical implementations. Minimum requirements for the role: 2-3 years of industry experience in data engineering, with significant experience building large-scale data platforms. Hands-on experience working with modern data technologies stack, such as Databricks, DBT, Redshift, RDS, Snowflake or similar solutions. Proficiency in Python and SQL, with experience in designing robust ETL/ELT pipelines. Experience orchestrating data workflows at scale and enabling machine learning or AI use cases. Strong understanding of data modeling, performance optimization, and cost-efficient infrastructure design. Located in and authorized to work in the United States (this is a fully remote role). An ideal candidate also has: Experience enabling generative AI workflows in Databricks or similar platforms. Familiarity with vector databases, embeddings, and retrieval systems. Experience with Salesforce, Gainsight, Gong, Outreach, or other CRM/enablement tools as data sources. Proven ability to automate repetitive tasks, improve data hygiene, and enable experimentation across GTM data use cases aligning with the emerging responsibilities of GTM engineering where clean, reliable GTM data foundations enable high-leverage automation and insight generation Exposure to observability, monitoring, and governance best practices for data and AI systems. Ability to collaborate closely with AI/ML teams while driving technical excellence in data engineering. Annual on-target earnings (OTE) range for full-time employees for this position is below and depends on your city of residence. Learn more about our total rewards and benefits below. Annual OTE Salary $101,745—$153,900 USD Total Rewards At Samsara, we build for the people who keep the global economy moving. We want owners, not passengers, which is why our rewards are designed to fuel high-impact builders. Our compensation program delivers above-market total compensation through a combination of base salary, performance-based bonus/variable pay, and equity (for eligible roles) in a high-growth public company. We meaningfully differentiate pay for our top performers, who have the opportunity to earn above-market compensation that can outpace the broader market over time. Beyond compensation, we provide the foundations that enable long-term success: a flexible, employee-led remote model, a professional development stipend, comprehensive health and parental leave plans, and more. If you’re ready to build for the long term and own the outcome, your journey starts here. Flexible Working At Samsara, we embrace a flexible working model that caters to the diverse needs of our teams. Our offices are open for those who prefer to work in-person and we also support remote work where it aligns with our operational requirements. For certain positions, being close to one of our offices or within a specific geographic area is important to facilitate collaboration, access to resources, or alignment with our service regions. In these cases, the job description will clearly indicate any working location requirements. Our goal is to ensure that all members of our team can contribute effectively, whether they are working on-site, in a hybrid model, or fully remotely. All offers of employment are contingent upon an individual’s ability to secure and maintain the legal right to work at the company and in the specified work location, if applicable. Belonging at Samsara At Samsara, we welcome everyone regardless of their background. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, gender, gender identity, sexual orientation, protected veteran status, disability, age, and other characteristics protected by law. We depend on the unique approaches of our team members to help us solve complex problems and want to ensure that Samsara is a place where people from all backgrounds can make an impact. Accommodations Samsara is an inclusive work environment, and we are committed to ensuring equal opportunity in employment for qualified persons with disabilities. Please email accessibleinterviewing@samsara.com or click here if you require any reasonable accommodations throughout the recruiting process. Our Commitment to Authenticity We use Tofu, a fraud detection tool, to validate the authenticity of applications and protect against identity fraud. This ensures we are connecting with real people and allows us to prioritize genuine candidates. Please see Samsara’s Candidate Privacy Notice for more information. Fraudulent Employment Offers Samsara is aware of scams involving fake job interviews and offers. Please know we do not charge fees to applicants at any stage of the hiring process. Official communication about your application will only come from emails ending in @samsara.com, @us-greenhouse-mail.io or @mail3.guide.co. For more information regarding fraudulent employment offers, please visit our blog post here.

Python
SQL
Data Engineering
Machine Learning
ETL
Data Pipelines
Direct Apply
Posted about 15 hours ago
PE

Software Engineer

PearsonAnywhereFull-time
View Job
Compensation$55K - 90K a year

Develop and maintain Ruby-based services, APIs, and JavaScript frontends for a global platform. | Experience with Ruby, Ruby on Rails, modern JavaScript, SQL, RESTful APIs, testing, code reviews, and agile development. | Software Engineer (Junior / Associate) Location: Remote (US preferred) Level:  (Associate to Advanced Associate) Team: Engineering – Enterprise Learning & Skills About the Role We are looking for a Junior Full-Stack Software Engineer to join the Enterprise Learning & Skills (ELS) engineering team and help build, maintain, and evolve a large-scale, API-driven platform used globally to issue and verify digital credentials. This role is ideal for an early-career engineer who is comfortable working in a modern JavaScript/Ruby on Rails environment, eager to grow their technical depth, and excited to use AI-assisted development tools as part of their daily workflow. What You'll Do * Contribute to the development and maintenance of Ruby-based services and APIs and JavaScript frontends * Implement well-scoped features, bug fixes, and refactors under guidance from senior engineers * Write clear, maintainable code with appropriate test coverage * Participate in code reviews and incorporate feedback * Use AI-assisted development tools for coding, testing, and debugging * Collaborate with Product and Customer Success teams on platform questions * Follow best practices for secure and scalable web application development Required Qualifications * Experience with Ruby and Ruby on Rails * Front-end experience with modern JavaScript frameworks * Understanding of relational databases and SQL * Familiarity with RESTful APIs * Familiarity with AI/LLM-related tooling and workflows * Experience writing tests and participating in code reviews * Comfort working in an agile development environment * Strong communication skills and growth mindset AI-Native Expectations Candidates should be comfortable using AI tools to assist with coding, testing, documentation, and learning. This includes validating AI-generated outputs, applying sound engineering judgment, and remaining curious about how AI can responsibly improve developer productivity. Nice to Have * Cloud experience (AWS, Azure, or GCP) * CI/CD and automated deployment exposure * Knowledge of web application security best practices * Experience with design systems, component libraries, or advanced frontend tooling Applications will be accepted through 3/16/2026. This window may be extended depending on business needs.   Compensation at Pearson is influenced by a wide array of factors including but not limited to skill set, level of experience, and specific location. As required by the California, Colorado, Hawaii, Illinois, Maryland, Minnesota, New Jersey, New York State, New York City, Vermont, Washington State, and Washington DC laws, the pay range for this position is as follows:    The full-time salary range is between $60,000 - $90,000.  This position is eligible to participate in an annual incentive program, and information on benefits offered is here [https://nam02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fpearsonbenefitsus.com%2F&data=04%7C01%7Ctasha.scott%40pearson.com%7C2c256513c79f4679be7c08d9e7287ebb%7C8cc434d797d047d3b5c514fe0e33e34b%7C0%7C0%7C637794983376381246%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=t1YQPUL7BgoclUd7yE2i86QAirLf4z3z8OEWgr42q7c%3D&reserved=0].

Ruby
JavaScript
SQL
Direct Apply
Posted about 18 hours ago
NM

Financial Analyst (Clinical Trials Research/ Revenue Cycle)

Northwestern Memorial HealthcareChicago, IllinoisFull-time
View Job
Compensation$55K - 85K a year

Support research billing compliance workflows by reviewing charges accurately and resolving billing discrepancies within Epic to maintain revenue cycle performance. | Requires high school or associate degree, basic medical billing knowledge, Epic experience, and ability to manage high volume independently. | At Northwestern Medicine, every patient interaction makes a difference in cultivating a positive workplace. This patient-first approach is what sets us apart as a leader in the healthcare industry. As an integral part of our team, you'll have the opportunity to join our quest for better health care, no matter where you work within the Northwestern Medicine system. We pride ourselves on providing competitive benefits: from tuition reimbursement and loan forgiveness to 401(k) matching and lifecycle benefits, our goal is to take care of our employees. Ready to join our quest for better? Job Description The Financial Analyst (Clinical Trials Research/ Revenue Cycle) reflects the mission, vision, and values of Northwestern Medicine, adheres to the organization’s Code of Ethics and Corporate Compliance Program, and complies with all relevant policies, procedures, guidelines and all other regulatory and accreditation standards. The Research Financial Analyst in this role will support both hospital and professional research billing compliance workflows, ensuring research-related charges are reviewed accurately and within required timelines to maintain timely billing and revenue cycle performance. The ideal candidate will have hands-on Epic experience, a strong understanding of research billing regulations, and the ability to manage high volumes independently while quickly onboarding to support operational continuity. Ensure timely review of charges within 10 days of service to meet internal KPIs. Work daily within Epic charge review queues/report to process and monitor charges. Familiarity with both hospital and professional billing workflows. Support both hospital and professional research billing compliance workflows, ensuring research-related charges are reviewed accurately and within required timelines to maintain timely billing and revenue cycle performance. Review and validate research-related hospital and professional charges for billing compliance Distinguish between billable and non-billable research services Identify and resolve billing discrepancies and late charge drops Collaborate with research operations and revenue cycle stakeholders to support compliance Maintain accurate documentation and ensure audit readiness Hands-on experience working in Epic with a strong understanding of charge review processes, research billing regulations, and revenue cycle dependencies high attention to detail and ability to manage high volume independently Review and reference study documents (protocol, consent form, budget, clinical trial agreement, coverage analysis) to identify research related charges impacting NMHC, and to ensure and support compliant billing. Maintain a detailed understanding of and adhere to rules and regulations of different types of payors, such as Medicare, Medicare Advantage, Medicaid, and commercial payors, inclusive of Medicare's Clinical Trial Policy (NCD 310.1) and related guidance. Acts as a resource to research teams to aid in identification of items and services related to the research and assignment of the appropriate payor. Maintains knowledge and proficiency in Epic, clinical trial management system, research fee schedule, and relevant government and non-government websites. Review and validate research-related hospital and professional charges for billing compliance distinguishes between billable and non-billable research services. Collaborate with research operations and revenue cycle stakeholders to support compliance. Hands-on experience working in Epic with a strong understanding of charge review processes, billing regulations, and revenue cycle dependencies high attention to detail and ability to manage high volume independently. Support both hospital and professional research billing compliance workflows, ensuring research-related charges are reviewed accurately and Qualifications Required: High School diploma One year related work experience or college degree Ability to perform mathematical calculations Basic knowledge of medical terminology and billing practices Extensive experience and knowledge of PC applications, including Microsoft Office and Excel Learn quickly and meet continuous timelines Exhibit behaviors consistent with principles of excellent service. Preferred: Two or more years’ college or college degree. Degree Specific associate’s degree Additional Information Northwestern Medicine is an equal opportunity employer (disability, VETS) and does not discriminate in hiring or employment on the basis of age, sex, race, color, religion, national origin, gender identity, veteran status, disability, sexual orientation or any other protected status. Background Check Northwestern Medicine conducts a background check that includes criminal history on newly hired team members and, at times, internal transfers. If you are offered a position with us, you will be required to complete an authorization and disclosure form that gives Northwestern Medicine permission to run the background check. Results are evaluated on a case-by-case basis, and we follow all local, state, and federal laws, including the Illinois Health Care Worker Background Check Act. Artificial Intelligence Disclosure Artificial Intelligence (AI) tools may be used in some portions of the candidate review process for this position, however, all employment decisions will be made by a person. Benefits We offer a wide range of benefits that provide employees with tools and resources to improve their physical, emotional, and financial well-being while providing protection for unexpected life events. Please visit our Benefits section to learn more. Sign-on Bonus Eligibility (if sign-on bonus offered for position): Internal employees and rehires who left Northwestern Medicine within 1 year are not eligible for the sign on bonus. Exception: New graduate internal employees seeking their first licensed clinical position at NM may be eligible depending upon the job family. Job Shift: Day Job (1st) Salary Range Minimum : $30.02 Salary Range Maximum: $43.53

Epic Systems
Revenue Cycle Management
Data Analysis
Direct Apply
Posted about 18 hours ago
BU

Information Security Analyst (Remote)

BusinessolverAnywhereFull-time
View Job
Compensation$55K - 85K a year

Safeguard company networks by implementing defense solutions, monitoring security alerts, and managing security configurations. | Bachelor's degree in a related field and 1-2+ years of experience in IT security, including knowledge of monitoring, incident response, and endpoint security. | Since 1998, Businessolver has delivered market-changing benefits technology and services supported by an intrinsic responsiveness to client needs. The company creates client programs that maximize benefits program investment, minimize risk exposure, and engage employees with easy-to-use solutions and communication tools to assist them in making wise and cost-efficient benefits selections. Founded by HR professionals, Businessolver's unwavering service-oriented culture and secure SaaS platform provide measurable success in its mission to provide complete client delight. A Brief Overview Responsible for safeguarding company networks and computer systems from any security threats or attacks by establishing and implementing security solutions that defend our networking assets. The role will develop security standards and best practices for the organization and recommend security enhancements as needed. What you will do Monitor, maintain, and respond to security alerts for our infrastructure Identify potential, successful, and unsuccessful intrusion attempts Participate in vulnerability assessment program Configuration, maintenance, and troubleshooting for single sign on solutions, anti-virus, web filtering, and web application firewalls Respond to security incidents, assist with troubleshooting, and provide on-call support as needed Propose creative solutions to grow our business by delighting our clients May perform other duties as assigned Skills and Abilities BS in Computer Science, CIS, Software Engineering, or related degree completed or in progress 1-2+ years of experience securing IT Systems Knowledge of security monitoring experience and incident response activities Experience with Windows or Linux environments Knowledge of vulnerability assessment tools such as Qualys or similar Experience with endpoint security including Anti-virus, HIPS, and endpoint hardening Knowledge of methods to investigate security incidents Knowledge of a modern scripting languages Highly motivated, Innovative, self-directed thinker with an eagerness to stay up to date with current trends and a desire to impress Excellent written and verbal communication skills Thrive in a fast-paced, innovative environment The pay range for this position is 65K to 75K per year (pay to be determined by the applicant’s education, experience, knowledge, skills, and abilities, as well as internal equity and alignment with market data). This role is eligible to participate in the incentive plan. Other Compensation: If this position is full-time or part-time benefit eligible, you will receive a comprehensive benefits package which can be viewed here: https://businessolver.foleon.com/bsc/job-board-businessolver-virtual-benefits-guide/ Dear Applicant. At Businessolver, we take our responsibility to protect our clients, employees, and company seriously and that begins with the hiring process. Our approach is thoughtful and thorough. We’ve built a multi-layered screening process designed to identify top talent and ensure the integrity of every hire. This includes quickly filtering out individuals who may attempt to misrepresent themselves or act in bad faith. We also partner with trusted, best-in-class providers to conduct background checks, verify identities, and confirm references. These steps aren’t just about compliance, they’re about ensuring fairness, safety, and trust for everyone involved. Put simply: we will always confirm that you are who you say you are. It's just one of the many ways we uphold the standards that matter most, to you, to us, and to the people we serve. With heart, The Businessolver Recruiting Team Businessolver is committed to maintaining an environment that protects client data. We train our employees to maintain leading class security practices and expect all employees to adhere to policy, procedures and controls. (Applicable to all roles at an AVP, DIR, VP, Head Of or SVP and above level): Serve as a security contact for the business unit. Responsible for driving adoption and compliance with information security and privacy practices. Serve as a liaison with the information security team on security and privacy matters. Equal Opportunity at Businessolver: Businessolver is an Affirmative Action and Equal Opportunity Employer and is proud to offer equal employment opportunity to everyone regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, veteran status, and more. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. #LI-Remote

Security Monitoring
Incident Response
Vulnerability Assessment
Scripting Languages
Windows
Linux
Direct Apply
Posted about 18 hours ago
CO

Senior Customer Success Operations Manager

CompanyCamAnywhereFull-time
View Job
Compensation$85K - 120K a year

Manage and optimize customer success technology platforms and automate lifecycle workflows to improve customer retention and growth. | 5–7 years in customer success or revenue operations in B2B SaaS with strong Salesforce and automation experience. | Hi, we’re CompanyCam. We’re a simple-to-use photo documentation and productivity app for contractors of all commercial and home services industries. Packed with intuitive functionality, CompanyCam facilitates unparalleled communication and accountability across a contractor’s entire business. We’re committed to providing a consumer-grade, game-changing experience that helps our users build trust within their company and with their customers. But don’t let that corporate description fool you—the people behind our buttoned-up product are laid-back (but hardworking), genuine, and kickass, and you could be one of them! The Role We’re looking for a Senior Customer Success Operations Manager to lead the strategy, systems, and operational processes that power customer retention and expansion. In this role, you’ll design and optimize the frameworks that support onboarding, product adoption, renewal, and growth across the customer lifecycle. You’ll partner closely with Customer Success leadership as well as teams across Sales, Product, Finance, and Revenue Operations to ensure alignment on lifecycle processes, customer health metrics, and execution standards. This role plays a key part in enabling a high-performing Customer Success organization through scalable systems, data-driven insights, and operational excellence. With an AI-forward mindset, you’ll leverage automation and predictive insights to proactively manage churn risk, improve customer outcomes, and expand the capacity of the Customer Success team in a high-growth SaaS environment. Location: This is a remote position. You must live and work permanently in the U.S. to be considered. What You'll Do Own and optimize the Customer Success technology ecosystem, ensuring systems are integrated, scalable, and aligned with lifecycle processes Administer and enhance CRM and Customer Success platforms while maintaining strong data governance and process integrity Design and implement automation that improves efficiency, reduces manual work, and enhances the customer experience Define and operationalize lifecycle workflows across onboarding, adoption, customer health monitoring, renewal, and expansion Partner with Customer Success leadership to translate strategy into scalable operational frameworks and playbooks Implement predictive customer health scoring and risk detection models using behavioral, engagement, and product usage data Develop reporting frameworks and dashboards that provide visibility into key metrics such as GRR, NRR, churn risk, product adoption, and time-to-value Deliver insights that inform renewal forecasting, expansion strategy, and resource allocation decisions Align cross-functional processes and SLAs with Sales, Product, Support, and Finance to ensure seamless customer handoffs Lead cross-functional projects that improve lifecycle execution, remove operational bottlenecks, and scale Customer Success processes Evaluate and implement new tools, automation, and AI-driven capabilities that enhance customer lifecycle management Establish governance, documentation, and best practices that support long-term operational scalability The Impact You'll Have At CompanyCam, your work makes a real impact. Whether you're writing code, supporting customers, or designing experiences, your contributions directly shape the product we deliver and the people we serve. We're building something that helps real people solve real problems—and we believe that kind of work is best done by a team that reflects the world around us. In this role, you’ll drive impact by: Building scalable operational systems that strengthen customer retention and expansion Improving visibility into customer health, lifecycle performance, and renewal risk Enabling Customer Success teams to operate more efficiently through automation and better tools Strengthening cross-functional alignment around the customer lifecycle Helping CompanyCam deliver stronger customer outcomes while supporting long-term revenue growth What You'll Bring 5–7 years of experience in Customer Success Operations, Revenue Operations, Sales Operations, or similar roles within a B2B SaaS or high-growth technology company Strong experience designing and optimizing customer lifecycle processes in subscription-based businesses Hands-on expertise with Salesforce and Customer Success platforms such as Gainsight, ChurnZero, Totango, Catalyst, or similar tools Experience implementing automation, workflow optimization, and scalable operational frameworks Strong analytical skills with the ability to define KPIs, build dashboards, and translate data into strategic insights Experience partnering cross-functionally with Customer Success, Sales, Product, Finance, and RevOps teams Proven ability to lead complex operational initiatives from concept through execution Bachelor’s degree in Business, Finance, Analytics, or related field A continuous growth-mindset, with a focus on learning, embracing challenges, and continuously improving. A knack for creativity and innovation, bringing fresh ideas to the table and solving complex problems. Benefits & Compensation This is a salaried position at CompanyCam. Our starting salary range is $123,000-$143,000 per year and is based on experience. We also offer meaningful equity and other benefits. CompanyCam is an equal-opportunity employer committed to respect, inclusion, and growth. We work hard, take responsibility, and support each other. Great ideas come from all backgrounds, and we carefully consider every applicant without regard to personal characteristics or traits. Even if your work experience doesn’t align perfectly, we encourage you to apply. What really matters to us is your potential, your passion, and your commitment to learning, innovation, and contributing meaningfully to our team. For any accommodations or technical issues related to the online application or interview process, please email jobs@companycam.com and we’ll respond promptly. Please do not include any medical or health information in your message. Note: Resumes sent to this email will not be reviewed or responded to. To be considered for a position, you must apply directly through our careers page.

Customer Success Operations
Salesforce Administration
Data Governance
Direct Apply
Posted about 20 hours ago
WhyHireWrong?

Product Data Scientist: Agentic AI

WhyHireWrong?AnywhereFull-time
View Job
Compensation$70K - 120K a year

Develop and scale an internal Agentic AI framework collaborating with product teams and global stakeholders. | Master's degree or bachelor's with strong data science experience, 2+ years production-grade data science including GenAI, solid Python skills. | The Role This is a product focused data science position sitting within a dedicated Agentic AI team. The core responsibility is co-owning the development and direction of an internal Agentic AI framework: ensuring it scales to a growing list of use cases and delivers a strong developer experience for data scientists building on top of it. This is not a pure research role. It combines hands on engineering, product thinking, and close collaboration with AI Engineers to build something that other data scientists rely on daily. What the Work Looks Like Day to Day Partner with product teams and business leaders to understand and define Agentic AI use cases Collaborate with data scientists across global teams to gather feedback on the agent building experience and translate it into framework improvements Shape and drive the evolution roadmap for the Agentic AI framework Apply GenAI and Agentic AI techniques to solve real business problems Build and maintain resilient, production grade algorithmic and agentic pipelines Write clean, well structured code following engineering best practices Deepen applied knowledge across machine learning, optimization, statistical modeling, and GenAI Technical Stack Cloud: Microsoft Azure, Google Cloud Platform, Kubernetes Languages: Python, Spark (preferred); SQL for analytical work Big data ecosystem: Databricks, BigQuery, Spark Dev tools: GitHub, Jira, Confluence (Agile DevOps environment) BI tools: PowerBI or Tableau (basic familiarity useful) What Is Required Masters degree in a quantitative field (Statistics, Operations Research, Computer Science, Applied Mathematics, Systems Engineering, Economics) OR a Bachelors or Engineering degree with strong, consecutive data science experience At least 2 years of delivering production grade data science or algorithmically enabled applications, with at least some of that experience involving GenAI based solutions Solid Python skills; Spark experience is a plus Experience with or genuine interest in building tools and frameworks used by other data scientists Strong analytical thinking across optimization, simulation, predictive modeling, and experimentation Comfortable taking ownership, navigating ambiguity, and working across distributed global teams What Strengthens an Application Prior experience building developer tooling, internal platforms, or frameworks for data science teams is a genuine differentiator here. This role sits at the intersection of engineering and product , and candidates who have thought about developer experience, not just model performance, will stand out. Working Model and Location This role is based in Warsaw, Poland, on a hybrid working arrangement. Regular on site presence in Warsaw is required. Full remote is not available for this position.

Python
SQL
Machine Learning
Federated Learning
Distributed Systems
Direct Apply
Posted about 23 hours ago
WhyHireWrong?

Data Scientist: Machine Learning and GenAI

WhyHireWrong?AnywhereFull-time
View Job
Compensation$55K - 120K a year

Develop and integrate machine learning models to solve business problems using large datasets. | Master's degree or Bachelor's with experience, strong Python and SQL skills, and 2+ years production data science experience. | The Role This is an ownership driven data science position within a scaled, globally distributed hub focused on bringing algorithms to production. The work spans traditional machine learning, deep learning, GenAI, optimization, and statistical modeling. Methods are chosen based on the problem, not the trend. The scope covers high impact business domains including retail, media, digital commerce, supply chain, R&D, and productivity. This is not a research only role. The expectation is to understand the business problem deeply, build the right model, and see it through to reliable production deployment. What the Work Looks Like Day to Day Take ownership of a defined business domain and its algorithmic needs from problem framing through to deployed solution Partner with product, business, and AI engineering teams to automate and integrate models into live applications Analyze large scale datasets (think: processing billions of behavioral signals daily) and translate findings into actionable recommendations Define and evolve the algorithmic roadmap for your area of ownership Apply machine learning, statistical, optimization, and GenAI techniques to real business problems Write production grade code following engineering best practices Build resilient, maintainable algorithmic pipelines that hold up over time Technical Stack Cloud: Microsoft Azure, Google Cloud Platform, Kubernetes Languages: Python, Spark (preferred); SQL for analytical work Big data ecosystem: Databricks, BigQuery, Spark Dev tools: GitHub, Jira, Confluence (Agile DevOps environment) BI tools: PowerBI or Tableau (basic familiarity useful) What Is Required Masters degree in a quantitative field (Statistics, Operations Research, Computer Science, Applied Mathematics, Systems Engineering, Economics) OR a Bachelors or Engineering degree with solid, consecutive data science experience At least 2 years of experience delivering production grade data science or algorithmically enabled applications Strong Python skills with hands on experience in machine learning, statistical modeling, and optimization Solid SQL and analytical skills Demonstrated ability to lead problem solving and prioritize across competing demands Comfortable working across cross functional teams in a fast moving environment What Strengthens an Application Experience with the full lifecycle of an algorithmic product: not just model building, but deployment, monitoring, and iteration. Familiarity with big data tooling (Databricks, BigQuery, Spark) and exposure to GenAI or optimization methods are genuine advantages, not box ticking requirements. Working Model and Location This role is based in Warsaw, Poland, on a hybrid working arrangement. Regular on site presence in Warsaw is expected; full remote is not available for this position.

Machine Learning
Python
SQL
Direct Apply
Posted about 23 hours ago
WhyHireWrong?

Senior Data Engineer - all genders - Google Cloud Platform

WhyHireWrong?AnywhereFull-time
View Job
Compensation$90K - 140K a year

Design and implement scalable data pipelines and collaborate with architects and managers to improve processes. | Bachelor's degree with proven data engineering experience, strong Python skills, and familiarity with GCP big data services. | The Role This is a hands on data engineering position embedded in a product focused environment. The work spans the full data lifecycle: gathering requirements from stakeholders, designing technical solutions, and shipping reliable, scalable pipelines. Expect close collaboration with data architects, asset managers, and product managers. This role sits at the intersection of engineering and business outcomes. What the Work Looks Like Day to Day Translate product and business requirements into technical data solutions Design and implement data pipelines and capabilities across product offerings Work with data architects to ensure solutions are aligned with broader technical strategy Identify gaps in internal processes and lead improvements Write clean, reusable code that adheres to established engineering standards Communicate technical decisions clearly to both technical and non technical audiences Technical Stack The primary environment is Google Cloud Platform. Day to day tooling includes: BigQuery for data analysis and processing Cloud Composer / Airflow for workflow orchestration Dataproc / PySpark for large scale data processing Vertex AI for machine learning adjacent workloads Cloud Spanner and Cloud Run for additional platform needs Python as the primary programming language, with GitHub Copilot integrated into the workflow What's Required Degree in Computer Science, Engineering, or a related field, or equivalent demonstrated experience Proven background in data engineering and architecture, including ownership of strategic technical initiatives Strong Python skills with hands on experience across the GCP services listed above Familiarity with PySpark and big data processing patterns Ability to explain complex technical concepts to varied audiences Comfortable working in fast moving environments where priorities shift What Sets a Strong Candidate Apart A track record of not just building pipelines, but improving how a team builds them: process thinking alongside technical depth. No need to be an expert in every tool listed. Intellectual curiosity and a structured approach to learning matter more than a perfect checklist match.

Python
Data Engineering
Machine Learning
Direct Apply
Posted about 23 hours ago
CI

Senior Data Scientist

CiklumAnywhereFull-time
View Job
Compensation$90K - 130K a year

Perform statistical analysis and develop ML models for healthcare risk stratification and integration of LLMs in AI solutions. | Strong Python skills, experience with ML frameworks like PyTorch and TensorFlow, NLP expertise, and hands-on LLM development experience required. | Ciklum is looking for a Senior Data Scientist to join our team full-time in the US . We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live About the role: As a Senior Data Scientist, become a part of a cross-functional development team engineering experiences of tomorrow.  Client is building an Agentic AI health platform to focus on risk stratification for chronic diseases by creating a value based PMPM model to increase longevity by attacking these conditions. As their main goal is to focus on lifestyle based pathways,  they are starting with weight loss management and diabetes. Need is to establish a team who can work with in US and come to customer meetings with him to lay out Data analytics, population health risk stratification and other thoughts. Responsibilities: * Sr Health Informatics/Data Scientists: deeply technical in statistical analysis, data science and ML algorithms; hands-on experience with Python to build data models specific to Risk Stratification and RISK TIER migration in US healthcare; familiar with reimbursement codes for Medicare, Medicaid, etc * Health informatics leader - data analytics, architect, pop health, risk stratification, risk modeling - part time is ok if full time is not available - this will be the key person here for him. he/she needs to be. in customer meetings with him * Collaborate with engineers, data scientists, and BA to understand requirements, refine models, and integrate LLMs into AI solutions * Embed generative AI solutions into consolidation, reconciliation, and reporting processes * Dev and implementation of Deep learning algorithms for AI solutions * Stay updated with recent trends in GENAI and apply the latest research and techniques * Preprocess raw data, including text normalization, tokenization, and other techniques, to make it suitable for use with NLP models * Setup and train LLMs and other state-of-the-art neural networks * Conduct thorough testing and validation to ensure accuracy and reliability of model implementations * Perform statistical analysis of results and optimize model performance for various computational environments, including cloud and edge computing platforms * Perform model audits to identify and mitigate risks * Monitor and optimize generative models for performance and scalability Requirements: * Solid understanding of object-oriented design patterns, concurrency/multithreading, and scalable AI and GenAI model deployment  * Strong programming skills in Python, PyTorch, TensorFlow, and related libraries * Proficiency in RegEx, Spacy, NLTK, and NLP techniques for text representation and semantic extraction * Hands-on experience in developing, training, and fine-tuning LLMs and AI models * Practical understanding and experience in implementing techniques like CNN, RNN, GANs, RAG, Langchain, and Transformers * Expertise in Prompt Engineering techniques and various vector databases * Familiarity with Azure Cloud Computing Platform * Experience with Docker, Kubernetes, CI/CD pipelines * Experience with Deep learning(*), Computer Vision(*), CNN, RNN, LSTM * Experience with Vector Databases (Milvus, Postgres, etc.)(*), Database Technologies What`s in it for you? * Strong community: Work alongside top professionals in a friendly, open-door environment * Growth focus: Take on large-scale projects with a global impact and expand your expertise * Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications * Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies * Care: Healthcare, Basic Life Insurance, Short and Long-term disability insurance according to the Company’s Benefit Plans About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you’ll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. In the US, Ciklum is growing fast—inviting experienced professionals to lead digital transformation alongside Fortune 500 clients. Be part of a company where innovation and impact go hand in hand. Want to learn more about us? Follow us on Instagram [https://www.instagram.com/ciklum/], Facebook [https://www.facebook.com/Ciklum/], LinkedIn [https://www.linkedin.com/company/ciklum/]. Explore, empower, engineer with Ciklum! Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.

Python
PyTorch
TensorFlow
NLP
Machine Learning
Data Science
Direct Apply
Posted 1 day ago
Qode

Sr. Data Engineer

QodeAnywhereFull-time
View Job
Compensation$70K - 120K a year

Design and optimize scalable batch and streaming data pipelines using AWS and PySpark. | Strong hands-on experience with Python, PySpark, AWS Glue, Lambda, and Kafka ecosystem with data quality framework expertise required. | Job Title: Sr.Data Engineer (Mid–Senior Level) – AWS & StreamingExperience Level – 13-15+ YearsLocation: Fort Mill, SC (3 days hybrid)Role Summary: We are seeking a Mid–Senior Data Engineer with strong expertise in AWS-based data engineering, real-time streaming technologies, and enterprise-grade data quality frameworks. The ideal candidate will design, build, and optimize scalable batch and streaming data pipelines, implement robust data validation and monitoring processes, and support mission-critical analytics platforms. Key Responsibilities: Develop and maintain scalable ETL/ELT pipelines using AWS Glue, PySpark, and Python Build event-driven workflows using AWS Lambda Design and manage real-time streaming solutions using Kafka, KSQL, and Apache Flink Implement and enforce comprehensive data quality frameworks, including validation, profiling, monitoring, and reconciliation Optimize data processing performance, scalability, reliability, and cost in cloud environments Collaborate with cross-functional teams to deliver reliable, production-grade data platforms and ensure data integrity across the pipeline Must have Skills: Strong hands-on experience with Python and PySpark Proven expertise in AWS Glue, Lambda, and other cloud-native data services Solid experience with the Kafka ecosystem (topics, partitions, consumer groups, streaming patterns) Demonstrated experience building and supporting data quality frameworks (validation rules, reconciliation checks, profiling, anomaly detection) Strong understanding of distributed data processing and scalable architecture patterns Good-to-Have Skills: Experience with Apache Flink for real-time stream processing and stateful computations Knowledge of KSQL or other streaming SQL engines Exposure to CI/CD pipelines, IaC (Terraform/CloudFormation), and DevOps practices Familiarity with data lake/lakehouse architectures and table formats such as Iceberg, Delta, or Hudi Experience working in enterprise or financial data environments

Python
PySpark
AWS Glue
AWS Lambda
Kafka
Data Quality Frameworks
Direct Apply
Posted 1 day ago
Showing 1-10 of 82,609 jobs

Ready to have AI work for you in your job search?

Sign-up for free and start using JobLogr today!

Get Started »
JobLogr badgeTinyLaunch BadgeJobLogr - AI Job Search Tools to Land Your Next Job Faster than Ever | Product Hunt