$120000 - 180000 a year
At Godela, we're building the first Physics Foundation Model; an AI system that learns from simulation, experiment, and equations to instantly predict and simulate physical behavior. Our mission is to give every engineer an R&D lab at their fingertips, cutting months of simulation and experimentation into minutes. We are looking for people who get excited about pushing the limits of science and engineering, and who want to create models that open up completely new ways to discover, design, and build. We are seeking a Founding ML Engineer to help us build and scale the world’s first Physics Foundation Model. This role will focus on developing and productionizing large-scale, physics-informed ML systems. What you will be doing Developing scalable approaches to train high-accuracy, large-scale physics-informed models. Designing and testing new architectures for multi-physics and multi-scale modeling Optimizing training efficiency through distributed computing, GPU optimization, and performance tuning. Must-have criteria: Strong Python + PyTorch (or JAX/TF) skills. Proven experience delivering ML systems into production Experience with multi-GPU / multi-node training Experience training and deploying large-scale ML models with GPU acceleration and distributed workloads. Hands-on AWS experience (compute, storage, IAM, cuda) or other cloud infra provider. Background in building or scaling training pipelines, APIs, or ML infra that supports real-world products. Nice-to-have criteria: Exposure to engineering simulations (CFD, FEM, PDEs) or physics-informed ML (PINNs, neural operators). Familiarity with Docker/Kubernetes, CI/CD, Slurm, or other workload managers. Experience with Graph Neural Networks, Physics-Informed Neural Networks, Neural Operators, and Transformer architectures Experience working with messy, high-dimensional data.
This job posting was last updated on 9/29/2025