About
Getting robots to reliably interact with tools, materials, and unpredictable environments is still unsolved. This role leads that effort.
This is an early-stage robotics company building the intelligence layer for industrial machines – forklifts, cranes, excavators and other heavy equipment. The approach is pragmatic: start with remote teleoperation to generate real-world operator data at scale, then train Vision-Language-Action models toward increasing autonomy.
You’ll own manipulation across learning and control, turning real-world data into systems that execute tasks reliably on physical machines.
What you'll do
- Lead the manipulation autonomy direction across task execution and tool interaction
- Build model and data strategies for learning from teleoperation and field data
- Develop transformer-based or VLA-style systems for embodied decision-making
- Improve training pipelines, evaluation loops, and failure analysis for manipulation tasks
- Work closely with robotics engineers to turn model outputs into real machine behaviour
- Help define the team’s technical bar, roadmap, and research priorities
- Drive experiments from hypothesis to on-machine validation
What you'll need
- Deep hands-on experience in robotics autonomy, embodied AI, or manipulation
- Strong background in modern ML methods, ideally transformers or multimodal models
- Experience building data pipelines for training and improving robot behaviour
- Strong Python and production-quality engineering habits
- Ability to move between model design, evaluation, and real-world deployment constraints
- Staff-level judgment and comfort setting direction in an early-stage team
Optional Bonus
- Experience with imitation learning, RL, or VLA systems
- Background in industrial machines, mobile manipulation, or field robotics
Shortlisted candidates will be contacted within 48 hours.