Advanced vision and sensing solutions for mobile robots
ams OSRAM offers innovative vision and sensing solutions, including VCSEL-based illuminators, global shutter image sensors, and multi-zone time-of-flight modules, enhancing the precision, efficiency, and safety of mobile robots in collaborative environments.
Mobile robots - autonomously guided vehicles (AGVs)
Mobile robots, also known as autonomously guided vehicles (AGVs) or autonomous mobile robots (AMRs), are increasingly popular in automation due to vision-based technologies that enable reliable 3D imaging for object recognition and collision avoidance. These technologies use specially designed cameras, either with stereovision (spatially separated image sensors with synchronized illumination) or structured light (a single sensor with a laser dot projection module). Combining both is known as "active stereovision."
We offer a range of 3D imaging components, including VCSEL-based flood illuminators, dot pattern-emitters projectors, and image sensors in near-infrared or visual light spectrum. Our integrated VCSEL illuminator and projection modules provide high optical and power efficiency. Our small global shutter image sensors enable compact camera designs with excellent optical performance and energy efficiency. For scenarios where camera-based solutions are impractical, our multi-zone direct time-of-flight 3D scanning modules offer a scalable alternative, providing pre-processed 3D data for navigation.
Enhanced human-machine interfaces can further benefit mobile robots in collaborative environments. Our innovative projection solutions improve interaction between robots and humans, reducing collisions and minimizing 'safe stops.'
The following mobile robot application block diagram will illustrate the key functions necessary for creating a mobile robot system, including obstacle detection, human-machine interface, traction motor control, and battery management systems:
Live Webinar: High-Resolution Multi-Zone Sensing with the TMF8829 dToF Sensor
Discover how precise 3D depth sensing can improve presence detection and spatial awareness in real-world designs, from robotics and logistics to mobile, wearables and other smart appliances.
What you’ll gain:
- A clear introduction to direct Time-of-Flight sensing technology
- A deep dive in multi-zone dToF 3D depth sensing with insight into the TMF8829 sensor: up to 48×32 pixels, 11 m range, and 80° field of view
- Understanding of on-chip processing for distance, confidence, and ambient light data
- Examples of multi-object detection and AI-ready depth data in practice
Where it’s used:
- Depth-based object detection and scene mapping
- Gesture control for headphones and wearables
- Smart appliances such as lawn movers or coffee machines
- Industrial robotics, logistics, and camera autofocuus
Subscribers will get access to additional technical resources. Take your chance to talk to our expert David Smith in our live webinar on January 14, 2026