Bringing artificial intelligence (AI) to smart products

How Mira Family image sensors empower intelligent machine-learning

Smart products with artificial intelligence (AI) are essential to our future lives – from consumer and industrial robots to augmented reality glasses, smart access to buildings and data, autonomous vehicles and more.

Imaging a last-mile delivery robot (AGV – automated guided vehicle) finding the most optimized and quickest way through the city without colliding with humans or objects.
 

Where does the intelligence come in?


Before products like robots can become ‘intelligent’ they need a way to capture information about their continuous changing environment – such as sensors, cameras, and fast and precise processing of the data they capture. This is like how we use our eyes, ears and other senses to gather different data before we engage our brains to determine the safest, quickest way to cross a busy street without colliding with anything or anyone.

In intelligent products, sensors capture environmental data in combination with machine-learning algorithms that process the data to analyze and predict their environment. This is true for such varied applications such as AGVs reliably finding their way around a warehouse or the street, even large industrial robots safely interacting with human workers – and much more.

Machine-Learning algorithms are only as good as the data they receive from sensors that capture their environment or human interaction. Machine-learning applications can only improve with use and if they receive enough of the right kinds of timely, accurate and relevant data to process.

This makes the Mira Family image sensors from ams OSRAM the perfect choice for intelligent designs and future product development. High frame rate and the ability to provide images with mono, color and even infrared data means they can deliver the fast and high-quality data needed for superior performance. Image sensors from ams OSRAM in conjunction with AI processors make consumer and industrial products operate better, more efficient and safer – thanks to AI enhanced imaging.

 



 

The global shutter image sensor of Mira Family advances intelligent 2D and 3D sensing with high quantum efficiency at visible and NIR wavelengths





 

ams OSARM Mira family image sensor


The Mira Family global shutter visible and near infrared (NIR) image sensors offer the low-power characteristics and small size required in the latest 2D and 3D sensing systems. These are in demand for augmented reality and virtual reality (AR/VR) products, in industrial applications like drones, robots and automated vehicles as well as in consumer devices like smart door locks.

Growing demand in emerging markets for AR and VR equipment depends on manufacturers’ ability to make products like smart glasses smaller, lighter, less obtrusive and more comfortable to wear. This is where the Mira Family image sensors adds value, reducing the size of the sensor itself and giving manufacturers the option to shrink the battery thanks to the sensor’s extremely low power consumption.

A further Mira Family innovation is the ability to process mono and color (RGB) and even color combined with infrared information (RGB-IR) in the same small space to add a wealth of design flexibility – and gaining human sensing and recognition benefits.

ams OSRAM uses back side illumination (BSI) technology in the Mira Family image sensors to implement a stacked chip design, with the sensor layer on top of the digital/readout layer. This allows it to produce the sensor in a chip-scale package, e.g.: Mira050 is a tiny 2.3mm x 2.8mm. The ultra-small footprint gives manufacturers greater freedom to optimize the design of space-constrained products such as smart glasses and VR headsets.

Mira Family image sensors combine excellent optical performance with extremely low power consumption. The Mira220 image sensor operates at only 4mW in sleep mode, 40mW in idle mode and at full resolution and 90 fps the sensor has a power consumption of 350mW. Mira Family image sensors offer a high signal-to-noise-ratio as well as high quantum efficiency allowing device manufacturers to reduce the output power of the NIR illuminators used alongside the image sensor in 2D and 3D sensing systems, reducing total power consumption.
Sensing technologies for 3D Sensing like structured light or active stereo vision require NIR image sensors, which enable functions like eye and hand tracking, object detection and depth mapping. Additionally, RGB-version of the image sensor delivers additional color information, which is used for machine-learning algorithms, improving face recognition and object identification.
 

A world of applications enabling a smart capture/process environment


A key use of the Mira Family image sensors combined with an AI processor including integrated machine-learning algorithm is to improve safety for humans when they interact or collaborate with intelligent products – especially free-moving robots. Using high-quality image sensors in conjunction with AI processors to recognize that a human is present and interpret their shared environment, different operational rules can be applied to ensure safe interaction.
 

ams OSRAM CMOS image sensors

The Mira Family of image sensors is well suited for authentication, scan and capture in smart home appliances, mobile or wearable devices, as well as for the industrial demands of robotic vision. They deliver the precision and quality required for AGVs and enable object detection and avoidance to allow their autonomous movement even outside controlled environments.
 

Partner network offers customers turnkey solutions based on ams OSRAM technology


The ams OSRAM partner network is designed to give customers multiple options to procure the support that best fits their business model with the integration of advanced optical and sensor systems into their end-product designs.

Our preferred external module & solution providers, independent design houses and manufacturers of complementary offerings and recommended components can be found here: