Embedded vision

Drones, self-driving cars, cleaning robots – what do these increasingly familiar products have in common? All of them have eyes. Equipped with cameras and sensors, they analyze images and location information and take necessary actions at a right place.

The concept that a machine with a camera supports humans is not new. In some areas, it was necessary and such an obvious concept from the very beginning. For example, the technology has been used in factories to provide imaging-based automatic inspection. Machine Vision or the technology and method used for inspection or location detection are being used in many fields including semiconductor manufacturing, automobiles, healthcare and logistics.

Traditionally, these cameras and sensors have mostly been used in factories for industrial purposes, but thanks to the emergence of powerful, low-cost, and energy-efficient processors, it has become possible to incorporate machine vision capabilities into embedded systems for commercial as well as household uses – the IIoT Times calls this ‘Embedded Vision’.

Expanding applications of embedded vision

Embedded vision enables drones, self-driving cars and cleaning robots to understand their environment through visual means. A more recent example that hit the news was the launch of the first Amazon GO store in Seattle in January 2018. Multiple cameras in the ceiling track the movement of customers, and store shelves have weight sensors to detect which item(s) a customer selects. The tracking system, supported by a deep learning algorithm, enables a customer to purchase products without being checked out by a cashier or using a self-checkout station.

Bloomberg reported that Amazon plans to open as many as 3,000 new Amazon Go stores by 2021.

Embedded vision has also had some impact in healthcare. A fundus examination, which used to be available only in large hospitals such as university hospitals, is now available in local clinics thanks to improved device capacity and cost cutting. In the United States, more and more drug stores offer a fundus examination and it is expected that the examination will be done at home in the future. Embedded vision is also used in packaging medicine. In most drug stores today, pharmacists manually read barcodes to select the type and amount of medicine for each patient, or use a camera to ensure that there are no mistakes. It is expected that an automatic pill packaging machine, currently used mainly in university hospitals, will become more widely used in the future.

In Japan, brands of rice are ranked based on quality. The evaluation is conducted by an official agency, but research is underway to develop a smaller evaluating machine that allows an individual farmer to check the quality of their rice on site. It is made possible because cameras and image processing systems used for industrial applications have become smaller, cheaper and more energy efficient. It is expected that AI and deep learning will contribute to further enhance the quality of these machines.

The future of embedded vision

Embedded vision is already changing the way we live and work. For example, Walmart is hiring robots to check whether their shelves are running out of stock. Equipped with 2D and 3D cameras, these robots take pictures of the store shelves as they move around and identify which shelves need replenishing, thereby replacing the human tasks that humans don’t “enjoy” doing and allowing humans to spend more time on serving customers face-to-face.

If you are looking for an apartment to rent, the normal practice is that you first check the floor plan and then visit the apartment to check it. Thanks to 3D cameras and scanning technology, you can now virtually check the apartment without the need to visit the site.

At DHL, picking staff are equipped with wearable Augmented Reality (AR) devices, which offer digital navigation to find the right route and item more efficiently, while reducing the training time. Google has just announced the launch of Google Glass Enterprise Edition 2, designed specifically for businesses. Users of the new Vuzix Blade smart glass, the world’s first consumer AR smart glass with Amazon Alexa built in, can see a projection of information on the screen inside, read text messages, and see incoming calls and more.

As technological innovations in the areas of cameras, sensors and IC chips continue, embedded vision will become even smaller, cheaper and lighter, further expanding the scope of applications and transforming society.