Perhaps most famously known ultrasound as the technology that lets us non-invasively scan bodies, communicate underwater, and assists in parking our cars. A young startup from Norway called Sonair wants to use it for something else: 3D computer vision in use with autonomous hardware applications.
Sonair's founder and chief executive Knut Sandven thinks that his company's application of ultrasound technology, which reads sound waves to detect people and things in 3D with minimal energy and computational requirements, can be the basis of more useful and considerably less expensive solutions than today's more standard approach using LIDAR.
Sonair has now raised $6 million from early-stage specialists Skyfall and RunwayFBU, and is opening up early access to its tech. Initially, this will be with groups developing autonomous mobile robots, but the vision (heh) is to see it being used in other applications.
Our go-to-market strategy is to start with robots, specifically autonomous mobile robots-moving stuff from A to B," said Sandven. "We are starting indoors-a limitation to give us focus but will, of course, expand into the other robotic categories and automotive in the long term.".
The name Sonair is a play on the 3D capabilities of water-traveling sonar but applied to sensors reading signals in the air—to wit, a nicely what the startup has produced.
Engineer and entrepreneur Sandven's earlier company, GasSecure, manufactured gas sensors with MEMS technology - devices that are a combination of mechanical and electronic elements with micro-scale sizes. The petrochemical sector is a huge segment of Norway's national economy.
After GasSecure was acquired by a German industrial safety specialist, Sandven began thinking of other uses for MEMS and sought out research coming out of SINTEF, a group that works with Norway's top science and technological university to take research to market. Dozens of companies have spawned from the work of the group over the years.
He explained that, "SINTEF had developed a novel MEMS-related ultrasonic sensor, ready for the market." Sandven obtained the IP for the technology, recruiting the researchers who developed it as well, and Sonair was born.
In recent years, LIDAR has been part and parcel of the development of autonomous systems but still offers room for complementary or alternative approaches. LIDAR is still deemed costly: it suffers from range issues; and there remain concerns about interference from light sources, some surfaces and materials.
While companies like Mobileye focus on other solutions such as radar, Sandven believes Sonair has a realistic shot at it, as their technology cuts the overall cost of sensor packages between 50% to 80%.
"Our mission is to replace LIDAR," he said.
The ultrasound sensors and associated software developed by Sonair to "read" the data from the sensors does not work in a vacuum. Like LIDAR, it works in concert with cameras to triangulate and provide more accurate pictures to the autonomous system in question.
Sonair's ultrasound tech is based on a "beamforming" method, which is also what is used in radar. The firm then collects this data and combines it using AI and object recognition algorithms to produce spatial information from sound waves. In its early forms, the hardware that utilises the technology is mobile robots, which get a 180-degree field of view with a range of five meters and can potentially use fewer sensors, addressing some of the LIDAR's shortcomings. Some other interesting ideas are still to be explored here. Company's focus at present is on new techniques for improving the way well autonomous systems can perceive objects in front of them. Though small, it also has potential in other form factors. May it potentially come as an integral part as complement within wearables, or even better as an alternative to pressure-based haptic feedback?
"What's being done today by other companies is [focused on] touch sensors," Sandven said.
"After you touch something, the device will measure the pressure or how hard or soft you're touching. Where we come in is the moment before you touch. So if you have a hand approaching something, you can respond already with our technology. You can move very fast towards the objects.". But then you get precision distance measurements, so you can be very soft in the touch. It's not what we're doing right now, but it's something we could do." Sagar Chandna from RunwayFBU projects that 2024 will see 200,000 autonomous mobile robots produced, working out to a market of $1.4 billion. That gives Sonair an immediate market opportunity as a less-expensive alternative for computer vision components.
The advancements in perception and decision-making are going to be priced lower with sensor technology, through which benefits will trickle down to the manufacturing to healthcare industries.