Autonomous vehicles all need sensors that can help them detect and avoid objects in their environment. And to operate at high speeds, these sophisticated machines need to identify objects ahead precisely, and early enough, that they can avoid crashing into them.
Existing systems have included light ranging and detection systems and cameras, primarily. But the effectiveness of both LiDAR and cameras can hinge on weather. LiDAR and cameras don’t “see” well in fog, dust or other inclement weather typically. Some aren’t capable of long-range sensing. Many are also clunky enough that they can’t even be considered for use on drones.
Now, investors are betting $29 million that Echodyne and its lightweight radar systems will bring true autonomy to vehicles of every kind, starting with drones but also cars, boats, or mobile robots.
The new round of funding brings Echodyne’s total capital raised to $44 million. New Enterprise Associates led the Series B investment in Echodyne joined by Bill Gates, Madrona Venture Group, Vulcan Capital, Lux Capital, The Kresge Foundation and others.
Echodyne’s radar system is compact and lightweight enough to be flown on-board the kinds of drones that are commonly used for commercial purposes like inspecting power lines, or surveying farmland. TechCrunch was on hand for a demonstration of the startup’s “pocket-sized” radar sensors earlier this month., and you can read more about the way its radars work, versus LiDAR and convention radar systems here.
Echodyne CEO Eben Frankenberg said that so far, his company has sold all of its production unit radars, which were built specifically for use in drones. He did not have permission to name companies that are working on integrating the radars with their other software and hardware, currently. Echodyne is still developing its radar systems for cars.
The company plans to use its funding for research and development, and to ramp up from production of hundreds of radars per year to thousands, the CEO said.
Much of Echodyne’s funding will go towards software that works with its radars, the CEO explained:
“Our hardware is more advanced than commercial radars that have existed. It really does operate like a phased array radar, the kind that would be in the nose cone of a fighter jet. . . But we are also building an entire platform around radar vision, the computer vision-like software that sits on top of it.”
The company’s radars create a point cloud and images that, as with computer vision, can be processed using neural nets or AI to classify and recognize what’s happening in any environment, the CEO said.
Featured Image: Bryce Durbin