Last week at a media event at its Endeavor headquarters in Silicon Valley, NVIDIA announced the general availability of its Jetson AGX Xavier
module specifically designed for autonomous robots. While there are many GPUs available today, NVIDIA’s approach is unique in that it’s delivering a turnkey system to companies that want to build self-driving robots. The Jetson AGX Xavier module is a basically a turnkey artificial intelligence (AI) computer that has been optimized for this particular use case.
The module offers AI-grade server performance with 512 CUDA Tensor cores, dual NVIDIA deep learning accelerators, and eight core CPUs -- delivering 32 TeraOps (TOPS) of performance. The model is very power efficient, and depending on configuration, it can consume as little as 10W of power, which is equivalent to about a standard clock radio. It also supports applications developed with JetPack and DeepStream software.
JetPack is NVIDIA’s SDK specifically designed for autonomous machines. It includes support for AI, computer vision, multi-media, and other functions. The DeepStream SDK is used for streaming analytics for digital initiatives such as smart cities. Developers can use DeepStream to create multi-camera and multi-sensor applications to detect and identify objects like cars, people, and cyclists.
An alternative to using the Jetson module is to put a standard GPU-enabled server, such as a Cisco USC or NVIDIA DGX, on the device and then configure it with the necessary software and hardware. But general-purpose servers are very power hungry and usually too big to mount inside a drone or other type of vehicle. Also, these are not optimized for autonomous vehicle operations, so they can be under- or over-powered depending on the resource.
The value add that NVIDIA brings is that it’s a complete solution including silicon, hardware, software, developer tools, and community. The goal is to reduce the barrier to entry into autonomous operations so anything that can move, can do so without human intervention. This would include trains, drones, delivery vehicles, medical equipment, and anything else one can think of.
There is precedent for this approach, as earlier NVIDIA rolled out its Drive AGX platform
, which can be thought of as a very specific use case for the larger subset of autonomous vehicles. A good way to think about the difference is that Jetson AGX Xavier is a general-purpose version of Drive AGX. Over time, I envision NVIDIA rolling out a drone version or other high-volume use case.
At its event, NVIDIA announced a number of companies that will be using Jetson AGX Xavier as part of their autonomous vehicle programs. These include Cainiao (shipping), Yamaha (drones), Komatsu (construction), jd.com (e-commerce), Fanuc (industrial) and Nanapore (healthcare). NVIDIA also has a number of ecosystem partners in areas such as video analytics, robotics, virtual reality, and sensors.
Countering the Heebie Jeebies
Unlike what’s portrayed in science fiction, autonomous machines aren’t dangerous, and they aren’t here to kill us and take over the world. I understand the heebie jeebies, but they are here to assist humanity and make our lives better by doing things that we can’t or do not want to do. In fact, many of us already use autonomous vehicles, but don’t really know it. For example, the train that takes people between the terminals and rental car center at San Francisco airport has no driver in it, but people don’t think twice about hopping onboard. Many subway systems are now driverless and go accident free.
In the city of Dubai, the police force is using robots to patrol the streets. Through the use of video and voice analytics, they can predict if a situation might result in a crime and police can be dispatched. The World Expo is coming to Dubai in 2020 and bringing over 20 million people and the police department doesn’t have enough officers to ensure the public is safe everywhere -- but it can with the help of drones and robots.
As another example, self-driving trucks can dramatically reduce accident rates in construction sites, as they can sense people and stop faster than people can. Additionally, drones can be used to take pictures of a pipeline under dangerous conditions instead of sending people. Or, portable medical imaging equipment can be made and taken to areas of the world that have insufficient healthcare. Drones are now even being used to feed crickets more efficiently than people can, creating another source of protein to help end world hunger.
All Coming Together
Autonomous machines have been the stuff of science fiction for decades, but we’re now sitting on the precipice of an explosion in this area. I’ve always found robots interesting, as they require a number of technologies coming together. For example, the police robots the City of Dubai is building use GPUs, AI, sensors, sentiment analytics, omnichannel communications, natural language processing, video analytics, 4G and soon 5G connectivity, cloud computing, edge computing, facial recognition, and a whole lot more to make it work.
Anything vendors can do to simplify the development of autonomous machines can accelerate innovation and time to market. On TV and in the movies, robots like Nomad, Cylons, and Terminators come back and kill us. And sure, that might still happen, but that’s not likely for thousands of years. In the meantime, we should use them to make our world safer, help overcome some of the biggest challenges in society, and do things we can’t or do not want to do.