NVIDIA’s DRIVE AGX AI platform for autonomous driving

Adopted by 450 companies including Toyota, Isuzu, Subaru, Audi, Daimler/Bosch

2018/10/24

Summary

NVIDIAのジェンスンCEO
Jensen Huang, CEO of NVIDIA announces that computing performance has increased by 1,000 times over the past ten years (photographed at GTC Japan 2018)

 The GPU Technology Conference (GTC) Japan 2018, hosted by NVIDIA, was held in September 2018 in Tokyo. This report primarily focuses on the keynote speech given by the CEO of NVIDIA, Jensen Huang, and the lecture given by Toru Baji, NVIDIA’s Technology Adviser, GPU Evangelist, entitled “FROM THE CAR TO THE CLOUD as well as an overview of “NVIDIA DRIVE AGX” AI platform for autonomous driving and the adoption status by various companies.

 DRIVE AGX is a platform that is comprised of the "Xavier" processor for autonomous machines, as well as an AI that recognizes and determines the external environment surrounding the car, and an AI that recognizes the vehicle’s internal environment and driver status.

 However, the realization of autonomous driving requires the support from a data center or a cloud server, which handles AI deep learning training and simulations.

 Currently, 450 or more companies, including automakers, suppliers, sensor companies, and map companies have adopted the NVIDIA DRIVE AGX, of which 55 companies have their head offices located in Japan. This report also covers the recently announced implementation by Toyota, Subaru, Isuzu, Audi, Daimler/Bosch, and Yamaha Motor.


Related report:
NVIDIA: The rapid evolution of Deep Learning and its applications (August 2018)



Announcement of the NVIDIA DRIVE AGX for autonomous driving systems

 At the beginning of his keynote address at the GTC Japan 2018, Jensen Huang, CEO of NVIDIA, stated that accelerated computing performance has increased by 1,000 times in ten years in many critical areas. Thanks to the introduction of NVIDIA’s GPU, deep learning has developed rapidly, causing AI performance to skyrocket. Furthermore, NVIDIA is researching computer technology and innovating architecture, systems, data centers, system software, algorithms, and applications. These elements are all integrated to work together like an orchestra. NVIDIA simultaneously designs hardware and software, with half of its engineers engaged in software development.

Xavier
The in-vehicle Xavier SOC that controls autonomous driving (Source: NVIDIA, likewise below with partial exclusions)

 Next, Huang introduced Xavier and the new system line NVIDIA DRIVE AGX. Xavier is the world’s first processor developed for autonomous machines and is the most complicated system-on-a-chip (SOC) ever manufactured, handling 100 trillion or more operations every second with only 30W of power. 8,000 man-years worth of engineering effort were invested in the development of the technology. NVIDIA also has taken many measures regarding functional safety using the Dual Execution (redundancy) concept.

 The product family of NVIDIA AGX is a new, unified brand that includes existing systems, and is the world’s first system group for autonomous machines that requires terminal AI to have maximum computational power.

   There are three systems:

  • NVIDIA DRIVE AGX for autonomous driving
  • NVIDIA Jetson AGX for various robots
  • NVIDIA Clara AGX for medical images

   DRIVE and Jetson are combined with a simulator as a set:

  • DRIVE Constellation and DRIVE Sim are for autonomous driving
  • Jetson ISSAC is designed for various robot applications


NVIDIA DRIVE’s composition

DRIVE AGXの構成
DRIVE AGX’s composition (created from NVIDIA materials)

  NVIDIA DRIVE is an open platform that will bring autonomous driving vehicles to the transportation and transport industry, built with the highest levels of functional safety technology and methodology. While dependent on the level (scalability) of the technology to which it is applied, the system can handle multiple deep learning models and algorithms simultaneously.

 The image on the right shows the composition of NVIDIA’s End to End DRIVE Platform. Autonomous driving vehicles will have the DRIVE OS. In addition to the Xavier (SOC), the on-board system includes the DRIVE AV (AI that determines parameters such as vehicle external environment recognition, awareness, self-location estimation, and route creation) and the DRIVE IX (AI that handles vehicle internal environment recognition and driver status recognition) as well as client applications.

 The database side of the system features the deep-learning supercomputer NVIDIA DGX/NGC (NVIDIA GPU Cloud), Deep Learning Framework, AI TRAINING, and the Drive Sim & Constellation (see later section) for simulations. After conducting deep-learning training with the DGX supercomputer, software testing and verification is conducted via Drive Sim & Constellation.

 Because the DRIVE platform is an End-to-End platform, it can handle everything including AI training, simulations, actual driving, as well as all autonomous driving duties from map creation, self-location confirmation, environment recognition, to route setting. Moreover, the system can be applied from Level 2 to Level 5 autonomous driving systems.

NVIDIA DRIVE End to End Platform NVIDIA DRIVE End to End Platform
NVIDIA DRIVE End to End Platform


NVIDIA DRIVE Constellation and DRIVE Sim conducts simulations

 The introduction of autonomous driving vehicles requires test driving over incredibly long distances. However, according to trial calculation results, “the development of autonomous driving vehicles, which improves the driving capabilities of human drivers by 20%, requires test drive distances of 11 billion miles, requiring 100 test vehicles to be driven at an average speed of 25 mph 24 hours a day, 365 days a year, for 518 years”, making it impossible to conduct test driving only using actual vehicles. Therefore, with the introduction of autonomous driving vehicles nearing a reality, the importance of simulations has been re-acknowledged.

 With NVIDIA DRIVE, two systems, the DRIVE Sim and DRIVE Constellation, handle the simulations.

 DRIVE Sim is a software that generates photo-realistic data flows, creating enormous amounts of data on various testing environments. For example, the system can replicate extreme weather conditions such as storms and blizzards, bright sunlight during various times of the day, limited visibility at night, varying road conditions and landscapes, as well as extremely rare driving conditions. Dangerous conditions are noted within the simulations, enabling developers to test the response of autonomous driving vehicles without endangering humans.

 The DRIVE Constellation is a simulator for autonomous driving vehicles, which operates the DRIVE Sim software and conducts several billion miles of test drives in virtual reality. The Constellation’s supercomputer processes sensor simulation data from the sensors on vehicles as if they were being driven on actual roads, controlling the accelerator, brake pedals and steering wheel to drive virtually.

 In the next moment, new sensor information is received, and the same process is repeated. This cycle is conducted 30 times per second, verifying whether the executed algorithm and software are properly operating the simulation vehicle. This sort of feedback loop is called a “hardware-in-the-loop.”

 When the software is corrected based on the simulation results, subsequent simulations are conducted using the corrected software. Such repetitions allow the AI in autonomous driving vehicles to become smarter.

 The test vehicle developed by Waymo, the company believed to have the most advanced development in autonomous driving technology, has conducted 8 million miles of test driving in the roughly 10 years since its Google days. However, Waymo is currently conducting 8 million miles of simulated test driving on a daily basis (Source: NVIDIA documents).

 

SCENARIO/VEHICLE/WORLD MODEL
Scenario/Vehicle/World Model for simulations

Unveiling of DRIVE Constellation to partner companies, collaboration efforts to enhance the systems

 In September 2018, NVIDIA announced it would unveil its DRIVE Constellation and DRIVE Sim to simulation partner companies. The aim is that through unveiling and collaborating with its partners, NVIDIA can improve the depth and flexibility of its platform as well as improve testing and verification efficiency.

 The following recognition provides some background: for simulations to become an effective tool for the development of safe autonomous driving, there needs to be an accurate reflection of real-world characteristics, which features a multitude of changes and unpredictability.

 Virtual environments must not only look identical to reality, but also be based on the laws of physics.

 Virtual vehicles must also replicate the same movements as vehicles in the real world. The vehicle’s dynamics must be replicated exactly according to actual conditions that can occur with an actual vehicle when executing actions such as stepping on the brake, accelerating onto a highway, or driving on bumpy roads.

 When creating a realistic situation that may lead to an accident, the system must observe real-world examples, organize them, and then adjust weather conditions, brightness, and road conditions in the simulator to replicate various scenarios. The testing environment area’s traffic patterns must also be faithfully replicated. Thanks to the open platform, NVIDIA can collect a multitude of real-world examples.

 Regarding the construction of DRIVE Constellation, NVIDIA has worked with various simulation companies and environmental model experts, vehicle and sensor model experts, as well as traffic and scenario model experts to bolster its library, as shown in the upper right image.



Ensuring the safety of autonomous driving vehicles through redundancy and diversity (CEO Jensen Huang)

 Following CEO Jensen Huang’s keynote speech, a Q&A segment for the press was held. Huang responded to questions such as, considering the accident fatality that occurred with Uber’s vehicle, whether Mr. Huang believes autonomous driving development will advance according to plan. Below are some of the responses of Mr. Huang:

  • The safety of autonomous driving vehicles is one of the most important and difficult issues. However, through the simple principle of redundancy and diversity, safety can be ensured. For computers, mutual checks are necessary to prevent weaknesses in any area. With NVIDIA’s system, all features check each other. Additionally, if any anomalies are detected, the system will be programmed to decelerate or stop the vehicle.
  • Transportation is a massive industry, and many companies are cooperating to increase safety. By pooling their efforts, they can become a bigger force. Safety issues must be resolved, and are indeed improving daily.
  • Today’s EV already carry advanced autonomous driving features, although only partly. EVs that will be released in the future will contain advanced autonomous driving features. Moreover, within the next two years, Mr. Huang believes that robot taxis will be realized in many regions around the world. Long-haul trucks will also be equipped with autonomous driving features in the next three to four years, assisting drivers. Long-distance transportation constitutes exhausting work for drivers. There is also the issue regarding labor shortages. Vehicles may be equipped with systems that contribute to reducing driver fatigue, if not the full realization of driverless vehicles.


Toyota, Audi, and Daimler adopt NVIDIA DRIVE

 According to NVIDIA, it has partnered with major automakers such as Audi, VW, Daimler, Tesla, and Volvo. 450 or more passenger vehicle/truck OEMs, tier-1 suppliers, map companies, sensor manufacturers around the world have adopted the system, of which 55 companies have their head offices located in Japan.

 As for Japanese companies, OEMs such as Toyota, Isuzu, and Subaru, in addition to companies like TIER IV and ZMP, have adopted NVIDIA DRIVE.

 Currently, almost all advanced autonomous driving projects around the world utilize some NVIDIA products.

日本の自動車メーカーやIT企業が採用 世界で幅広く採用されている
Adopted by Japanese automakers and IT companies Adopted widely across the globe

Toyota: Equipping its vehicles scheduled to be released in 2020 with NVIDIA DRIVE

 In May 2017, Toyota and NVIDIA announced they would be collaborating. Currently, Toyota is advancing its projects to utilize the NVIDIA DRIVE AGX Xavier as the brain of its AI to be equipped onto its vehicles released from 2020 (believed to be limited to autonomous driving on highways and roads for the exclusive use of automobiles).

 As for autonomous driving on general roads, Toyota is simultaneously developing autonomous driving vehicles for both transportation services and for consumer use. Transportation service vehicles such as the e-Palette will aim for Level 4 autonomous driving capability, with consumer vehicles stepping up from Level 2 to Level 3, eventually aiming for fully autonomous driving capability in the future.

Subaru: Advanced development with NVIDIA’s technology

 Subaru aims to realize Level 2 autonomous driving on highways by 2020, and Level 2 or higher on highways by 2024. Those technologies will be further advanced from highways to roads for the exclusive use of automobiles, and finally to general roads.

 To develop its vehicles, the automaker has adopted NVIDIA products as an advanced development platform. Subaru will implement NVIDIA AI in areas in autonomous driving technology such as peripheral recognition, path planning, multiplexing, robustness, and intelligence.

Isuzu: Starting with lane keeping and ACC, and aiming for fully autonomous driving

 In September 2018, NVIDIA announced that Isuzu has begun adopting the NVIDIA DRIVE AGX platform. Through the implementation of DRIVE AGX, Isuzu will begin with developing features such as 360-degree situation recognition, lane keeping and adaptive cruise control (ACC), with the aim of eventually realizing advanced autonomously driving vehicles.

 For its FY 2018-2020 mid-term plan, Isuzu has placed the implementation of advanced emergency braking systems and the development of autonomous vehicle platooning as the focal points of its development efforts.

Audi: Development of the Audi AI Traffic Jam, capable of Level 3 autonomous driving

 NVIDIA and Audi have been in partnership for more than 10 years. Audi utilizes NVIDIA’s technologies, and featured the Audi AI Traffic Jam Pilot system on its Audi A8 and A6 announced in 2017-2018. The Audi AI Traffic Jam Pilot is the world’s first Level 3 autonomous driving system (usage on actual roads requires the company to resolve issues to comply with various regulations).

  Furthermore, the Audi A8, Audi A6, and the EV exhibited at the Paris Motor Show 2018, the Audi e-tron SUV, are equipped with NVIDIA graphics technology in driver-assistance system features and its infotainment displays.

Daimler and Bosch: Adoption of the NVIDIA AGX (Pegasus)

 Daimler and Bosch have cooperated to develop Level 4 and Level 5 autonomous driving vehicles in urban areas. In July 2018, Daimler and Bosch announced they would be adopting NVIDIA’s AI technology. NVIDIA provides the high-performance AI processor specifically developed for the autonomous driving Drive AGX (Pegasus) platform, while Daimler and Bosch provide the system software that processes the driving algorithms created by machine learning. Daimler and Bosch will also utilize NVIDIA’s expertise in developing the platform.

Yamaha: Development of a last mile vehicle

 At the GTC Japan in September 2018, Yamaha Motor and NVIDIA announced that Yamaha will feature the NVIDIA Jetson AGX Xavier on a wide range of product lines. Jetson is a product targeted at robots and shares its architecture with DRIVE, targeted at autonomous driving. Jetson has been adopted by many Japanese manufacturers such as Denso (automobile components), Komatsu (construction machinery), Fanuc (FA robots), and Musashi Seimitsu Industry (factory automation).

  • Yamaha will develop a last-mile vehicle based on a golf cart. The automaker aims for the social implementation of a new transportation method for depopulated areas in mountainous regions, as well as tourist destinations and urban areas, through the advancement of autonomous driving technology and improved compatibility with MaaS solutions.
  • Through the advancement of the automation of unmanned ground vehicles (UGVs) such as agricultural vehicles, Yamaha will help contribute to the resolution of issues in the agricultural industry, such as labor shortages, automation of farm work, and the advancement of precision agriculture. In 2019, Yamaha plans to begin verification testing of an unmanned agricultural vehicle featuring AI technology.
  • Additionally, Yamaha intends to implement the NVIDIA Jetson AGX Xavier in industrial robots, industrial drones, maritime products, and other product lines.

ヤマハが開発中のラストマイルビークル Jetson AGX Xavierを採用する日本メーカー
Yamaha’s last-mile vehicle, currently under development
(Photographed at GTC Japan 2018)
Japanese companies that utilize the Jetson AGX Xavier


------------------
Keywords
NVIDIA, GPU, GTC Japan 2018, DRIVE AGX, Jetson AGX, Xavier, DRIVE Sim, DRIVE Constellation, Toyota, Isuzu, Subaru, Yamaha Motor, Daimler, Audi

<Automobile Industry Portal MarkLines>