Fast INT8 Inference for Autonomous Vehicles with TensorRT 3

Autonomous driving demands safety, and a high-performance computing solution to process sensor data with extreme accuracy. Researchers and developers creating deep neural networks (DNNs) for self driving must optimize their networks to ensure low-latency inference and energy efficiency. Thanks to a new Python API in NVIDIA TensorRT, this process just became easier … Read more

RESTful Inference with the TensorRT Container and NVIDIA GPU Cloud

Once you have built, trained, tweaked and tuned your deep learning model, you need an inference solution that you need to deploy to a datacenter or to the cloud, and you need to get the maximum possible performance. You may have heard that NVIDIA TensorRT can maximize inference performance on NVIDIA GPUs, but … Read more

Speed to Safety: Autonomous RC Car Aids Emergency Evacuation

By Abhinav Ayalur, Isaac Wilcove, Lynn Dang, Ricky Avina The alarm is ringing. You smell smoke and see people running for the exit, but you don’t do the same. Why? Because you’re the fire marshal. As the fire circles, you have the responsibility of making sure everyone gets out safely before you can save yourself … Read more

Top AI Researchers Receive First NVIDIA Tesla V100s

At this week’s Computer Vision and Pattern Recognition conference in Honolulu, NVIDIA CEO Jensen Huang surprised a group of elite deep learning researchers at CVPR to unveil the NVIDIA Tesla V100, our latest GPU, based on our Volta architecture, by presenting it to 15 participants in our NVIDIA AI Labs program … Read more