TensorRT 5 RC Now Available

AT GTC Japan, NVIDIA announced the latest version of the TensorRT’s high-performance deep learning inference optimizer and runtime. Today we are releasing the TensorRT 5 Release Candidate.   … Read more

Video Tutorial: Introduction to Recurrent Neural Networks in TensorRT

NVIDIA TensorRT is a high-performance deep learning inference optimizer and runtime that delivers low latency and high-throughput. TensorRT can import trained models from every deep learning framework to easily create highly efficient inference engines that can be incorporated into larger applications and services … Read more

NVIDIA Releases TensorRT 4

Today we are releasing TensorRT 4 with capabilities for accelerating popular inference applications such as neural machine translation, recommender systems and speec … Read more