The cloud-based service is available immediately to users of the just-announced Amazon Elastic Compute Cloud (Amazon EC2) P3 instances featuring NVIDIA Tesla V100 GPUs. NVIDIA plans to expand support to other cloud platforms soon.
“The NVIDIA GPU Cloud democratizes AI for a rapidly expanding global base of users,” said Jim McHugh, vice president and general manager of Enterprise Systems at NVIDIA. “NGC frees developers from the complexity of integration, allowing them to move quickly to create sophisticated neural networks that deliver the transformative powers of AI.”
After signing up for an NGC account, developers can download a containerized software stack that integrates and optimizes a wide range of deep learning frameworks, NVIDIA libraries and CUDA runtime versions — which are kept up to date and run seamlessly in the cloud or on NVIDIA DGX systems.
Key benefits of the NGC container registry include:
- Instant access to the most widely used GPU-accelerated frameworks: Containerized software includes NVCaffe, Caffe2, Microsoft Cognitive Toolkit (CNTK), DIGITS, MXNet, PyTorch, TensorFlow, Theano and Torch, as well as CUDA for application development.
- Maximum Performance: Tuned, tested and certified by NVIDIA for maximum performance, the NGC container registry enables developers to get optimal performance on NVIDIA GPUs running on clouds.
- Pre-integration: Easy-to-use containers allow users to begin deep learning jobs immediately, eliminating time-consuming and difficult do-it-yourself software integration.
- Up to date: Containers available on the NGC container registry benefit from continuous NVIDIA development, ensuring each deep learning framework is tuned for the fastest training possible on the latest NVIDIA GPUs. NVIDIA engineers continually optimize libraries, drivers and containers, delivering monthly updates.