NVIDIA announced immediate availability of the NVIDIA® GPU Cloud (NGC) container registry for AI developers worldwide. In just a few steps, NGC helps developers get started with deep learning development through no-cost access to a comprehensive, easy-to-use, fully optimized deep learning software stack. The cloud-based service is available immediately to users of the just-announced Amazon Elastic Compute Cloud (Amazon EC2) P3 instances featuring NVIDIA Tesla® V100 GPUs. NVIDIA plans to expand support to other cloud platforms soon.
After signing up for an NGC account, developers can download a containerized software stack that integrates and optimizes a wide range of deep learning frameworks, NVIDIA libraries and CUDA® runtime versions — which are kept up to date and run seamlessly in the cloud or on NVIDIA DGX™ systems.
The NVIDIA GPU Cloud democratizes AI for a rapidly expanding global base of users,” said Jim McHugh, vice president and general manager of Enterprise Systems at NVIDIA. “NGC frees developers from the complexity of integration, allowing them to move quickly to create sophisticated neural networks that deliver the transformative powers of AI.”
Three Steps to Accelerated AI Computing
Developers who want to get started with deep learning right away using the NGC container registry can follow a three-step process:
- Sign up for a no-cost NGC account at www.nvidia.com/ngcsignup.
- Run an optimized NVIDIA image on cloud service provider platform.
- Pull containers from NGC and get started.
Key benefits of the NGC container registry include:
- Instant access to the most widely used GPU-accelerated frameworks: Containerized software includes NVCaffe, Caffe2, Microsoft Cognitive Toolkit (CNTK), DIGITS, MXNet, PyTorch, TensorFlow, Theano and Torch, as well as CUDA for application development.
- Maximum Performance: Tuned, tested and certified by NVIDIA for maximum performance, the NGC container registry enables developers to get optimal performance on NVIDIA GPUs running on clouds.
- Pre-integration: Easy-to-use containers allow users to begin deep learning jobs immediately, eliminating time-consuming and difficult do-it-yourself software integration.
- Up to date: Containers available on the NGC container registry benefit from continuous NVIDIA development, ensuring each deep learning framework is tuned for the fastest training possible on the latest NVIDIA GPUs. NVIDIA engineers continually optimize libraries, drivers and containers, delivering monthly updates.
Sign up for the free insideAI News newsletter.