NVIDIA Is Unlocking the Potential of Deep Learning

Companies across nearly all industries are exploring how to use GPU-powered deep learning to extract insights from big data. From self-driving cars to disease-detecting mirrors, the use cases for deep learning is expanding by the day. Since computer scientist Geoff Hinton started using GPUs to train his neural networks, researchers are applying the technology to tough modeling problems in the real world.

Alex Woodie of Datanami recently interviewed Will Ramey, Accelerated Computing senior product manager at NVIDIA, to get an insight on how NVIDIA is unlocking the potential of GPU-powered deep learning applications.

The article also mentions the new software by NVIDIA aimed at helping data scientists build deep learning systems powered by GPUs will be shipped this month, which includes version 3 of the cuDNN library, and DIGITS 2.

Read more on Datanami >>

About Brad Nemire

Brad Nemire
Brad Nemire is on the Developer Marketing team and loves reading about all of the fascinating research being done by developers using NVIDIA GPUs. Reach out to Brad on Twitter @BradNemire and let him know how you’re using GPUs to accelerate your research. Brad graduated from San Diego State University and currently resides in San Jose, CA. Follow @BradNemire on Twitter