Facebook researchers developed a reinforcement learning model that can outmatch human competitors in heads-up, no-limit Texas hold’em, and turn endgame hold’em poker. … Read more
There’s a new computational workhorse in town. For decades, general matrix-matrix multiply—known as GEMM in Basic Linear Algebra Subroutines (BLAS) libraries—has been a standard benchmark for computational performance. … Read more
Google recently announced the release of version 1.0 of its TensorFlow deep learning framework at their inaugural TensorFlow Developer Summit. In just its first year, the popular framework has helped researchers make progress with everything from language translation to early detection of skin cancer and preventing blindness in diabetics. … Read more
Russian scientists from Lomonosov Moscow State University used an ordinary GPU-accelerated desktop computer to solve complex quantum mechanics equations in just 15 minutes that would typically take two to three days on a … Read more
Adam McLaughlin, PhD student at Georgia Tech shares how he is using NVIDIA Tesla GPUs for his research on Betweenness Centrality – a graph analytics algorithm that tracks the most important vertices within a network. This can be applied to a broad range of applications, such as finding the head of a crime ring or […] … Read more
Daniel Ambrosi, Artist and Photographer, is using NVIDIA GPUs in the Amazon cloud and CUDA to create giant 2D-stitched HDR panoramas called “Dreamscapes.” Ambrosi applies a modified version of Google’s DeepDream neural … Read more
Columbia University researchers have created a robotic system that detects wrinkles and then irons the piece of cloth autonomously. Their paper highlights the ironing process is the final step needed in their “pipeline” of a robot picking up a wrinkled shirt, then … Read more