We love seeing all of the NVIDIA GPU-related tweets – here’s some that we came across this week:
— Raghav (@SuperVoxel) May 19, 2017
— Arno Candel (@ArnoCandel) May 25, 2017
Now I hate Apple for not including CUDA enabled GPU on Macbook Pro 2016
— Jussi Kujala (@jukujala) May 19, 2017
— Santi Dsp (@santty128) May 23, 2017
twf @AMD launches a new GPU "for deep learning", but then you remember no framework currently works w/ OpenGL
nvidia deep learning day 참가중^^ pic.twitter.com/nTnsBsIrlD
— 김진만 (@DM_SMU) May 25, 2017
may all your neural networks be GPU rich & threadbare! A massive structure of threads hang over NN like a great wanton beast of computation!
— Paul Tulloch (@ptullochott) May 24, 2017
Installed a GTX 1080ti in my #deeplearning rig. From unboxing to training in under 1hr. Surprised that pass through to VM was so easy.
— Grant Beyleveld (@grantbey) May 19, 2017
— Vlado Handziski (@vlahan) May 24, 2017
— Abhinav Kr. Gupta (@KrAbhinavGupta) May 20, 2017
On Twitter? Follow @GPUComputing and @mention us so we’re able to keep track of what you’re up to.