The augmented reality game Pokémon Go is by far the most popular mobile game ever created – but what if its virtual characters could interact with the real world?
Researchers at MIT recently published a paper introducing a technique called “interactive dynamic video,” which lets people reach in and touch objects in videos. Using an NVIDIA GeForce GTX 460M GPU and traditional cameras, interactive dynamic video captures the tiny object vibrations from five seconds of video — almost undetectable movements — to create real-time video simulations that users can interact with on their devices.
“The technique works by reducing the dimensionality of simulations to a small number of vibration modes — so instead of simulating thousands or millions of dimensions, we only have to simulate tens,” said Abe Davis of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). “Each of these new dimensions is an image, though, and we need to add different combinations of these images every frame to compute the new shape of objects. Fortunately, the GPU does this super-fast — and when it’s done we just render one final pass to warp the original image or video into a new shape (also on the GPU). The GPU does all the heavy lifting. On the CPU, we only keep track of about 5 to 100 single degree-of-freedom simulations — which is a piece of cake.”
The team created a demo of Pokémon virtual characters jumping onto different objects and the environments reacting in very realistic ways.
Davis mentions that interactive dynamic video has many possible uses, from filmmakers producing new kinds of visual effects to architects determining if buildings are structurally sound.