PDA

View Full Version : Microsoft's KinectFusion research project offers real-time 3D reconstruction, wild AR



wraggster
August 9th, 2011, 22:15
http://www.blogcdn.com/www.engadget.com/media/2011/08/fusionkinect-demo-siggraph-2011.jpg (http://www.engadget.com/2011/08/09/microsofts-kinectfusion-research-project-offers-real-time-3d-re/)
It's a little shocking to think about the impact that Microsoft's Kinect camera has had on the gaming industry at large, let alone the 3D modeling industry (http://www.engadget.com/all/kinect+Hack). Here at SIGGRAPH 2011, we attended a KinectFusion research talk hosted by Microsoft, where a fascinating new look at real-time 3D reconstruction was detailed. To better appreciate what's happening here, we'd actually encourage you to hop back and have a gander at our hands-on (http://www.engadget.com/2010/03/11/primesense-talks-full-body-motion-control-at-gdc-the-possibilit/) with PrimeSense's raw motion sensing hardware from GDC 2010 -- for those who've forgotten, that very hardware was finally outed (http://www.engadget.com/2010/03/31/primesense-fesses-up-its-the-magic-behind-microsofts-project/) as the guts behind what consumers simply know as "Kinect." The breakthrough wasn't in how it allowed gamers to control common software titles sans a joystick -- the breakthrough was the price. The Kinect took 3D sensing to the mainstream, and moreover, allowed researchers to pick up a commodity product and go absolutely nuts. Turns out, that's precisely what a smattering of highly intelligent blokes in the UK have done, and they've built a new method for reconstructing 3D scenes (read: real-life) in real-time by using a simple Xbox 360 peripheral.

The actual technobabble ran deep -- not shocking given the academic nature of the conference -- but the demos shown were nothing short of jaw-dropping. There's no question that this methodology could be used to spark the next generation of gaming interaction and augmented reality, taking a user's surroundings and making it a live part of the experience. Moreover, game design could be significantly impacted, with live scenes able to be acted out and stored in real-time rather than having to build something frame by frame within an application. According to the presenter, the tech that's been created here can "extract surface geometry in real-time," right down to the millimeter level. Of course, the Kinect's camera and abilities are relatively limited when it comes to resolution; you won't be building 1080p scenes with a $150 camera, but as CPUs and GPUs become more powerful, there's nothing stopping this from scaling with the future. Have a peek at the links below if you're interested in diving deeper -- don't be shocked if you can't find the exit, though.http://www.engadget.com/2011/08/09/microsofts-kinectfusion-research-project-offers-real-time-3d-re/