MIT Surround Vision spills TV programs off the screen straight into the living room

MIT Surround Vision

Audio enthusiasts more than appreciate surround sound technology that typically employs multiple speakers to augment the TV experience much beyond the screen. Now, a new system named Surround Vision developed by MIT’s Media Labs offers to do something similar using regular handheld devices, albeit with images.

Traditionally, surround sound enables users to hear the off-camera happenings lucidly, expanding TV programs and movies outside the limit of the screen’s edges. And brings it literally into the living room. Doing something identical with images, MIT’s latest development aims to function with standard, Internet-enabled handheld devices.

“If you’re watching TV and you hear a helicopter in your surround sound,” remarks Santiago Alfaro, a graduate student in the lab who’s leading the project “wouldn’t it be cool to just turn around and be able to see that helicopter as it goes into the screen?”

Interestingly if viewers desire to watch what was happening off the left edge of the television screen, all they would have to do is point their cell phone in the particular direction. In effect, this would pop up an image on its screen. Another functionality of the technology could also enable users to confer several different camera angles without affecting what others see on the TV screen.

Alfaro got his prototype running after affixing a magnetometer to an existing handheld device and to write software that integrated its data with that from other sensors of the device. The researcher nevertheless mentions that most present day devices counting the latest iPhone version already include magnetometers onboard.

Along with his advisor, Media Lab research scientist Michael Bove, Alfaro foresees video running on the handheld device to stream over the Internet upon commercialization of the system. At the TV service providers end, there would seemingly be no modification of broadcasts or even set-top boxes for that matter.

“In the Media Lab, and even my group, there’s a combination of far-off-in-the-future stuff and very, very near-term stuff, and this is an example of the latter,” Bove adds. “This could be in your home next year if a network decided to do it.”

From three angles, Alfaro shot video footage of the street ahead of the Media Lab simultaneously once he had outfitted a handheld device with the essential motion sensors. With a TV set replaying the flicks from the center camera, a viewer pointing a motion-sensitive handheld device straight at the TV would cause the same footage to appear on its screen.

Swinging the device in either the right or left directions nevertheless would cause it to shift to one of the other views. Like an approaching bus can be seen by the viewer on the small screen before it appears on the large screen. Moreover, Alfaro was able to conceive demos that permit users to swap between the final version of a film and alternate takes thanks to the availability of many commercial film DVDs with bonus footage of scenes shot from different angles.

A range of user studies that should make use of content developed with partners are anticipated in the spring and summer. It would be really interesting to see this prototype technology in some real action.