Jun Rekimoto, who studies augmented reality at Sony Computer Science Labs, is using cameras, drones and sensors to capture and share what others are seeing and feeling.
Jun Rekimoto envisions a future in which sports spectators will be able to watch games through the eyes of the players, experiencing the sights and feeling on the field, even while sitting at home on the couch.
This concept of "jacking in," popularized by William Gibson's seminal 1984 novel "Neuromancer," is something that's existed in the reaches of science fiction for decades, but one that Rekimoto is trying to bring closer to reality.
Rekimoto, deputy director of research at Sony Computer Science Laboratories in Tokyo, outlined some of his work in this area at the lab's first-ever research symposium in the US. The event was held this week in Manhattan's Museum of Modern Art, where researchers discussed their work in music, art and prosthetics, to name a few areas.
Rekimoto's focus has been on augmented reality, taking the ideas from Gibson's novel and trying to apply them to current technology. "I wanted to extend this concept, that we can immersively connect to other humans or drones," he said during an interview after the symposium.
Augmented reality technologies have existed for years, but there's a sudden acceleration in virtual reality technology, from Oculus' gaming-oriented virtual reality technology to Sony's own Project Morpheus. Even Samsung, working with Oculus, is set to sell a virtual headset powered by its Galaxy Note 4 smartphone. These products could spark huge changes in how people watch movies, play games and communicate with each other.
"When you combine that viewing experience, it really creates a new way of communicating," said Brian Blau, a consumer-technology analyst for research firm Gartner. "Seeing someone else's viewpoint, I think, is going to be a powerful paradigm."
Rekimoto sits further out on the developmental curve, with some of the ideas he's working on not expected to reach the market for years. In one project, he and a team are developing a headset called LiveSphere, which includes six cameras and can capture 360 degrees around someone wearing the device. Rekimoto said the headset, along with audio instructions from someone watching remotely, could help users get guidance on cooking or medical procedures. Athletes could wear a similar device to provide a fresh view for spectators.
"The resulting image is quite exciting," Rekimoto said when describing a test with a gymnast swinging around on a high bar while wearing LiveSphere.
He added that Sony researchers are also working on adding the sense of touch into LiveSphere by using a "tactile device," such as actuators mounted on fingers.
In another project, Rekimoto said the lab is developing something called Flying Head, in which a drone follows a person's movements. That technology could be used to help athletes gauge their form and style when practicing. In another case, jacking into a personalized drone could allow a user to inspect disaster areas too dangerous for people to visit.
"I think the more important, or more promising, practice is a human augmenting other humans," Rekimoto said, adding that he expects that such technology could birth "a huge industry" with human abilities transmitted from one person to another.