New sensor technology will let machines monitor players, and adjust games based on their emotions.
When Microsoft Iintorduced the Kinect sensor in 2010, the company said the motion-capture system would transform gaming. That was only partially true; gamers could do novel things like swing an imaginary golf club or dance, but the Kinect wasn't sensitive enough to register intricate maneuvers. The system, however, has become most popular among hackers, who used it to build smart shopping carts and gesture-controlled quadrocopters. In November, the company will launch an upgraded Kinect with the Xbox One console. With that release, Microsoft could finally disrupt gaming at the level it had originally intended, changing not only how we interact with games but also how games interact with us.
Successful videogames have one thing in common: immersion. When drawn in, players lose track of time, their pulse rises, they become unaware of their surroundings, and, according to a recent study at the University College London, they have difficulty returning to reality. In short, their point of view shifts from the real world to the virtual world. But while it's easy to identify an immersive game (or scene within a game) after the fact, developers have never had feedback of a player's engagement in real time.
The intensity of a game could ratchet up as a player leaned forward or his heart began to race.With the new Kinect, reams of information will flow from the gamer. And that data will be granular enough to detect extremely subtle signals. A high-speed 1080p camera can detect minute movements, including eye blinks, wrist twists, and muscle flexes. Using a combination of the camera's color feed and the active infrared, the Kinect can also pick up fluctuations in a gamer's facial blood flow to estimate heart rate.
Developers could mine that data to change the way games unfold. Along with a player's skills-response time, shooting accuracy-his reactions could factor into gameplay. For example, the intensity of a game could ratchet up as a player leaned forward or his heart began to race. Games could even respond to facial expressions. Granted, precise emotions are hard to nail down (intense fear and intense joy both raise the heart rate). For that reason, applications may be basic at first-adjusting difficulty based on a player's posture, for instance.
That probably won't be the case for long, as sensors become more powerful, affordable, and easily integrated into devices. Already, Israeli company Umoove has created compact head- and eye-tracking systems that could adjust a player's viewpoint based on head movements. And Irish start-up Galvanic has developed a prototype skin-conductivity sensor that can better correlate a player's stress level and in-game performance. Consoles with such heightened senses will allow for games that are progressively more immersive-and blur the once stark line between the real world and the virtual one.
This article originally appeared in the September 2013 issue of Popular Science.
Shares