New smartphone reads callers' neural signals

by MICHAEL RIORDAN | 10/17/11 10:00pm

Dartmouth computer science professor Andrew Campbell was jogging on a frigid day two winters ago when he realized that he needed to call his wife. As he struggled to remove his gloves to operate his cellphone, he asked himself a question that would later fuel his research "Why can't a thought just drive a phone?"

This idea may soon become a reality for cellphone users thanks to research conducted by Dartmouth professors and students. The new "NeuralPhone" a smartphone that reads callers' thoughts was inspired by a senior thesis project by Matthew Mukerjee '10. The phone, which has been featured in The New York Times Magazine and on CBS News, is the first of its kind to rely on a "DialTime" interface that functions as a contact list, Campbell said.

The "DialTime" feature flashes pictures of contacts at random. When a user sees the image of the person he intends to call, his brain sends P300 waves, or neural signals, to a headset fastened with electrodes, Campbell said. P300 waves represent a response to "an anticipation of something happening but not knowing when it's going to happen," Campbell said.

The headset then converts and transmits these waves to the phone, where an internal learning algorithm searches for matching wave patterns, he said. If the wave patterns match those of a contact, the phone calls the designated person, Campbell said.

The potential for this technology is "endless," Campbell said, adding that he has spearheaded the creation of three separate phone applications that incorporate smartphone sensors.

BeWell an application that uses sensors to measure individuals' health and well-being by observing how often they walk, talk and sleep remains in the development stage.

To make the application more accessible to users, Campbell and his team designed a "fish-themed display" for BeWell, he said,

"There is an animated aqua-ecosystem on the wallpaper of your phone, where a fish represents you," Campbell said.

The fish's speed, color and school size correlate with users' health, according to Campbell.

CenceMe, which relies on sensors to infer a user's location, was the team's first application and was created after the release of the first iPhone in 2007, according to Campbell. Through CenceMe, the user can post location information to social networking sites like Facebook, he said.

"When the app store first opened up, CenceMe was one of 40 social networking applications, and now of course there's 400,000 applications out there," Campbell said.

Another application, WalkSafe, employs the phone's camera as a sensor and vibrates to alert the user of approaching vehicles, Campbell said.

"It can detect cars from 50 meters away traveling at 30 mph," he said.

These new technologies provide researchers with an opportunity to improve conditions in a variety of fields ranging from education to health care, according to Campbell. A teacher with a network of neural phones, for example, would be able to view student feedback to questions instantaneously on a phone or computer monitor, according to Campbell. Teaching methods could then be modified based on the responses, he said.

"I think it's a technology that is going to explode," Campbell said.

Cornell University computer science professor Tanzeem Choudhury, who left her full-time position at the College in June, and Rajeev Raizada, a former researcher at the College's Neukom Institute for Computational Science who also relocated to Cornell, contributed to the NeuralPhone's research and development, Campbell said.