Skip to Content, Navigation, or Footer.
Support independent student journalism. Support independent student journalism. Support independent student journalism.
The Dartmouth
April 28, 2024 | Latest Issue
The Dartmouth

Prof.'s research inspires ‘EyePhone'

Cell phone users may eventually be able to control their phones with a thought or the blink of an eye, based on research recently conducted by professors at the College. In their recent work on the "EyePhone" and "NeuroPhone," computer science professor Andrew Campbell and his colleagues used neural signals and gaze tracking to expand hands-free smartphone.

Campbell's EyePhone was reviewed in Popular Science and the MIT Technology Review on May 24, and he and his colleagues will present their research in India in August.

As the functionality and popularity of mobile phones increases, their sensors can be used to improve utility and build better applications, according to Campbell.

The development of the EyePhone, which uses a phone's camera to start applications by tracking a user's gaze, was prompted by the evolution of multi-functional smartphones, according to Campbell.

"For many years, the PC has been the center of people's lives," he said. "But for the last five years, the mobile phone has almost replaced the PC. The smartphone is going to be the central computer and communication device."

Research and experimentation with eye tracking on mobile devices has been facilitated by the recent accessibility of programmable phones, Campbell said. The ability to configure phone applications made it possible to create algorithms that take into account a phone user's surroundings and movements, he added.

"We've come to a point that it's so easy to program these devices," Campbell said. "Three years ago, you couldn't do this there was no platform. Now I can develop an app and release it to a million people through the App Store."

EyePhone software, which runs on the Nokia 810 smartphone, partitions the phone's screen into a matrix of nine squares and then attempts to detect the eye's gaze within one of these areas, according to Campbell.

"Most smartphones now come with these large screens," Campell said. "When you're not using the phone in a conversation, you're looking directly at the phone, at the applications."

Because the technology cannot track movements of the iris, it uses the eye's relative position to the phone to determine the user's gaze, according to Campbell. Based on this information, the software highlights the application in the segment of the screen associated with the location of the eye. The user can then launch the application by blinking, Campbell said.

The accuracy of eye tracking to activate phone applications is subject to a user's movements and changes in external conditions, according to Campbell.

"When you're walking around with a cell phone, being able to track eyes is difficult because of the mobility," Campbell said. "I might hold the phone closer or further away, or be exposed to changing light conditions."

In order to optimize precision, the EyePhone employs an algorithm that allows the user to train the system using pictures of the eye, according to Campbell. The software can "learn" to recognize the eye at different distances and in different lighting conditions, he said.

Tests of the EyePhone returned results of 60 to 80 percent accuracy, depending on whether or not the user was in motion, according to Campbell. Routine eye movement also influenced the results, as the EyePhone currently cannot distinguish between intentional and natural blinks.

"[In testing it], we used the phone normally," Campbell said. "People may have blinked naturally and that may have been registered as a false positive."

The accuracy of similar software in the future is likely to improve as developers create algorithms that can detect specific iris movements, according to Campbell.

"If you can get it to that point, it becomes a very natural interface," he said. "Smartphones in the end may not be something you carry in your pocket they might be projected onto the wall or hand, and it will make more sense to just gesture or use your eyes."

In addition to gaze tracking, the use of neural signals may be used to increase smartphone functionality. NeuroPhone software, programmed largely by Matthew Mukerjee '10, uses the information picked up by electrodes on an EEG wireless headset to make phone calls on an iPhone, according to Campbell.

"Neural signals are probably the most ubiquitous signals in the world," Campbell said. "If you could see them, they'd be everywhere."

The EEG headset, normally used to facilitate communication for disabled individuals and allow video game players to play hands-free, registers brain activity when iPhone users see the image of those they want to call, according to Campbell.

Future versions of the software will likely be more practical, using smaller devices to pick up neural signals and applying them to more of the phone's functions, he said.

"No one's gong to be caught dead wearing [the headset] across the Green," Campbell said. "But the earphones of the future will support this functionality, and you'll have two or three electrode centers rather than a clunky headset. Also, the NeuroPhone of the future will be much smarter if you thought of Jay-Z, then the smartphone would download Jay-Z and play it for you."

The project, based on research in machine learning, computer science and neuroscience, was also led by computer science professor Tanzeem Choudhury and Rajeev Raizada, research assistant professor at the Neukom Institute for Computational Science.