Text 2.0 uses infrared light and a camera to track eye movement across a screen.
As eReading devices and the software that runs them become more advanced in an increasingly competitive market, researchers are creating applications that could take reading to a whole new level, with tools such as Text 2.0—a reading technology that personalizes the user’s experience by tracking eye movements.
Text 2.0 uses infrared light and a camera to track eye movement across a screen, and it uses this information to infer a user’s intentions during the course of reading.
For example, taking more time to read certain words, phrases, or names could trigger the appearance of sound effects, footnotes, translations, biographies, definitions, or animations. If the user begins skimming the text, the tracker will begin fading out words it deems less important to the text. If the reader glances away, a bookmark automatically appears, pointing to where the user stopped reading.
Watch a video that demonstrates Text 2.0 in use:
Though many new technologies now beginning to surface are experimenting with hands-free controlling of software and devices—such as electrocorticography (ECoG), in which a sheet of electrodes is laid directly on the surface of the brain to allow for “mind typing” and performing computer activities based solely on brain stimuli—some observers say Text 2.0 is a just-right futuristic technology that’s already generating interest from major companies.
One of these is reported to be Apple Inc., a company known to take risks on highly-profitable technologies, such as the computer mouse in 1984 and the touch screen in 2007.
In a 2007 patent filing by Wayne Westerman and John Elias, co-founders of the Fingerworks firm acquired by Apple during the development of the original iPhone, the patent details a handful of these newly proposed inputs under the title “Multi-Touch Data Fusion.”
The pair of engineers note in their filing that while the touch technology gives users more control, fusing additional information from other “sensing modalities” can enhance a device or improve its overall ease of use.
These sensing modalities can include voice fusion, finger ID fusion, facial expression fusion, biometrics fusion, and Gaze Vector fusion—a technology from Tobii Technology.
Microsoft Corp. also recently has been backing Tobii’s technology.
At this year’s Computex, an information technology show based in Taipei, Microsoft demonstrated eye control from Tobii. Interested participants were placed in front of a screen and were told simply to “interact by using your eyes.”
“Windows 7 opens a world of opportunity for hardware manufacturers, software developers, and service providers, and Computex gives us an opportunity to demonstrate what is possible,” said Murray Vince, general manager of the Original Equipment Manufacturer division for Microsoft. “Tobii’s eye-tracking and eye-control products provide a natural user interface and enable rich new computing scenarios; we are thrilled to be collaborating with them.”
According to Amadeus Capital investor Jeppe Zink, Tobii also has sold its technology to internet portal companies.
Tobii’s eye-tracking products cost roughly between $7,000 and $35,000, depending on how advanced the product’s operating system is. Besides DFKI’s Text 2.0, Tobii’s technology can be used with online advertising, gaming, car safety, and 3-D displays, says the company. Tobii also says the technology can be used to help people with disabilities.
“Computer manufacturers are working intensively to integrate new and intuitive interaction interfaces,” said John Elvesjo, founder and chief technology officer of Tobii Technology, in a statement. “Eye control is one such technology. Tobii’s eye-controlled computers are already used by thousands of people with physical and speech impairments around the globe and will, in a near future, become a natural part of a regular PC environment. To reach this point, it is essential that we collaborate with major players.”
Although Text 2.0 might sound like an intriguing technology, critics are questioning how helpful it might really be.
Some wonder if automatic visual or special effects will be too distracting. They also question whether the software really can determine which words are “less important” if a reader is skimming. And will eye-tracking software open the possibility for advertisements to pop up related to topics the reader is perusing? For example, if a user reads about Julia Child, will advertisements for cutlery appear when a user reads the word “knife”?
Many say it’s too soon to tell how the technology will be implemented.
And while eye-tracking technology is still in its infancy, DFKI recently put its Processing Easy Eye-tracker Plug-in (PEEP) to a practical use by allowing Webkit’s 3-D capability to create a window manipulation system called “gaze-controlled tab expose.”
In other words, computer users can use their eyes to pull up internet tabs in 3-D.
Watch a video showing gaze-controlled tabs in action:
PEEP is free to download and can be used in any eye-tracking project.