The Apple (i)phone been presented as a very sexy gadget, but I suspect its real significance lies beneath the flash in its multi-touch interface. Like many people, I assumed that this was based on the work of Jeff Han and his team at New York University’s Courant Institute, which he demonstrated at a recent TED conference. However, a bit of digging around suggests that the basis of this UI is much more solid and based on work that has already resulted in real products (iGesture and Touchstream), much loved by their users.
The seeds of these products began their life in a PhD thesis by Wayne Westerman, at the University of Delaware, supervised by Professor John Elias. They then went on to found Fingerworks, which, while the purchase is shrouded in a certain amount of mystery, was later bought by Apple.
The significance of their work is summed up in this extract from a press release in September 2002:
“Elias said the communication power of their system is “thousands of times greater” than that of a mouse, which uses just a single moving point as the main input. Using this new technology, two human hands provide 10 points of contact, with a wide range of motion for each, thus providing thousands of different patterns, each of which can mean something different to the computer.
While much about the computer has changed over the last three decades greater power, faster speeds, more memory what has not changed is the user interface.
“For what it was invented for, the mouse does a good job,” Elias said. “People accept the mouse and the mechanical keyboard because that’s the way it is. But there are limitations in terms of information flow. There is so much power in the computer, and so much power in the human, but the present situation results in a communications bottleneck between the two.”
Elias and Westerman have a better idea. “I believe we are on the verge of changing the way people interact with computers,” Elias said. “Imagine trying to communicate with another human being using just a mouse and a keyboard. It works, but it is slow and tedious.”
Elias said he could envision in the next 10 years “a very complex gestural language between man and machine.””
This would seem to answer a complaint about current computer technology raised by Brian Eno in interview in Wired twelve years ago:
“What is pissing me off about this thing? What’s pissing me off is that it uses so little of my body. You’re just sitting there, and it’s quite boring. You’ve got this stupid little mouse that requires one hand, and your eyes. That’s it. What about the rest of you?”
If Apple pushes forward with this technology, as I think they must, we may see their phone as being even more significant than the Mac in changing the way we interact with computers. Just as the Mac, regard by many when it was introduced as just a toy, brought the insights of the team at Xerox Parc to the mass market, the iphone, or whatever it will be called, will shift in perception from being just a sexy, desirable gadget to being seen as the forbear of a range of devices that engage much of our body and more of our mind.