I’ve been waiting for a full-screen touch UI with haptic response. That is, if the application displays a button on the screen, when I push it with my finger, I want to feel it clicking. Similarly, when I nudge an object, I want to feel its edge on screen.
Get the latest Flash Player to see this movie.
Imagine the challenges in designing for the kind of hardware depicted in the video, above! It doesn’t exist, yet, but I’m ready. I can also imagine haptic icons on mobile-phone handsets, because I heard researchers present their research at the University of British Columbia, a few years ago. I can imagine the responsibility and the pressure of being the first to market with haptic icons. The market leader will get to define what OK feels like and what Cancel feels like, for years to come. The idea is that mobile phones should touch you back, with haptic icons, at the place where your thumb typically touches the handset. Phones could signal—through touch—that you have a call waiting. Similarly, a camera could recommend—through touch—that it’s focused and ready to shoot. Outside the world of portable electronics, a doorknob could signal—through touch—that there are already three people in the room, and a steering wheel could alert you—through touch—that your car needs refuelling.
Most of us will only get to decide which kind of haptic cue we want a screen to convey during a particular physical interaction between the user’s fingers and a screen. But I’m eagerly awaiting that technology. There are bound to be many common computing tasks where a finger can outperform a mouse.