Touchscreen Touch Feedback

March 16th, 2012
ideas, tech
The tablet and smart-phone are incredibly flexible because they are just rectangles of multitouch display-glass. By getting rid of the buttons of earlier designs they give the application designer full control over the interface. We made the same jump with the switch to mouse and GUI: instead of simply repurposing the buttons on the keyboard we could place labeled buttons in any arrangement we wanted.

When we added the mouse we kept the keyboard for its typing speed, but with touchscreens eliminating the keyboard is now a serious option: phone keyboards have never been that good and multitouch screens aren't that bad for typing. Still, the pictures under glass model suffers from a lack of feedback. When you type on a keyboard you can feel the edges of the keys and your body learns calibration until it becomes an automatic extension of your body. This doesn't happen on a screen.

It's especially apparent with digital instruments: while there are many great musical iPhone apps [1], and the touchpad on a MacBook makes a nice MIDI controller, you need to be constantly looking at the screen not to get lost. With a guitar or piano your fingers quickly learn to do most of the fine tuning of positioning on their own and allow you to play much more quickly than if you had to keep your eyes in the loop.

How do we get the eyes out of the loop? Especially, how do you give better feedback than a simple right or wrong, where as the fingers start to stray they can adjust back unconsciously? The best we have so far is a click sound or short vibration after a successful button press, but that can't hint what you need to fix or where you're in danger of going wrong. Instead we fall back on visual feedback, but the fingers get in the way, and it doesn't combine well with muscle memory.

I want something better: a precise way to get a feeling into the fingers from the surface of the device. It should be immediate, but it's probably fine if it doesn't feel much like anything we're used to: we don't need to imitate the feel of a button, just get some of its advantages. Either we need to trigger the finger's natural sense of touch by deforming the surface, or we need some way to act on the finger at a distance. An array of raisable pins like in a Braille reader? Strong charges of static electricity just below the screen tingling in the fingers? [2] Direct electrical stimulation with tiny contacts? A major trade-off here is whether you get to keep the surface as a simple pane of sturdy glass. Tiny dots probably make for a much worse image. On the other hand I don't know the physics to say whether a finger-tingling voltage in close proximity to the rest of a touch screen is practical. Are there other ways to trigger the nerves of the finger remotely?

One possibility is to not worry about any of this and just write smart software that takes a noisy imprecise input stream and sorts it out. Siri and Swype, gestures with the mouse or finger. Either we give people better feedback so we can be more precise, or we teach devices enough that we don't have to.


[1] I say iPhone because you can't make them for Android.

[2] You could put magnets in the fingers and then have the device wiggle them, but we'd like to just work for everyone without requiring biomods.

Comment via: google plus, facebook

Recent posts on blogs I like:

How Does Fiction Affect Reality?

Social norms

via Thing of Things April 19, 2024

Clarendon Postmortem

I posted a postmortem of a community I worked to help build, Clarendon, in Cambridge MA, over at Supernuclear.

via Home March 19, 2024

How web bloat impacts users with slow devices

In 2017, we looked at how web bloat affects users with slow connections. Even in the U.S., many users didn't have broadband speeds, making much of the web difficult to use. It's still the case that many users don't have broadband speeds, both …

via Posts on March 16, 2024

more     (via openring)