Durrell Bishop's 1992 grad project for his design program at the Royal College of Art was a brilliantly conceived riff on the answering machine, making use of physical, legible interfaces that made a point of exposing the conceptual workings of the device to its users.
Durrell Bishop is a partner in Luckybite with Tom Hulbert, working on physical interfaces, product design and interactive media. Prior to this he was a senior interaction designer at IDEO Europe. He co-founded Itch, which won a D&AD Gold award for large-scale work on the Science Museum Welcome Wing, and he was a partner in Dancing Dog, working on camera-based interfaces to computer games.
Durrell Bishop’s Marble Answering Machine
(via Timo Arnall)
Timo Arnall from the design studio BERG has makes several great and provocative points in his essay "No to NoUI" -- a well-argued piece that opposes the idea of "interfaces that disappear" and "seamless computer interfaces," arguing that by hiding the working of computers from their users, designers make it harder for those users to figure out what the computers are really doing and to solve the problems that inevitably arise.
Interfaces are the dominant cultural form of our time. So much of contemporary culture takes place through interfaces and inside UI. Interfaces are part of cultural expression and participation, skeuomorphism is evidence that interfaces are more than chrome around content, and more than tools to solve problems. To declare interfaces ‘invisible’ is to deny them a cultural form or medium. Could we say ‘the best TV is no TV’, the ‘best typography is no typography’ or ‘the best buildings are no architecture’?
...We might be better off instead taking our language from typography, and for instance talk about legibility and readability without denying that typography can call attention to itself in beautiful and spectacular ways. Our goal should be to ‘place as much control as possible in the hands of the end-user by making interfaces evident’.
Of course the interfaces we design may become normalised in use, effectively invisible over time, but that will only happen if we design them to be legible, readable, understandable and to foreground culture over technology. To build trust and confidence in an interface in the first place, enough that it can comfortably recede into the background.
No to NoUI
(via Dan Hon)
A joint Disney Research and CMU team have produced a demo showing gesture controls on a variety of everyday, non-computer objects. The system, called Touché, uses capacitive coupling to infer things about what your hands are doing. It can determine which utensil you're eating your food with, or how you're grasping a doorknob, or even whether you're touching one finger to another or clasping your hands together. It's a pretty exciting demo, and the user interface possibilities are certainly provocative. Here's some commentary from Wired UK's Mark Brown:
Some of the proof-of-concept applications in the lab include a smart doorknob that knows whether it has been grasped, touched, or pinched; a chair that dims the lights when you recline into it; a table that knows if you're resting one hand, two hands, or your elbows on it; and a tablet that can be pinched from back to front to open an on-screen menu.
The technology can also be shoved in wristbands, so you can make sign-language-style gestures to control the phone in your pocket—two fingers on your palm to change a song, say, or a clap to stop the music. It can also go in liquids, to detect when fingers and hands are submerged in water.
"In our laboratory experiments, Touché demonstrated recognition rates approaching 100 percent," claims Ivan Poupyrev, senior research scientist at Disney Research in Pittsburgh. "That suggests it could immediately be used to create new and exciting ways for people to interact with objects and the world at large."
Disney researchers put gesture recognition in door knobs, chairs, fish tanks
Bret Victor's "Brief Rant on the Future of Interaction Design" is an eye-opening look at the poverty of the current options for computer-human interaction. Victor argues that our hands have enormous expressive range and sensitivity, but our devices accept only pokes and swipes from them, and only provide feedback in the form of a little bit of haptic buzzery. He persuasively argues that expanding the repertoire of I/O using hands will produce interfaces that are richer and better:
Notice how you know where you are in the book by the distribution of weight in each hand, and the thickness of the page stacks between your fingers. Turn a page, and notice how you would know if you grabbed two pages together, by how they would slip apart when you rub them against each other.
Go ahead and pick up a glass of water. Take a sip.
Notice how you know how much water is left, by how the weight shifts in response to you tipping it.
Almost every object in the world offers this sort of feedback. It's so taken for granted that we're usually not even aware of it. Take a moment to pick up the objects around you. Use them as you normally would, and sense their tactile response — their texture, pliability, temperature; their distribution of weight; their edges, curves, and ridges; how they respond in your hand as you use them.
There's a reason that our fingertips have some of the densest areas of nerve endings on the body. This is how we experience the world close-up. This is how our tools talk to us. The sense of touch is essential to everything that humans have called "work" for millions of years.
A Brief Rant on the Future of Interaction Design
(via Making Light)