Following on from yesterday’s post about Joe Clark’s frustrations with various aspects of iPhone interface design (and smartphone design more broadly, I think), here are a few more.
First, Craig Mod on the new iPads — amazing hardware, infuriating software.
Getting the iPad to Pro
The problems begin when you need multiple contexts. For example, you can’t open two documents in the same program side-by-side, allowing you to reference one set of edits, while applying them to a new document. Similarly, it’s frustrating that you can’t open the same document side-by-side. This is a weird use case, but until I couldn’t do it, I didn’t realize how often I did do it on my laptop. The best solution I’ve found is to use two writing apps, copy-and-paste, and open the two apps in split-screen mode.
Daily iPad use is riddled with these sorts of kludgey solutions.
Switching contexts is also cumbersome. If you’re researching in a browser and frequently jumping back and forth between, say, (the actually quite wonderful) Notes.app and Safari, you’ll sometimes find your cursor position lost. The Notes.app document you were just editing fully occasionally resetting to the top of itself. For a long document, this is infuriating and makes every CMD-TAB feel dangerous. It doesn’t always happen, the behavior is unpredictable, making things worse. This interface “brittleness” makes you feel like you’re using an OS in the wrong way.
How we use the OS, the user interface, is key. Here’s Bret Victor on why future visions of interface design are missing a huge trick – our hands are more than just pointy fingers.
A brief rant on the future of interaction design
Go ahead and pick up a book. Open it up to some page. Notice how you know where you are in the book by the distribution of weight in each hand, and the thickness of the page stacks between your fingers. Turn a page, and notice how you would know if you grabbed two pages together, by how they would slip apart when you rub them against each other.
Go ahead and pick up a glass of water. Take a sip. Notice how you know how much water is left, by how the weight shifts in response to you tipping it.
Almost every object in the world offers this sort of feedback. It’s so taken for granted that we’re usually not even aware of it. Take a moment to pick up the objects around you. Use them as you normally would, and sense their tactile response — their texture, pliability, temperature; their distribution of weight; their edges, curves, and ridges; how they respond in your hand as you use them.
There’s a reason that our fingertips have some of the densest areas of nerve endings on the body. This is how we experience the world close-up. This is how our tools talk to us. The sense of touch is essential to everything that humans have called “work” for millions of years.
Now, take out your favorite Magical And Revolutionary Technology Device. Use it for a bit. What did you feel? Did it feel glassy? Did it have no connection whatsoever with the task you were performing?
I call this technology Pictures Under Glass. Pictures Under Glass sacrifice all the tactile richness of working with our hands, offering instead a hokey visual facade.
And that was written in 2011. We’ve not got any further.
The YouTube video he links to isn’t there anymore, but this one from Microsoft works just as well.