As Donald Norman said in 1990, “The real problem with the interface is that it is an interface. Interfaces get in the way. I don’t want to focus my energies on an interface. I want to focus on the job…I don’t want to think of myself as using a computer, I want to think of myself as doing my job.”
It’s time for us to move beyond screen-based thinking. Because when we think in screens, we design based upon a model that is inherently unnatural, inhumane, and has diminishing returns. It requires a great deal of talent, money and time to make these systems somewhat usable, and after all that effort, the software can sadly, only truly improve with a major overhaul.
The best interface is no interface
A confusing UI element in Apple’s iTunes.
- Clicking the button will result in an action that works as advertised, it shuffles the playlist. This button includes 2 other actions as well which may lead to confusion for many. It also plays the song list, without any indication that this is an included action. The other action is skip. While the list is playing clicking the button again will result in the player skipping the currently playing song.
- There is no indication that the shuffle function is active and no way to turn it off (no stop), except by going into a nested action inside the menu bar.
These little details matter when delivering a coherent user experience.
I gave this keyboard, a temporary replacement to my Apple wireless, to my wife to use with her iPad – it’s a neat keyboard in theory but I never was comfortable typing on it. She now seems pretty happy with it, at least compared to typing on glass.
But for normal people this keyboard, and Apple’s for that matter, presents a UI problem. My wife wants to connect the keyboard to the iPad – at first glance how does she accomplish this task? There is nothing in this picture that supports that task, that supports her mental model of connecting 2 devices together.
She knows it’s a bluetooth keyboard, so she looks through the settings app. on the iPad and finds Bluetooth settings within which shows which devices are connected. No luck there.
So she asks.
I tell her she has to pair the devices. There is no UI to support that label. Because I have previously learned the UI from numerous other bluetooth devices I come over try long pressing the bluetooth icon and eventually we are in action. Known of that is at all obvious or learnable without outside guidance. Most bluetooth devices, especially the Apple BT keyboard I have, which has you long press the power button (!), allow users to fail in this basic task.
If we look to minimalistic or very simple UI the task should be automatic, like with Apple’s new EarPods. If we are unable to accomplish that then we need more obvious UI that directs the user to task completion – like a button with a label that has connect/pair or an icon with a universally accepted connect/pair meaning. The bluetooth icon is slowly becoming that symbol but I bet most people would not recognize it as such in tests.
Because every person knows what he likes, every person thinks he is an expert on user interfaces.
Something found from my notes. How much closer are we today (this was from 2008)?
By 2020 the terms “interface” and “user” will be obsolete as computers merge ever closer with humans. It is one prediction in a Microsoft-backed report drawn from the discussions of 45 academics from the fields of computing, science, sociology and psychology.
It predicts fundamental changes in the field of so-called Human-Computer Interaction (HCI).
By 2020 humans will increasingly interrogate machines, the report said.
In turn computers will be able to anticipate what we want from them, which will require new rules about our relationship with machines.
Computers to merge with humans
Perhaps due to the influence of Windows based software poor UI design, I often come across the common mistakes superfluous and poorly thought out dialog boxes. In addition to the maxim below, I believe we should avoid creating error dialogs when an undo will do. Unfortunately the essential undo function is still often forgotten.
Dialog boxes should be action-oriented; they should help guide users towards what their next step is likely to be and it should provide them with the information that they need in order to be able to accomplish that next step.
He doesn’t state it strongly enough. Unfortunately, no matter how strongly or loudly you state this fact, often it goes unheard.
Typically, the burden is on the user to learn how a software application works. The burden should be increasingly on the system designers to analyze and capture the user’s expectations and build that into the system design. Norman, 1988
So often in my new experiences complexity is the selling point, the starting point, and/or the proof of your value. People (customers) don’t share this vision. People are intelligent but must be set free to construct the level of complexity they are comfortable with, or need.
Complexity isn’t designed but rather rises spontaneously through self-organisation. Start with basic or simple interactions and allow more complex behaviours or patterns to emerge.
From an old project proposal, source is likely from theory of emergence.
No medium has managed to reach the status of genuine artistry without offending some of its audience some of the time. Even under the user-friendly dictates of interface design, you can’t make art without a good measure of alienation.
Steven Johnson, Interface Culture (HarperEdge, 1997)
My current projects all involve dealing with issues of featureitis, software with simple uses, but with a monstrous amount of controls and options. It’s well designed software created by brilliant nice people, but many have fallen into the belief that more UI controls, more options, more visible data, somehow makes software more desirable. This is of course a long held problem, routed not just in software (ala. Microsoft Windows) but in Western Society herself. As far as interface design is concerned, I know from experience, more choice as a feature seldom works, as complexity leads to more complexity, more choice leads to dissatisfaction.
Is lots of choice a good thing?
You see it turns out whilst people will invariably ask for more choice, lots of choice is not really a good thing for the following reasons:
- More choice means more options for people to consider, and a greater cognitive workload to do so, as all the different options are weighed up and evaluated.
- With lots of choice the burden of responsibility is placed on the person making the choice, rather than those drawing up the choices. If a bad choice is made it’s because someone chose the wrong option, not because a poor set of options were made available.
- More choice means greater expectations, and a greater probably of not meeting those expectations. With so many options available, people will expect there to be one that is exactly what is need, and will no doubt be disappointed when they don’t choose it.
- More choice means less engagement.Sometimes people would rather not take part, than have to go through a million and one different options. For example, an interesting study showed that for every 10 investment funds that an employer offered for their pension scheme (e.g. 10, 20, 30, 40 different funds, and so on), uptake fell by 2%. Employees were put off participating because they didn’t want to have to select from so many different options.
Ideally, we would focus entirely on those features or controls that users need to accomplish their goals while deleting our through perhaps progressive disclosure keep all the complexity hidden from all but the most advanced user. My experience is that that is far more difficult than it ought to be but its a challenge worth engaging in.
Below is a ted talk where Psychologist Barry Schwartz takes aim at a central tenet of western societies: freedom of choice. In Schwartz’s estimation, choice has made us not freer but more paralyzed, not happier but more dissatisfied.
Dino Ignacio’s supercut of all the moments in A New Hope where characters interacted with machines, doors, screens, levers, knobs and buttons.
… graphic interfaces are more about telling a good story than conveying real information. Our ultimate goal is to create screens that feel credible and authentic to the spirit of the story, and if they achieve that, we’ve done our job well.
Some nice ideas but in practice I doubt I would use them. People prefer to do most of their interaction within the app itself. Notifications are simply alerts and as such don’t need to be interacted with. Launching the app that sent the alert is only a swipe away, that should be efficient enough.
Ultimate Guide to Website Wireframing
A Beginner’s Guide to Wireframing
Beyond Wireframing: The Real-Life UX Design Process
The Importance of Tying Personas to Wireframes
Top Wireframing Tools Every Designer Needs to Consider
Sketchboards: Discover Better + Faster UX Solutions
Why Design Documentation Matters
A video detailing how Omni designs its own apps, “starting with a quick iPad sketch and ending up at a pixel-perfect, interactive design”. I may revisit using OmniGraffle but for the short term at least I’m invested in using Sketch.
This is not seat-of-your-pants level of exciting but I always enjoy listening to how another teams approach interface design.
An interface is humane if it is responsive to human needs and considerate of human frailties. We make mistakes. No matter how hard we try to concentrate and prevent errors, errors will happen when our concentration wanes or when we are forced to do something that is beyond our cognitive abilities like multi-tasking: the act of consciously thinking about two things at once — and, with the use of Queueing Theory & Little’s Law, we learn that multi-tasking leads to lower productivity.
A humane interface design philosophy:
1. It’s the fault of the interface, not you.
The main thing you have to remember—and please remember this, because it could be vital to your sanity—is that any problems you have with an interface are not your fault. If you have trouble using your microwave, it’s not because you’re “not good with technology”, it’s because the people in charge of designing the interface for that microwave didn’t do their job right. User interface design is incredibly hard, and carries with it a great deal of responsibility; this is something that’s taken quite seriously when it comes life-critical systems such as flight control software. But in today’s consumer culture, what should be blamed on bad interface design is instead blamed on the “incompetence” of users. Just remember that it’s not your fault.
2. Don’t take something simple and make it complex.
Some tasks—for instance, teaching a child arithmetic—are intrinsically pretty complicated. But some aren’t. Setting the time on a wristwatch, for instance, shouldn’t be that hard; on old analog wristwatches, it basically involved pulling out a knob, twisting it until the watch showed the correct time, and pushing the knob back in again. But on newer digital wristwatches—ones that claim to be more powerful and feature-loaded than their analog counterparts—it involves pressing a series of buttons in a hard-to-remember, often unforgiving order. Most people dread setting the time on their digital watches, and for good reason.
It’s right and proper for complicated tasks to take time and expertise to accomplish. But something that is fundamentally simple—like changing the time on a wristwatch—should stay simple.
3. Fewer choices are better than many.
People love having choices, because having choices means having freedom. Well, we don’t think this is necessarily a good thing when it comes to usability. We believe that when someone wants to do something on their computer, they want to spend their time doing it, not deciding how to do it. For instance, Microsoft Windows provides you with at least three different ways to launch applications and services on your computer: desktop icons, a quick-launch bar, and a Start Menu. Each one of these mechanisms is useful in one or two situations but horrible in others, and each has completely different instructions for operation. Microsoft even gives you a wealth of choices to configure them the way you want, which makes the situation that much more complex.
When we can, we try to avoid burdening our users with choices like this: we’d rather just take the time to make one simple mechanism that the user can use for all their purposes. Because the less burdened a user’s mind is with irrelevant decisions, the more clear their mind is to accomplish what they need to get done.
4. Reliability is sacred.
It’s that simple, really. When one ensures that a machine can’t lose a user’s work, interfaces become a lot simpler; no more dialog boxes asking questions like “Are you sure you want to delete that entry?”; no more remembering to click a “Save” button like it’s a nervous twitch. You never need to regret any action you take, because any action you take can instantly be undone. Not to mention your complete lack of terror when you’re in the middle of working on your computer and the power goes out.
5. Your train of thought is sacred. Don’t break the flow.
You can only really think about one thing at a time. If you’re thinking about paying your taxes, you can’t be thinking about your vacation in Tahiti. Indeed, thinking about that vacation in Tahiti will actively prevent you from thinking about your taxes. That’s why when you want to get something done, you want to get everything out of your head except the task at hand.
Quite simply, you need to preserve your train of thought. And that means that the interface you’re using can’t derail it. No talking paper clips bothering you from the sidelines, no fiddling with windows to find your work, no distractions.
6. Good interfaces create good habits.
When you’re first learning how to use even the best of interfaces, preserving your train of thought can be hard because so much of your mind is focused on how to use the interface, rather than on what you need to do. But as you become more proficient at using a good interface, it eventually becomes second nature—it becomes a habit, like walking or breathing. You don’t need to think about what sequence of motions you need to perform an action because it’s like your hands have memorized them as a single continuous gesture, saving you the trouble of having to think about them.
Bad interfaces, on the other hand, prevent habits from forming—but they can also make you form bad habits. Have you ever closed a window and hit “Do Not Save”, only to realize a split second too late that it was exactly what you didn’t want to do? That’s a bad habit from a bad interface.
Good interfaces make forming good habits really easy, and they make forming bad habits nearly impossible.
7. Modes cause misery.
There exists a mortal enemy to your habits and your train of thought: it’s called a mode. If an interface has modes, then the same gesture that you’ve habituated performs completely different actions depending on which mode the system is in. For instance, take your Caps Lock key; have you ever accidentally pressed it unknowingly, only to find that everything you type LOOKS LIKE THIS?
When that happens, all that habituation you’ve built up about how to type on a keyboard gets subverted: it’s like your computer has suddenly turned into a completely different interface with a different set of behaviors. And that derails your train of thought, because you’re suddenly confused about why your habits aren’t producing what you expect them to.
When you think about it, almost everything that frustrates us about interfaces is due to a mode. That’s why good interfaces have as few as possible.
8. It’s easy to learn.
Good interfaces aren’t just effortless to use once you know them—they’re also easy to learn to use. This doesn’t necessarily mean that someone should be able to use it without any instruction, though—it just means that knowing how to use any feature of the interface involves learning and retaining as little information as possible. Keep it simple, and keep it consistent.
9. An interface should be attractive and pleasant in tone.
How messages are phrased is important, how the interface looks is also important. But these are of secondary importance in terms of task completion. It used to be said that the Mac OS X interface looked so good it was lick able.
They are road signs for your daily rituals — the instantly recognised symbols and icons you press, click and ogle countless times a day when you interact with your computer. But how much do you know about their origins?
A short video by Luke Wroblewski on how to deliver the best experience possible across multiple devices. More excellent videos featuring Luke, in collaboration with Intel UX, can be seen here.
Taken from Donald Normans Affordances and Design essay comes and excellent set of conventions to help guide the design of screen based products in a way that will help users understand what actions are possible, like most design conventions, each has both virtues and drawbacks:
1. Follow conventional usage, both in the choice of images and the allowable interactions.
Convention severely constrains creativity. Following convention may also violate intellectual property laws (hello Samsung et al). Sometimes we wish to introduce a new kind of action for which there are, as yet, no accepted conventions. On the whole, however, unless we follow the major conventions, we are doomed to fail.
2. Use words to describe the desired action.
This is, of course, why menus can be relatively easy to understand: the resulting action is described verbally. Words alone cannot solve the problem, for there still must be some way of knowing what action and where it is to be done. Words can also cause problems with international adoption. It is also the case that words are understood more quickly than graphics — even a well known, understood graphic. Words plus graphics are even more readily understood.
3. Use metaphor.
Metaphor is both useful and harmful. The problem with metaphor is that not all users may understand the point. Worse, they may take the metaphor too literally and try to do actions that were not intended. Still, this is one way of training users.
4. Follow a coherent conceptual model so that once part of the interface is learned, the same principles apply to other parts.
Coherent conceptual models are valuable and, in my opinion, necessary, but there still remains the bootstrapping problem; how does one learn the model in the first place?
Though written over twenty years ago the logic still is valid today, many of my chief complaints with iOS 7 could be solved by following the above. For complete detail refer to the original article.
Showreel covering Territory’s UI concepts, design and animation for both on-set playback and VFX shots for Guardians of The Galaxy.
Beautiful work but like all UI created for movies overly complex; complexity lends itself well to delivering an idea of technological advancement, what we don’t understand is inherently advanced (and few understand onscreen UI displays).
Luke Wroblewski shares some very clever design tricks that will make your app feel much quicker.