Some things require no explanation. If you are this close to a crocodile you don’t move or, if you do, you back away sloooooowly. No signs or shouted instructions are necessary.
An ideally “intuitive” User Interface requires an equal lack of training. Users see the UI and know what to do (and, hopefully, the consequences of getting it wrong are less drastic than with the crocodile). The primary resource that users have when working with a new UI is what they already know about working with software programs (just as your primary resource with the crocodile is your, or others’, previous experience with nasty animals). As I’ve said elsewhere most of your users are UI experts, at least from the point of view of having used lots of software products and learned about those UI conventions (I’ll call what your users have learned from working with other applications “virtual affordances”). What you shouldn’t count on is users translating what they know from the real world into how to interact with your UI (I’ll refer to those as “physical affordances”).
One example of the difference between virtual affordances and physical affordances is demonstrated by the diagonal lines that appear in the lower right hand corner of some windows in the Windows operation system. I doubt very much that anyone saw those lines as something that could be “gripped” and pulled…and even if they did, they didn’t try to grasp that corner between their fingers and pull it. That kind of physical affordances wasn’t invoked.
What did happen is that users noted that one corner of the window looked different from the rest. Generally speaking, users know that different visual features in a window’s border signals some kind of different functionality. They also know that useful information is often displayed when the mouse is hovered over something. With those two virtual affordances in mind, users move their mouse to that part of the window to see what happens. Once their mouse arrives at the window’s corner, their mouse pointer changes to a double headed arrow which users instantly recognize and know how to use. Users then test their assumption by clicking and dragging to change the size of the window. In Windows 10, those lines have been reduced to a triangular collection of 6 dots and still work equally well with new users — the virtual affordances function equally well with any distinguishing mark for the “special” corner of the window.
But even that description conceals an enormous amount of virtual affordances on the part of these hypothetical users. In a early paper on UI design, Jef Raskin described finding an “intelligent, computer-literate, university-trained teacher” who had never seen a mouse before. Raskin asked the teacher to use a mouse in conjunction with a children’s program called Manhole. The user, drawing on experience with joysticks, fought with the mouse for a couple of minutes but, in the end, had to ask how the mouse was to be used. After a quick demonstration of how to use the mouse, its button, and its relationship with the on-screen cursor, the user was able to work with the program drawing on their virtual affordances learned from using joysticks. Similarly, in my previous example with dragging the corner of a window, our users already knew about mouses, pointers, and a vast variety of other virtual affordances.
Obviously, as a UX designer, your job is to leverage those virtual affordances in your UIs. But rather than think about “affordances,” a more useful concept to think about is “feed forward design patterns,” a critical part of Learning Tree’s User Experience (UX) Design for Successful Software course. These patterns are the parts of a UI that tell us what will happen next if we do the right thing. These patterns so quickly become ingrained that we’re often unaware that they even exist. I suspect, for example, that the users who discovered how to drag their window didn’t have any more conscious thought than an initial “What’s that?” when they noticed the diagonal lines. Faced with a crocodile, you may think many things but you freeze automatically.
My favourite example of “feed forward design patterns” in action occurred when I was being trained on a sales application. My trainer (a senior salesperson) showed me how to enter an order and then said, “Once you know everything with the order is OK, you can go onto the next screen” and moved her mouse to click the Next button to go to the next screen. At that point, I interrupted her and asked “How do I know that everything is OK?” She had to stop and consider for a moment before she said “Oh, because nothing is highlighted in red and there’s no messages in the status bar.” Not only did the UI communicate “everything’s OK” but it did it so well that my trainer took in the message almost subconsciously.
Which brings me to the topic of my next post: Integrating “feed forward design patterns” into your own applications.