User’s brain vs your UI design

In the context of UI design, I’ve come across numerous references to gestalt principles in the past few years, but not to 2D-design principles. When you design a user interface, you can apply both sets of principles to ensure users “intuitively” figure things out without any mental, or cognitive, effort.

Gestalt principles

There are plenty of websites that list and illustrate gestalt principles, and explain how the brain’s precognitive processing makes assumptions and fills in the blanks. Here are some examples:

Do you see a triangle?Are the 6 circles in one group or two?

Do you see six circles? Do you see any white triangles? There are no circles (just irregular shapes), and there are no triangles drawn on the screen. Your brain completes the triangle’s edges, and your brain closes or completes the circles. This gestalt principle is called the law of closure.

How many groups do you see? One?

Compare the three illustrations—the coloured and the grey ones—above and below. In each illustration, how many groups of circles do you see: one, two, or three? Isn’t it interesting that your brain is so prepared to see groups? This gestalt principle is called the law of similarity. Note how colour changes your perception:

How many groups do you see? Three?

You can see the law of similarity before and after a quick redesign of this web-application’s main screen:

A colour change improves user perception

Before the redesign, the viewer’s brain applies the law of similarity and intuitively sees the five boxes as related. In fact, this is incorrect because the right-most box has a different function from the others. After the low-cost design change—a colour change—to the fifth box, the grouping that users intuitively see matches the functional model.

2D-design principles

I haven’t yet come across a website that explicitly ties the principles of 2D design to UI design. But these principles, which I learned from Marlene Cox-Bishop and the classic Wucious Wong textbook on 2D design, do apply to UI design. Consider these examples:

Does your brain see distance?

When comparing two figures, the brain sees that (1) a raised figure is in the distance, (2) a smaller figure is in the distance, (3) a lighter figure is in the distance, (4) a greyer figure is in the distance, (5) an overlapped figure is in the distance. The last pair (6) combines all of these design principles. You’ve likely seen these principles applied in your operating system:

Perception of distance in a

When you apply the 2D-design principles incorrectly, just as with incorrectly applied gestalt principles, user perceptions will be incorrect.

The future is haptic, right?

I’ve been waiting for a full-screen touch UI with haptic response. That is, if the application displays a button on the screen, when I push it with my finger, I want to feel it clicking. Similarly, when I nudge an object, I want to feel its edge on screen.

A video about haptic feedback – technology that touches back.

Imagine the challenges in designing for the kind of hardware depicted in the video, above! It doesn’t exist, yet, but I’m ready. I can also imagine haptic icons on mobile-phone handsets, because I heard researchers present their research at the University of British Columbia, a few years ago. I can imagine the responsibility and the pressure of being the first to market with haptic icons. The market leader will get to define what OK feels like and what Cancel feels like, for years to come. Click to read about piezo-based skin-stretch display The idea is that mobile phones should touch you back, with haptic icons, at the place where your thumb typically touches the handset. Phones could signal—through touch—that you have a call waiting. Similarly, a camera could recommend—through touch—that it’s focused and ready to shoot. Outside the world of portable electronics, a doorknob could signal—through touch—that there are already three people in the room, and a steering wheel could alert you—through touch—that your car needs refuelling.

Most of us will only get to decide which kind of haptic cue we want a screen to convey during a particular physical interaction between the user’s fingers and a screen. But I’m eagerly awaiting that technology. There are bound to be many common computing tasks where a finger can outperform a mouse.