The Disappearing User interface
I recently read that "becoming a UX leader means knowing the principles of good design and how to identify and create good design in your own work."
I like this statement because it acknowledges that there are defined principles involved in design, that it’s not simply a matter of personal taste. Most people think they’re a better than average driver. Many also believe that their opinion about what constitutes a good design is as valuable as the trained designer’s. It's not...though it's often hard to explain that without sounding snooty.
Good design is non-obvious
Stated another way, good design stays out of the user’s way; it doesn’t call attention to itself yelling “look at me, I’m a clever design!” With the perspective of time, it’s now clear why so many early web interfaces were poorly received. I remember all of the circular menus and skeuomorphic designs that tried to mimic familiar real world objects. Early web designs seemed to go out of their way to pack as much gibberish as they could into a viewport (perhaps satisfying the science fiction dreams of their builders). Remember all of those interfaces designed to look like warp drive control panels? Buttons that were made to look like raised glass beads? Those elements all drew attention to themselves. They screamed at the user. They were so obvious that they came between the user and the content they were trying to access.
Modern designs strive to pare everything away but the bare essentials. Only show one or two main actions per screen. Limit the color palette. Allow plenty of white space between screen elements. Don’t force the user to deal with overwhelming cognitive load. Reduce. Reduce.
In the 1960s and 70s, industrial designer Dieter Rams took complex devices like console stereos and removed all visual flourishes in favor of flat surfaces and clearly labeled controls. Successful modern web apps follow this approach. With such a small screen area to work with, with screens being scarcely larger than the user’s hand, no other approach is viable.
Early web designers thought that a new visual language had to be invented for the web. In the end, a new visual paradigm was invented, but it wasn’t as dramatically different from print design as everyone thought it would be. As web design gelled, the conventions that became established were remarkably similar to those that had been found in books for hundreds of years: here’s an interactive table of contents; here’s a heading at the top of the page; here’s a pull quote; here’s a bulleted list. Even online, people still read left to right and top to bottom (at least in Western culture).
Functionality is necessary but not sufficient
How is it that some designers and developers who rely so heavily on the good design of their iPhones, Androids, and other consumer technologies still deliver solutions that aren't easily understood without a manual, special training, or a call to the help desk?
What I’ve seen in my own career is that software developers that handle the heavy lifting of coding and databases see an application as successful if it works. Period. If the controls are there that allow the user to complete the tasks written in the specification, their mission is accomplished. This is how IT products were allowed to leave the delivery floor for decades. It was imagined that technical people would be the primary users. The software worked, what more do you want? Only when IT products fell into the hands of the broad public did this change.
If you want wide adoption, an IT product has to offer the same type of emotional appeal as any other consumer product: it must be functional, it must appear simple, it must be attractive, it must be pleasing (good design creates emotion). These aren’t the kinds of qualities that a software developer can deliver. It requires a different skill set. When this began to sink in with technology companies, visual designers were brought in to help. Later, the user experience field developed. UX had always been applied to physical objects but it took decades for anyone to realize that they could--and should--be applied to IT products as well.