We communicate with each other using a common language, and we obviously become more effective when we all understand that language. However, technology complicates our lives as each piece of technology we interact with requires us to learn a new (proprietary) language, a set of rules, technology grammar, and a unique user-interface experience.
Think about it, when Larry King on national TV stumbles over his own URL (yes, language) and messes up HTTP, semicolon, and slash (or was it backslash), I can’t help but think about the hell we put users through to use the internet. Only when you understand the language do you get to benefit from its capabilities. That’s like forcing anyone that wants to vacation in Mexico to speak Spanish first. The Mexican tourist industry would grind to a halt.
For example, it gets worse to make photographs look better; Photoshop (and now with Photoshop Express), and many other photo-editing applications deploy a language that requires users to understand the intricacies of color and light and apply that language in the right order.
Here is a synopsis of my mother-in-law’s skill level would need to master to make her photographs look better: first increase the dynamic range using a histogram, then use curves to change the tonal values to your liking, apply the right white balance and improve saturation and vibrance. Indeed, I just described the introduction of yet another language to solve a pretty mundane problem.
To create a web page, we introduce another language, a compilation of HTML, Perl, Ajax, and Flash, usually contained within a desktop product with its own proprietary language. To write a book, we wrestle with 90% of Microsoft Word’s functionality and language we seldom use, trying to figure out how to create a table of contents. In Excel, we use another language consisting of non-intuitive formulas (like sum() ) to derive values from other cells. Should I go on?
So why is it that we seem to get away with it – or are we? For one, lots of people make money to understand a computing language that fewer others do. Web designers don’t always create better designs, but they understand the language of design and can implement it. So, web designers don’t want you to know there are better ways to do this. Adobe is probably not in a hurry to remove the language and erode its premium market. It could have created much more democratization in the website creation process. Many times have designers, with corporate marketeers in tow, objected to the use of Rapidweaver, a tool that attempts to democratize web design (
this site is built with it).
But we are fooling ourselves. The democratization of the internet requires that we make technology more accessible and easier to understand and implement. Only then will it reach real mass adoption.
We could easily build technology that figures out how to make the majority of images look better, design a web page by drawing it – rather than programming, or have Word make recommendations for a table of contents when it discovers one.
The iPhone is a great example of how packaging existing technologies in a different way can make people feel that they don’t need to learn a new language to communicate with it. My 3-year-old daughter uses it. Each of the individual technologies in the iPhone had been around for a while. Apple “just” packaged it, so the language became intuitive.
But Apple is not the only vendor that can remove the computing language from the equation. Others need to pay attention to it.
So when you design products, pay attention to removing the language, fewer yet intuitive options – rather than more. After all, for thousands of years, we ourselves have communicated in many other ways than verbal; the majority of our communication remains behavioral.
Innovation has become the art of packaging a flawless user experience, rather than a race to add features. The latter quickly become commoditized anyway.