Tuesday, July 16, 2013

Recognition Rather Than Recall

Have you ever forgotten to take your keys when you left the house?  Or your wallet?  Or your phone?

Let me guess why.  I bet it was because they weren't in the place where you normally leave them.  Right?

Because you didn't see them, you didn't think to take them.  "Out of sight, out of mind," as they say.

So some people get in the habit of always putting these things in a place where they will see them, preferably as they walk out the door.  Then they are in your sight, and thus, in your mind.

Software user interface designers call this principle Recognition Rather Than Recall.  The basic idea is: don't make users have to remember to do (or how to do) something.  Rather, provide them some cue that they will recognize to help them along.  

http://www.nngroup.com/articles/ten-usability-heuristics/

For example, today basically all computer programs have a menu bar or toolbar somewhere.  They are a constant reminder to the user that, "hey, you can save this!" or, "hey, you can make this text bold!" or, "I hope you don't want to close me, but you can click this red X here if you really want to."  Yes, the user can recognize the button and know both that the operation exists and how to do it.

Compare that with, say, WordPerfect back in the old days.  (And by 'old days', I mean the early 90s.)  Way back then, WordPerfect was the premier word processing application.  But it didn't have the fancy-shmancy toolbars of today.  The user had to remember that the F5 key was save and F12 was quit.  Actually, those are probably not the correct keys... but that just further illustrates how hard it is for the user to remember.  

With the rise of touchscreen interfaces, I worry that we will backtrack here.  A trendy feature of such interfaces is gesture-based commands.  The problem is, if the user sees no visual cues that a command is available, he has to... ugh... remember the commands.  How exhausting!

I faced a really glaring example of this just this week.  I logged into a server running Windows Server 2012.  I needed to find a particular program.  But the normal Windows entry point to find programs, the Start button, was nowhere to be found.  There were a few icons for some applications, all of which were useless to me at the time.  I just needed to open this specific app.

Now, I'm a software engineer.  I have been using Windows for almost 20 years.  And here I was, unable to figure out how to open the application I wanted.  The visual cues I am accustomed to had vanished.  I had to do a Google search to find it.  A Google search!  It turns out, you have to move the mouse to the lower right of the screen and that opens the "Charms Bar."  Then you can access the Start screen from there.  But how is anyone supposed to know that?

But even when I got to the Start screen, I couldn't see all my programs.  Again, there was no button or visual cue about how to access them.  Again I had to consult of the wisdom of the Internet.  It turns out that on the Start screen, you have to right click and then you can access your apps.  Again, how is the user supposed to figure that out?

My point is, there is now nothing to recognize.  No button.  No visual cue of any kind.  It all relies on recall.  It's even worse when you've never done it before... as in my case with Windows Server 2012.  How can you recall what you never knew in the first place?  

Monday, July 8, 2013

Excellence Answers to No One

I admit: I am mediocre.  But I like to think about excellence and individuals who have reached some level of excellence.  There are, of course, many ways to approach this, but one thing that keeps running through my head is: excellence cannot exist under the direction of mediocrity.

This may seem obvious.  Maybe it is.  In any case, here's why I'm thinking it:

Let's say you own a company and you want to build a new headquarters.  Now, presumably, you have some idea how you would like the building to look, but you're no architect.  So you hire an architect.  He, presumably, has some experience designing buildings.  The question is: who designs the building?  You or the architect?

Really, that depends on how you perceive your abilities vs his.  If you perceive yourself as having an excellent eye for building design, you will give instructions to the architect and he will basically follow them.  After all, you're paying him.  However, if you perceive yourself as mediocre (or worse) in this regard, you will defer to his experience because you perceive him as excellent, at least in comparison to yourself.

Now, which is the better course?  I think it's safe to say the latter.  After all, he is the architect.  Sure, you give him some themes you'd like to see.  But then you let him work his excellence.  The building gets built, and it is admired by all passers-by.

Of course, this latter approach depends on two very critical things: the architect should actually be excellent, and you should recognize your mediocrity in this respect.

Granted, this is a contrived example.  But, I daresay, this principle affects the rise of companies today.  There are at least two models of product design that I can think of: designing products based on input from users (or stakeholders, financiers, etc) and designing products based on what you think is cool.  

If you have mediocre product designers, definitely go with the former.  But if you have excellent product designers, you better go with the latter.

Most companies and/or products fall into the first bucket.  They need to build their product to satisfy their customers, so they do whatever they can to meet the customers' needs.  By definition, most of anything is mediocre.  So for the general case, this is a good model to follow.

But the companies that fall into the second bucket are the game-changing companies.  Google, Apple, Tesla.  Did Google do market research when they decided to build Loon?  Self-driving cars?  Gmail?  No!  They made those products because they thought they were cool.  Likewise with the iPod, iPhone, and the Model S.  People who are excellent in their field built something they thought was cool.  More often than not, other people then also thought those things were cool.

If Apple had built the iPod based on feedback about music players of the day, it would simply have been a CD-player with some more bells and whistles.  People mediocre in the field wouldn't have had the creativity or technical background to envision something game-changing.  But the excellent people did!

Now, what would have happened if their boss (or stakeholder, or financier) didn't recognize the genius of their vision - if they were mediocre?  They may have squashed the project.  The best case scenario would have been the forgoing of huge profits.  The worse case would have been those excellent people quitting, starting their own competing company, and putting Apple out of business.

So I guess what I'm saying is that if you want your company to do something ground-breaking, hire excellent people and then get out of their way.

And now, the giant caveat.  What happens when mediocre people think that they are excellent?  That's clearly the worst of both worlds.  They would refuse to listen to input from those who know more even though they don't have the skills to justify such hubris.  Those are the people you fire.

The last question, then, is: how does one recognize excellence, especially if he is mediocre himself?  That's a tough one.  I may not be able to recognize excellence, but I can recognize if someone is better than me.  Those are the people who I hire.  And I tell them to hire people better than them.  After enough such iterations, hopefully we can hit that bar.