January 31, 2008
Why are so many computer programs hard to use, with misleading or confusing screen prompts, counterintuitive methods to accomplish simple tasks, and lots of options that are rarely used but that clutter the screen and that often slow users down? There’s no need for me to ask whether this is the case, as friends and colleagues complain to me all the time. On what appliance other than a personal computer would you press a ‘start’ button to turn the machine off?
At home, these issues may be simply an annoyance. At work, they represent lost productivity, increased training time, and, often, less accurate data. Programs that are hard to use cost the company every day.
As a user-interface designer, and a usability consultant, my job is to insure that programs are easy to use, intuitive, friendly, receptive to good data and likely to reject data that doesn’t make sense. My secret in this work? I spend lots of time with the users — first learning about the tasks they need to do, then watching how they use their current systems, paying careful attention as they try out our proposed designs in rough prototype form, and remaining attentive during further testing.
Even with my extensive training and — dare I say — vast experience as a computer software designer, I must confess that the users know more than I do! They live in their subject world, and regularly use their knowledge of that world as they take orders, review insurance claims, schedule manufacturing or repair orders, or whatever. Not paying attention to their experience is a much too common, and often fatal, mistake.
Here’s the story of one interface design assignment, and how careful listening turned the whole project around. The project seemed straightforward, if not exciting. My new client had developed a computer system to help collection agencies in their work, and wanted me to redesign the screens, and to offer some advice about the underlying technologies they were using for development. That’s what I though, before I arrived at their office.
Actually, their primary agenda was more mundane and problematic. They wanted me to magically stuff a lot more information on the already crowded screens. It wasn’t at all clear how this would be helpful, and I knew that it would make their screens uglier and harder to use. But how could I say this in a polite and helpful way, while steering this client to a more productive agenda?
For me, the first step was quite clear. My business card says, “Listening to Users”, and that is exactly what I planned to do. After a brief review of the system design, I asked it we could visit their one client who was in the same Midwestern town as their offices. They agreed, expecting, I believe, that we’d spend an hour or two. We ended up spending two days, and totally transformed our agenda and their product.
I asked for a “training harness”, so that we could hear the debt collectors conversing with their “clients” as I watched them navigating through the screens of this computer system. It turned out that most of these “clients” were repeats, who regularly defaulted on medical payments, furniture store bills, checks written pizza parlors, and credit card bills. The computer system showed all these bills, payments made towards them, and promised payments that were never made. Typically there were screens and screens full of data.
In between calls, I had time to ask the collectors about their work — how they assessed each situation, when they felt they were getting honest responses, how they decided on a strategy, and what led them to accept the final agreements (or lack thereof). There were many interesting and complex stories, often filled with amazing complexity. Along with these details were the feelings of heartache, as more and more of these “clients” were getting caught up in a web of debt that seemed inescapable.
My role was that of the cyber-age anthropologist, noting the debt collectors behavior, and beginning to discern some patterns. I could see some common themes — almost some rituals — but some aspect of the collectors’ behavior kept eluding me.
Finally, on the second day, I blurted out what might have been obvious from the start: “So you’re in the business of getting these clients to make promises that they will keep . . . and it’s the keeping part that is so important!” The collector I was with almost jumped out of his chair with excitement. “Wow”, he said. “Nobody has ever said it so clearly. Yes — that’s exactly what we do, and the keeping promises part is what it’s all about.”
I ventured another statement, a bit more tentatively: “And what are you doing with all these screens of data? Are you computing how often clients have kept their promises (or not?)” The collector was even more excited. “Why that’s exactly it”, he said. “I want to know what promises the client has offered or agreed to, and how well those were kept.” And he eagerly agreed when I suggested that, “It sounds like we need to display some indexes of promise-keeping”. When I followed up with the question, “How would you compute these?” we began an active dialog, with the user taking the lead. The initial computer screens should show the promise-keeping indexes, with all the data details available to the collector on later screen, but often not needed.
As long users regard me as a priest of technology, they’re hesitant to come forward and play an active role. Once the discussion shifted to how data is managed in what now felt like their system, however, I became the process consultant, they were the experts.
Unfortunately, users typically are in the back seat as systems are being designed, and yet they are the ones with the most real subject knowledge. By listening carefully to the users, responding thoughtfully, and really trying to understand their work process, I could get their active engagement in this design process. In this case it led to a radical redesign of the whole screen concept, and, I believe, to a much more powerful system.
This is an especially clear example of a scenario that I see constantly — Users who would not have been part of the process of designing the very systems they will use, but whose deep knowledge and understanding is critical to the system’s success. Typically the users feel technically inferior, and, in fact, they don’t speak the “systems” language that computers consultants are so fond of using. They absent themselves from the process, and are not invited by anybody else to join in.
In lecturing on the system design process, I explain that my first attempts to design a system are usually reasonably good, but it’s the users who correct me and really take it to a higher level. Watching them work with rough prototypes I can see where they struggle with my designs, and where they can easily find their way. And when the users have to struggle to use the proposed system, the problem is almost always that my design is not clear enough — not that the users need more training or experience.
When computer systems are hard to use, most of us blame ourselves and our lack of experience or training. We don’t consider that usability should be a prime characteristic, or that as reasonably intelligent people we should be able to find our way through. We also tend to assume that the system is properly designed to give us the benefits we expect.
Often, however, the systems we use today are simply reworks of systems designed earlier, that are themselves reworks, and at some point we can trace back to a system modeled on how people do a task. Years ago I was asked to work on a system for an organic produce distributor — who wanted me to track exactly what was on-hand of each product in their coolers. This sounds reasonable enough — until you realize that on-hand quantities were not what this distributor was selling. They had trucks full of produce coming east from California, and were selling what of those expected arrivals were still available for promise to customers after subtracting what items had already been sold. That’s a lot of computation for people to do, but not hard at all for a computer. The computer system had to be rethought so that it could feed its users the necessary available to promise data that were the basis of sales and pricing decisions. My job was to step back from the detailed operational data, and focus on how users are trying to interpret and understand this data and use it to make critical business decisions.
Information systems will add value when they truly enable users to be more productive, more confident, more correct in communicating with customers, vendors, and with other employees. Simply using the latest technologies, having the fanciest and most beautiful screens is clearly not enough. Buzzwords like “real time” or “bus architecture” mean little by themselves. Here are some guidelines that can help insure that systems are really workable and usable tools, that exemplify best practices, and that really work for your organization:
1. Include actual system users, at a variety of levels, in the design and review of the system. Their experience and insight are critical.
2. Start with a clear understanding of the business goals of the system. Don’t be guided by just a list of data elements to be tracked, or reports to be produced.
3. Watch the users working with the system — whether it is in rough prototype form, or is a production system running with lots of real data. Learn how the system is really used, and to identify times when it probably should be used but, for whatever reason, is not.
4. Don’t count of your most technical staff to carry these concerns. Just as you need architects, engineers, and builders to design and construct a new facility (and, in fact, you need lots of subspecialties within this list), so you need people with skills in usability, user testing, analysis, system design, modern programming methodologies, and coding techniques to design and build a new system, or to review and revise an existing system.
5. Insist that the overall system design be written in language that is comprehensible to you and to many of your users.
6. Keep looking at your systems, even after you believe that they are “finished”. Most systems can be improved, but many never are.