Have you discovered Anne Holland’s wonderful website, “Which Test Won?” I discovered it through a recent Gerry McGovern article, “It’s Not What People Say – It’s What They Do;” and – of course – I had to play. Each week, Anne posts two side-by-side test sites. She tells you what specific element or words were being tested (which is important because, most times, it’s hard to see what’s different). Then, you get to pick which one you think tested better.
Now, I am a seasoned web manager. I am off the chart in “intuition” on the Myers-Briggs tests. I had a minor in psychology in college and took counseling courses in grad school. I’ve been known to be pretty cocky about my ability to figure out what my customers want on the web. But you know what? I have picked wrong more often than I’ve picked the winner, on these side-by-side tests. The moral of the story? Customers know best. And it’s their results that count.
This week’s test on Anne’s site is especially intriguing to me. This example came from Sony, and it has an interesting twist. They were testing an email promo with a link to a specific page. In one version, the email says, “Save 25% on the Vegas Pro Production Assistant.” It leads to a page with a banner that says, “Never start Vegas Pro with an empty project again!” The other version switched those two. The email said, “Never start Vegas Pro with an empty project again!” It led to a page with the header, “Save 25% on the Vegas Pro Production Assistant.”
I won’t tell you which version won – go make your own guess and see. But I will tell you that this test puts a new spin on “winning.” Because though one version got more opens and clicks, the other version actually generated more sales. Web designers probably were thrilled with that first version. But I guarantee you that Sony is much more interested in that second version. And, as the evidence showed, so were the customers. It’s not enough to drive the traffic to the web page (though that’s certainly important). It’s whether the customer completes the task that matters.
So, what does this mean for government web managers? Very simple. Learn from your customers. Focus on their results. Find out if they can complete the task. It’s not enough to count hits or clicks or page views and think your customers are successful in using your site. It’s not enough to ask their opinions about the site (because – like in house-hunting – what customers say and what they actually do often are not the same). And it sure as heck isn’t enough for you to sit in your office and think you can guess how your customers will behave. What matters is this: what did your customers actually accomplish? Were your customers successful in completing the task?
I’m not saying this is easy – especially in government. If you’ve got a service like selling stamps or applying for passports or paying taxes – services that start and end online – then you have no excuse. You should be honing in on those start and completion stats, perfecting the wording and steps through usability testing, and interviewing your customers about the experience, routinely. You should know with relative certainty how successful your customers are in completing the task and shoot to improve that percentage.
When your service is offering information or the beginning of a process that has to be completed elsewhere, it’s much more difficult to measure results. But it’s not impossible. Several years ago, HUD had a successful web-based kiosk program that provided information about homebuying, low-rent housing, and services for the homeless. How do I know it was successful? Because we hired a researcher (and honestly, it didn’t cost that much) to watch and interview and follow-up with kiosk users all over the country. We looked at their results. And we found that 74% of the 1,500 kiosk users observed (a statistically significant percentage) actually did something with the information they found. We learned which parts of the kiosk content produced the most/best results – and which didn’t. That was really helpful. That enabled us to make our services better, more focused…and drop those that weren’t being used and were only muddying the waters.
Do you have to go hire a contractor to measure your website results? Of course not, though that certainly is one option. There are many ways to observe your customers and measure their results. The point is: do it.
I love Anne Holland’s “Which Test Won?” website. It’s humbling. Every week, it reminds me that it’s the customers – not me – who know best. And it’s their results that count!
Related Posts
Customer Service Mantra: Listen, Respect, Follow
I saw one of those HUD kiosks the other day. I’m pretty sure it was in a weird random mall and I thought it was awesome.
Sadly, HUD ended that kiosk program last year. It really did reach a population that we weren’t reaching through the web. I think at the time they stopped the program it was still getting 400,000 uses a year. With only about 75 kiosks out there, that was a pretty good reach. But the program always was a political football. And so it goes. Maybe it was an idea whose time hadn’t come.
Yeah. It was at a mall in the Tampa area that served the low-income area of town. So was pretty cool.
Excellent post, Candi!
You can learn a great deal about how a particular website serves the needs of its customers, how it compares with expectations, and where it falls short in its support of a successful visit–as long as you’re listening to your customers.
Visitors to federal websites may have different objectives than do visitors to private-sector retail sites (for example); but they have objectives, and they are important.