In two previous posts (Brookings and Foresee), I have explored common methods for measuring websites in a Web 1.0 world in order to find applicability for Web 2.0. Essentially, I am providing a summary in order to educate myself and share that knowledge acquisition with personnel from government agencies and other organizations that are beginning to think about social media metrics and analytics. Each of these measurement methods are included in a presentation that Ari Herzog and I delivered at the “Social Media for Government” conference a few weeks ago.
Another way that an organization may analyze its website is by using the “Web Site Review Scorecard 7.0” developed by Forrester Research, Inc. As a quick aside, I owe Alan Webber, formerly a government website reviewer with Forrester who is launching his own venture (called Ronin Research Group), my appreciation for reviewing this summary of Forrester’s review methods.
Whereas ForeSee directly asks customers to complete a survey, Forrester engages an expert in an evaluation of a website using a scorecard review instrument. The expert assumes the role of the customer vs. being the customer him/herself. Technically, its a “heuristic evaluation,” which is a review of the user experience in light of acknowledged usability principles.
The scorecard begins by asking an organization to articulate their “evaluated user goals” to ground the review in a set of clearly defined outcomes. With this foundation, each review question is scored on a -2 to +2 scale:
• 2 = Strong Pass (Best practice)
• 1 = Pass (No problems found)
• -1 = Fail (One major problem or several minor problems
• -2 = Two or more major programs, or one major problem and several minor problems
The scorecard then asks several questions related to four elements. There are specific criterion under each question that results in a score. A score of 25 (+1 on each criteria) being a pass). The elements and some of their associated questions are found below:
Value
• Is essential content available where needed?
• Is essential function available where needed?
• Are essential content and function given priority in the display?
Navigation
• Do menu categories immediately expose or describe their subcategories?
• Is the wording in hyperlinks and controls clear and informative?
• Are keyword-based searches comprehensive and precise?
Presentation
• Does the site use graphics, icons, and symbols that are easy to understand?
• Do layouts use space effectively?
• Are interactive elements easily recognizable and behave as expected?
Trust
• Does the site present privacy and security policies in context?
• Does site functionality provide clear feedback in response to user actions?
• Does the site help users avoid and recover from errors?
The review scorecard is a bit more comprehensive, and presents yet another way of thinking about an organization’s Web presence. Per Forrester’s own description of the review, it “uncovers flaws that prevent users from accomplishing key goals on Web sites….To get the most out of the Web Site Review, site owners should identify user goals that drive business metrics, review their sites using the tools available on Forrester’s Web site, and fix usability problems identified in the review.”
So what are the implications and applications for Government and Web 2.0?
A. Agencies may consider gaining a combination of feedback from both end users (ForeSee) and experts (Forrester). Alan suggested that “agencies should use multiple paths, including detailed analytics, usability reviews, user feedback” – much of which can be done internally.
B. Forrester has a Blog Review tool that is accessible to their clients. Agencies who happen to be current Forrester clients may want to examine this review tool and conduct their own analysis if they currently use blogs to communicate with citizens.
C. Consider how some of the questions above apply to social media in evaluating the placement of social media tools on your agency’s website. Under “Value”: Where should you place a video or RSS stream in light of its relative importance to your goals and objectives in communicating with constituents? Under “Presentation”: Are you using only the icon of a Delicious or Twitter link rather than spelling out a description of that tool? We cannot assume that our end users know how to navigate a page adeptly or understand what these icons represent.
D. Start with the end in mind. Once your agency has decided to launch a social media tool, use the four Forrester elements and ask again: “How does the use of this tool on our website or elsewhere on the Web connect to our mission, goals and objectives?” That’s where Forrester starts its evaluation – with the “why”. So should you.