Originally posted at the GenerationShift blog.
In preparation for a workshop that Ari Herzog and I will be facilitating on March 26 at the Advanced Learning Institute’s Social Media for Government Conference, I spent some time this afternoon reviewing a report by Darrell West of the Brookings Institution called “State and Federal Electronic Government in the United States, 2008.” The report presents the findings of “a comprehensive analysis of 1,537 government websites (1,476 state government websites, 48 federal government legislative and executive sites, and 13 federal agencies).”
Each website was ranked based on a 100-point index that awarded four points for the presence of 18 features, such as publications, audio and video clips, foreign language access, disability access, and privacy and security policies, for a total of 72 possible points. Another 28 points was awarded for the presence of online services (one point per service). The report provides a full list of the final rankings, but the top five states were Delaware, Georgia, Florida, California and Massachusetts. The top five federal sites were USA.gov, Department of Agriculture, General Services Administration, Postal Service, and Internal Revenue Service. Check out where your state or agency ranks.
This kind of measurement is important as it holds government agencies accountable by “examining whether e-government capitalizes on the interactive features available on the World Wide Web to improve service delivery and public outreach.” However, none of their metrics included a careful, direct look at the use of what we’d call Web 2.0 or social media tools, such as blogs, wikis, social virtual networks, Twitter accounts or other collaborative tools – features that move a government entity deeper into conversation with its constituents and facilitate democratic dialogue beyond its own Internet borders to a more comprehensive and engaging presence across the Web.
To be fair, the report measures the use of audio and video clips and cites the presence of a blog called “Jay 360” hosted by the Louisiana Secretary of State, as well as podcasts and RSS feeds at the State of Michigan’s portal. Nonetheless, there is scarce mention or measurement of Web 2.0.
I want to reiterate the value of the study, but stress that it only scratches the surface, measuring only the presence of features and not the actual user experience. It also does not measure the cost savings created by the presence of these features, such as the number of staff members otherwise required to provide these services or the ability to generate revenue through online fees. Perhaps there is a follow-up study in the works. I intend to contact the author, Darrell West, to learn more about their plans and to explore the value of a study on Web 2.0 tools.
In the meantime, I would like to ask a couple questions of readers:
1. Are you aware of any research being conducted that measures the presence and/or user experience of Web 2.0/social media tools on government websites?
2. If you were to create a 100-point index to measure the presence and/or value of social media tools, how would you assign points?
Many voices are calling for government entities to adopt social media tools. So what if they do? How will we measure the relative value of their use? Will we give them points for presence alone as in the Brookings study? Or is there another level of evaluation that should occur?
Much as increased sales proves value for a commercial enterprise, measurement should be on how much civic engagement is increased. Was there more comment on proposed legislation? Higher voter registration, or even higher voter turn-out? Just having blogs – much less counting them – is not very meaningful.
Another aspect that should be looked at is how successful gov’t is in reaching out to people on existing social networks. Younger folks expect information to come to them [I’ll look up the citation for that back at work…] So, how successful are governments in getting news, information, etc., to them?
Identifying, gathering and analyzing suitable metrics for these is a challenge. I’m glad you’re looking at it!
Tough questions! I can’t dive into the nitty-gritty at the moment, but a quick response on evaluating 2.0 features:
e-mail links
blogs that actually have comments and responses
adaptive GIS
complaint tracking systems
Thanks for your thoughts, Sarah. The study was really asking “Do they have this stuff or not?” It didn’t go into any greater detail. I agree that just implementing is not enough…as taxpayers, we should be concerned about the investment of time and resources…and the tie back to the agency’s mission.
Thanks to you as well, Adriel – the study looks at whether or not the websites had direct emails to personnel (vs. only to the webmaster), but didn’t get into the number of blog comments and responses. I had a great conversation with the folks at TSA the other day about “Evolution of Security.” It seems that most comments are negative, but it’s the gems here and there that make it worth their effort…plus the ability to respond to rumors and misinformation in real-time. I’d be curious to know more about your last two bullets. On the last point, it’d be great to have a way to dump data into one place and sort….kind of like a tag cloud to know which issues are most prevalent, then measure time to respond, number/percent of successful resolutions, etc.
There was an e-government student at the University of Maryland that did a very similar study as part of his masters thesis. It was called “Advanced Content In State E-Government: Criteria for Evaluation” not sure if it is available online or was ever published. I think he is on GovLoop, here is his website http://www.zammarelli.com/chris/
Ya, he is on GovLoop, https://www.govloop.com/profile/ChristopherZammarelli