One of the best things I’ve read recently is Grant Vergottini’s Imagining Government Data in the 21st Century where he lists and then explains seven characteristics of “data transparency”:
- The data must be available.
- The data must be provided in such a way that it is accessible and understandable by the widest possible audience.
- The data must be provided in such a way that it should be easy for a computer to digest and analyze.
- The data provided must be useful.
- The data must be easy to find.
- The data should be as raw as possible – but still comprehensible.
- The data should be interactive.
This is a great list! If you read this list in conjunction with the same author’s 2014 legislative data and transparency conference review you get a good view of the technical progress being made in making legislative data useful and available.
But as the author points out in the first article cited above he’s “… a little concerned about how disjointed some of the initiatives seem to be. I would rather see new mandatory mandates for enforcing existing systems to be rethought rather than causing additional systems to be created–which can get very costly overtime.”
How true! A classic dilemma for information system designers is, “Don’t let the perfect be the enemy of the good.” Even if it is possible to design a “perfect system” you run the risk of making its development and implementation impossibly expensive, complicated, and disruptive to ongoing operations. You need to think through the implications for wholesale replacement (along with the attendant cost challenges) versus partial solutions that maintain selected legacy process while “fixing” the most glaring problems (such as publishing data rich documents only in .pdf format).
This is one reason why so many system development efforts in the real world are compromises. The systems and the processes they support are so interconnected you need to think through both the strategic and the tactical impacts of making changes.
The real-world interconnectedness of systems and processes might be one reason that Vergottini was “concerned” about what he saw at the 2014 legislative data and transparency conference. When you have so many different development efforts going on it’s almost inevitable that, lacking a highly centralized (or authoritarian) program management operation, you end up having duplication and people seeming to work at cross purposes. In such situations at minimum you need to maximize collaboration and knowledge sharing through formal and informal channels such as conferences, webinars, social networking, collaboration platforms — and getting together for a beer or softball after work.
Another point to consider when dealing with issues like improved data transparency is that “transparency” is not just a technical issue. Data at the heart of government data transparency efforts are associated with real people and real organizational processes. People and processes are constantly changing, including the people and processes associated with government legislation.
The fact of constant change in technology, people, and processes suggests that attempts to meaningfullycontrol the different development efforts described by Vergottini in his transparency conference report might be difficult. Lacking a highly focused top to bottom development effort such as that overseen by the Recovery Accountability and Transparency Board (RATB) regarding stimulus reporting implemented by the ARRA, a more “federated” or collaborative approach might be more appropriate, as I suggested for DATA act implementation in How Should DATA Act Implementation Impact Federal Project Management Practices?
We want systems and processes to be more effective and transparent, we want to be able to take advantage of improved standards and technologies when they make sense — but we also need to balance the cost benefits of change in a fiscally austere and change resistant environment.
To me that means that someone needs to look at things from a strategic big picture perspective so that the consequences of individual needed changes can be put into context. That’s why lists such as Vergottini provides for defining transparency are so valuable. They can form the basis for developing a big picture strategy against which individual changes — and practical implementation projects — can be evaluated.
Realated reading:
- Has “Transparency” Concerning Federal Stimulus Funding Been a Success? Part 1
- The State of Government Data Transparency, 2013
- Meetings and the Limits of Government Transparency
- Transparency Is Not An End In Itself
- Framework for Transparency Program Planning and Assessment
- What Makes a Government Program “Transparent”?
- Collaboration Can Be Messy
Copyright © 2014 by Dennis D. McDonald, Ph.D. Dennis is a project management consultant based in Alexandria, Virginia. He is currently working with Michael Kaplan PMP on developing SoftPMO project management services and with BaleFire Global and Socrata on implementing open data portals. His experience includes consulting company ownership and management, database publishing and data transformation, managing the integration of large systems, corporate technology strategy, social media adoption, statistical research, and IT cost analysis. His web site is located at www.ddmcd.com and his email address is ddmcd@yahoo.com. On Twitter he is @ddmcd.
Leave a Reply
You must be logged in to post a comment.