Default open data; that was one of the big takeaways from the Digital Government Strategy released last year. But the DGS was a bit murkey on how agencies should implement open data. The White House has formalized that dilemma with the Open Data Directive released in May.
Hudson Hollister is the Executive Director for the Data Transparency Coalition. He told Chris Dorobek on the DorobekINSIDER program that directive doesn’t create any new principles but it is a clarifying document.
What’s in the Directive?
- For the first time we have an official definition of open data. The President has told us in a seven part definition, including that the data should be fully described, fully accessible, standardized and fully downloadable. That is important because most of the federal government’s data is not. So this is an aspirational definition.
- The President has given agencies some specific direction about what they should do to implement this policy. Agencies are directed to move towards this definition of open data for all their new systems. They are also required to review data that is in their existing systems to move it from inaccessible to accessible.
A change in how government operates?
“Anytime the administration makes announcements like this you have to worry about implementation. That’s because where open data is valuable the specifics of the data sets can be very complicated. The people who are working on open data policy are pretty far removed from the people at OMB and the Treasury Department whose job it is to oversee the filing and organizing of the data,” said Hollister.
Getting past the clay layer bureaucrats
“The middle layer is a challenge, they are the most resistant to change, that’s why the implementation steps that are new here are so important. Agencies CIOs are asked to create and inventory of all of their data sets. It allows for so much activity by advocates outside of government,” said Hollister.
Want More GovLoop Content? Sign Up For Email Updates