GovLoop

How to avoid another healthcare.gov style procurement – Plus your weekend reads!

The political weight of Healthcare.gov placed a global spotlight on the some of the issues plaguing federal IT. Federal budget cuts, slow adoption rates and long procurement cycles have forced agency CIOs to maintain 30 year-old piecemeal legacy systems rather than invest in new technologies.

Despite the effectiveness of some legacy systems, federal agencies spent over $50B on operational expenses in 2012 – an increase of 25% from the year before and representing 70% of their total IT budgets.

Christian Heiter is the CTO of Hitachi Data Systems Federal. In part one of our interview Heitertold Chris Dorobek that there was no one single issue with healthcare.gov. It was collective.

“A lot of the problems were due to the fact that the government has a lot of disparate systems. They have different ways of buying systems, different ways of specifying them, different architectures. Budget cuts have not helped, they have had security concerns. Because of all of these disparate systems it makes it hard to do large scale integrations, like the healthcare.gov system that was just implemented,” said Heiter. “When you are trying to do this and your data is coming from lots of different places and it is not always easily accessible, it makes it more difficult to do more large scale systems. It really requires a much larger, solutions based approach across the whole infrastructure.”

How do you fix the procurement problem?

“We have to look at this problem from the bottom up. We have to look at the infrastructure pieces first. We need to make sure we have all the techniques and capabilities necessary to build larger scale implementations. It is not just about how much storage you have or how many servers you have, that will come into play as you start building this out, but really understanding the access to the data that is out there, making sure that we can adequately provide bandwidth to the data, it is scalable, and we can virtualize when necessary. We can add more people if we find that systems need to grow much more quickly, we can now use that virtualization technology to provide better scalability and yet still provide the isolation and access that is required,” said Heiter.

Is this possible with the size of the federal gov?

“It really comes down to collaboration. We need to build the tools, it is in the hardware and the software, it is in our processes and our techniques, procedures to be able to share this information across other agencies. We probably will never have one system across government. I doubt that will happen. But even if we can start focusing even within an agency or portions of an agency to start building more cohesive systems, more intelligent hardware platforms, more intelligent solutions, and looking at different ways of approaching the problem. My guess is a good portion of what is happening right now is using unstructured data. Maybe we don’t have to structure everything. Maybe we can unstructure it. We can take advantage of systems that allow you to manage it as content. Now you can look at using meta data and its capabilities. We don’t have to do everything the way we have always done it in the past. We can find new ways and new techniques to work a lot smarter,” said Heiter.

Weekend Reads:

Exit mobile version