Jason Miller’s report Kundra says government’s approach to consolidation has been ill-suited includes this statement by Vivek Kundra, Federal Chief Information Officer:
“I’ve talked to CIOs, buyers, business unit leaders across the government, and one of the biggest challenges is that if they want to provision services they have to go through so many steps that it is far easier for them to spend 10 times the money so they could do it in house, built it and provision it rather than leverage what is already out there,” he says. “It’s unacceptable in my view that it would take us a year or so to roll out solutions you could roll out in your personal life in minutes.”
This is preceded by Kundra’s description of how the move of government computing to the “cloud” is really a shift not just in how applications are procured but in how IT infrastructure is developed and supported.
What I’m also seeing here in Kundra’s discussion of the Federal Government’s IT infrastructure is the realization that what’s being discussed (and implemented) by Kundra and his lieutenants is not just a shift in how how applications are procured but is also a shift in how IT infrastructure is architected.
An ironic aspect of this is that the complexity of government procurement policies and practices may actually be accelerating this “outsourcing” of IT. The reason is that it might just be easier — and potentially less expensive — to replace both application and infrastructure in one fell swoop. Cloud computing may offer this potential.
This idea should probably not come as a surprise to anyone familiar with IT operational costs and how they are usually associated with IT applications and IT infrastructure. In many large IT operations, attempts to re-architect an entire IT application portfolio are fraught with complexity and expense. This is due to the spaghetti of touchpoints and data interchange, security, and standardization issues that need to be addressed. Standardization and centralized migration may also have been stymied by their huge front end costs.
Moving services and supporting infrastructure to the cloud — assuming data integration and security issues can be managed — is one way around this problem. This may be especially true if the acquisition process for “keeping things in-house” by managing and upgrading potentially incompatible applications and supporting infrastructures is prohibitively expensive and prone to evolve long-winded and expensive procurement cycles.
Some of the issues connected with the move of IT “to the cloud” are similar to those associated with whether or not corporate IT “should be outsourced.” They’re also related to the discussion of which government operations are “inherently governmental” and which can be safely contracted out.
What “cloud computing” offers is the possibility of moving everything outside. And if this can be accomplished by walling off and eventually replacing expensive-to-maintain legacy applications, why not?
Copyright (c) 2009 by Dennis D. McDonald, Originally published (July 15, 2009) in Dennis McDonald’s Web Site.
Part of the real value in some sort of federal cloud and procurement is the ability for one C&A, ATO, security package. GSA pioneered some of the thought leadership with web 2.0 Terms of Service that they negotiated with YouTube, FB, and more but other agencies could basically leverage almost all of it.
The same would be great with the new technologies. Rather than each agency figure out if a new tool they want to use like MediaWiki, WordPress, Blogger, Confluence, Jive, etc meets their arch/sec needs – they can do 1 C&A and ATO package for the federal cloud.
Or at least that’s my hope…
I don’t understand the acronyms but I get your point!
Sorry…Certification and Accreditation. Authority to Operate. But basically all the internal loopholes to get software implemented