It’s been a year and half since the U.S. CIO’s office issued its 25 point implementation plan to reform federal technology management. Chief among them are goals to consolidate at least 800 data centers by 2015 and the shift to a “Shared First” policy, under which the federal government had identified $20 billion in IT that can be moved to the cloud. From other quarters comes pressure to implement analytics and shore up security measures. The problem is, many in federal IT are just looking to check off the boxes that show they are compliant with these goals without stopping to make sure that any changes made will deliver the return on investment the federal government wants to achieve.
A large government agency with offices across the United States turned to IBM to deploy an asset management system for such physical assets as vehicles and computers as well as software. Although the agency initially considered a distributed systems environment, it ultimately decided to tap two exiting IBM mainframe servers to run IBM Maximo Asset Manager, saving on deployment time, acquisition costs and staffing while using the same amount of floor space, cooling and electricity. If the agency had merely focused on checking off the boxes that showed it was moving to new technology, it would have ended up paying over $1 million more in software and servers.
At Canada’s Department of National Defence (DND), the mainframe was being pushed out of the organization as distributed systems took over. Soon DND was stuck in a quagmire of rising costs, cooling issues and lack of space, causing it to dig deeper and reconsider the mainframe. DND found that by moving one-third of a group of applications to Linux on IBM System z, they would save at least $1.5 million. Today, DND has four mainframes that help it meet the imperatives of the Government of Canada’s Shared Services initiative, which — much like Shared First — aims to streamline IT, save money, end waste and avoid duplication. DND believes it is now well situated to transform its operations.
Like any organization, the government requires the most relevant, accurate information to make quick, informed decisions. Centralized computing supports high-speed analytics for better decision-making. It also eliminates costly off-platform duplication of software and data and allows cloud computing to be provisioned in a very inexpensive and scalable manner. Because the exponential growth of data around us is primarily digital, cloud computing is essential to making big data useful for government.
Now is the time for governments to get smarter about IT. Centralized computing can, in many cases, lower costs while making good use of big data and providing quick and easy access through the cloud. Of course, one size does not fit all, but let’s be sure that we’re helping further the federal government’s IT agenda, not just turning in to a bunch of box checkers.
Jim Porell is an IBM Distinguished Engineer and Deputy CTO, Public Sector at IBM
Good point. Government does a lot box-checking w/r to initiatives like this and never really consider what value is being delivered at the end of the day. So what does turning to IBM Maximo Asset Manager have to do with cloud / “share first”? Sounds like it was just a good tech decision to centralize the asset management as opposed to distributed environment. Was there some alternative that involved the cloud and would have been more costly?