By: Mike Tanner, President and CEO, Hitachi Data Systems Federal Corporation
It is said that “knowledge is power.” But today, government agencies have to contend with a massively growing volume of data that can get in the way of actionable, mission-focused knowledge. From DoD to FTC to HHS, technology is vital to an agency’s success. Yet, while $80 billion is spent annually on information technology, there has been a long-standing dichotomy in the government space between the agency mission and the technology supporting it.
Agencies are asked to store more data than ever before – data that varies in diversity and complexity, structured and unstructured data, data that come not just from computers but also from a variety of machines, devices and sensors. This information needs to be stored for varying lengths of time and at different levels of accessibility. Some may be needed immediately, some needed regularly, and some retrieved only rarely. But when it is needed, data must be quickly accessible no matter how it is stored. And, it needs to be reliable and secure from end to end.
Technology itself changes at lightening speeds. According to Moore’s Law, every two years, the number of transistors in an integrated circuit doubles. This speed complicates the ability to store, access and analyze critical data. Even if agencies procure state-of-the-art technology, the rate of change makes IT systems outdated very quickly. Over time, the increasingly vast quantity of data i3s housed on a variety of hardware – magnetic tape, discs and solid state drives. Often that hardware is not compatible and/or has been procured from a variety of different vendors. According to U.S. Chief Information Officer Tony Scott, 80% of government is using legacy hardware.
This complex data storage burden is carried by all agencies – big, medium and small. Although they have different goals, budgets, mandates and technical requirements, all of them are being asked to makes sense of their data. They need to gain better insights so they can execute their missions and serve citizens to the best of their ability. Regardless of budget limits, they all need to possess the ability to store more, increasingly diverse data that can be efficiently organized, accessed, manipulated, analyzed and secured.
To accomplish their goals, agencies’ information management systems must be able to adapt to market needs, scale for efficient growth and leverage legacy networks while also embracing innovative solutions. Unfortunately, complexity and volume make utilizing a single tool to accomplish these tasks next to impossible. Traditionally, agencies have had to make a choice between investing in enterprise-wide, feature-rich but high-cost solutions and settling for lower-cost, modular solutions that often have significantly fewer capabilities.
The good news is that agencies no longer need to make that choice. Instead of purchasing brand new hardware every three to four years, they can deploy a software-defined infrastructure that enhances legacy systems. They no longer need to rip and replace with each new hardware iteration. Indeed, one moderately priced and elegantly designed software-defined solution can virtualize storage and be flexible as the agency grows, evolves and adapts to new programs, citizen needs or changing budgets.
Software-defined data centers can fill the middle ground. Government can implement economical storage options that do not sacrifice functionality, speed, reliability or feature richness.
It is important that information technology managers understand the options that are available to help meet their agencies’ needs. Regardless of agencies’ storage budget, they can access the same data management, migration, virtualization, replication, and flash performance optimization and data protection capabilities of higher-end systems. They can use virtualized, tiered storage architecture and consolidate diverse storage solutions into a single, managed set of tiered pools of data.
Software-defined solutions and virtual storage platforms can help technology managers save their agencies’ money, reduce complexity and increase performance by offering enterprise-quality functionality. They can help agencies keep up with the explosion of data, make it available 24/7, and ease transition to new big data computing methods, such as the cloud. Gartner estimates that 82% of data will be virtualized by 2016.
Agencies no longer need to be caught in between full-feature, high-cost solutions and low-cost, limited-feature solutions. There is a middle ground that can be achieved by feature-rich, agile, scalable software-defined systems, which will help open the doors to more effective use of emerging technologies – such as cloud and mobile – and help properly align government missions and technology. Thereby, complex and massive data can be turned into actionable knowledge to best serve the American public.
Leave a Reply
You must be logged in to post a comment.