With pockets of data in multiple, unconnected repositories and formats, it has become much more difficult to understand where data is and how it is being used. This is a particularly thorny issue for government agencies, which often have complex information-sharing relationships with other agencies.
With a background in both federal and state government, David DeVries has experienced these challenges firsthand. As Chief Information Officer (CIO) at the Office of Personnel Management (OPM), for example, he saw how complicated it was to keep information from different parts of an employee’s lifecycle separate, as the law requires, but also usable and sharable.
“During employees’ active years, the agency manages the data — not only basic data, but things like background investigations. But when an employee retires, OPM takes over, managing retirement benefits,” DeVries said. “So during an employee’s lifetime, there may be multiple types of data that have to be kept separate for compliance reasons, yet it’s also important to be able to look at that data holistically.”
Lack of visibility into the location of data and how it is being used also can hide security threats and lead to performance degradation. A recent report from the SANS Institute found that many IT professionals consider lack of visibility and the complexity of managing data across infrastructures as major complications to effective security defense. Lack of visibility also leads to concerns about data privacy and the risk of exposing personally identifiable information and other sensitive data.
Compliance and data governance are also difficult in this environment. With separate data repositories, it can be particularly difficult to meet service-level agreements and agency requirements for security, privacy and data management.
Solution: Holistic Data Management
The most effective way to address these issues is by managing data holistically. It’s by far the best way to know what you have, where it is and that it’s fully secure. Plus, agencies can govern that data in ways that comply with all requirements. It can no longer be simply delegated out to a business owner.
“Data is the lifeblood of an organization, from when the data is first created to the end of its natural lifecycle, when it is archived,” said Richard Breakiron, Director of Strategic Initiatives for the Federal, Civilian and Intelligence Community at Commvault.
And it works. According to a recent report from IDC, holistic data management results in a 50% to 61% reduction in exposure to compliance or audit failures and a 44% decrease in annual spending on data infrastructure.
The first step to achieving holistic data management is embracing a multi-cloud environment with automated backup. This reduces costs, eliminates hardware and storage maintenance, and automates the process of moving data to different storage tiers and platforms throughout the data’s lifecycle.
The next step is standardizing on a solution that provides centralized visibility, full integration with all data repositories and applications, a complete audit trail, automated data analysis, and advanced security capabilities.
In addition to solving data governance, compliance, security and storage challenges, holistic data management can help agencies make better use of advanced analytics, data sharing, artificial intelligence, machine learning and the Internet of Things.
“The more access you have to more data sources, the better your analytics can get,” Breakiron said.
Holistic data management also helps agencies prepare for the unexpected. When the COVID-19 crisis forced government employees to work from home, for example, agencies that already practice holistic data management were better positioned to make data available to workers as they shifted to working from home.
This article is an excerpt from GovLoop’s recent report, “Cloud-Based Data Management: A Holistic Approach.” Download the full report here.