This blog post is an excerpt from GovLoop’s recent guide, How You Can Use Data Analytics to Change Government. Download the full guide here.
What does 2016 look like for government IT? We may be in for a few surprises, but one thing is clear: data analytics will play a key role. There are several areas where data and analytics will drive more value in government, including data transparency, tax fraud, and transportation logistics.
Data analytics will require a continued effort in analyzing increasingly larger datasets. And these larger datasets are coming at government agencies at greater speeds than ever before, with wider variety and complexity.
This is where challenges begin to occur. With more data to analyze, government agencies can easily be overwhelmed as they try to maintain system performance and data integrity while protecting sensitive information. Additionally, with most agencies having legacy systems in place, it can be difficult to integrate new systems and different vendors into their existing analytic ecosystem.
There are ways to combat these challenges. In an interview with GovLoop, Alan Ford, Director of Presales Consulting for Teradata Government Systems, shared how agencies can leverage data analytics, including those in the cloud, to generate actionable information and enable better data sharing within and across agencies.
Teradata provides end-to-end solutions and services like data warehousing to enable government agencies to create unified data ecosystems. The Teradata Unified Data Architecture, for example, integrates different platforms into a comprehensive analytics solution that enables fast, deep, and powerful data management, storage, and exploration. With this architecture and its inherent analytic capability, agencies can complement and augment their legacy systems rather than abandoning them.
These technologies will help all areas of government. Ford cited healthcare fraud as a major data analytics problem for government. Healthcare comprises some of the largest annual expenditures in both the private and public sector. Medicare and Medicaid, in particular, receive a significant amount of government funding, which makes it an attractive target for fraudulent activity.
According to Ford, however, there is hope with increased data digitization and analytics automation. “Digitization has created a new way of automating fraud detection,” Ford said. “By combining traditional claims data with other separate but publicly available data – like bankruptcies, liens, judgments, or even repeated address changes – into an integrated data model or a unified data architecture, we can run advanced analytics that make fraud easier to track or even proactively predict the likelihood of its occurrence.”
Teradata’s Aster Discovery Platform is one tool that helps run analytics that could only be done manually in the past. “With this tool, we can run older transcribed handwritten records or typed reports through complex text analytics capabilities and sentiment analysis in an automated fashion dramatically decreasing the time to results,” Ford said. “And with less expensive storage accessible by sophisticated techniques, all within the unified data architecture, we’ll be able to analyze all the data rather than just represen- tative samples of it.”
When looking to the future, Ford predicts data analytics and cloud will make a strong pairing for government. Governments will benefit from the reduced costs enabled by such technologies, so agencies can do more with less.
“The whole movement into the cloud environment is a huge cost saver, because no longer do organizations have to set up and maintain their own infrastructure,” Ford said. “It’s taken care of by the cloud provider. In this way, agencies convert acquisition costs into operating costs.”
This big leap to cloud and data analytics, however, may seem improbable for government agencies with legacy systems. When agencies have spent a tremendous amount of money acquiring systems over time, they are reluctant to replace these familiar platforms.
That’s where Ford says “the Holy Grail of analytics” – the unified data architecture – can come in handy. “This is an ecosystem which allows for the harnessing of all of an agency’s data,” he said. “We also refer to it as the logical data warehouse because it brings the right type of analytical processing to the underlying datasets. Further, it allows users to combine and analyze data from multiple disparate platforms all in a single query.”
A logical data warehouse can subsume older platforms so agencies that choose not to transform and migrate data into an integrated data model may still leverage their legacy systems.
With a unified data architecture that can accompany legacy systems, complemented by cloud technologies, government agencies can better manage the vast amounts of data accelerating toward them. The future of data analytics does not have to be daunting. With a unified data architecture, inside or outside of the cloud, government can be ready for advanced analytical capabilities and harness large influxes of complex data to further agency initiatives.