This blog post is an excerpt from GovLoop’s recent guide, the Open Data Playbook for Government. Download the full guide here.
Data is powerful. It can tell us more than we ever hoped to know about ourselves and the world around us. It’s also vital for solving today’s problems and predicting government’s future needs.
This is especially true for agencies with healthcare, law enforcement and cybersecurity missions, said Prem Jadhwani, Chief Technology Officer at Government Acquisitions. As a solutions provider and IT reseller, Government Acquisitions uses a combination of hardware, software and commercially available technologies to help agencies use data to solve their biggest challenges.
“There is lots of data — some of it may be internal data, some of that may be public data, and you have to analyze all of that and look for uncovered valuable patterns,” Jadhwani said. But that’s easier said than done. Gathering, analyzing and disseminating large amounts of data is no easy task, especially when time is a constraint.
One of the agencies benefiting from Government Acquisitions’ expertise works extensively in the healthcare space. The agency is charged with figuring out how much flu vaccine must be developed and shipped across the U.S. during an outbreak.
The agency must run numerous computations on large amounts of datasets to figure out how much of the vaccine each state needs. It used to take several days or weeks to crunch the numbers on large volumes of data, but now the answers can be obtained much faster. “They can get quick analytical results, which allows them to make the right decisions, and it affects people’s lives, especially when every second counts,” Jadhwani explained.
In the cyber realm, data analytics is also vital to carrying out critical missions. In the wake of a breach, victim agencies must pore through data to determine what happened, how it happened and what can be done to prevent future breaches.
“Cyber breaches are becoming widespread, and now they are moving from just credit card numbers to Social Security numbers and healthcare records,” Jadhwani said. “Again, data analytics is a very powerful tool for analyzing all the different patterns and trends and coming up with a way that you can do predictive analysis and stop the cyber crime before it happens. The other area that is also affecting mission is the prevention of fraud, waste, and abuse.”
Much of what we’ve talked about until this point has focused on analyzing internal data, not data that is released to the public. There’s a difference.
“Big data is typically something that is within the agency — a large volume of data,” Jadhwani noted. “But open data is something that is publicly available, and it is licensed in a way that people can reuse that data, and it can be applied to the citizens in general.”
The Smart Policing Program, implemented in more than 30 U.S. police departments, funds and empowers local, data-focused, crime prevention tactics. Here’s a brief recap of the initiative:
Working with solution providers, the law enforcement agencies collect and analyze GIS and other data using predictive analytics solutions. These solutions help agencies combat crimes such as street robberies, repeat violent offenders, and neighborhood drug markets and have contributed to a double-digit drop in criminal activities in several jurisdictions
Using publicly available data on crime statistics, participants can use predictive data analytics to view historical data and predict future trends. “Now they are able to effectively identify at-risk areas for crime and deploy the appropriate manpower patrols in the right locations,” Jadhwani said.
When it comes to analyzing structured and unstructured data, there are a lot of innovative solutions to get the job done. Jadhwani highlighted several of them:
In-memory predictive analytics
Predictive in-line analytics with in-memory databases provide real-time operational analytics without requiring customers to have a separate data warehouse. The shift to in-memory database significantly speeds up the business intelligence and advanced visualization reporting performance from weeks to near real-time.
Adoption of all-flash arrays in place of traditional hard drives
All-flash arrays, such as Pure Storage, are the new storage systems that are 100 percent solid state and provide a complete set of robust features like compression, automatic data tiering, and snapshots and replication, similar to traditional hard-drive based storage systems. However, flash memory can help reduce data center operational cost and significantly speed up mission-critical analytics applications.
Massive parallel processing and hyperconverged infrastructure
Parallel means you can take multiple datasets at the same time and run analytics in parallel, thus speeding up the query processing. Hyperconverged infrastructure solutions, such as Nutanix, allow agencies to build their data analytics environment in small chunks of compute and storage and scale out seamlessly as their workloads and data volumes grow. This way, they can start small and expand the analytics use-cases in small increments.
Leave a Reply
You must be logged in to post a comment.