On the journey to modern data centers, federal agencies have undoubtedly come across the advice to “tear down data storage silos.” Those are wise words, but how can agencies actually follow them?
Today’s storage silos aren’t just office walls. They’re complicated contracts and legacy technology obstacles with proprietary lock-ins and relatedly isolated data practices.
“We have to collapse data into a central repository. To do that, you have to look at modernizing your current infrastructure,” said Dallas Nash, Senior Director of Sales for Global Government at Dell EMC.
Simply, the goal is to have data live in one shared space, accessible from anywhere. That centralized repository is known as a data lake.
GovLoop recently interviewed Nash about how agencies can turn unstructured data into a collective resource. Below are the areas he outlined – the edge, core and cloud.
1. EDGE
Increasingly, data is generated outside offices and at the edge, where work is now conducted. While valuable, information at the edge is often unstructured and goes unused – envision the many video surveillance cameras’ data that’s never touched. Unstructured data is critically important, but lacks standards in format, type and values.
Imperative for modern data intake, Nash said, is to start with an end goal in mind. Without an unstructured data target and the ability to glean real-time business intelligence, agencies have dormant data – taking up space and driving up costs.
If the aforementioned cameras had their data pooled and analyzed to identify license plates and vehicle types, then that information carries true mission-critical value. Acted on, it could save lives.
2. CORE
For actionable information, all external sources must come together in a core, scale-out network-attached storage solution – simply, a data lake.
A data lake is a broad-scale unstructured data repository. Different data streams throughout the enterprise – like sensors, cameras and file portals – are all tributaries into one central location.
The benefit of a data lake is that data scientists can return to the same watering hole for reliable information from established sources. They don’t have to waste time retrieving data from scattered, disparate sources.
“The data lake allows both customers and data creators to have mass-scale ability to pull value out of data,” Nash said.
What agencies need to consider is how information traverses the network into the data lake. Vehicles like Dell Isilon and PowerScale move data from the edge to the core.
3. CLOUD
Agencies finally need to link the core – the data lake – to the cloud, where ready-to-boot artificial intelligence (AI) solutions assess and analyze unstructured data. The cloud makes insights portable and accessible.
Dell Technologies offers a low-lift AI starter pack with compute, networking and storage so that agencies can flexibly deliver insights across multiple data centers and private, public or hybrid clouds.
Eventually, agencies might permanently favor a private cloud for simplicity and consistency, Nash said. Private clouds can eliminate significant public cloud data egress costs.
“Options are wonderful, especially when looking to make that initial move,” Nash said.
This article is an excerpt from GovLoop’s recent guide, “Your Data in the Year of Everything Else.” Download the full guide here.