This interview is an excerpt from GovLoop’s recent guide, What the Internet of Things Means for the Public Sector, which explores insights and best practices into how government is using automated machine-to-machine transactions and implementing IoT in the public sector.
As organizations connect more devices, they produce more data. This isn’t an earth-shattering revelation. In fact, that’s the point of the Internet of Things (IoT). However, Chris Steel, Chief Solutions Architect at Software AG Government Solutions, an enterprise software company, explained that the implications of that data deluge are worth thinking about.
“If you think of the amount of additional data that agencies are now going to being exposed to through the Internet of Things, they are just simply not prepared to deal with it,” Steel said.
According to Steel, many organizations make the mistake of accumulating unnecessary data, in case they find relevance for it later. “Up until now that’s been fine, because our storage capacity has been increasing exponentially over last 20 years,” he explained. “But what the Internet of Things does is bring a whole new magnitude to the amount of information that we’re trying to grab, and we just simply aren’t going to be able to take the traditional approach and store all that data for later analysis.”
Accommodating IoT data requires a more thoughtful approach. “What we need to do is decide up front what the valuable data is and triage it. Determine what data you’ll ingest, how you’re going to store it, and for how long,” Steel said. “Otherwise we’re going to be overwhelmed with this deluge of unnecessary information that’s going to tie up all of our storage and hamper efforts to analyze it.”
Steel advised starting your IoT data analytics journey by first deciding what questions you’re trying to answer or what opportunities you’re trying to seize. This is in contrast to the approach many organizations take, where they look at their cache of data to determine both the problem and solutions it might provide. However, starting with a problem allows an agency to extract the data that could offer a solution and discard the data that won’t.
Once an organization targets the data that will solve its original problem, it can begin to invest in analytics. Steel said agencies, in order to best approach this investment, must take a holistic approach. “We’re not going to have one tool or one process that’s going to allow us to seize these opportunities. What we need is an entire architecture that’s going to have a lot of different pieces,” he said.
According to Steel, specific components of this architecture will vary depending on unique business needs. However, certain tools will be necessary across all organizations:
- Real-time streaming analytics engine: “This is going to be able to scale to millions of events per second,” Steel explained. “That will allow us to ingest all the different data we’re getting from different sensors and devices and perform our analyses in flight, storing only the results, rather than all the unnecessary raw data.”
- Predictive analytics: This capability allows organizations to detect and mitigate problems before they occur, preventing disruptions, saving costs, and decreasing maintenance requirements.
- Visual analytics capabilities: “Now that we have all this data, it’s easy to get overwhelmed. What we want is a powerful class of visual analytics tools that allow us to sift through it to find the right data,” said Steel. Such a tool permits organizations to easily identify anomalies without having to manually parse through large amounts of data.
- Integration engine: This component allows different connected devices, operating on different protocols, to work together.
- In-memory computing: The value of data comes from being able to take action. Therefore, “we need to be able to do the analysis right here in the moment, and to do that we need in-memory computing,” said Steel. Storing data in memory allows for faster processing speeds to quickly execute analytics.
The way these components integrate to create a holistic architecture specific to an organization’s individual needs will vary. That’s why Software AG always takes the time to first talk with government customers about what value they want from their connected devices and the data they produce. Then, they create a proof of value demo to test the proposed architecture before it’s implemented.
“What we want to do is provide access to the right data in the right time frame so that we can take action on it or gain opportunity out of it,” Steel noted.
Finally, Steel re-emphasized that it’s important to remember that the power of the IoT lies not in the sheer volume of data and connections that it contains but in the timely access to the right data that it makes possible. With access to the right data sources, with tools to capture, manipulate and make sense of that data in real time, and with an architecture designed to enable a well-considered, real-time response to the opportunities presented by a specific situation, an organization can be well-positioned to succeed in the IoT.
Photo Credit: Flickr/Gilad Lotan