by John L Pulley
Federal agencies have long mitigated risks to industrial control systems by separating or “air-gapping” information technology (IT) and operational technology (OT).
OT systems manage a wide range of industrial control systems that are targets of terrorists and saboteurs. Unlike IT systems developed to safely connect to the outside world, OT systems were designed to operate in a virtual vacuum, making them vulnerable to external threats.
Much of the OT in use today was designed when network connectivity was limited and operational activities were concentrated at a few locations. From the beginning, the isolation of OT was a security feature. Like a person born without an immune system, OT existed and functioned in a bubble.
That bubble is threatening to burst.
As agencies pursue efficiency, OT and IT networks are converging. Ramping up the use of automation and remote monitoring of geographically distributed OT systems increases cyber vulnerability. Deploying Internet of Things (IoT) devices — cameras, thermal imaging systems, thermostats — further increases the security risk to OT systems.
This gradual convergence of OT, IT and IoT systems is helping agencies reduce costs and improve effectiveness through greater agility. It also achieves some security benefits, such as the ability to apply modern cybersecurity tools and techniques to OT. In the process, however, it also introduces new operations and security challenges that must be mitigated.
One problem is that the aging distributed control systems (DCS) and supervisory control and data acquisition systems at the center of many OT environments were not intended to be connected to IT networks. As a result, remote access, visibility and threat analysis were not design priorities.
“The level of processing power that some of these OT devices have can’t stand up to network scans,” said Andrew Callan, Systems Engineer for ClearShark. “They fall over.”
In addition, best practices in IT environments such as vulnerability scanning, patch management and endpoint detection and response often aren’t possible with legacy OT systems.
“Some of these systems use such old operating systems, there’s not a patch available,” said Darshan Shah, Senior Manager for Solutions Marketing at Gigamon. “Endpoint detection is not going to work in those situations, and security falls back to the network.”
In addition, there are two other key challenges:
- Blind spots with data center and/or cloud visibility that prevent security tools from seeing a complete picture of all network activity. While most network security tools were designed to protect communications based on the TCP/IP protocol, OT networks often utilize specialized protocols. Some of these protocols are proprietary, and even those that are standards-based are not always supported by commercial security products. Even if the necessary protocol support exists in a security product, it may not be practical for the tool to gain visibility into the relevant traffic flows due to constraints of the OT architecture.
- Escalation of network traffic volumes to levels that overwhelm the security monitoring and enforcement tools used to protect the environment. Monitoring tools can get overwhelmed by large amounts of duplicate or irrelevant traffic, limiting their accuracy and effectiveness. This makes it difficult to inspect traffic in a timely manner, which puts mission-critical functions at risk.
Without attention, these issues present real-world safety and security risks. Unfortunately, performing the necessary security tool upgrades to keep up with escalating traffic volumes is cost-prohibitive for many budget-constrained agencies.
Solution: A Holistic Approach to IT, OT and IoT Security
A robust visibility and analytics solution provides a scalable and comprehensive platform for ongoing insights into both legacy OT and modern IT networks, along with the ability to deliver optimized traffic feeds to security tools that protect converged IT, OT and IoT environments.
A third-party solution from a reputable vendor will extend visibility across legacy OT infrastructures, modern on-premises data centers and public cloud infrastructure. This includes tapping into the switches carrying data between PLCs and HMIs over ethernet and other methods of observing OT traffic.
“With OT, IoT and IT convergence, visibility is more critical than ever before,” said John Quezada, a Federal Sales Engineer at Gigamon. “You always want to have visibility into what’s going on within the environment and be able to monitor it.”
In addition to delivering visibility across converged OT, IT and IoT environments, premier solutions also help agencies overcome the performance and scalability challenges mentioned above.
A good solution “offloads a lot of the processes that tools typically have to apply to traffic before they can get to the core mission of analyzing,” Quezada said. “A lot of legacy tools are only expecting traffic at a rate of 100 megs or less. One or 10 gigabits per second line rates will be more than these tools can handle.”
Integrated traffic de-duplication reduces the amount of unnecessary traffic directed to security monitoring and enforcement tools, often in the range of 40 to 60 percent, preventing packet loss that can result when tools are overloaded – and also avoiding expensive upgrade costs. Many medium-sized organizations have saved millions of dollars in tool upgrades by using traffic reduction techniques like de-deduplication, packet slicing, flow mapping, NetFlow, etc.
Consistent use of smart application filtering culls data with application (Layer 7) granularity; only essential traffic reaches tools – no more than is needed for those to perform critical tasks.
Reducing the burden of processing irrelevant data helps agencies apply the Purdue model and other best practices at scale without interruptions or unnecessary costs.
This article is an excerpt from GovLoop’s report, “You Can’t Secure What You Can’t See: Cybersecurity in a Converged IT/OT/IoT Environment.” Download the full report here.