This blog has been repurposed from the May-June 2020 edition of InTech.
Awareness of challenges and collaboration on solutions can secure critical resources
Protecting automation systems from cyberattacks—particularly those in critical infrastructure—has been an imperative for almost 20 years. Professional and trade associations have developed standards, practices, and guidelines. Government agencies and national laboratories in several countries have supplied frameworks, guidelines, and in some cases, regulations. Traditional automation companies have retooled their product and service offerings to focus more on security. Entirely new companies are offering technology and services to address perceived needs. In spite of all of this activity and investment, many experts and industry pundits say that we are still far from being able to ensure the security of these critical systems. What is holding us back?
There are as many answers to this question as there are specific circumstances. However, there are some common themes. Many asset owners find it difficult to define the business case for changing their security practices without evidence of specific and pressing risk. Others may struggle with selecting from what often appears to be a large, confusing, and perhaps conflicting collection of available guidance. Many small to midsized companies simply do not have the necessary staff or expertise to adequately address the need for a cybersecurity program.
Common or industry specific?
One of the more interesting topics of discussion is whether standards and guidance should be broad and general or tailored to specific industries. It has been hotly debated within the industrial cybersecurity community for several years. The assertion that industries are more similar than different, and that standards and practices should be developed to be applicable across industries has elicited a range of responses depending on the perspective of the individual. There is an interesting analogy here with the Kübler-Ross model, also known as the stages of grief, which describes a progression of emotional states associated with traumatic events (figure 1).
While this model may not be a perfect fit for the situation, it does supply a context and steps for moving the discussion beyond the question of the suitability of particular guidance and toward what is required to reduce cyberrisks. The aim must be to gain maximum value from the guidance and practical examples available, without discounting information simply because it comes from a different industry.
Similar risks and consequences
There are certainly differences between industries, but the risks associated with potential cyberattacks or deficiencies are rather similar. Consider that risk is commonly defined as a function of threat, vulnerability, and consequence, combined with an estimate of likeliness of occurrence. To fully assess similarities across industries and companies, it is necessary to look at each of these factors.
Threats to industrial systems come in many forms, ranging from direct attacks to nonspecific attacks that capitalize on the nature of these systems and their availability or accessibility via the Internet. While many asset owners feel that they are not of a high enough profile to be the target of a directed attack, they can easily become collateral damage when malicious software is released on the Internet. Recent cases of ransomware attacks illustrate this very clearly. Those releasing this software may not have an individual target in mind but are simply looking for situations where vulnerabilities can be exploited to encrypt data and demand payment for its release.
The second major factor in calculating risk is vulnerability. It is here that we see the greatest level of commonality across applications. Virtually all industries that employ computer-based or automated systems use products from the same suppliers. In recent years, the number of major suppliers has decreased, with all using essentially the same commercial-off-the-shelf technology for components such as databases, operating systems, and network components. This has created a technology monoculture where the vulnerabilities inherent in automation solutions are common to all.
Vulnerability mitigation requires asset owners to update or patch their installed systems as quickly as possible. In cases where such patching is not practical or even possible, it is often necessary to employ compensating countermeasures or controls to mitigate the vulnerabilities. Examples include the use of various isolation methods, up to and including disconnecting such systems from networks. Products, such as industrial firewalls and unidirectional gateways, are now available for this purpose.
Perhaps the most important component of the threat calculation is the potential consequence of system compromise. These consequences may extend well beyond shutting down the automation system itself to loss of view or control of the process under control. In some cases, it is difficult or impossible to operate such a process safely without automation, leading to a dependence on automated safety systems to move it to a safe condition or shut it down in a safe manner. As recent events have shown, even the safety systems themselves may be susceptible to cyberattack.
Although the details of these consequences vary across industries, their nature is often similar, if not identical, ranging from loss of product or service to explosions or release of hazardous or noxious materials and equipment damage. Perhaps the most common grouping of industries based on potential consequences is what is known as critical infrastructure component sectors. In the U.S., this is defined as including sectors that are “. . . considered so vital to the United States that their incapacitation or destruction would have a debilitating effect on security, national economic security, national public health, or safety, or any combination thereof.”
Based on the above analysis it appears that the risks faced by different industries are more similar than some might perceive, which would in turn lead one to conclude that more cross-industry sharing may be beneficial.
There are other factors that have often limited the creation of effective cybersecurity programs for industrial systems. These also tend to be common across a wide range of industries.
Standards complexity. Industry standards are written using very precise structure and terminology to allow the creation of formal conformance specifications. The language can be viewed as arcane to those who are not experts in the subject matter. Unfortunately, this means that these documents may be seen as intimidating to almost incomprehensible by those trying to apply them in practical situations. This in turn affects the level of acceptance and adoption. It is clear that standards by themselves are not sufficient to promote adoption of proven and effective practices.
Lack of practical examples. While standards typically document what has already become accepted engineering practice, they also supply a starting point for the collection of case studies and use cases. Case studies describe the approach and results from a specific application; while use cases focus on a specific aspect of the topic. In both cases, it is important to identify and describe common principles and fundamental concepts that form the basis of an effective response, irrespective of the industry involved.
Unfortunately, practical and representative case studies are often hard to find. Asset owners may be reluctant to share what they consider proprietary or otherwise sensitive information, or they may simply lack the time and resources to prepare and publish such documents. The most common scenario is for suppliers to publish case studies that have much of the identifying information redacted, but these may be viewed as thinly veiled advertisements of specific products and solutions.
Need for work process guidance. There is a specific type of guidance for which there appears to be particularly high demand. Asset owners and others who wish to establish an effective cybersecurity program are very interested in understanding how their peers have integrated such programs into their normal work processes. There is ample evidence and experience that shows that a project approach to cybersecurity is not sustainable over the long term. Just as with safety, security must become an integral part of normal processes and procedures.
Multiple competencies. There is also the question of what skills or competencies are required to adequately address cybersecurity risks. Obvious needs include expertise in the management and protection of information, and of the systems and networks on which it resides. But this is not enough when dealing with automation systems. It is also necessary to have a thorough understanding of both the process under control, and the strategy and logic used to affect that control. This expertise is only available from automation and similar engineering disciplines. This combination of information, systems, and network security with engineering expertise is needed to fully address the problem.
Little of what has been presented to this point is specific to an industry. The threat and vulnerability components of risk are largely the same for everyone, and while the detailed consequences may vary, the potential impact is very similar for sectors considered to be part of the critical infrastructure. The challenges faced by those trying to mount an effective response to cyberrisk are also largely the same. Moreover, most of these challenges can be more effectively addressed with increased sharing of practices and experiences.
Elements of the response
Given that there is so much commonality between industries, it seems clear that more collaboration would be helpful in addressing the need for improved cybersecurity of operations systems. To share an often overused phrase, “We’re all in this together.” Such collaboration should include several essential elements:
- Context: First, there must be a common context that can be used to position components of the response and establish relationships between them.
- Concepts and terminology: Effective collaboration and cooperation across industries and disciplines is only possible if there is a common set of concepts and terminology. Although this exists for functional elements in the automation industry and for the system elements in the networking and security disciplines, there are sometimes difficulties when these disciplines must work together. This situation is improving as each constituency becomes more familiar with the other.
- Comprehensive requirements: An effective response is only possible if there is a clear and unambiguous description of the desired future state. Typically, this comes in the form of a set of normative requirements and associated supplemental guidance. It is important that these requirements are constrained to defining what is to be done without making assumptions or assertions about how this is to be achieved.
- Recommended practices: Requirements—even when accompanied with supporting or explanatory rationale—are not enough, as they often use broad or generic terms in a form that allows for the definition of conformance criteria. Recommended practices take these requirements and restate them using terminology that is tailored more to the specific environment. Thus, there may be several practices based on a single standard, each addressing a specific scenario.
- Case studies and use cases: These are perhaps the most useful resources for system integrators and asset owners, because they describe what has and has not worked in previous situations.
In creating, vetting, and sharing the above resources it is essential to involve all relevant disciplines and stakeholders. Suppliers must work closely with system integrators, asset owners, and service providers to address all phases of the life cycle, from specification and development through implementation, operation, and support. It is also crucial to draw on the expertise from all relevant and affected disciplines. For example, risk assessments must include input from engineers and operations personnel who are familiar with the underlying processes and possible consequences of compromise.
Help is available
Much of the above already exists, albeit not from a single source. This can cause a lack of awareness and understanding on the part of those needing the information. More collaboration is necessary to put the pieces together and provide the full range of necessary guidance. There is also a need for increased awareness and understanding of what is readily available.
The National Institute of Standards and Technology Cybersecurity Framework (NIST CSF) has been available for several years. Although developed in the U.S., it reflects contributions from other parts of the world and has become widely accepted as a general framework for characterizing an effective cybersecurity response.
NIST CSF core functions
It is now very common for suppliers of products and services to describe their offerings in terms of alignment to the core functions described in the NIST CSF. In addition, NIST has published an implementation guide for the Cybersecurity Framework Manufacturing Profile that is aligned with manufacturing sector goals and industry best practices.
Several organizations have addressed industrial cybersecurity concepts, models, and terminology in various forms; ISA and IEC, however, have jointly developed what are arguably the most comprehensive set of standards in this area. The ISA/IEC 62443 standards are not industry-specific, but define their scope based on a combination of activity-, asset- and consequence-based criteria (figure 2).
The benefit of the ISA/IEC 62443 approach is that the standards are not tied or limited to a specific industry or sector. While originally developed with process industries in mind, they have since been successfully applied in industries such as rail transportation and mining. The most common approach in developing these applications involves interpretation of the concepts and requirements in the context of the industry in question, and using this interpretation as the foundation of more focused recommended practices.
The practices developed thus far can serve as useful examples for those interested in developing similar guidance for other industries. As these practices are applied to specific situations, the results can be documented in the form of case studies.
Industry also needs clear definitions of the skills and expertise required to staff cyberattack response programs. Several organizations have addressed this need through the use of competency models or similar tools. For example, ISA has worked with the U.S. Department of Labor to create competency models for both automation and cybersecurity. These are valuable resources for those trying to decide how to best staff their programs.
For the above to be successful, it is necessary to increase the general awareness of what resources are available and how they may be used. In 2019, ISA created the ISA Global Cybersecurity Alliance (ISA GCA) to advance cybersecurity readiness and awareness in manufacturing and critical infrastructure facilities and processes. The Alliance brings end-user companies, automation and control systems providers, information technology infrastructure providers, services providers, system integrators, and other cybersecurity stakeholder organizations together to address common interests and advance the state of automation cybersecurity.
The intent of ISA GCA is to be the basis of the collaboration required to address cybersecurity challenges, irrespective of industry sector. Because cybersecurity is a cross-competency activity, ISA GCA also hopes to bridge the IT-OT gaps within manufacturing and process industry companies. End user members are welcome, and are encouraged to include people from multiple disciplines.
Cybersecurity risks are common across industries. An effective response to this risk must be multidisciplinary. Being aware of the challenges and collaborating on the solutions is the way to secure our critical resources today and into the future.