“situational awareness and the information infrastructure battlespace”

In my previous post, I suggested that cyber had become the new battlespace. I admit that while the role of the cyber environment as a fully-fledged warfighting domain is open to sustained debate, it is very clear that the cyber environment is one in which it is possible to conduct a range of targeted operations.

In this post I will attempt to elaborate on the association between the battlespace and situational awareness. As outlined prevviouly, terms such as battlespace and attack have become common parlance when discussing the protection of information infrastructures from a wide range of cyber-based information operations, as has another term, situational awareness. But what do we mean by situational awareness? Well, Endsley’s definition,

“the perception of elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”,

will serve as a useful introduction. However, what does that mean?

The study of situational awareness has its roots in military theory, and has the goal of understanding the state of a particular scope and using that understanding to make decisions about how to proceed and respond to events. There are different models and frameworks for situational awareness in the computer networks field, but there is general agreement that its core consists of three levels:

  • Perception — becoming aware of situational events;
  • Comprehension — interpreting what is happening to form a situational understanding of the events;
  • Projection (i.e., prediction) — using the understanding to inform what actions (if any) should be taken to control the network.

So in essence situational awareness is a process, consisting of (i) becoming conscious of the immediate environment, (ii) and understanding how temporal/spatial events (which you may or may not control) will impact on that environment. It is generally understood that inadequate situational awareness is an element of poor decision-making, especially in those situations that are composed of high information flow, with any disastrous consequences resulting from that poor decision-making being attributed to ‘human error’.

Unfortunately, cyberspace is characterized, amongst many things, by a lack of natural visibility and tangibility. Humans have sense-based defensive postures. Sight, smell, feel and sound underpin our innate defensive posture. The challenge of cyberspace is that none of these senses, the core of our sensory toolkits, are effective in the cyber environment without technology and tools. Network administrators are therefore depend upon these tools, and the way in which they have been developed and configured to provide them with necessary situational awareness in order to make sense of the network’s behaviour.

However, it is not unreasonable to expect that such unforeseen events would cause increased cognitive workload, thereby impacting on the situational awareness and consequently the decision-making time. During this decision-making process, the perceived situational awareness being used during that process will be influenced by the cognitive effort needed to acquire and comprehend that situational awareness.

It should be noted that while situational awareness as a concept is well defined and understood,situational awareness depends on cognitive processes, and unfortunately while the situational awareness concept makes general references to cognitive processing, very little detail on what cognitive processes are involved, or indeed how they function is available. However, it still remains an extremely useful metaphor for describing the perceived understand an individual has of their immediate environment, and in an information-rich, time- constrained environment, a clear understanding of the current state of the battlespace, i.e., situational awareness, becomes a battle-winning factor. In securing cyberspace, situational awareness represents a way of perceiving threat activity to an information infrastructure, such that network administrators can actively defend the network.

When discussing the manoeuverist approach previously, I suggested that in order to gain and maintain the initiative in a particular area of operations, the first step or component was to achieve an understanding of the area and activity within it. This clearly echoes Endsley’s model noting a perception and comprehension of information in order to enable projection; actions to seize the initiative in a particular situation.

Moreover, the manoeuverist perspective on situational awareness looks in even more detail at information operations, intelligence collection and collation as part of the process to convert perception to comprehension and projection. This is directly relevant to the information space and implies a degree of planning and direction through the acquisition, analysis and dissemination of intelligence. In many contexts, analysis is intuitive and organic, especially in the high tempo information space, however, we must acknowledge its role as an active part of the practical process. It is this transition from information to intelligence which takes us from Endsley’s Understanding Phase to the Projection Phase.

However, while Endsley’s model is useful for understanding the levels of situational awareness, an example from the kinetic sphere will illustrate how it adds value in a practical context.  So let’s take a brief step into kinetic military doctrine, and view the computer incident response process in the context of Boyd’s OODA loop theory.

John Boyd was commissioned by the US Department of Defense in 1976 to analyze why US pilots in Korea were so successful despite the fact that the opposing Chinese MiG15 aircraft were technically superior in many respects. His simple theory, which postulated that certain aspects of the US aircraft design enabled the pilots to react more quickly to the changing battle, has gained much traction since. Boyd theorized that combat pilots made decisions using a cycle comprising four steps: observe, orient, decide, and act (OODA). In a contest between two opposing pilots the individual who could complete this cycle the quickest would have the advantage. Boyd suggested that the increased speed at which the US pilots could react and reorient themselves outweighed the technical superiority of the MiG15.

Refinements have since been made to Boyd’s OODA model and it is particularly pertinent in the context of cyber security and the defence of information networks. The information network environment is characterized by high tempo and granularity, coupled with low visibilty and tangibility. Administrators are therefore dependent on complex and granular data feeds for data about what is happening, and must often further translate this view into language that can be understood by decision makers. The use of tools can simplify this complex data picture, but each analysis layer introduces margin for error and adds Clausewitzian friction. Added to this are the practical limitations of our physical and intellectual physiology; it is practically impossible for most people to sit watching complex visual data feeds concurrently with other activity without quickly losing effectiveness.

So, one of the principal challenges is that of maintaining situational awareness in an environment which is essentially intangible and invisible. In order to increase our visibility in this area, we need to identify indicators that may warn of attacks or inappropriate activity.

This will be considered in my next post.