Article

Leading in the Time of Data Breaches

Posted August 26, 2014 | Leadership | Technology | Amplify
In this issue:

Historically, there have been analogous eras of rampant theft. The items raided were stores of portable wealth -- art, gems, fabrics, spices, and metals. Wealth was defended using secrecy and physical means, including fortifications and strongboxes. Attacks on these stores still do occur today, but they are much more rare, largely because the steps needed to steal that property and convert it to usable forms of wealth are closely monitored. In other words, borrowing a concept from software engineering, the end-to-end1 attack process for portable physical wealth is now better defended than it once was.

That is not yet so for data theft. Acquiring without detection the needed equipment and knowledge is relatively easy. After malicious actors have the data in hand, they are relatively free to use it or to exchange it for money or deposits.

We can understand the difference between the success rates for theft of physical valuables and the theft of data in terms of what network security experts call the kill chain or the cyber kill chain.2 In conventional theft, defenders can monitor the entire conventional kill chain, end-to-end. Law enforcement personnel and others have access to databases containing such information as criminal records, photographs, fingerprints, known associates, patterns of past criminal activity, medical records, and much more. They have networks of informants, and they can access detailed information about commercial transactions. The power of this array of capabilities is illustrated in the Michael Mann film Heat.3 In data theft, analogous means of monitoring the activities of data thieves are only in the early stages of development, and are therefore not yet as effective as they will eventually become. At a fundamental level, this effectiveness gap is the reason why attackers remain as successful as they are.

Suggestions and best practices for enhancing enterprise data security abound, but data breaches continue. In some cases, breaches have occurred even after deployment of security software and procedures that should have prevented those breaches. Clearly, something is needed beyond deploying the latest technical countermeasures.

It is therefore useful to explore the possibility that the difficulty lies in the way we as human beings think. Countermeasures for conventional theft of valuables usually involve concrete, physical objects -- fences, locks, badges, armament, steel, and so on. Countermeasures for data theft are much less tangible -- software, procedures, alerts, and the like. Because physicality imposes a kind of discipline on our thought processes, it is possible that we face special challenges when we design, implement, or use the largely nonphysical countermeasures we employ for data security.

Cognitive biases might provide an explanation. The study of cognitive biases is an area of psychology that has drawn increasing attention from psychologists, management theorists, economists, system designers, and others in recent decades, because they characterize how we tend to think in ways that lead to systematic deviations from results that a rational thought process would produce. Let us now look at the steps organizations should take to enhance data security and examine how cognitive biases might lead to disappointing results from the deployment of measures intended to achieve this goal.

KNOW THE DIFFERENCE BETWEEN A CRISIS AND AN EMERGENCY

A crisis is a crucial time or state of affairs in which a decisive change is impending but has not yet arrived. An emergency is an unforeseen situation that calls for immediate action. The difference between the two is immediacy. We are now in a crisis that calls for rethinking the strategies and tactics we use to ensure data security. Until we have dealt with the crisis, we will undoubtedly encounter a series of emergencies that we call data breaches.

Selecting members of crisis response teams and emergency response teams is an important task for organizational leaders. It is a delicate and difficult task, because crisis response and emergency response are two different activities that require overlapping, but different, sets of knowledge, skills, and talents. Selection committee members must recognize that the people who deal well with crises might be different from the people who deal well with emergencies.

Selection committees must also recognize that they themselves are subject to cognitive biases. For example, social comparison bias4 can lead selection committee members to look with disfavor upon candidates whose strengths and experience match or exceed their own. In-group favoritism5 is the tendency to favor candidates whom the selectors perceive as being members of their own social group -- in this case, their own professions or organizations.

Choose your crisis and emergency response teams carefully. Choose your selection committees even more carefully.

PROTECT ENTERPRISE DATA IN ALL ITS FORMS

Because any element of the enterprise -- human or not -- can provide entry and exit points for malicious actors, every element of the enterprise -- human or not -- could potentially be involved in data security preparedness and breach response.

How we choose to attend to the various elements of the enterprise is sometimes unrelated to the security benefit that attention can produce, because our choices are influenced by a group of cognitive biases that lead to the phenomenon of selective perception (see Figure 1).6 Selective perception influences how we perceive the world around us based on our preconceptions, interests, experiences, or expectations.

Figure 1

Figure 1 -- A playing card like the one used in experiments on selective perception. Do you notice anything strange about it?

Because of selective perception, some stores of enterprise data might not be recognized as such by people whose ordinary focus is data in the form of official documents or computer files. For example, data we usually regard as residing on servers might also reside in physical trash, including paper invoices, receipts, sticky notes, packing boxes, package labels, envelopes, manuals, memos, discarded travel itineraries, shredder confetti, and more.

Effective data security requires careful attention to data in all its forms, beyond the bytes themselves. That attention might be distorted by selective perception.

COMMIT TO INTER-ENTERPRISE COLLABORATION

Collaborations between and among enterprises can be the basis for unified action that aids all collaborators. These actions can include influencing government policy, participating in standards development and propagation, exchanging information about vulnerabilities, and exchanging lessons learned.

With more thorough and widespread collaboration, the Heartbleed defect in the OpenSSL cryptography library7 might not have caught the IT community by surprise. The Core Infrastructure Initiative of the Linux Foundation8 is an example of close collaboration support for shared resources that could have uncovered the problem much sooner or prevented it altogether.

Noticing opportunities for inter-enterprise collaboration is difficult, in part, because of a cognitive bias known as functional fixedness,9-11 which causes us to assess situations in conventional ways. In other words, it prevents us from thinking outside the box -- or, in this instance, outside the enterprise. Because those accustomed to thinking within the enterprise might have difficulty developing inter-enterprise collaborations, develop cyber defense collaboration teams by recruiting those from within the enterprise who have experience in forming strategic partnerships or other kinds of collaborations.

TEST THE WHOLE DEFENSE SYSTEM, END-TO-END

When we consider cyber defense infrastructure, we tend to focus on software and hardware. In many cases, the focus is even more specific -- the network. But the typical enterprise cyber defense infrastructure actually has four components:

  1. Hardware
  2. Software
  3. Procedures
  4. Behavior

An outstanding example of failing to consider this whole-system perspective is the 2013 Target data breach, in which the software and hardware did function as intended, raising alerts of suspicious activity, but the alert evaluation team elected not to investigate the alerts.12, 13 This incident provides an illustration of the importance of end-to-end testing of cyber defenses, which includes their procedural and behavioral components.

The behavioral components are subject to an array of cognitive biases too numerous to list. One example is the observer-expectancy effect.14 This effect causes a person who is being observed to behave in conformance with what he or she believes the observer expects. In the case of a breach response team, the team knows that senior management hopes and expects that the system is breach-proof. Declaring an emergency thus violates what the team believes are management's expectations, which makes the team reluctant to declare a breach.

To give a team experience with this cognitive bias, run a drill in which a breach does not occur but several false alerts are signaled. A team that expects the drill designers to have programmed a breach will attempt to meet expectations by (incorrectly) declaring a breach. If that happens, and even if it does not, the drill debriefing presents an opportunity to educate the team about the observer-expectancy effect.

PREPARE FOR MULTIPLE CONCURRENT INCIDENTS

Because the environment might present indicators of several threats or attacks simultaneously, saturation of breach response capacity can cause response failures far more severe than any of the breaches can cause alone.

When a team is tracking multiple threats, a threat analysis discussion must necessarily move from one topic to the next. These transitions cause the participants to switch mental contexts repeatedly, which is exhausting, and fatigue can reduce performance. Moreover, cognitive biases can also affect performance in such threat evaluation meetings. Two examples of biases relevant to these situations are the availability heuristic15 and the bias blind spot.16

The availability heuristic distorts our perceptions of reality because we have a tendency to remember some things better than others. The error occurs when we mistake what we remember about a class of items as representative of that class, or as having greater importance than other members of that class. For example, in a discussion of potential vulnerabilities of a particular subsystem, if we have discussed one system's vulnerabilities to a particular threat just a few moments ago, we're more likely to see the relevance of that vulnerability to another subsystem and threat that happen to be next on our morning's agenda.

The bias blind spot is the tendency to believe that one is less susceptible to cognitive biases than others are, or to believe that one is able to identify more manifestations of cognitive biases in others than in oneself. This bias reduces the effectiveness of training aimed at teaching breach response teams about the dangers of cognitive biases.

Attackers know that breach response resources can become saturated when breach response teams must track multiple targets and threats simultaneously. Have a plan that addresses the possibility of multiple concurrent incidents by recognizing the importance and value of breach response capacity. Unlike routine operations, in which many organizations try to optimize net income by matching capacity to load, breach response capacity must match the maximum anticipated load, even if such a match creates idle capacity much of the time.

CONDUCT BREACH RESPONSE DRILLS AND RETROSPECTIVES

In data breach incidents, people must work well together even if they do not work together during routine operations. Drills are therefore important, because high performance under stressful conditions requires that people know each other well. If your operations span multiple sites, and if you anticipate the need for virtual operations of the breach response teams, virtual drills are a must.

Use drills to simulate actual situations, emphasizing those situations that you expect might uncover defects in breach response plans. When conducting drills, have role players unknown to the employees represent media, law enforcement, investors, suppliers, and others. Design some drills to explore breach response under capacity saturation conditions. Drills can be low cost if breach response capacity exceeds levels that are needed routinely, because the drills would then use otherwise idle capacity.

Many organizations conduct retrospectives following drills to improve performance by uncovering resource requirements and necessary policy and procedure variances. Retrospectives also provide additional social contact opportunities ("face time") for breach response team members, which helps to strengthen their relationships. In addition, retrospectives can give the group opportunities to explore the effects of a cognitive bias called shared information bias.17

Shared information bias is the tendency for groups to allocate too much time and resources to discussing items that everyone already knows (shared information) and too little time to discussing items that only some group members know (unshared information). This behavior degrades decision quality. Design your drills to explore this bias by distributing information unevenly during the drill. Give the team opportunities to learn the dangers of this bias and to develop skills for eliciting unshared information from each other.

A word of caution: an actual incident might occur during a drill. This is a low-probability event, but the consequences can be severe. If it happens, it will be necessary to suspend the drill immediately and gracefully. Design all drills with this requirement in mind.

CONDUCT OPERATION SUSPENSION DRILLS

Operation suspension drills have two goals. The first is determining with certainty that data access can be completely terminated. The second is ensuring that the suspension of operations does not affect the operational capabilities of breach response teams. Actually suspending operations is the only way to accurately assess an organization's ability to completely halt, and then restore, access to all data.

Reluctance to conduct such drills might arise from a cognitive bias known as the valence effect.18 This is the tendency to overestimate the probability of a positive outcome of an event, and likewise to underestimate the probability of a negative outcome. Arguments against operation suspension drills typically assert that the system as a whole is sound, and therefore actual suspension is not needed because component subsystems have been tested successfully.

Suspensions are expensive because they interrupt business flow. However, avoiding such drills can be even more expensive if the organization learns during a real incident that it has not actually halted all data access. To limit the duration of actual suspension of operations during a drill, construct and maintain a simulator to use for preliminary exercises.

UNDERSTAND THE FUNDAMENTAL ATTRIBUTION ERROR

When people try to explain the behavior of others, they tend to attribute too much to character and not enough to circumstances. This is a cognitive bias known as the fundamental attribution error.19 The fundamental attribution error can lead to incorrect conclusions about conduct that compromises data security. Those incorrect conclusions can lead to ineffective corrective actions that leave the enterprise vulnerable.

Consider the example of data vulnerabilities owing to the conduct of insiders (i.e., people who have been officially granted access to enterprise data). Unintentional insider threats are those in which the insider is unintentionally responsible for actions (or inactions) that lead to data breaches. Intentional insider threats are those in which the insider intentionally executes or assists with attacks that result in data breaches.20

We are inclined to regard as negligent those who write their passwords on sticky notes affixed to the walls near their computers, even though this behavior might be a strategy for coping with the lack of effective password management software. Treating this behavior as negligent, and then disciplining or terminating offending individuals, is usually an example of the fundamental attribution error. The problem can be solved only with password management tools.

As a second example, consider an intentional insider who is a victim of blackmail or extortion. The root cause is the outside criminal, not the intentional insider. Terminating the insider might prevent the attack or end the attack (if it is still in progress), but the root cause is still in place.

To defend against intentional insiders, administer periodic examinations, monitor communications and computer usage, and conduct periodic background checks. Periodic training and retraining can improve the ability of employees to notice unusual activity on the part of other employees and motivate them to report it. Manage costs by conducting these activities in a manner consistent with an individual's assessed level of risk. Most importantly, remove from the organization anyone known to harbor ill will toward it or toward management.

BEWARE THE TECHNOLOGY MINDSET

Because of a range of cognitive biases, IT professionals and executives with technology backgrounds are apt to devote too much attention to the technological elements of defenses and not enough to the procedural elements or behavioral elements. The result is a defense architecture in which procedural and behavioral elements tend to be less resistant to attack than the technological elements. More specifically, we are inclined to focus on the network, even though we know that other technological elements also have weaknesses.21

Psychologists use the term mental set22, 23 to denote the tendency to approach situations using familiar patterns. The tendency is not necessarily reinforced by past success; rather, it is based on familiarity. For example, in an IT leadership team whose members have strong technical backgrounds, their overall mental set could cause them to test procedural and behavioral elements of defenses less thoroughly than technological components. When such teams do test procedural or behavioral elements, they tend not to recruit test designers with psychology expertise. Instead, they rely on inhouse personnel, who often also have a technological mental set. The results are tests that do not always effectively uncover flaws in procedural or behavioral defenses.

Some security certification processes are also excessively focused on technology. Certification processes should assess the judgment of the people who must halt operations, the record of management retribution for false alarms, and management policy with respect to false alarms.

UNDERSTAND ALL COSTS

Because so many of these suggestions are expensive, some organizations are reluctant to deploy them. This reluctance can arise, in part, from a failure to accurately account for the expected costs of data breaches. The decision to deploy a subset of these suggestions should be based on comparing the quantity:

<Cost of Cyber Defense Deployment> + <Estimated Cost of Breaches with Deployment>

to the quantity:

<Estimated Cost of Breaches without Deployment>

If the latter quantity is greater than the former, then deployment is the low-cost option. Too often, we make the deployment decision with inaccurate estimates of the two breach costs.

Errors in making the deployment decision can arise from any of several cognitive biases. Two examples are the focusing illusion24 and the aforementioned availability heuristic.

The focusing illusion causes us to pay too much attention to some aspects of situations, out of proportion to their importance. In the case of allocating resources to data security, the focusing illusion can lead us to demand precise estimates of the cost of deploying a countermeasure, to within 5% or so, while demanding much less precision in estimates of the expected cost of a data breach. Unless we can determine with similar precision the expected cost of a breach, such high-precision estimates for countermeasures are of little utility for decision making. Even if we did demand such precision, it is probably not achievable.

The availability heuristic25 affects our ability to assess probabilities. If we can easily imagine an event occurring, we're more likely to assign a high value to its probability. In an organization whose people have never experienced a data breach, one might expect underestimates of breach probabilities.

The valence effect26 mentioned earlier causes us to overestimate the probability of good things happening to us, relative to the probability of bad things happening. That is, we are likely to underestimate the probability and severity of data breaches.

Disconfirmation bias27 is the tendency to critically examine information that contradicts our beliefs and to uncritically accept information consistent with our beliefs. This bias can cause us to overlook or inappropriately dismiss proposed breach vulnerabilities, or to reject assessments of probabilities that are higher than we would like.

The prevalence of these effects suggests the need for careful examination of breach estimation processes.

CONCLUSION

How clearly we think about protecting the data we hold can determine how well we do it. Although awareness of relevant cognitive biases cannot prevent errors of thought as we think those thoughts, it can provide us with the tools we need to examine our thinking and make it right. Close examination of our reasoning process is a necessary first step toward ending the time of data breaches.

ENDNOTES

1 Saltzer, J.H., D.P. Reed, and D.D. Clark. "End-to-End Arguments in System Design." Paper presented to the Second International Conference on Distributed Computing Systems, Paris, France, April 1981.

2 Hutchins, Eric M., Michael J. Cloppert, and Rohan M. Amin. "Intelligence-Driven Computer Network Defense Informed by Analysis of Adversary Campaigns and Intrusion Kill Chains." Proceedings of the 6th International Conference on Information Warfare. Academic Conferences International, 2011.

3 The plotline of Heat is available at wikipedia. Although the film is a work of fiction, it is based on the work of Chicago police detective Chuck Adamson, who hunted down professional criminal Neil McCauley in 1963.

4 Garcia, Stephen M., Hyunjin Song, and Abraham Tesser. "Tainted Recommendations: The Social Comparison Bias." Organizational Behavior and Human Decision Processes, Vol. 113, No. 2, November 2010.

5 Banaji, Mahzarin R., Max H. Bazerman, and Dolly Chugh. "How (Un)Ethical Are You?" Harvard Business Review, December 2003.

6 Bruner, Jerome S., and Leo Postman. "On the Perception of Incongruity: A Paradigm." Journal of Personality, Vol. 18, No. 2, December 1949.

7"Common Vulnerabilities and Exposures, CVE-2014-0160." MITRE Corporation.

8 The Linux Foundation's Core Infrastructure Initiative was launched shortly after the Heartbleed incident. For more information, see linuxfoundation.org.

9 Duncker, Karl. "On Problem-Solving." Psychological Monographs, Vol. 58, No. 5, 1945.

10 Kershaw, Trina C., and Stellan Ohlsson. "Training for Insight: The Case of the Nine-Dot Problem." Proceedings of the 23rd Annual Conference of the Cognitive Science Society, edited by Johanna D. Moore and Keith Stenning. Lawrence Erlbaum Associates, 2001.

11 King, Margaret J. "Apollo 13 Creativity: In-the-Box Innovation." Journal of Creative Behavior, Vol. 31, No. 4, December 1997.

12 Harris, Elizabeth A., and Nicole Perlroth. "Target Missed Signs of a Data Breach." The New York Times, 13 March 2014.

13 Riley, Michael, Ben Elgin, Dune Lawrence, and Carol Matlack. "Missed Alarms and 40 Million Stolen Credit Card Numbers: How Target Blew It." Bloomberg BusinessWeek, 13 March 2014.

14 Rosenthal, Robert, and Lenore Jacobson. "Teachers' Expectancies: Determinants of Pupils' IQ Gains." Psychological Reports, Vol. 19, No. 1, August 1966.

15 Schwarz, Norbert, et al. "Ease of Retrieval as Information: Another Look at the Availability Heuristic." Journal of Personality and Social Psychology, Vol. 61, No. 2, August 1991.

16 Pronin, Emily, Daniel Y. Lin, and Lee Ross. "The Bias Blind Spot: Perceptions of Bias in Self Versus Others." Personality and Social Psychology Bulletin, Vol. 28, No. 3, March 2002.

17 Baker, Diane F. "Enhancing Group Decision Making: An Exercise to Reduce Shared Information Bias." Journal of Management Education, Vol. 34, No. 2, April 2010.

18 Rosenhan, David L., and Samuel Messick. "Affect and Expectation." Journal of Personality and Social Psychology, Vol. 3, No. 1, January 1966.

19 Jones, Edward E., and Victor A. Harris. "The Attribution of Attitudes." Journal of Experimental Social Psychology, Vol. 3, 1967.

20 "Combating the Insider Threat." US Computer Emergency Readiness Team (US-CERT), 2 May 2014.

21 "Operational Levels of Cyber Intelligence." Intelligence and National Security Alliance, Cyber Intelligence Task Force, September 2013.

22 Luchins, Abraham S. "Mechanization in Problem Solving: The Effect of Einstellung." Psychological Monographs, Vol. 54, No. 6, 1942.

23 Öllinger, Michael, Gary Jones, and Günther Knoblich. "Investigating the Effect of Mental Set on Insight Problem Solving." Experimental Psychology, Vol. 55, No. 4, 2008.

24 Schkade, David A., and Daniel Kahneman. "Does Living in California Make People Happy? A Focusing Illusion in Judgments of Life Satisfaction." Psychological Science, Vol. 9, No. 5, September 1998.

25 Schwarz (see 15).

26 Rosenhan (see 18).

27 Taber, Charles S., and Milton Lodge. "Motivated Skepticism in the Evaluation of Political Beliefs." American Journal of Political Science, Vol. 50, No. 3, July 2006.

About The Author
Rick Brenner
Rick Brenner is Principal of Chaco Canyon Consulting. He works with people in dynamic problem-solving organizations who make complex products or deliver sophisticated services that need state-of-the-art teamwork and with organizations that achieve high performance by building stronger relationships among their people. Mr. Brenner focuses on improving personal and organizational effectiveness, especially in atypical situations, as in the case of… Read More