The human factor is a chance for IS security

People: the weakest link or a chance for security?

Have you ever heard security experts? They all point out that the human factor is the weakest link in the security chain. They often make fun of people being victims of social engineering for instance. They try to prove their assumption by showing how phishing attacks have been successful so far. Sometimes, the human factor seems to be an easy explanation or even an excuse for security failure... However, no one can pretend to do security if they only work on improving technologies or processes while stigmatizing people rather than taking them really into account. Moreover, it‘s time to stop deploring the weakness of the human factor. Fortunately, there’s another theory around the human factor by Robert Longeon, an IS security engineer at CNRS (https://www.cnrs.fr/index.html). Robert and I teamed together to introduce his theory more broadly and to try to make you realize that while the human factor might be a weak link, people are indeed a chance for information system security (IS security).

First of all, both of us do acknowledge it is true that bad behaviors of information systems actors can be a source of security incidents. Actually, at the source of any security incident, there’s almost always a person or a process deficiency. The gap between the level of security that one would like to have and what they have is usually due to either a transgression of security rules, whether maliciously or by negligence, or by a violation of security policies. Obviously, incidents’ outcome might be severely worsened by inadequate behaviors, insufficient vigilance or, at the opposite, excess of confidence of people at key roles.

Let’s talk about transgressions. Many authors [1] have worked on the human factor, some more particularly on information system security. Their work lead to theories used as reference in this domain (behavior theories, psychology or criminology inspired theories) or to explanatory models (behavior deviating with morals, technology acceptance model…). All these theories have in common to explain behaviors and related situations, to allow avoiding some mistakes, but do not give any practical method! Some more fecund works were done in safety, reliability or ergonomics researchs [2]: “The way by which the operators manage their working conditions has a great variability: the conduits observed show various logics of hierarchisation of the priorities in the event of constraints. They can cover the appearance of a search for compromise between the realization and the cost of a will to achieve an operational goal with the detriment of safety.” [3] So here is an interesting first point: an inappropriate behavior does not necessarily result from a malicious intent. Better, a transgression is not arbitrary and usually is the result of targeting a specific goal by adapting rules in response to a constraint. Therefore, there are solutions where some issues would be reduced by improving systems ergonomics to no force people to make a bad tradeoff between their goal and the company’s rules. That means there’s no fate and we don’t have to stay here watching issues arise while some specialists try to comment in erudite language!

If we try to improve ergonomics, we still need to avoid a common pitfall which is to believe that we can find a technical solution to this issue. Let’s face it, when people stigmatize the human factor, they usually think that securing information systems would be much easier if there was no human being involved at all. Thinking that machines could do the job better is over simplistic and loses sight of the fact that the root cause of the most frequent and most serious security incidents usually lies in management errors like ignoring to take security into account in the strategic goals of the company, not including security at the beginning of a project, over confidence in security devices by techno-friendly people, lack or insufficient training and education for security, loss of motivation by staff due to weak values in corporate culture, mistakes in defining security goals, deficient structures, lack of rules and procedures, diluted responsibility …

Moreover, information systems are called system for a reason. It’s because information in a company, while it may appear like chaos at first, is in fact a complex system. And complexity can’t be dealt by a finite-state automaton because a finite-state machine can only do what it was designed for. Only the human mind is able to grasp complex issues. On one hand in a normal situation, the surprisingly unpredictable and fundamentally irrational nature of human beings can waste the Information Security Officer’s day. On the other hand, in an unknown, therefore not programmed before, situation, it is a major asset. In such a situation of turbulences, the human factor is irreplaceable and highly valuable.

IS security is about information risk management and, as everyone knows, this risk can’t be annihilated. Therefore, one has to make choices. “To decide or to make strategic choices is an activity by itself that can’t be reduced to a sum of technical decisions.” [4] IS security is the realm of non deterministic choices based on a variety of notions that we can’t model for a finite-state automaton. These notions include our perception of reality, our vision of security and our understanding of our best interests at a given time… The human factor allows us to deal with the non deterministic nature of information systems. Any project management model relying on a deterministic logic of systematically and progressively reducing risk does not work. A technical system cannot pilot IS security while people can. Without the human factor, there’s simply no IS security. Piloting IS security is often more a matter of deciding in uncertainty rather than risk management. Indeed situations faced are new and cannot be deduced from past events (this impedes our ability to prevent or predict based on statistics). Worse, we don’t know everything about occurring events. What’s more, decisions taken modify the environment and the parameters assessed to take the decisions and usually understanding an issue requires a systemic approach based on knowledge in other areas. Zero day attacks based on an unlikely chaining of events are unpredictable. All these issues make objective probabilities difficult to compute. Using subjective probabilities has the drawback of relying on the limited rationality of decision makers and allows their beliefs to be manipulated. Works from other disciplines, in particular Knight [5] and Keynes [6] in economics, teach us how to distinguish risk (or situations where probabilities can be computed) from uncertainty (situations where probabilities can’t be computed) and give us clues on how to deal with the human factor. Therefore, the human factor is an uncertainty, not a risk! And one can try to reduce it by management efforts.

Conclusion and proposals for reducing uncertainty of the human factor by management

We agree that inadequate behaviors of IS actors can lead to security incidents. However, complexity of information systems makes inappropriate any deterministic solution like a technical solution. The only way we know to deal with a non deterministic system is to put in place a human organization. That’s why the human factor is a chance for information systems security, not a risk. To put it another way, a popular Russian proverb says “some eagles may fly lower than some hens, but a hen will never fly higher than an eagle”.

To achieve the goal of reducing uncertainty related to inappropriate behaviors, it takes a combination of the three following ingredients that Human Resources division cultivates:

- Knowledge: people need to have been trained to react correctly. This means an appropriate, targeted, not condescending training.

- Empowerment: people need to have the means and necessary authority to react. Some people need to be identified and selected to be individually accountable. The corporate culture should reward personal initiatives. For instance, if someone decides to innovate in dealing with a never seen before situation rather than stupidly following the procedures that are leading the company to failure, they should be confident that they will be recognized and not penalized for not sticking to the usual, inappropriate, procedures.

- Will: people need to be willing to react for the organization’s best interest, namely be responsible actors: “if anyone of us is aware of their roles, they discover themselves to be more than servants, they are sentinels and each and every sentinel is responsible of the empire.”[7]

In order to improve your security, you need to rely on people to complement technologies to, more or less intuitively, detect abnormal events when they occur. The required qualities are not learned in training centers. That’s why a selection is necessary. However the values in a company’s culture and the management style can help in maintaining and developing those qualities rather than annihilating them. Reducing uncertainty of the human factor is therefore about management!

Adding more technology is not a solution to organizational issues where decision processes need to be redesigned. Worse, a purely technological view of IS security can make you more vulnerable. And to emphasize the importance of employee involvement compared to the technological arsenal, let’s quote Thucydide, a Greek historian from the 5th century before Christ: “The thickness of a wall is less important than the will to defend it”. Indeed, the involvement of employees in IS security in a company is an excellent indication of its dynamism and social state. This can become a useful indicator for investors to assess the risk… of their investment.

References:

[1] Let’s quote for instance: Rosé P., (1995), La criminalité informatique, Paris, PUF ;

Venkatesh V., Morris M.G., Davis, G.B., Davis F.D., (2003), "User acceptance of information technology: Toward a unified view", MIS Quarterly, Vol. 27, N°3 ;

Adams A., Sasse M.A., (1999), "Users are not the enemy" Communications of the ACM, Vol. 42, n°12;

Dhillon G., Backhouse J., (2001), "Current directions in IS security research: towards socio-organizational perspectives", Information Systems Journal, 11

[2] Let’s quote for instance: Guérin, F., Laville, A., Daniellou, F., Duraffourg, J., Kerguelen, A. : Comprendre le travail pour le transformer. La pratique de l’ergonomie. ANACT, Collection Outils et Méthodes. 1997

[3] Noulin, M. Ergonomie, Toulouse : Octarès Ed. 2002

[4] C. Rochet Manager dans la complexité

[5] Knight F., Risk, uncertainty and profit, Houghton Mifflin Company published, 1921

[6] Keynes J.M. , A treatise on probability, London Macmillan, 1921

[7] Saint-Exupéry, Terre des Hommes