Michael Bret Hood (21puzzles@gmail.com), a former senior-level FBI agent, is a professional trainer on financial crime, money laundering, ethics, and executive leadership development, based in Raleigh-Durham, North Carolina, USA.
Believing that you should have foreseen what happened is often referred to as hindsight bias. In the compliance world, this bias often surfaces after a corporate scandal or serious compliance violation. When such an offense occurs, executives and compliance officials look for answers to determine how the system failed and why the particular incident was not foreseen. In reality, noncompliance is not something that is predictable. Human beings are much too diverse in their behaviors to accurately predict when someone in an organization is going to deviate from established norms. Todd Haugh, assistant professor of business law and ethics at the Indiana University Kelley School of Business, writes, “Accurately predicting the probability and scope of compliance failures is more difficult than currently understood.”[1]
How the unconscious brain affects compliance
When compliance failures occur, the typical response is to “plug the holes,” or, in other words, create new policies/procedures to ensure that the failure does not repeat itself. Although this step is certainly necessary to protect organizational interests, it is not the be-all and end-all solution that most people perceive: “Companies believe bad employee conduct will transpire in their organizations in a manner that conforms to a recognizable, and ultimately manageable, pattern. This approach, it is further believed, will foster a positive corporate culture, thereby improving corporate compliance en masse.”[2] What executives and compliance leaders fail to take into account is that our ethical breakdowns and rule/procedure deviations are often committed without conscious knowledge.
In his landmark book, Thinking Fast and Slow, author Daniel Kahnemann referred to two types of thinking, which he labeled as System 1 and System 2. In System 1, your brain instantly reacts to stimuli and makes decisions using previous experiences, belief systems, culture, and desires. System 2, on the other hand, is the more rational part of your brain. In System 2, you think before you act.[3] While most people believe they give careful thought and analysis before making important decisions, you may be surprised to know that almost 95% of your daily decisions are made using System 1, including whether or not to follow established norms.[4] Typical compliance frameworks are based upon employees properly assessing a situation and making a correct and ethical choice. This, however, is rarely the case.
Noncompliant offenders rarely consider the consequences of their behavior prior to committing the action because of the manner in which System 1 operates. On most occasions, the dimensions of the decision have been altered or transformed in a way that the decision-maker disregards or is completely unaware of the ethical ramifications of his/her decision. “In many cases, wrongdoers have trouble recognizing their own unethicality, meaning that they act wrongly not because they are willing to pay some external or internal price, but rather because they have a biased assessment of what it is they are doing.”[5] Some of the System 1 manifestations can lead you and your colleagues to, among other things, ignore ethical dimensions of decisions, refrain from speaking up about known compliance violations, and accede to the adopted rules of the informal culture, even if doing so violates established laws, policies, and procedures.