Why good people do bad things in business

2 minute read

Only 4% of the population is considered 'bad' but the corporate world attracts those with sociopathic tendencies. So how can you avoid the pitfalls of perceptual blindness?

AC Ping

Good and bad

Your first reaction to reading the headline of this article may be to think of ‘them’ – the bad people who do bad things like insider trading, misrepresentation, fraud, bribery and corruption. But it may surprise you to know that research by Stout shows that only about 4% of the population is considered ‘bad’ – that is habitually acting in an amoral and antisocial way. And although other research by Babiak & Hare indicates that the corporate world may in fact attract those with sociopathic tendencies, I’m hoping that your self-assessment excludes you from being one of ‘them’. That said, how is it that good people end up presiding over monumental ethical breaches? And how might you, as one of those good people, avoid tripping over something, that in hindsight would have you scratching your head in wonder?

The vast majority of almost 40 years of research into business ethics has been based on the assumption that ethical decision making is a rational cognitive process. That is, you recognise an ethical dilemma; set your intention to be ethical (good); apply some sort of ethical decision making framework – such as Aristotle’s virtue ethics, Kant’s duty based ethics, or Bentham’s utilitarianism; make a decision; and then act on that decision.

The problem is that research in other fields such as criminology from Heath, social psychology from Bandura, Caprara, & Zsolnai and neuro-cognitive science from Reynolds indicates that this assumption is flawed and that there is a gap between ethics in theory and ethics in action.

In theory, we apply the process detailed above. In reality we miss ethical cues due to perceptual blindness causing us to apply the wrong type of decision-making criteria. For example, a major client indicates that a rival has outbid you for a long term contract by guaranteeing preferential treatment in the form of kickbacks on executive travel and entertainment. She indicates over a luncheon meeting that if you can offer the same sort of preferential treatment, your firm will get the contract. Your decision-making criteria can be business based – in which case a quick handshake will win the deal – or it can be ethics based, resulting in a stance based on principles and a loss of the contract.

You may shake your head and think ‘I’d never do that’, but this would discount the effect of situational and contextual factors that may influence your thinking. Bazerman & Tenbrunsel show that incentives can motivate us to miss ethical cues causing us to begin a slide down the ‘slippery slope’. As my own research illustrates, once we have made a decision, we then use flawed justifications that effectively neutralise the very values we hold dear. These justifications arise from research into delinquency by Sykes & Matza and include: it’s not hurting anyone; I’m just following orders; they deserve it; it’s a stupid rule anyway; or, a call to higher purpose.

This theory of neutralisation has more recently been applied to the field of business by Heath to explain why the corporate environment presents such opportunities for ‘good’ people to do bad things. To the five justifications proposed by Sykes and Matza, Heath added two more – everyone else is doing it; and a claim to entitlement.

Once on the ‘slippery slope’ new ethical dilemmas are made by pattern matching to the decision that has already been made, hence creating an environment where slowly but surely the culture and accepted norms shift (Zimbardo). I’m guessing now the question you’re asking is: ‘Why don’t people recognise their errors and recant?'. This is where things get tricky.

In my research I have found that subjective perceptions of justice lie at the heart of the matter. The initial trigger that invokes a flawed justification often arises from a perception that what is happening is not fair – for example, in the case above – it’s not fair that the other company might take that action – this threat to fairness causes a response to ‘balance the scales of justice’ by taking action. Once taken, this action is seen as righteous and hence the person is unwilling to change their minds and admit to themselves that they were in fact wrong. Self righteousness then holds the person on the slippery slope, always believing that they were right and that they can ‘fix it’.

The moral of the story? Your belief that you would never do bad things opens you up to the possibility of being blindsided by perceptual blindness. And the solution? Clarity of moral intent. Be clear about what you are trying to create and what values and principles form the boundaries for that outcome. Check that you are not being seduced by flawed justifications that neutralise those values. Finally, ensure that you have good advisers around you that don’t let you shoot yourself in the foot with self righteousness.

Above all, remember one thing. You’re not as smart as you think you are.

 

AC Ping is Colin Brain Governance fellow at Queensland University of Technology and a regular PS Contributor. This article first appeared on The People Space on 8 May 2017

In reality we miss ethical cues due to perceptual blindness causing us to apply the wrong type of decision-making criteria

Enjoyed this story?
Sign up for our newsletter here.