chapter nineteen

CHAPTER NINETEEN. THE PSYCHOLOGY OF THE LUCY LETBY CASE.

It is a disgraceful obscenity that an angel of mercy was purported as an angel of death as a “punishment” for whistle-blowing and feminism, and then imprisoned. The question is:- How could this happen? What PSYCHOLOGICAL factors were in play that allowed it to happen? To examine these psychological factors, I carried out three Wikipedia searches – for Confirmation Bias, for Groupthink, and for Conformity (ie:- Conformity Bias). Below are the results of these three searches. I have removed irrelevant material, and highlighted relevant material, but have added no extra text. I have labelled sections with the letters A,B,C etc, so that the reader can easily refer back to specific points.

Bottom of Form

 

From Wikipedia, the free encyclopedia

(A). Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.[3] People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes.

(B). The effect is strongest for desired outcomes, for emotionally charged issues, (My comment:- The murder of infants is indeed an “emotionally charged issue”.)

(C). the irrational primacy effect (a greater reliance on information encountered early in a series) Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on,

(D). A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Explanations for the observed biases include wishful thinking

(E). Confirmation bias, previously used as a "catch-all phrase", was refined by English psychologist Peter Wason, as "a preference for information that is consistent with a hypothesis rather than information which opposes it."[4]

(F). Confirmation bias is a result of automatic, unintentional strategies rather than deliberate deception.[8][9]

(G). Biases in belief interpretation are persistent, regardless of intelligence level.

(H). Biased recall of information

People may remember evidence selectively to reinforce their expectations, This effect is called "selective recall",

(I). Francis Bacon (1561–1626) wrote:  “The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it.”

In the second volume of his The World as Will and Representation (1844), German philosopher Arthur Schopenhauer observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."[52]

(J). Desired conclusions are more likely to be believed true.

(K). In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence. (My comment:- This is exactly what happened in The Lucy Letby investigation.)

From Wikipedia, the free encyclopedia

(L). Groupthink is a psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome.

(M). Cohesiveness, or the desire for cohesiveness, in a group may produce a tendency among its members to agree at all costs.[1] This causes the group to minimize conflict and reach a consensus decision without critical evaluation.[2][3]

(N). Members of a group can often feel under peer pressure to "go along with the crowd" for fear of "rocking the boat" or of how their speaking out will be perceived by the rest of the group.

(O). "Groupthink by Compulsion [...] [G]roupthink at least implies voluntarism. When this fails, the organization is not above outright intimidation

(P). Stereotyping those who are opposed to the group as weak, evil, biased, spiteful, impotent, or stupid. Self-censorship of ideas that deviate from the apparent group consensus.

·         (Q). Direct pressure to conform placed on any member who questions the group, couched in terms of "disloyalty".

·         (R). Members may disagree with the organizations' decision, but go along with the group for many reasons, such as maintaining their group status and avoiding conflict with managers or workmates. Such members think that suggesting opinions contrary to others may lead to isolation from the group.

·          

 From Wikipedia, the free encyclopedia

For other uses, see Conformity (disambiguation).

(S). Conformity or conformism is the act of matching attitudes, beliefs, and behaviors to group norms, 

(T). This tendency to conform may result from subtle unconscious influences), or from direct and overt social pressure.

(U). People often conform from a desire for security within a group,

(V). This is often referred to as groupthink: a pattern of thought characterized by self-deception, forced manufacture of consent, and conformity to group values and ethics, which ignores realistic appraisal of other courses of action.

(W). Unwillingness to conform carries the risk of social rejection. Conformity strongly affects humans of all ages.[10]

(X). According to Herbert Kelman, there are three types of conformity, which includes public conformity, motivated by the need for approval or the fear of disapproval;

(Y). According to Deutsch and Gérard (1955), conformity results from a motivational conflict (between the fear of being socially rejected and the wish to say what we think is correct)

With the above features in mind, let’s now see how Confirmation Bias, Groupthink, and Conformity (ie:- Conformity Bias) brought about this obscene witch-hunt.

We start out with a conflict that arose between the consultants and Lucy Letby. She was a “whistle-blower”, and they wanted her removed from the unit solely on this basis. The consultants, in their bid to remove her, accused her of harming babies on the ward. The question now arises:- Is it the case that the consultants knew perfectly well that she had never harmed any babies, and did they still nevertheless accuse her? If that were the case, then they would be utterly evil. However, I do not believe that this was the case. In order to clarify the “psychology” that pertained to this accusation, allow me to provide some quotes from the world of literature which may illuminate the issue:-

“Men freely believe that which they desire.” (Caesar, De bello Gallico, Book iii, section 18.)

“Man prefers to believe what he prefers to be true.” (Francis Bacon  Aphorisms, Number 49.)

“What ardently we wish, we soon believe.” (Young Night Thoughts, Night vii, 2, 1233.)

“You believe that easily which you hope for earnestly” – (Terence.)

“Thy wish was father, Harry, to that thought.” (Shakespeare.)

“Obstinacy’s ne’er so stiff – As when ‘tis in a wrong belief.” (Butler.)

Also, from the Wikipedia material above:- (J). Desired conclusions are more likely to be believed true.

The above quotes and passages illuminate the psychological process involved in the accusations levelled by the consultants at Lucy Letby. The consultants wanted her out of the unit. They managed to persuade themselves that they really did have a genuine and valid reason for removing her. The human mind is capable of an infinite degree of self-deception. These consultants first deceived themselves, and then commenced the process of deceiving everyone else.

Once the consultants had reported their suspicions to the police, the police were duty bound to investigate the matter. The police spent several million pounds on the investigation. At this point, the police succumbed to THE SUNK COST FALLACY. This is what Wikipedia has to say about The Sunk Cost Fallacy (See the Wikipedia entry for the search term “sunk cost”.)  Sunk costs often influence people's decisions, with people believing that investments (i.e., sunk costs) justify further expenditures. People demonstrate "a greater tendency to continue an endeavour once an investment in money, effort, or time has been made". This is the sunk cost fallacy, and such behaviour may be described as "throwing good money after bad", while refusing to succumb to what may be described as "cutting one's losses".

Once the police had spent several million pounds, they had “crossed the Rubicon”. Deciding to proceed no further would have seemed like a “defeat” for the police. Getting a “result” would seem like a “win” for them. If they “cut their losses”, then heads would roll, and careers would be blighted. They HAD to proceed towards a conviction. The police started to believe the scenario that they “ardently wished for”. “What ardently we wish, we soon believe.” (Young Night Thoughts, Night vii, 2, 1233.)

Furthermore, GROUPTHINK now came into play. Recalling the Wikipedia article:- 

(L). Groupthink is a psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome.

(M). Cohesiveness, or the desire for cohesiveness, in a group may produce a tendency among its members to agree at all costs.[1] This causes the group to minimize conflict and reach a consensus decision without critical evaluation.[2][3]

(W). Unwillingness to conform carries the risk of social rejection. Conformity strongly affects humans of all ages.[10]

The police form a “group”, and it is necessary for any individual police officer to adhere to the group’s beliefs, in order to advance their career.

Also, CONFIRMATION BIAS came into play. Recalling the Wikipedia article above:-  (K). In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.[122]

The Crown Prosecution Service, and the Prosecuting Counsel all “suffered” the above psychological mechanisms, and furthermore, in the Adversarial System current in The UK, the prosecuting counsel’s “job” is to WIN, not to pursue some nebulous notion of “truth and justice”. If they can get away with deliberately (or “accidentally” NO! I DON’T THINK SO!) hiding exculpatory evidence from the jury, then they are only doing the job that they are paid to do, leaving it up to the defence to seek and find exculpatory evidence as best they can.

When it comes to Doctor Dewi Evans, he is “ruled” by the same psychological mechanisms as detailed above. He “ardently wishes” to believe the scenarios that he proposes, and “What ardently we wish, we soon believe.” (Young Night Thoughts, Night vii, 2, 1233.) He has the further “motivation” of financial gain (although I am sure this could not have possibly influenced his testimony in the court – and I am also sure that Father Christmas exists!). If he had testified that it was impossible to hazard anything more than a guess as to the cause of the deaths of the infants, then the police would have never employed him again.

Now we come to the jury. They heard weeks of prosecution evidence before hearing any (scant) defence evidence. They had already made up their minds, according accounts of people who overheard the comments of some of the jury members.

To substantiate this claim, the following quote is from the article in The New Yorker (issue for May 20th, 2024) by Rachel Aviv, entitled – A Reporter at Large – Conviction, page 48:- The court received a communication from someone claiming to have heard a juror saying that the jurors “had already made up their minds about her case from the start”. (My comment:- This fact relates to (C). the irrational primacy effect (a greater reliance on information encountered early in a series) Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on,)

Once the jury had made up their minds, it would be psychologically difficult for them to later on alter their previously formed opinion. Referring to the Wikipedia articles above:-

(A). Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.[3] .

(B). The effect is strongest for - - - emotionally charged issues, (My comment:- This was a very “emotionally charged issue!”)

(D). A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs.

There is one further issue to explore:- Doctor Jayaram believes that he remembers that Lucy Letby was alone with a baby whose breathing tube had become dislodged. I discuss this specific issue in Chapter 15. All the evidence suggests that he is mistaken about this incident (or simply a malicious liar – but no – I would never accuse him of actually lying!). Referring back to the Wikipedia article –

(H). Biased recall of information

People may remember evidence selectively to reinforce their expectations, This effect is called "selective recall",

Doctor Jayaram is suffering from “selective recall” here.

Doctor Jayaram also SUBSEQUENTLY “selectively recalled” that a baby had a rash (implying “air embolism” – the deliberate injection of air into the baby). He did not mention this at the time in the written notes. (See Chapter 12 for details.)

“Men freely believe that which they desire.” (Caesar, De bello Gallico, Book iii, section 18.)

Doctor Jayaram “desires” to discover some information that can have “whistle-blower” Lucy Letby removed from the unit, and he “freely believes” that which he “desires”. The human mind is capable of an infinite degree of self-deception.

To confirm the above statements, here are some quotes from the paper (available on the internet. Type in Google search Royal Statistical Society. Healthcare serial Killer or Coincidence) Royal Statistical Society. Healthcare serial Killer or Coincidence? Statistical Issues in Investigation of Suspected Medical Misconduct, by The RSS Statistics and The Law Section.

Justice systems are sometimes called upon to evaluate cases in which healthcare professionals are suspected of killing their patients illegally. These cases are difficult to evaluate

The cases often turn, in part, on statistical evidence that is difficult for lay people and even legal professionals to evaluate.

Furthermore, the statistical evidence may be distorted by biases, hidden or apparent, in the investigative process that render it misleading.

Our focus is on ways that investigators’ desires and expectations may unintentionally and even unconsciously influence what they look for, how they characterise and classify what they find, what they deem to be relevant and irrelevant, and what they choose to disclose. Examiner bias is a well-known phenomenon in both scientific and forensic investigations. It arises in large part from what are known as observer effects, a tendency for human beings to look for data confirming their expectations (confirmation bias) and to interpret data in ways that are subtly (and often unconsciously) influenced by their expectations and desires. Statisticians have long studied the ways in which examiner bias can distort statistical evidence emerging from scientific and forensic investigations.

 We go on to discuss ways to reduce “tunnel vision” in which the investigation becomes a search for evidence confirming a particular investigative theory while ignoring or dismissing evidence inconsistent with that theory.

4. Investigative bias. Because criminal investigations are carried out by human beings, investigative findings may be influenced by common human tendencies (often called biases) that can affect the way investigators search for and evaluate evidence, as well as how they choose to report findings.

People’s expectations and desires can influence what they look for and how they evaluate what they find when they seek answers to important questions. The tendency of preconceptions and motives to influence people’s interpretation of evidence has been called “one of the most venerable ideas of … traditional epistemology…” as well as “… one of the better demonstrated findings of twentieth-century psychology”.

The potential for observer effects to distort scientific investigations was recognised by early astronomers, who discovered differences in reported findings of the same astronomical phenomena by different observers.

Scientists’ failure to notice (or at least to report) phenomena inconsistent with their theory-based expectations; reported findings that support pet theories but cannot be replicated; and the statistically improbable degree of correspondence that has been observed between some reported findings and theoretical expectations

The same scope for biased data collection has been noted in criminal investigations. Miscarriages of justice are often attributed to “tunnel vision” and “confirmation bias,” processes that may lead investigators to “focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.

All information supporting the adopted conclusion is elevated in significance, viewed as consistent with the other evidence and deemed relevant and probative. Evidence inconsistent with the chosen theory is easily overlooked, or dismissed as irrelevant, incredible, or unreliable. Common investigative practices noted in the commentary by Findlay and Scott include tendencies for investigators: • to settle too quickly on a preferred theory, without adequately considering alternatives; • to look for evidence that confirms or supports the preferred theory rather than seeking evidence that might disconfirm it or support alternatives; • to notice, remember and record evidence more readily and reliably when the evidence is consistent with the preferred theory than when it is not; • to interpret ambiguous evidence in a manner consistent with the preferred theory; • to view evidence and interpretations as more credible when they support the preferred theory, and vice versa; • to report findings with a higher degree of confidence if they support rather than contradict the expected result; • to fail to hand over or disclose all the countervailing evidence to the defence; and • to have skewed incentives to boost their case.

Psychological research suggests that people have a general tendency to gravitate toward criminality as an explanation for seemingly anomalous events, rather than looking at situational or institutional factors; this is called the “fundamental attribution error”. It causes people to look to the person (ie, to human agency) rather than the situation when explaining events. There is strong and well-demonstrated psychological tendency for people to assume that bad things are caused by bad people rather than bad circumstances (cf. the common public need to attribute blame to individuals, for “heads to roll”, in cases of systemic failure in public services). Hence, people may tend to look for scapegoats to blame for bad medical outcomes arising from other causes, and this is often encouraged by sensationalist reporting in the media.

Cognitive biases can also affect the way that investigators interpret and classify data, and thereby distort the findings that emerge from an investigation. Epidemiological and statistical methods used in investigations of disease outbreaks or clusters of adverse events are applicable to investigating clusters of deaths. Whether a particular death should be deemed “suspicious”, for example, might be influenced by a variety of factors, including factors that have little or no diagnostic value. Cognitive psychologists have found that people often have limited insight into the factors that influence such evaluations, so can be influenced by their own expectations or motives without realising it.34 The largely unconscious nature of these processes makes the resulting biases difficult to remedy.

“Contextual bias”. For present purposes, the key finding of the Dror et al. study is that forensic pathologists’ manner-of death determinations can be influenced by contextual information, such as information about who was caring for the decedent.

Let us consider how that might affect the fairness of the kinds of investigations we are discussing here. Suppose, for example, that a forensic pathologist is more likely to determine that a patient’s death was “suspicious” and hence possibly due to homicide if aware that the patient was under the care of a suspected serial killer.

This might happen because the forensic pathologist thinks it is proper and appropriate to consider such information when evaluating cause of death. Even if the forensic pathologist tries to ignore such information, however, it may still bias the evaluation by creating an expectation of homicide when the pathologist reviews cases associated with the suspected serial killer, and it may do so without the pathologist being aware of it. Contextual information of this kind may also affect thresholds for reporting. Concern about missing possible victims may cause them to lower their threshold for reporting “possible homicide” when evaluating patients attended by the suspect; while concern about casting suspicion on an innocent person causes them to raise the reporting threshold for patients attended by other nurses. Consequently, when their evaluation of the medical evidence leaves them uncertain, forensic pathologists may be more likely to report a case as a possible homicide if they know the nurse on duty was a suspected serial killer, and less likely if another nurse was on duty. Regardless of how it occurs, this kind of bias would undermine the fairness of the investigation by causing an increase in the count of “suspicious” deaths associated with the nurse. The higher count would arise from the very suspicions that the investigation is supposed to evaluate – an example of circular reasoning.

Like judgments about manner of death, judgments about access and opportunity to kill are complex subjective assessments on which different experts may have differing opinions (and, where experts are party-appointed, have implicit “advocacy-bias”). Hence, they are also the kinds of judgments that may be influenced by contextual bias. There is a risk that investigators will be influenced by their expectations and desires. It is possible, for example, that they will cast a wider net when looking for “suspicious deaths” that can be linked to a suspect; and a narrower net when counting suspicious deaths that occurred when the suspect was not present. As a consequence, the deaths counted against the suspected individual could increase (relative to deaths counted against others) for the very reason that the suspect has come under suspicion.

The officials who guide the investigation may have an interest in supporting particular outcomes, which could hinder the ability of investigators to identify the full range of possible causal factors.

 

 Suppose for example, that the increase in deaths that prompted the investigation arose after administrative changes that affected staff levels, training, or supervision. To avoid any implications of responsibility for a surge in deaths, the administrators may well prefer that the investigation focus on a single bad apple on the staff, rather than these background factors, and hence may de-emphasise or ignore them. This self-interested guidance may prevent investigators from recognising causal factors that confound their assessment of the rate of deaths attributable to a particular individual.

The potential for cognitive bias and the subtle and often unconscious ways it can influence expert judgments,

Medical authorities may have an interest in the outcome of the investigation that influences what they tell the police about possible causal factors. For example, faced with an upsurge in patient deaths, hospital administrators may find it easier to imagine that it was caused by individual misconduct of a “bad apple” on the staff than to acknowledge that it may have arisen from administrative decisions related to staffing and service levels; this may happen unconsciously. (My comment:- Yes. I think that the Hospital Management of The Countess of Chester Hospital did indeed have an “interest” in finding a “bad apple” as a “patsy” or “scapegoat”, so as to avoid being themselves sued for negligence that caused unnecessary deaths!)

The United States’ National Academy of Sciences observed that “forensic science experts are vulnerable to cognitive and contextual bias” that “renders experts vulnerable to making erroneous identifications.” 

In the United Kingdom, the Forensic Science Regulator reached similar conclusions.

Bias may arise from investigators’ failure to consider all possible explanations for the deaths or other negative outcomes under investigation.

The following quotes are from the book Lucy is Innocent, by Paul Bamford, SECOND EDITION, ISBN number 9798326484130 (I strongly recommend this book.)

Page385:- Doctor Shahram Heshmat, associate professor of health economics at The University of Illinois states that strong emotion tends to produce tunnel vision. (My comment:- A nurse suspected of killing babies might well produce strong emotions, and hence tunnel vision.)

Page 219:- Regarding the air embolism accusation in relation to Baby M. (This was suspected because it was incorrectly believed that a rash on a baby could suggest air embolism.) Doctor Jayaram did not record a rash on the baby in the case notes. He only LATER recalled this rash (ie:- when he realised that it would incriminate Lucy Letby). (My comment:- This is “selective recall”. (H). Biased recall of information. People may remember evidence selectively to reinforce their expectations, This effect is called "selective recall",

THE PREJUDICIAL MEDIA FRENZY SURROUNDING THE TRIAL.

Here is a quote from the book Baby Killer 2. Is Lucy Letby Innocent, by Stu Armstrong, ISBN 9798301688835, printed by Amazon. (I strongly recommend this book.)

Pages 89 to 92:- When Lucy Letby was first arrested in 2018, the tabloids displayed headlines such as “Angel of Death”.  The investigative journalist James Calder stated that this media frenzy created intense pressure on the police to secure a conviction.

This kind of coverage – just to sell more newspapers – is highly prejudicial to a case. If you don’t believe me, here is a quote from the book Math on Trial – How Numbers Get Used and Abused in The Courtroom, by mathematicians Leila Schneps and Coralie Colmez, published by Basic Books, 2013:-

Page 46:- “Sneed requested a change of venue for the trial on the grounds that the murder had created a vast amount of publicity - - - and  - - - local prejudice - - - the trial was moved” (to a distant location). (My highlighting.)  

My comment:- So there we have it. A host of psychological factors caused doctors, police, the hospital, the prosecutors, and the “expert” witness to construct a shaky edifice of confabulation in order to convict an angel of mercy.