Lie Detection and the Law: Torture, Technology and Truth

Lie Detection and the Law: Torture, Technology and Truth

My latest book, ‘Lie Detection and the Law: Torture, Technology and Truth’ is out now with Routledge (you can buy it on Routledge, on Amazon and you can preview it and buy the ebook on Google Books). It is part of Routledge’s ‘Law, Science and Society‘ series, which is worth taking a look through. IMG_0598

Chapter 1 outlines an interesting example from the history of the polygraph’s use in criminal investigations, briefly reports on how the device has long suffered from contestation regarding its validity and reliability, before reviewing some key philosophical starting points in understanding lying, and taking some tentative footsteps towards a sociological approach.

Chapter 2 explores the use of torture as a form of lie detection in practices of ‘trial by ordeal’ and in the Elizabethan period before outlining the literature on the emergence of the polygraph machine and the trial by jury in the United States. Together, these threads help describe how conceptualisations of the body and its relation to truth have been invested in lie detection technologies.

Chapter 3 provides an overview of a heuristic often used to explain the legal status of the polygraph machine in US criminal courts, that it is either: inadmissible; admissible with prior-stipulation; or admissible even without prior-stipulation. It then argues that although this is useful, there is a need for greater detail, exploring the tensions and complexity involved in characterising the socio-legal status of the polygraph, which is followed-up in Chapters 4 and 5.

A contemporary polygraph kit

Chapter 4 begins to provide a more detailed picture of how the polygraph machine is challenged and managed within US State Supreme Courts, outlining the ‘exclusionary toolkit’ and how this is used to highlight uncertainties in lie detection research and application.

Chapter 5 extends this analysis by examining how ontological uncertainties in polygraph science and interrogation practices are negotiated in trial settings by reference to broader techno-political currents in the United States. It shows that the ontological connections between body, truth and lying are enacted differently over time in a case study of the Massachusetts State Supreme Court.

Image result for peter reilly
Peter Reilly, who was convicted of murdering his mother on the basis of a polygraph-induced confession.

Chapter 6 explains how the polygraph machine is used in police interrogations of criminal suspects and witnesses. It describes in detail the deceptive techniques used by polygraph interrogators to manipulate subjects into confession. The chapter evaluates these techniques in light of the tragic case of Peter Reilly, showing how lie detection practices can bring about false confessions.

Chapter 7 explores the use of the polygraph in the socio-legal periphery of the criminal trial, namely in probation and treatment programmes of sex offenders. It describes the emergence of sex offender polygraph testing in USA and UK, setting this against a background of moral panics in both countries, but also linking it to a longer history of socio-technical practices through which certain groups have been made into subjects of suspicion.

Image result for post-truthChapter 8 reviews the ways in which uncertainty in lie detection has figured in its techno-legal configurations within socio-legal situations. It explores the implications of this account for developing a sociological approach to lying, drawing on key insights from Georg Simmel and others, and indicates why these need to be revised to reflect today’s biopolitical schemes of social order, and to address the complex struggles over information and truth in contemporary sociotechnical systems and ‘post-truth’ politics.

If you’re interested in the book, you might also want to read other posts on lying and lie detection on my blog and some other papers I’ve written on the subject, which you can find on Academia.Edu.



The Experiment with Lie Detection Should Be Ended

The polygraph machine – or ‘lie detector’ – has long been tied-up with sex and sexuality, from the use of the device to out homosexuals during McCarthyist witch-hunts to the recent use of polygraphs to monitor convicted sex offenders. Reports of the success of this programme warrant scepticism and careful analysis, not least because the machine doesn’t detect lies, but also because the history of polygraphy tells us that it is a slippery slope from using it in one area to its spread into all realms of social life.

The majority of psychologists will tell you that the polygraph machine does not work – and they are right. It doesn’t detect lies. Nonetheless, in the past century the machine has been used in a variety of situations in the United States, including for job applicant screening, police investigations, in agreeing divorce settlements, or for resolving family and business disputes. Its proponents have sold it more on a political promise than on scientific credentials, as a tool in the fight against crime, to challenge police corruption or to combat terrorism. Sex offenders are just the most recent group to play a role in these scientific and political fictions.

In the UK we have thus far been more circumspect regarding lie detection, and thanks to this scepticism the polygraph does not have the kind of mythic quality that it seems to engender in American hearts. However, there are a few areas in which examinations have crept in, chief amongst them being their use to monitor sex offenders post-probation. After changes to the Offender Management Act in 2007, trials were run and eventually polygraphy was rolled-out across the UK. It is now reported that through these routine examinations 63 of 492 convicted sex offenders have been returned to prison after they admitted to breaches of their probation conditions.

On the face of it, of course, this is a good thing, for nobody could deny that sex offenders who breach their parole conditions should be returned to prison. However, there are serious concerns that should be raised about the efficacy of this technique and its longer-term consequences.

The chief problem is the kind of evidence the polygraph produces. The thrust of the research into polygraphing sex offenders aims to assess how far the machine goes to elicit so-called ‘clinically significant disclosures’. These are effectively confessions of one kind or another, and are typically used to evaluate the offender’s riskiness, change treatment strategies or alter probation conditions.

In other words, the validity of the polygraph is seen to rest not so much on whether it can detect lies but whether it can get offenders to make more disclosures about their behaviour. However, reports from the programme’s trial show that most disclosures are not made during the actual test but in the pre-test or post-test interviews. The polygraph’s role is less lie detector and more threat, used to induce disclosures. In this way, is works only so long as offenders believe that it works and any disclosure made is treated as evidence that offenders are being more honest.

This is not a good basis for making crucial legal decisions: sex offenders returned to prison are likely to learn from each other that the machine does not work and develop ways to cheat the test. There are courses one can take and books one can read about beating the machine. The fact is that the use of the polygraph in this context, as in others, is based on a game of bluff – who can convince whom that they are telling the truth? The examiner, when they say that the machine is infallible, or the examinee, when they say that they have not lied?

It is distinctly unwise to rely on such a game of bluff, even if only as an additional method for helping to determine risk. Offenders will, over time, adjust their strategies for getting through probation and back into the general population. Most sex offenders are already skilled liars and will not buy into the lie of the lie detector for very long.

It is also important to take into account the situation in which this technology has taken root. Lie detection was first considered in the UK at the peak of a media-provoked cycle of one-upmanship between Labour and the Conservative Party over who could be toughest on crime, a contest which has escalated over the last two decades and has always been most vociferous when it has focused on paedophiles. As former Justice Minister for the Coalition government, Jeremy Wright said of the implementation of the programme: “this will give us one of the toughest approaches in the world to managing this group.” Toughness cannot be an end in itself. This is not the kind of politics which we need in such a sensitive and complex situation – where the prevention of crimes against children is at stake cooler heads must prevail. Lie detection breeds confidence in probation decisions where no confidence is justified.

We have to be careful not to let the use of the polygraph in one context help it spread to another. In the USA the spread of polygraph exams began in the 1940s and today millions of exams are conducted every year by thousands upon thousands of examiners in many different situations. In some cases, it was the use of the lie detector by government departments like the CIA, FBI or Department of Energy, which helped justify its spread. Trial courts heard that since the machine was trusted in the most sensitive parts of government security, it should also be used as evidence of guilt or innocence. The machine is now used in criminal justice systems far more regularly than is realized by most observers. And, in the police interrogation room, there are more than a few cases in which the polygraph exam has been directly responsible for the production of false confessions.

We must guard against the use of lie detection in the UK, no matter the context, but especially in those situations in which the risks are so high. The crimes of sex offenders warrant surveillance during probation, but the polygraph does not warrant our credence. For it does not work, and sooner or later it will be found within a divorce proceeding, a job interview or a routine matter of airport surveillance, and your heart will be the one that is beating out a rhythm on the examiner’s chart.



Three Questions I get Asked

Here are three questions that natural scientists and engineers often ask me, and which are commonly asked of social scientists participating in interdisciplinary collaborations. 

  1. Why do people not like X?

e.g. why do people not like synthetic food additives?

This question usually has to do with a perception that scientists, or a particular field of scientific work, is not viewed favourably by ‘the public’ or by industry, governments, NGOs, and so on. Natural scientists and engineers sometimes have the impression that social scientists will be able to explain why it is that their technical ambitions have been thwarted by a political misconception or misunderstanding.

The question is sometimes embedded in a range of other faulty assumptions about public understanding of science and science communication. As a question, it can position social scientists as experts in the irrational behaviour of individuals and groups. Social scientists can also be positioned through this style of questioning as brokers, and so be expected to help to open governance and/or public doors.

I’d rather talk about:

What do people actually say and do in relation to X? What practices are involved in regards to X and how do you envision X changing, supplementing or supplanting those practices?

When it comes to something like food additives, for example, I’d be interested to know how people make sense of ‘synthetic’ and ‘natural’ from within cooking, eating, feeding and caring practices. We could ask in what ways are these concepts important to the organisation of such practices, or in what ways do these practices organise our demarcations of those concepts? It could be important to explore how novel methods of producing food additives sit alongside or displace existing methods of production, and with what global socioeconomic implications.

In this regard we could begin to understand why the notion or production of synthetic food additives might raise socio-political, economic or ethical questions from publics, NGOs, governments and so forth, rather than assuming people don’t like it because of ignorance.


  1. What will people think of X?

e.g. what will people think of brain-based lie detection?

This type of question usually has to do with natural scientists’ worries about what ‘the public’ will think about their planned innovations or technical recommendations. It comes from a recognition that publics are interested and invested in scientific knowledge production. However, it is often motivated by a desire to ensure that people will think positively about X and so sometimes accompanied by a will to understand how to encourage people to feel positively about X.

The question is sometimes embedded in a range of other faulty assumptions about how people’s negative feelings about proposed technologies will inexorably lead to market failure and that this is necessarily a bad thing. It positions social scientists as PR gurus or pollsters, who can help to take the public temperature and design marketing strategies for innovations.

I’d rather talk about:

What kinds of imagined (necessary, likely, possible or peripheral) actions, routines, relations and social structures are being embedded in the sociotechnical work being conducted? In other words, putting the emphasis not so much on the object of technical interest but on how the envisaged object might impact upon existing practices of life, relations and social order, or open-up new practices. Do we want these kinds of practices and why/ why not? What effects will these changes have and with what implications for people’s experiences of social and technical phenomena?

In the given example of brain-based lie detection, I would want to know more about how we do lie detection now in different contexts and why we do it that way. I’d be interested to know how brain-based techniques would change these contexts if implemented and what implications such changes might have for our ways of living with each other. If we were talking about using brain scanning as evidence for use in legal proceedings, for example, we’d have to think carefully about why we currently use the investigation, interview, interrogation and jury systems. How would brain-based technologies fit in these practices and what would change? What kinds of life and forms of justice are implicated in such changes?

In this regard we could begin to unpick some of the tangle of concepts, practices, norms, politics and so on that are bound up with our current ways of doing lie detection and thus better understand what would be at stake for someone who is asked to give an opinion on a new lie detection technology.


  1. Is it okay to do X?

e.g. is it okay to engineer life?

This question usually has to do with a perceived ethical ‘implication’ of some proposed technical innovation. The questions often centre on technical objects. They involve a recognition that sociotechnical innovation generally implies changes in how things are done or understood. They might have to do with abstract concepts like life or nature. However, by emphasising the objects of technical innovation or abstract questions these kinds of concerns largely miss the everyday practices that are at the heart of how ethical decisions and dispositions are made and formed.

This type of question is sometimes embedded in a range of assumptions about scientific objectivity and how ethical implications arise only from the implementation of knowledge and new technologies in the world rather than in the practices of knowledge production itself. In addition, such questions often come with the implication that X is going to happen anyway, but it would be good to know what moral status it is going to be given when it does happen.

This style of questioning is more comfortable for some social scientists than others, since some of us are experts in ethics. However, in the way the question is generally posed it positions social scientists as ethical arbiters, who themselves are being asked to judge the moral status of objects and so assess the social value of proposed innovations in order to help scientists justify actions they know they are going to take. This is a bit tricky and can be a far less comfortable space to inhabit.

I’d rather talk about:

What kinds of moral or ethical values are embedded in the scientific practices out of which the question has emerged? In other words, what has been decided about ethics already, what kinds of questions have been closed off and with what justification?

I’d also be looking to explore what kinds of ethics and norms are used in contexts in which the proposed X is being invoked. Are there differences in the ethical frameworks used to think about X across different spaces and times and in what ways do these differ? If there are differences of opinion about the ethical valence of X how do we decide amongst such opinions in governance, regulation, and technical innovation practices?  

Tough on Crime? Lie detector programme for sex offenders doesn’t hold all the answers.

The Coalition has decided to drop the privatisation of polygraph, or ‘lie-detector’ tests for sex offenders. But the continued use of this flawed technology within the probation service is misguided and the whole programme should be scrapped.

Since the Offender Management Act was changed in 2007 to allow for the attachment of a polygraph condition to terms of probation, trials of the device for use with post-conviction sex offenders have been taking place in the Midlands. These concluded in 2012, were reported to be a success by the government and are due to be rolled-out nationally.

However, in scientific and legal communities the polygraph’s validity (whether it can detect lies at all) and its reliability (how regularly is makes an error in categorising truths as lies and vice versa) are highly contested and have been so since its early development at the turn of the 20th century.

Above all, it is far from clear whether the use of these measures reduce reoffending rates or improve offender rehabilitation outcomes.

The polygraph machine was first adopted in the UK at the peak of a media-provoked cycle of one-upmanship between Labour and the Conservative Party over who could be toughest on crime.

This rhetorical contest has escalated over the last two decades and has always been most vociferous when it focussed on sex offenders – particularly paedophiles.pervhunt

Tony Blair’s 2005 Labour Party manifesto promised to trial the lie detector for use in the monitoring and treatment of paedophiles post-conviction and thus opened-up the UK to the official use of the device for the first time.

The political justification for their use continues to rely on the idea that parties have to be seen to be being tough, as Justice Minister Jeremy Wright has said, “Introducing lie detector tests, alongside the sex offenders register and close monitoring in the community, will give us one of the toughest approaches in the world to managing this group.”

800px-Computerized_PolygraphWhat about the scientific justification? According to proponents the polygraph works by measuring the concurrence of certain physiological responses (e.g. pulse rate) with deceptive behaviours. The use of the device with sex offenders would be to ensure they are being truthful about their behaviour during probation. However, deception can occur in the absence of these physical responses, and the physical responses can occur in the absence of deception. Moreover, the most extensive US National Academy of Science report on the device concluded that the polygraph’s reliability was flawed as regards its real-world generalizability and that additional basic research was needed.

Scientists involved in the UK trials with sex offenders have conducted research into its efficacy. However, rather than focussing on validity and reliability the research has concentrated on the value of the polygraph as regards the elicitation of ‘clinically significant disclosures’ (CSDs).

Such CSDs are typically used to evaluate the offender’s riskiness, change their treatment strategies or alter their probation conditions.

In this context, the validity of the polygraph is seen to rest not so much whether it can detect lies but whether it can get offenders to make more disclosures about their behaviours. Even if we accept this premise, the question of reliability is still valid: how often does it make mistakes in categorising those disclosures true or false?

The report on the 2012 trial has a worrying feature in this regard. It turns out that the majority of the disclosures are not made during the actual test but in the pre-test interview. As such, the polygraph’s role appears to be less a lie detector and more a threat or method of inducing confessions from offenders.

Reportedly, the use of the test is of value to offender managers because it gives them confidence that offenders are sticking to their probation conditions, discloses risk and allows managers to challenge risk.

Given the focus on numbers of CSDs it seems distinctly unwise to rely on the polygraph as a method for helping to determine risk and of inducing confessions, particularly when many of these disclosures are being made before actually connecting up the device to the offender. As such, a significant risk in adopting the polygraph in this context may be an over-reliance on the veracity of CSDs.

Offenders are likely to be able to manipulate the examination to their own means just as much as managers and examiners are able to use the examination to elicit CSDs.

There is no research at all on strategies the offenders may use to make CSDs in relation to anticipated examinations or during pre- and post-test interviews.

If an offender knows that they have breached the terms of their probation and fear that their examination is going to result in a ‘deception indicated’ result, then they might well offer less significant CSDs in advance of the examination in order to help shape the interpretation of their results.

Furthermore, we don’t have any information on what might happen when an offender has an erroneous result of ‘deception indicated’. When false positives occur, it could be risky for offenders to maintain that they are not lying. If the polygraph is to be believed, this means they are not acting in a trustworthy manner. In line with this suspicion their treatment and probation conditions might be changed. Might offenders provide CSDs that are themselves lies in order to convince the examiner and manager that they are now telling the truth? We have to know a lot more about this technology and how it is used before we trust it to help determine the riskiness of offender behaviours.

Finally, technologies used to manage sexual offenders often leak into other areas and it is possible that polygraphs could in future find use in other contexts of treatment and rehabilitation, particularly if corporations could profit from their adoption.

However, the government’s idea to privatise the probation service has received a poor assessment from its own internal reviews and has been dropped. Instead, the Ministry of Justice plans to go ahead with the programme, but will keep this part of the probation service in-house.

But this doesn’t go far enough. Focussing on how tough the programme is on offenders and on the value of ‘clinically significant disclosures’ papers over the critical question of whether these measures reduce reoffending rates and improve offender’s rehabilitation outcomes.

Ultimately, the safety of children, communities and the offenders themselves will not be improved by increasingly punitive measures if they do not tackle the causes of sexual offending, improve rehabilitation and reduce the rates of sexual offences.

The adoption of technologies that are seen to be tough on offenders should not be an end in themselves.

The Grisly Truth about Truth Drugs Research

Damian Lewis from a scene in CIA TV drama ‘Homeland’

This is the second part of a post on truth drugs, the first part is here

Inscribed in the stone of the original CIA building was the motto, a line taken from the Gospel according to St. John, “And ye shall know the truth and the truth shall make you free.” Of course, one of the major responsibilities of the Central Intelligence Agency is the collection and collation of intelligence materials in order to discern the truth of other governments’ plans and operations. In the era of fears about soviet espionage the intensity of this demand was perhaps as great as it is today in the context of terrorism. The Agency was under immense pressure to pre-empt the actions of soviet powers and one important mechanism in their pursuit of this goal was the extraction of information during interrogation of suspected spies and sympathizers.  Interrogations, however, were a difficult business, particularly as operatives were often trained in resistance techniques. The CIA thus reached out to science in order to improve their interrogation results. Though far from being the beginning of the role of psychology in military and security services this period was hugely influential in shaping the relation.

Dr Gottlieb

In 1953, the CIA collated their psychological research into a programme they codenamed ‘MKUltra’. The director of the programme was a chemist named Dr Sidney Gottlieb, who believed the Agency had to understand the ‘mind-washing’ strategies they felt their enemies were undoubtedly developing. Moreover, this fear made it vital that the US developed their own science in this area. Gottlieb thus became known as the ‘Black Sorcerer’ for his involvement with a number of projects concerned with controlling behaviour by use of drugs and a range of other techniques. His colleagues and collaborators involved, most notably, the notorious CIA officials General William Donovan, Colonel George H. White and Dr Stanley Lovell. Perhaps the most terrifying of their many psychological brutalities was the work conducted into ‘sleeper agents’, who they wanted to program to undertake covert actions, most obviously assassinations, without ever having known they wished to do so.

No less violent and degrading, however, a significant strand of research in the programme was to discover drugs that would influence behaviour in such a way as to improve interrogations. In short, one of MKUltra’s main goals was to find drugs that would compel suspects to tell the truth. As I described in a previous post, there was a ready-to-hand possible candidate in the form of scopolamine, a nightshade-derived drug. In the 1920s Dr Robert House had discovered that giving the drug to women during childbirth caused them to talk uninhibitedly about their feelings and thoughts, which led him to believe that scopolamine might help in criminal interrogations. Though his efforts to develop the truth drug into a policing tool largely failed, the CIA picked up his research thirty years later and began investigations for other such compounds.

Ergot fungus (growing on rye) is the source of LSD.

Among the drugs that the CIA tested were psychedelics, one of the most prominent compounds being lysergic acid diethylamide, or LSD. General Donovan believed that finding a ‘speech-inducing’ drug was vital to the security of the US and that this warranted any effort in its discovery. Chief amongst the many controversial actions undertaken with these substances was their use on unsuspecting US citizens, primarily men and women who were most vulnerable and on the fringes of society. The test subjects included unwitting prostitutes, homeless people and patients in psychiatric hospitals. George White created ‘safe houses’ in Greenwich village and San Francisco, from where he tested the drugs on these victims. White also set-up a study on sexual behaviour and prostitution to understand how these might be used to extract information. Using funding from MKUltra, a number of psychologists and psychiatrists working around the US in universities and private institutions were supported to conduct research on the effects of LSD and how it could be used to control behaviour. For example, between 1955 and 1958 over 1000 American ‘volunteers’ participated in a series of tests at the Army Chemical Warfare Laboratories in Edgeware, Maryland. In one case, 95 military personnel were given LSD in a fake drinks reception without their knowledge and then subjected to interrogation, polygraph examinations and put into isolation chambers to explore how their traditional security training stood up in the face of these ‘drug-enhanced’ interrogations.

A soldier on LSD during the Porton Down woodland operation.

A number of experiments in LSD thus involved  subjects being given these drugs without their consent and, as a result, it produced a situation in which people died and others suffered severe psychological trauma without any explanation. It wasn’t only in America that these kinds of covert, illegal experiments were being conducted. In the UK, scientists working for the MI6 science park ‘Porton Down’ gave military personnel LSD without consent and observed their interactions and attempts to conduct a staged operation in a woodland. Three of these men later sued the government and were awarded compensation in an out-of-court settlement in 2006.

Perhaps the most widely reported tragedy was the death of Dr Frank Olsen, a civilian employee of the US army, who specialised in ‘aerobiology’. He was invited to a cabin in the countryside for a semi-annual review along with several other scientists. At the cabin, Gottlieb and Robert Lashbrook gave the group each a drink from a bottle that contained 70 micrograms of LSD mixed with Cointreau. In the days after the ‘experiment’ Olsen went on to develop severe depression and Gottlieb and Lashbrook arranged for him to be treated by one of the CIA cleared LSD researchers in Washington. During the treatment, however, Olsen threw himself out of a hotel window on the 10th floor and died from the fall.

The rhetoric that has often been adopted in the development of lie detection devices and truth drugs is that they represent a more humane way of getting to the truth when compared to physical torture. Indeed, in a documentary on the work conducted as part of MKUltra, one scientist involved argues that if there is going to be war, it is better that it is done using the least barbaric means possible. Robert House thought scopolamine might alleviate the practices of brutality used in police interrogations, as did the developers of the polygraph. The creation of fMRI lie detection has similarly been occasioned by claims that scanning the brains of suspects is far better than the practices used in torture camps like Guantanamo Bay. What this line of argument implies is that science can free us from the darker side of the pursuit of truth, shining the enlightenment into the cells of prisons, and throwing open the window of interrogation chambers. But the history of truth drugs cautions us to examine this thinking and to be mindful that not all scientists act in humane ways and that not all science ostensibly done in the service of peace and security is without consequence. It is difficult to know if the intelligence services are using contemporary developments in neuroscience to explore modern methods of information extraction. But if there are a couple of things we can learn from the history of lie detection they are that we seem unable to stop ourselves from pursuing the creation of these devices and that the desire for the truth can sometimes be a dark one, conducted in the murky and shadowy parts of scientific and military culture.

Some links:

MKUltra Documentary:

Video reporting on LSD Experiments at Porton Down:

Senate Hearing on MKUltra:

Scopolamine, Truth-telling and the Other Dr House

Portrait of Daniel Defoe

The reasoning that recording physical correlates can be used to discern truth and deception can be traced at least as far back as Daniel Defoe’s 1730 essay on the prevention of street crime. He wrote: “Guilt carries fear always about with it; there is a tremor in the blood of the thief.” Defoe advocated holding the wrists and measuring the pulse to detect a person in possession of false tongue.

As Geoffrey Bunn has recently argued, a range of important concepts emerged in the genesis of criminology, not least the notion of ‘criminal man’, whose animalistic, biological nature was the source of his criminal behaviour. So in the late 1800s and early 1900s the body was increasingly tied to the mind and the various inscriptions produced from reading the body became vital to theorising mental and emotional events. Similarly, the theorisation of the unconscious as a quasi-spatial repository of personal truths made the mind a focus for physiological study. Moreover, biologists of the time investigating heredity imagined memory to be material, conceiving of it as a vibration of cells in parents, which were then transmitted to offspring. This helped them account for the transmission of ostensibly non-physical qualities that nonetheless seemed to be transferred from parents to offspring.

These and a great many more small changes in the discourse of crime and the body helped to consolidate the idea that some technique or technology could be used to access the internal state of a criminal suspect resistant to interrogation. Practitioners of applied psychology, developing their work most fervently from the 1870s onwards, produced a central set of technologies that examined psychic states by monitoring physiological changes. The years from 1870 to 1940 thus saw the development of numerous lie detection devices such as truth serum, sphygmomanometers, pneumographs and the galvanic skin response monitors, some of which were consolidated into the ‘polygraph’ machine, patented several times from the 1930s onwards, and now used throughout the USA to police a range of suspect categories.From good historical work we now know quite a lot about the history of the polygraph machine, particularly regarding its early years of development and deployment in the USA. However, we know a lot less about the emergence and use of truth serum.

Dr Robert House, administering his “truth serum” drug to an arrested man in a Texas jail.
House administers the serum in Texas

In the 1920s a nightshade-derived drug, scopolamine hydrobromide, was trialled by one Robert House, a Texas obstetrician, for use in the interrogation of two prisoners at the Dallas county jail. Dr House had observed the effects of scopolamine on women during childbirth, alongside morphine and chloroform. This drug-induced state became known as ‘twilight sleep’, and was caused by blocking the action of the neurotransmitter acetylcholine. House felt that the drug’s effects on women might be similarly produced in people suspected of concealment. The two prisoners interviewed by Dr House retained their original story indicating to the Dallas physician that they were innocent. The evidence was submitted and the prisoners were found not guilty at their trial. The use of scopolamine as a ‘truth serum’, a term coined by the media and eventually adopted by House, was short lived, mostly due to its dangerous side effects, and though it found brief use in the legal field it was generally unsuccessful.

Dr Gregory House and his slogan, ‘Everybody lies’.

Its popularity resided mostly within the media and was propelled not solely, but incessantly, by House. Much like the proponents of the polygraph, House believed that the truth serum would not only act on individuals to produce justice but on institutions also. He feared that the corruption of powerful members of society, both public and private, had reached severe levels, most dangerously so within the criminal justice system. At the time, aggressive interrogation methods had become endemic in US police investigations, a practice that became known as ‘the third degree’. The doctor saw his serum as the antidote to this social ill. As the more popularly known Dr Greg House from the US TV show says, ‘Everybody lies’. Indeed, the TV character of House seems to be a modern inheritor of the historical House’s cause for lie detection techniques. in the TV show, House regularly calls his patients on their deception and prevarications, and in a few episodes uses the hospital’s fMRI machine to scan their brains and determine whether they’re telling the truth or not.

However, the Dr Robert House’s hopes for a truth serum, that might act like a societal vaccine were never made real: he died in 1930 and the use of scopolamine as a truth drug mostly died with him. It was around this time that the ‘inventors’ of the polygraph were pushing their devices as cures for the corruption of police investigation practices. So although the idea for using chemical compounds in the interrogation of suspects survived, becoming known as ‘narco-analysis’ it never really competed with the rise of the polygraph machine. One important context in which it did endure, however was as part of the programme of human behavioural modification explored by the CIA in Project MKUltra, about which I’ll talk more in a future post. For now, here are some references that might be of interest:

Robert House, The Use of Scopolamine in Criminology

Geoffrey Bunn, The Truth Machine

Alison Winter, The Making of “Truth Serum” 1920-1940

Melissa Littlefield, The Lying Brain

Risky Bodies and Dangerous Desire [II]

This is part two of some comments on sex offenders and lie detection (part one here). It is also a bit of a promo for my recently published paper on the topic, which you can download here.


We are in a period in which child sex offences cause moral panics and thus help further the measures we are willing to take in punishing and policing them. Laws are passed under the names of the victims to remind us of the cruel and brutal acts committed against children. The media drives up fear and anger because it sells print, and because they know we need an enemy. The paedophile is now the sexual terrorist – his actions undermine the structural organisation of Western society by explicitly challenging the notion of ‘childhood’. This notion is not simply natural consequence of our biology but an entangling of ‘social’ and ‘material’ phenomena. Amongst a number of other causes of the entrenchment of the notion of the innocent child, was the fact that once we had machines and automation in factories, on farms, etc., we no longer needed children to do hard labour.  The contemporaneous emergence of psychiatry also welcomed a whole host of ways in which adult sexuality was connected to childhood experience, and thus the fracture of innocence became connected to criminal and deviant behaviour in adulthood. This is not to say that our bodies do not change as we grow older or that our emotional ability to manage relationships both sexual and familial does not similarly develop. Instead, it is to point towards the cultural production of a relation between innocence, sex and criminality that underscores the construction of sex offenders as contemporary monsters and underpins media and moral panics.

A boy walks through a field back home from school in order to advertise a new tractor. Against this background, a number of Western strategies of governance in education, sexual health and criminal behaviour can be seen to be geared around securing healthy, happy, playful and above all innocent lives for our children. The corollary of this is the ‘adult’ – the sexually, intellectually, economically mature individual now capable of work, reproduction and decision-making. Challenging the norms of childhood and adulthood, paedophiles are thus not only coded as monstrous because of the acts in which they engage but because their symbolic function is to uphold the binary they seek to destroy. This helps explain our punitive obsession with them and with their bodies. The paedophile is at the far edge in terms of the lengths to which we go to monitor, manage and predict criminal behaviour, and unique in regards to the kinds of treatments and punishments we mete out.

Indeed, current responses to sex offenders are not entirely oriented towards their criminal and violent acts. They also evidence a fear about the offender’s desire itself. In treatment, we don’t just want them to change their behaviours; we want them to change their desires. Practices of role-play, lie detection and plethysmography are sometimes used (particularly in the USA at the moment) to help steer desires towards ‘normal’ objects, both in terms of being age appropriate (not simply above the age of consent, but rather of a ‘normal’ age for the male being treated), sexually conservative (the desires should be quite vanilla) and heteronormative (male-male desire is coded as increased risk).

Is the response to terrible acts of violence that we just accept that because of the crime any punishment, any treatment is justified? Is it a woolly (or worse, ‘suspect’) argument to claim that sex offenders have rights too and that punishing their acts is sensible but changing their desires is not? Trying to put philosophy and politics aside, a pragmatic approach alone tells us that this isn’t a sensible system. If we accept – as we should – that a great many sexual offenders have committed violent and abusive acts against children, and that these should be criminal acts, then we should think pragmatically about how to respond. Whatever notion of punishment or law we wish to adopt, we can probably all agree that we would like to see fewer instances of such abuse. The social isolation of offenders in post-probation settings (named and shamed in the community, on the register, by the media) results in a lifestyle that facilitates further offending. Training the offender’s desires to fit the norms of a sexually conservative society surely only serves to further stigmatise their non-criminal sexual behaviours and desires.

By avoiding offenders and ostracising them we don’t protect children from further suffering, we place them at continued risk, precisely because offenders have no social resources to draw upon in changing behaviours and avoiding risky situations. If the only people who will talk to them are other offenders, it isn’t hard to see how recidivism becomes connected to social factors. The conceptualisation of the offender as monstrous and incurable at the level of desire conflicts with the demands that they change these desires. Orienting treatment using these binaries and norms is an impediment to the development of non-criminal sexual behaviour.

The days of chemical castration aren’t behind us and nor is the use of the polygraph. The recent reports of the ‘success’ of the polygraph trials for use with sex offenders here in the UK are evidence that further measures are consolidating around the criminal act of paedophilia in such a way that it constitutes fundamentally deviant human monsters. Our obsession with the bodies of these offenders comes at the cost of understanding their social practices and, ultimately, at the cost of actually reducing the risk of future criminal acts. We have to stop seeing sex offenders as monstrous and stop panicking about their crimes in order to be more able to respond effectively to them.