Insights Michael Baker

Debater-bot impresses but we outsource our critical faculties at our peril

Does IBM’s Project Debater herald a new approach to debate, where the influence of emotion, bias and ambiguity are limited?

IBM’s Project Debater was more convincing than its human opponent, Dan Zafrir

If decision-making has just become too much to cope with, there is hope on the horizon. Computer giant IBM unveiled their remarkable “Project Debater” artificial intelligence technology this week, which can crunch the internet’s vast repositories of data to make persuasive arguments that beat a human opponent in live debate.

Impressively, the machine could listen to its opponent’s argument and quickly craft a rebuttal.

IBM says the technology could sit in corporate boardrooms and police operations centres to help people make better decisions and limit the influence of emotion, bias and ambiguity.


project debater


Does this herald the end to the art of political persuasion or carefully crafted advertising or PR campaign designed to appeal to our emotions?

In a world of polarised identity politics, Brexit and Trump where we are faced with vast quantities of information, it is a comforting to think that we might be able to circumvent our emotions when making a decision. Nudged to the “correct” conclusion by a machine, we could avoid the anxiety of thinking about complex or even routine decisions altogether.

But that undermines the value and values that emotions bring to our decision making. What’s more, delegating to machines our capacity to think, scrutinise and interrogate would deprive us our most valuable faculties in the information age.

Human decision making is the product of what we know and feel and the premise that emotions are detrimental to human decision-making is not, ironically, supported by science. Studies show that people who suffer damage to the part of the brain responsible for emotion are hopeless at making decisions. Without emotion, far from becoming the Vulcan-like masters of logic seen in Star Trek, our ability to act is impaired. Taking emotion out of decision making is like stripping a compass from a ship’s navigation systems.

A potential evolution for IBM’s Project Debater would be the courtroom, where a machine could coolly weight the evidence and come to a verdict more quickly than 12 human jurors. But decisions based purely on logic, divorced from human values and compassion, may be “wrong” even when rational. The arguments and debates are well-rehearsed among those pioneering autonomous vehicles. Faced with a situation in which five people are certain to die unless a vehicle swerves into one person nearby, the rational course of action is to swerve for the greater good. But research shows that for most people, the idea of abiding by these rational rules to throw someone under a bus is distinctly uncomfortable. People who do think it the right course of action are less trusted – understandably – by the rest of the community.

Debate is not then, a simple swap of facts between parties, it is about taking a position on important moral issues. In that context, our emotions help and indeed, just because someone argues passionately does not mean that they are irrational.

Even if machines can help us to process information, it is vital that we retain the cognitive capacity to question the output. Project Debater’s arguments are a product of the source information it draws from. While IBM says that the machine can check its sources – largely millions of news articles – for credibility and accuracy, those same sources are influenced by the same emotion, bias and ambiguity written into human knowledge creation and writing over millennia.

We are undoubtedly in a new age of communication in which facts are subsumed by appeals to our emotions. But these emotions are what set us apart from robots in the first place. While IBM’s technological progress is admirable, we outsource our critical faculties at our peril. Our future is too important to simply switch ourselves off and leave for the robots to decide.