Many moons ago, I did my first degree in philosophy, at Aberdeen University, in the northeast of Scotland. I remember that one of the first things our lecturers explained, very wisely, was how in philosophy we should always criticise the theory and not the person. In undergraduate philosophy tutorials, especially in debates about applied philosophy, we would have to discuss contentious issues like abortion, animal rights, and nuclear weapons. We should strive to do that dispassionately, with philosophical objectivity, and without taking offence or attacking other people, even if we’d be shocked by the views they’re stating in the context of ordinary life.
There’s no other way to do philosophy. If we want to think rationally ourselves, we have to focus on the evidence for and against what people say, and forego criticism of the other person’s character. Attacking the person stating a theory is well-known as a fallacy. It’s traditionally called the argumentum ad hominem. There are many good reasons for avoiding ad hominem attacks.
- It’s fallacious reasoning. Criticising the character or actions of someone who holds a theory tells you absolutely nothing about the validity of the theory. Even the world’s stupidest people have good ideas. Sometimes bad people say the right things, albeit for the wrong reasons. Even a stopped clock is right twice a day. If Hitler said that one plus one equals two, for example, that wouldn’t make it any less true.
- It’s just not good manners in a philosophical or rational debate. Especially today, on social media, good etiquette would be to express disagreement dispassionately, without taking offence or offending other people by attacking their character.
- It’s what I call a “conversation killer” because it prevents rational discourse from continuing. So it’s really very unphilosophical. Wise people don’t kill conversations by derailing them with ad hominems. They try to evaluate what other people say objectively and respond reasonably and politely.
- It’s usually very presumptuous and tends to involve the fallacy of “mind-reading”. You don’t really know what the motivations of a stranger on the Internet are. So jumping to conclusions about what they’re thinking rather than focusing on the validity of what they’ve actually said is really not a good idea. When we jump to conclusions about other people’s reasons for saying something, I tend to find it says more about our own attitudes than the other person. There’s some truth in the Freudian-Jungian concept of unconscious “projection”.
- We should be intellectually humble enough to always remember that the other person might actually turn out to have been right all along. Think of all the ad hominem attacks against Charles Darwin that portrayed him as a foolish moral-degenerate and the cartoons depicting him as a monkey – the real fools were the people dismissing what he said. Criticising the other person’s character potentially stops us from realising that what initially seemed false or stupid was actually correct. Put bluntly, using ad hominems risks making you more stupid.
This is one of my favourite anecdotes about Zeno of Citium, the founder of Stoicism… One day, Zeno came across an arrogant young intellectual who was discoursing loudly about the philosopher Antisthenes. (Antisthenes was one of Socrates’ closest friends, and greatly admired by him; his writings were held in very high regard in ancient Greece but none survive today.) A small crowd had gathered around the young man and he was showing off by doing a hatchet job on Antisthenes, denouncing what he perceived as the shortcomings of his philosophy. Zeno interrupted him and asked what he’d learned from Antisthenes that was of value, about wisdom or virtue. The young man said “nothing”. The story goes that Zeno told him he should be ashamed therefore to have spent so much time and energy picking over the flaws in a philosopher’s writings without first being able to identify what’s actually of value in his writings. In Zeno’s day, the Platonic Academy became dominated by Skeptics who were adept at nit-picking flaws in any philosophical theory. The Stoics felt these people risked of turning philosophy into nothing but clever wordplay and losing sight of any ideas that are actually of value.
This is similar (but not identical) to what philosophers today call the Principle of Charity. The Principle of Charity involves giving other people the benefit of the doubt, assuming they’re not stupid, and interpreting their statements in the most charitable way in terms of the debate. There’s always some ambiguity about what other people mean, especially on social media. So if we’re not sure, it’s good etiquette to lean toward the most generous interpretation, i.e., not to assume the worst, but to see what others say in the most rational light. That would entail not “mind-reading” others, for example, and risking falsely attributing dishonest or stupid motives to them.
The philosopher Bertrand Russell, likewise, once said that the ideal way to study another philosopher consists in two distinct stages. In the first stage, we should be as sympathetic as possible toward their theories, and perhaps even try to find additional reasons to support them. We should empathise with their position and try to really understand them as deeply as possible. Once we’ve done that enough, we should enter into the second stage, of criticism, and adopt a more hostile position, in which we identify as many flaws as possible with their theory. It’s premature to criticise a theory until we’ve attempted to fully understand it. Many philosophers waste time and energy expounding lengthy criticisms of other philosophers that, on close inspection, just show they didn’t fully understand their theories to begin with. To some extent misunderstanding is inevitable. Scholars believe that even Aristotle failed to fully appreciate his master Plato’s teachings, despite having been his most prominent student for many years. On the other hand, though, at the extreme end of the scale, it’s not unusual to find people who publish long-winded criticisms of books they’ve obviously not read!
There is one exceptional circumstance, nevertheless, where I feel ad hominem criticisms may be legitimate. When I trained psychotherapists, I often found that people were very strongly invested in particular schools of thought that they’d been previously trained in. Now there are hundreds of competing psychotherapeutic theories. They all say different things. As Arnold Lazarus, one of the pioneers of behaviour therapy once put it: they can’t all be right, but they can all be wrong.
When I first began studying psychotherapy there were still many therapists deeply invested in Freudian theory. They believed in things like the primacy of the Oedipus Complex, even though no evidence supported this theory. Psychodynamic therapists believed that their form of therapy was the only effective form of therapy, even though countless research studies provide evidence that conflicted with this claim. (Actually, it very often seems to be one of the least effective forms of therapy.) When I pressed these therapists for the reason they believed these things in the face of conflicting evidence they’d often say something along the lines of this: “Freud is widely regarded to be a great psychologist.” Likewise, in the 1990s, Neurolinguistic Programming (NLP) was reaching the peak of its popularity as a fad. There are also many studies on NLP, which overall show that it is ineffective and certainly does not show the dramatic results its proponents claim. When I pressed NLP practitioners for the reason they believed in this approach, much like the Freudians (whom they hated!), after lot of prevarication, they would say something like: “Because Bandler’s research shows that it works.” Richard Bandler, however, never conducted any scientific research into the theories and techniques he developed. He just published books on them, and trained others to use them, without testing them in clinical trials, etc.
Now, someone who holds pseudoscientific theories will very often attempt to support them by appealing to the perceived authority of the person who developed them. That’s obviously another fallacy: the appeal to (perceived) authority or argumentum ad verecundiam. You can try pointing out to them that it’s a fallacy but that often does nothing to dissuade them. In those cases, if their rationale for holding something to be true is purely based on the character and credentials of someone else, I think it’s legitimate to question whether that’s good evidence. Doing so may involve questioning the scruples or expertise of the person they’re citing, i.e., questioning their authority. For example, Freud is certainly famous. However, he is not highly regarded today as an expert on psychology or psychotherapy. In fact, Freud conducted no research whatsoever on psychotherapy and only treated a very small number of psychotherapy clients – perhaps less than one hundred in his lifetime whereas most modern therapists treat thousands. Bandler, likewise, is qualified neither as a psychotherapist nor as a psychologist and has published no scientific research in support of NLP. His books have been shown to base their arguments on simple scientific errors about neuropsychology.
Now none of those observations necessarily mean that psychoanalysis or NLP are wrong. They merely throw into question the reliability of the people behind them. However, in the exceptional case mentioned above, where an individual cites the perceived authority of Freud or Bandler as their sole reason for believing something, I think it’s valid to use something resembling an ad hominem argument. In that case, though, rather than attacking the character of the speaker, you’d be questioning whether someone they cite as an authority actually has the expertise and reliability they’re attributing to them. Even so, this is a last resort, because ideally your interlocutor should realise that such appeal to authority is a fallacy to begin with. It’s especially foolish to use such appeals as a reason to discount scientific evidence that points in a contrary direction. Unfortunately, it’s still very common for people to think this way, though.