Killing for a Debate

by Ben Albahari 29. July 2011 00:56

What are we achieving by pointing the finger at a mass murderer?

A 5 year old knocks over a vase and it shatters all over the floor. Do you point your finger at the kid and shout "Look, it's the dumbest person in the room!"? Yes, Breivik is the most evil person in the room right now. Congratulations for pointing that out. So now you have no responsibility for what he did? WRONG.

In such a horrific situation, what’s hard is to point the finger at ourselves. But we are complicit.

We never gave the guy a voice when his opinions were still malleable. We didn't let him debate. I've lived in a few western countries and here’s a cultural observation. If you say something politically incorrect, don't expect a debate. Prepare to be pounced on with patronizing admonishment. For any non-mainstream belief, you have to spend an inordinate amount of time learning how to delicately position it. So slowly, through a process of social reinforcement, you learn to talk less and less about politics and religion. And what do you talk about instead? Well, a friend of mine thought it was funny that I was up to date with the latest goss. This is hardly the case, but I do try to remain vaguely aware of it, because pop culture bullshit yields 10 times the social mileage of politics. We're rewarded for being superficial. For demonstrating social sensitivity rather than substance.

“In depth analysis and theoretical studies doesn‘t exactly go hand in hand with advancing your social skills which is more related to the skills of interaction and communication; sales, entertainment and manipulation. Unfortunately, many of my friends who are masters at it are apolitical and usually end up wasting their superb social skills on manipulating women into one night stands.”

That's Breivik. A good observation and it's fucked up, but good luck having a meaningful conversation about how fucked up that is.

I was recently at a bar discussing Californication, where David Duchovony's character gets charged for statutory rape for having a one-night stand with a girl who clearly looks and acts much older than her supposed age (the actress was in her 20s following a script that made her more articulate than most of the people I was having the debate with). The only thing resembling a 16 year old about this character was the label. Now I couldn't even suggest that what Duchovony's character did wasn't wrong without literally getting labelled a rapist. The label was meant jokingly, but on reflection it's insidious that you’re a joke for merely debating the idea. As a natural devil’s advocate I sort of get a perverse enjoyment from being the one to be rudely escorted out of the “let’s not go there” zone back to the invariably more boring comfort zone of polite small talk. But ultimately I never really push too far. These are the implicit rules of the game. Shut up and play.

Breivik never accepted the rules. And whenever he tried to openly talk about them and the effects he believed they’d had on his family he was basically told to shut up. The tragedy of this human being is how his great qualities were squandered. He was a big picture guy. A passionate guy. A guy who thought outside the box. And yet these qualities contributed to his alienation. In his manifesto, he writes "The phase of dialogue has now ended". No. The dialogue never really began. You never let him.

Tags:

The Real Factor Pitch

by Ben Albahari 15. July 2011 17:03

Yesterday someone told me, in his emphatic half-duplex communication style whereby he’s so focused on espousing his wisdom to you that he has no brain left over to detect his listener’s rolling eyes: 

“Looks aren’t important at all! All that matters is...”

Which reminded me of another guy I’d once met, a chronic exaggerator, who gave his insight on becoming a millionaire:

“All you need is one good idea! One good idea!”

That’s right. Forget what you think you know about success. The real factor for success is...

I call this persuasion tactic The Real Factor Pitch. Less obviously retarded than its retarded cousin The Magic Bullet, but in essence just another form of exaggeration. This shit is the shit that’s the shit. That other shit ain’t shit.

On the heels of The Real Factor Pitch comes the exceptional anecdote. The ugly guy with the beautiful woman. This proves that looks aren’t important! Well, no. His ugliness was a big handicap, which he compensated for by having a ton of other compelling factors. And of course maybe she’s a handicap herself. In a competitive world, low hanging fruit don’t last for long. They’ll be eaten long before you have a chance to read about them. The Real Factor Pitch relies on your naivety in believing the fruit is right in front of you if you just open your eyes.

Typically, success requires multiple contributing factors. Including skill. Including luck. Including shit you have. Including shit you don’t have. Including shit you can change. Including shit you can’t change. Including shit we know. Including shit we don’t know. 

The least you can do for yourself is to objectively weigh up the factors that you do know.

Tags:

Dodging the Dilemma

by Ben Albahari 19. April 2011 20:27

There's lots of ways to reverse arguments. You can often reverse argument fallacies too. I'm just going to pick one example: The False Dilemma. It's opposite? Dodging the Dilemma.

To recap The False Dilemma: it's when you make it appear as if there's only two options, when in fact there are more. A famous example is Bush's "You're either with us or against us". Conversely, you can deny a real dilemma exists. Here's Julia Gillard, the Prime Minister of Australia:

We can certainly work with industries to transition to a clean-energy economy and in that clean-energy economy there will be more jobs.

Whatever. She's too afraid to say her policy is not a win-win situation. That yes, the immediate effect of a climate tax is to hurt the economy. That there's no magic way around that. That there's a cost now for a benefit later. That gulp, yes, the cost: jobs. She knows the power of soundbites, and knows what a present the truth would be to the opposition if she didn't spin it. So she's dodging the dilemma. Barack Obama loves this pitch. It is however a standard tool in any respectable politician's toolbox. Whatever policy they want to push will have a downside that their opponents can use to frame the policy as a dilemma. The parry is to Dodge the Dilemma as follows:

My opponents are misleading you that having X means we sacrifice Y... That's a false choice... We can have both X and Y.

Dodging the Dilemma ultimately manifests as Having One's Cake and Eating It Too. Here's Obama:

We don't have to choose between a future of spiraling debt and one where we forfeit our investment in our people and our country.

Like so many real world examples of argument fallacies, this has more than one rolled into a single sentence. You'll have to replace "our investment in" with "some advantages for" to kill the glittering strawman. Now, obviously we do have to forfeit some advantages. Whether that's low taxes or social security benefits, something HAS to suffer*. He's dodging the dilemma.

-

* Later in his speech, Obama actually says his plan can succeed by "spending reductions in the tax code", a ridiculous euphemism for raising taxes (Jon Stewart had a field day with it).

Tags:

TakeOnIt Editors Discussion

by Ben Albahari 19. March 2011 09:09

This blog post is here for editors to discuss TakeOnIt.

Tags:

Intuitions

by Ben Albahari 12. March 2011 19:50

In an argument, some people will appeal to their own intuition as evidence that they're right and you're wrong. Unscientific? Yes. Patronizing? Yes. Always wrong? No. Sometimes intuition beats science.

Intuition: Fast, opaque, unreliable, context-dependent.
Science: Slow, transparent, reliable, context-free.


Intuitions are fast. Formalizing a hypothesis, constructing a theory, and gathering evidence, is a serious amount of work. Intuitions are rough but ready.

Intuitions are opaque. You think or even know you're right, but you have to introspect to find out why. It's believe first and ask questions later. Science works in reverse. You ask questions first, and only believe after you've got thorough answers. This makes science much better for resolving differences of opinion, since all the evidence is on the table, rather than being strewn across the uncollective subconscious.

Intuitions are unreliable. The fact that people have seriously incompatible intuitions should make us question our trust in intuition. If we are to trust our intuitions, we at least need good reasons why intuitions are reliable for us but not others. Similarly, we've all been sure of something that turned out to be wrong. Why were we wrong? How do we know we're right now? Without a way to evaluate intuitions, we either have to turn to science for answers, or admit that our beliefs are based on raw arrogance.

Intuitions are context-dependent. We routinely use intuitions successfully in situations untested in any laboratory. Business is impossible without a good sense of intuition. Intuition can't be practically reduced to a set of formulas you can follow for guaranteed success, no matter what the blurb on a self-help book says. Socializing is also impossible without intuition, as the hapless nerd aptly demonstrates applying science to social situations. This is not to say that context-dependent situations can't be informed by science, but that one's intuition takes the particulars of each situation into account and makes the final call.

Conversely, people often treat context-free situations as if they're context-dependent. Suppose your intuition tells you that the spooky sound coming from grandma's attic has got to be a ghost. The problem here is that the laws of physics apply in all contexts. Whatever you believe, has to be reconciled with what we know about fundamental forces. You heard it? You saw it? You saw the lamp it knocked over? This means the "immaterial" ghost is interacting with the material world, so physics apply to it. We must be able to detect one. Hell, let's capture one. In such an argument, the nonfictionally challenged X-File fan will actually try to inject context into a context-free situation. "My mama has never lied about anything..." In contrast to nerds, most people are weak in systemizing thinking. While a nerd may overly systemize when it comes to context-dependent situations, many people under-systemize when it comes to context-free situations well understood with the scientific method. This makes them vulnerable to believing in the supernatural and quackery. Often incapable of seeing any situation as context-free, they're eager to buy into the idea that truth is relative, oblivious - due to their lack of systemizing thinking - to the philosophical gravity that that position entails.

Tags:

Properties of Beliefs

by Ben Albahari 27. December 2010 07:34

Confidence, entanglement, objectivity, social signaling, personal impact, and altruistic impact. Those are six properties of a belief where deep points of disagreement lie which can easily cause a discussion to degrade into a fruitless argument if you're not aware of them. And that's key - be aware of them - you don't necessarily want to explicitly mention these points in a discussion - in fact doing so can easily cause an argument!

Confidence

For each of you, how confident are you that you're right? And really, what is that confidence based on? What is the thought process by which each of you acquire confidence in your belief? When the thought process itself diverges too much, any disagreement you have about the issue you're arguing about is only a surface level disagreement. The real disagreement lies with the thought process itself. If someone believes trusting their gut is a legitimate way to acquire a belief, and you're a rationalist, it's very unlikely you'll get anywhere in an argument. Abortion can be an impossible topic to argue because the confidence underlying the belief is often based on gut feelings.

Entanglement

For each of you, how entangled are your beliefs in your other beliefs? For example, a belief in God tends to be highly entangled with other beliefs. In comparison, the belief that Atkin's diet works, unless you've used it and now swear by it and run seminars on it, is probably something that you could acquire or drop, without really affecting any of your other beliefs. Entangled beliefs are far harder to discuss than unentangled beliefs, and are probably pointless to discuss if you differ in the thought processes by which you and the other person are confident in your beliefs.

Objectivity

For each of you, do you consider your belief to be universally true or just true for you? Broadly speaking, universal truths are generally scientific in nature, while subjective truths tend to be about ethics and aesthetics. Some beliefs entail a complex mixture of both. Now of course some people claim their ethical and aesthetic judgements are objective truths, and others claim scientific truths are subjective. Even then, the most extreme advocates of these positions will reluctantly acknowledge a "superficial" difference between such beliefs. That itself - a person's philosophical stance regarding objectivity - is key information to be aware of in an argument. If I'm arguing with someone about astrology, and they claim truth is relative - I can stop right there - we're never going to agree if I think the truth is objective and they think it's subjective.

Social Signaling

For each of you, what does your belief signal socially? Does it portray you in a certain light, or align you with a particular group? The belief that Global Warming is real aligns you with environmentalism, while the belief that Global Warming is bunk aligns you with conservatism and skepticism. "How outrageous!" you might be thinking. Whether or not you or the other person likes or agrees with being aligned with a particular group based on a belief is itself something you have to be aware of in an argument. Politicians are masters of being aware of the signals a belief sends to others.

Personal Impact

For each of you, what does your belief personally impact? Any entangled belief has a high personal impact because it permeates your world view. Beliefs that send strong signals define your identity, and also have a high personal impact. However, other beliefs might have a high personal impact, in that they can still strongly affect your decisions. For example, the belief that house prices are going to crash, could have a huge impact on the biggest purchase decision most of us ever make. Thinking about the personal impact a belief has can be a handy reality-check on a belief you might be taking too seriously. You might say you believe in life after death, but if on reflection you’re going to not act any differently regardless of the belief, then the belief is in nearly every sense immaterial.

Altruistic Impact

For each of you, what positive impact does your belief have on the world? In real world arguments, some people tend to focus on idealistic impacts, while others tend to focus on realistic impacts. The idealistic impacts come from considering the best case scenario for your belief. Your vote changed the election, or your Hybrid tipped Earth's environmental balance. The martyr is the hero of the idealistic impact. In contrast, realistic impacts come from considering the most likely scenario for your belief. Perhaps your vote made no difference, and perhaps your Hybrid, even if it did have a small positive effect on the environment, was a foolish use of your money if you really wanted to help the environment. The economist is the hero of the realistic impact. Idealists focus on stories; realists on spreadsheets, idealists want to revolutionize politics; realists accept realpolitick as the name of the game.

Tags:

Global Warming

by Ben Albahari 19. November 2010 13:03

Ask most climatologists, and they will tell you Global Warming is real and caused by humans. In other words, the consensus of climatologists is clear. Now, while one can somewhat glibly declare that the consensus is irrelevant, and demand clear-cut scientific tests a layperson can personally understand before accepting a hypothesis, it is not rational to do so. When a plethora of tests are required to test a hypothesis, and domain expertise is required to understand and synthesize those tests to attain the likelihood of that hypothesis, the rational thing to do is to outsource the assessment of that hypotheses to domain experts. This reasoning is described in more detail here.

When a scientific consensus exists but clear-cut tests do not, the matter is not settled, but the onus shifts to the contrarians to present a consistent set of claims that refute the prevailing view. If they fail to so, a rational layperson should side with the consensus. At the top-right of this page you can see a break-down of the major contentions of this issue, phrased as yes-no questions. I found these questions to be the most revealing:

Are the causes of climate change well understood?
Do negative feedback loops mostly cushion the effect of atmospheric CO2 increases?
Does cosmic radiation significantly affect earth's climate?

Through examining these arguments, I've come to the belief that the claims of climate skeptics are rife with inconsistency. Perhaps the most egregious inconsistency, is claiming Earth's climate is too complex to understand, while simultaneously claiming that climate change is very likely caused by nature. Such inconsistencies are exactly what you'd expect from contrarians whose skeptical thought passes through a polarized filter. These inconsistent claims are promoted by leading climate skeptics with impressive scientific credentials such as S. Fred Singer and Roy Spencer. S. Fred Singer is a particularly important figure because he heads the NIPCC (Non Governmental International Panel on Climate Change), a coalition very representative of the skeptical viewpoint.

I believe scores of bright climatologists with idealistic truth-seeking tendencies that override their fear of peer-suicide have had ample opportunity to provide convincing refutations of AGW. Given the amount of time I've spent so far seeking such refutations, I'm getting discouraged that any solid objections exist. (To address concerns I may have conducted a motivated search, I actually started out leaning towards the skeptical side.) Given the consensus and the lack of a coherent objections to that consensus, I believe AGW is likely to be true. However, I leave the door open to the possibility that AGW is false so long as we await clear-cut tests.

...

Expert opinions here.

Tags:

Consensus Counts

by Ben Albahari 19. November 2010 13:00

In the case a hypothesis is supported by clear-cut tests, there is no need for consensus. Consensus is however useful when a plethora of tests are required to test the hypothesis, and domain expertise is required to understand and synthesize those tests to attain the likelihood of that hypothesis.

Hypotheses vary in how easy they are to test. The General Theory of Relativity, while a difficult theory to understand, has hypotheses with clear-cut tests that a layperson can easily grasp. Will a clock on a plane have lost a little time on its trip? Will the position of a planet be correctly predicted? If the theory predicts the outcome of these tests, a layperson has a straightforward reason to believe in that theory, even if they have little grasp of the complexity of the theory itself. In comparison, AGW is much harder to test. The central theory, of the greenhouse effect, is actually remarkably simple, as is the hypothesis it suggests, that CO2 warms the planet. But testing that hypothesis in the real world requires synthesizing data from a plethora of tests, and you can't really do this without being pretty familiar with climate science.

I believe the complexity of testing the AGW hypothesis is what ultimately troubles earnest climate skeptics. If they can't understand the tests, how can they feel comfortable signing off on the hypothesis? The strategy of only believing in hypotheses that have been demonstrated with simple tests is not actually a bad rule of thumb - in fact - I wish more people used it! It will correctly lead you to believe in General Relativity and the Theory of Evolution, while rejecting homeopathy and astrology. The problem is that it's an overly-aggressive "truth-filtering algorithm". It will cause people to reject theories that they would otherwise accept if they could understand the science needed to understand the tests. It's like using a skeptical axe rather than a skeptical scalpel.

In the case one doesn't personally understand the tests for a hypothesis, the rational thing to do is to outsource the assessment of that hypotheses to the experts. If an expert consensus exists, our task is easier - they're probably right, and the dissent is most likely from deluded or deliberate noise-makers with claims of conspiracies and ignored evidence. However, we shouldn't dismiss such claims - contrarians are occasionally right. A fast screening process you can use on a contrarian without spending your precious time delving into their arguments is to see if they've made radical claims elsewhere. For example, Roy Spencer, a climate skeptic, is also an evolution skeptic, claiming that Intelligent Design is "no more religious, and no less scientific, than evolutionism". The more rigorous approach however, is to examine specific claims challenging the consensus view for consistency.

Contrarians are usually "contrary for the sake of being contrary". Rather than disagreeing on a few specific points, they tend to disagree with everything they can about the consensus view. This inevitably leads to inconsistent claims. If the consensus view claims A and B, the contrarian will often claim NOT A and NOT B. But is it consistent to believe both NOT A AND NOT B? For example, climatologists believe A: That we know the cause the cause of warming, and B: That the cause is humans. Skeptics often believe both NOT A: That we don't know the cause of warming, and NOT B: That the cause is very likely natural. But NOT A AND NOT B are inconsistent!

In the absence of consistent contrarians, the rational thing to do is to trust the consensus. While a broad consensus has the negative effect of shunning dissent, it also has the positive effect of raising the number of people eyeballing the science, and raising the potential glory for the eagle eyed maverick. If a consistent alternative to the consensus view can be made, it will readily surface.

...

Expert opinions here.

Tags:

Truth is Not Relative

by Ben Albahari 19. November 2010 12:55

I was recently arguing with a good friend of mine, who happens to believe in alien encounters, psychic powers, reincarnation etc. The argument made a cold sharp transition from a healthy discussion when I was failed to suppress a smirk when she described what an alien looked like, and ended when she eventually resorted to the relativity of truth with regards to the debate. "No!" I cried, as if contradicting someone needs the adrenaline shot of an exclamation mark. "You can't say that it's true for you that aliens visited Earth, and it's not true for me! It's either true or it isn't. There are multiple beliefs on the matter, but only one truth".

Do you really believe truth is relative? Let's say I tell everyone I can that you eat kittens. I put posters up on the streets with photos of you putting kittens in an oven, I talk on radio shows about how delicious you think kitten sausages are, and I launch a website "savethekittens.org" etc.

Now, let's say two bystanders end up debating whether it's really true that you eat kittens. One of them insists that you are innocent, but the other really believes you're guilty, and after a heated argument declares "truth is relative" and "it's true for you they don't eat kittens". Does that makes sense?

Of course not. You do not eat kittens. Truth is not relative. Belief is relative. Do not confuse truth with belief. If you do not eat kittens, you must drop the belief that truth is relative.

But now let's suppose the debate between the bystanders changes to whether kittens are cute. Once again, the argument ends with one declaring "truth is relative" and "it's true for you that kittens are cute". In this case, you have to distinguish between subjective an objective statements. Subjective statements are as much about the observer as they are about the object, while objective statements are all about the object. To go back to the previous argument, whether you eat kittens is not dependent on anyone's judgement. In comparison, a kitten's cuteness is dependent on a person's judgement. If you can't distinguish between these different types of statements, your take on reality is muddled.

That aliens visited earth is an objective statement (as are claims of reincarnation and psychic powers). It's in the same category as the statement that you eat kittens. It's not in the same category as the statement "kittens are cute". As a general rule, scientific statements are objective and impersonal, while aesthetic and ethical statements are subjective and emotional. Many statements, particularly social ones, are a complex mixture of both. Unless distinguishing between objective vs. subjective statements becomes a habit; a skill that's regularly sharpened; you're doomed to eat kittens. The cute ones too.

...

Expert opinions on this issue are here.

Tags:

The (Endless) Appeal To Research Fallacy

by Ben Albahari 11. August 2010 13:54

I don't know if this fallacy has been named - let me know if it has.

It's the one when someone claims that your belief has little weight because you haven't researched the issue as much as they have.

For example, someone claims aliens have visited Earth. No matter what arguments you give, they will say: "But have you considered this piece of evidence? No? Oh really? Well you haven't really researched it enough. You need to read this book."

Every conspiracy theorist and alternative therapist loves this fallacy, because it insidiously shifts the burden of proof to their opponents. People who support a position will almost always spend far more time investigating that position than those who refute it. This is the ammunition of this fallacy. The homeopath has spent his life devoted to homeopathy, while the skeptic has merely written a book. The girl visiting the homeopath has read a book on homeopathy, while the dude has merely read an article on it.

This fallacy, when taken to its logical conclusion, requires that one should be open minded to any position in the case that someone has spent more time researching it than you. Purely in terms of time management, it makes the shtick of a skeptic untenable.

The fallacy will also endlessly appeal to research - there's no end to new evidence and new arguments coming from the believers - see the Bottomless Well of Bullshit.

The flaw in the fallacy is the assumption that a position cannot be easily refuted, and that the skeptics don't have good reasons that negate the need to explore the issue further. Suppose I tell you that gnomes are living in your basement. You tell me you don't have a basement. I say that's what you think, and that you need to reserve judgement until you've read "Gnomes in our Homes". Is there anything in that book that could change your mind? Of course not!

The sad truth of the matter is that many people have heavily invested in an easily refuted belief. When you point out the easy refutation, they get defensive, and say you need to read more. As Leo Tolstoy said:

"I know that most men — not only those considered clever, but even those who are very clever and capable of understanding most difficult scientific, mathematical, or philosophic, problems — can seldom discern even the simplest and most obvious truth if it be such as obliges them to admit the falsity of conclusions they have formed, perhaps with much difficulty — conclusions of which they are proud, which they have taught to others, and on which they have built their lives."

People usually pitch an Appeal To Research after their woolly explanation of their belief has failed to infect your brain. They might even say something like "Well I'm not an expert on this, but this guy is, and once you read his book, you will be convinced" (see also The Convert Pitch, The Testimonial Pitch). And if you don't look eager to read their book they'll follow up with The Closed Minded Pitch. The hypocrisy! As if you should have to do work to make up for their woolliness. The onus is on them to read your books, not on you to read their books. Mirror them. If in the mood, tease them with their hypocritical reluctance to read your books on science and skepticism.

Tags: