思含 黃湘淇
我們好騙,是因?yàn)槲覀兲珣小_@話可能過(guò)于簡(jiǎn)單化,也過(guò)于諷刺。但看看近來(lái)流行的“朋友圈”謠言,很多都有悖常識(shí),經(jīng)不起推敲,但為什么就能橫行無(wú)忌,還大有野火之勢(shì)呢?因?yàn)槲覀儜械萌ケ鎰e來(lái)自熟人的信息,即使被辟謠后,還是懶得去改變想法。謠言或許危言聳聽(tīng),無(wú)傷大雅,但卻有如流毒,扭曲事實(shí),蠱惑人心,禍患無(wú)窮,不得不防。
If you ever need proof of human gullibility, cast your mind back to the attack of the flesh-eating bananas.2 In January 2000, a series of chain emails began reporting that imported bananas were infecting people with “necrotizing fasciitis”—a rare disease in which the skin erupts into livid purple boils before disintegrating and peeling away from muscle and bone.3
According to the email chain, the FDA was trying to cover up the epidemic to avoid panic.4 Faced with the threat, readers were encouraged to spread the word to their friends and family.
The threat was pure nonsense5, of course. But by 28 January, the concern was great enough for the US Centers for Disease Control and Prevention to issue a statement decrying the rumour.6
Did it help? Did it heck.7 Rather than quelling the rumour, they had only poured fuel on its flames.8 Within weeks, the CDC was hearing from so many distressed9 callers it had to set up a banana hotline. The facts became so distorted10 that people eventually started to quote the CDC as the source of the rumour. Even today, new variants of the myth have occasionally reignited those old fears.11 The banana apocalypse may seem comical in hindsight, but the same cracks in our rational thinking can have serious,12 even dangerous, consequences.
Why do so many false beliefs persist in the face of hard evidence? And why do attempts to deny them only add grist to the rumour mill13? Its not a question of intelligence—even Nobel Prize winners have fallen for some bizarre and baseless theories.14 But a series of recent psychological advances may offer some answers, showing how easy it is to construct a rumour that bypasses the brains deception filters.15
One, somewhat humbling, explanation is that we are all “cognitive misers”—to save time and energy, our brains use intuition rather than analysis.16
As a simple example, quickly answer the following questions:
“How many animals of each kind did Moses take on the Ark?”17
“Margaret Thatcher was the president of what country?”18
Between 10% and 50% of study participants presented with these questions fail to notice that it was Noah, not Moses, who built the Ark, and that Margaret Thatcher was the prime minster, not the president—even when they have been explicitly asked to note inaccuracies.19
Known as the “Moses illusion”, this absentmindedness illustrates just how easily we miss the details of a statement, favouring the general gist in place of the specifics.20 Instead, we normally just judge whether it “feels” right or wrong before accepting or rejecting its message.
Based on the research to date, Eryn Newman at the University of Southern California suggests our gut reactions swivel around just five simple questions:21
Does a fact come from a credible22 source?
Do others believe it?
Is there plenty of evidence to support it?
Is it compatible with23 what I believe?
Does it tell a good story?
Crucially, our responses to each of these points can be swayed by frivolous, extraneous details that have nothing to do with the truth.24
Consider the questions of whether others believe a statement or not, and whether the source is credible. We tend to trust people who are familiar to us, meaning that the more we see a talking head, the more we will begrudgingly start to believe what they say.25 “The fact that they arent an expert wont even come into our judgment of the truth,” says Newman. Whats more, we fail to keep count of the number of people supporting a view; when that talking head repeats their idea on endless news programmes, it creates the illusion that the opinion is more popular and pervasive than it really is.26 Again, the result is that we tend to accept it as the truth.
Sticky nuggets27
Then theres the “cognitive fluency” of a statement—essentially, whether it tells a good, coherent story that is simple to imagine.28 “If something feels smooth and easy to process, then our default is to expect things to be true,”29 says Newman. This is particularly true if a myth easily fits with our expectations. “It has to be sticky—a nugget or soundbite that links to what you know, and reaffirms your beliefs,”30 agrees Stephan Lewandowsky at the University of Bristol in the UK.
In light of these discoveries, you can begin to understand why the fear of the flesh-eating bananas was so infectious.31 For one thing, the chain emails were coming from people you inherently32 trust—your friends—increasing the credibility of the claim, and making it appear more popular. The concept itself was vivid and easy to picture—it had high cognitive fluency. If you happened to distrust the FDA and the government, the thought of a cover-up would have fitted neatly33 into your worldview.
That can also help explain why those attempts to correct a myth have backfired so spectacularly, as the CDC found to their cost.34 Lab experiments confirm that offering counter-evidence only strengthens someones conviction.35 “In as little as 30 minutes, you can see a bounce-back effect36 where people are even more likely to believe the statement is true,” says Newman.
Fraying37 beliefs
As a result of these frailties, we are instantly drawn to the juicier details of a story—the original myth—while forgetting the piddling little fact that its been proven false.38 Worse still, by repeating the original myth, the correction will have increased the familiarity of the claim—and as weve seen, familiarity breeds believability.39 Rather than uprooting40 the myth, the well-intentioned correction has only pushed it deeper.
A debunked41 myth may also leave an uncomfortable gap in the mind. Lewandowsky explains that our beliefs are embedded in our “mental models” of the way the world works; each idea is interlinked with our other views.42 Its a little like a tightly bound43 book: once you tear out one page, the others may begin to fray as well. “You end up with a black hole in your mental representation44, and people dont like it.” To avoid that discomfort, we would often rather cling to the myth before our whole belief system starts unravelling.45
Fortunately, there are more effective ways to set people straight and make the truth stick.46 For a start, you should avoid repeating the original story (where possible) and try to come up with a whole alternative to patch up the tear in their mental model.47 For instance, when considering the fears that MMR vaccines may be linked to autism, it would be better to build a narrative around the scientific fraud that gave rise to the fears—rather than the typical “myth-busting” article that unwittingly reinforces the misinformation.48 Whatever story you choose, you need to increase the cognitive fluency with clear language, pictures, and good presentation. And repeating the message, a little but often, will help to keep it fresh in their minds. Soon, it begins to feel as familiar and comfortable as the erroneous49 myth—and the tide of opinion should begin to turn.
At the very least, staying conscious of these flaws in your thinking will help you to identify when you may be being deceived. Its always worth asking whether you have thought carefully about the things you are reading and hearing. Or are you just persuaded by biased50 feelings rather than facts? Some of your dearest opinions may have no more substance than the great banana hoax of the year 2000.51
1. gullible: 輕信的,易受騙的,下文的gullibility為其名詞形式。
2. cast ones mind back: 回顧;flesh-eating: 吃人肉的。
3. 2000年1月,一系列的連環(huán)電子郵件開(kāi)始報(bào)道進(jìn)口香蕉會(huì)讓人感染壞死性筋膜炎—— 一種罕見(jiàn)的疾病,會(huì)讓皮膚爆出青紫色的癤子,然后從肌肉和骨頭上分離剝落。infect sb. with: 使……感染;necrotizing fasciitis: 壞死性筋膜炎;erupt into:(斑疹等在皮膚上)突然大片出現(xiàn);boil: 癤子;disintegrate: 分裂,分解;peel away: 剝落。
4. FDA: Food and Drug Administration,(美國(guó))食品藥品監(jiān)督管理局;epidemic: 傳染病,流行病。
5. nonsense: 無(wú)稽之談。
6. US Centers for Disease Control and Prevention: 美國(guó)疾病控制與預(yù)防中心,下文縮寫(xiě)為CDC;issue a statement: 發(fā)表聲明;decry: 譴責(zé)。
7. 起作用了嗎?起了才怪!
8. quell: 平息;pour fuel on its flames: 火上澆油。
9. distressed: 憂慮的。
10. distorted: 歪曲的,受到曲解的。
11. variant: 變體;myth: 謠言,謊言;reignite: 重新激起。
12. apocalypse: 啟示,大災(zāi)難;comical: 滑稽的,好笑的;in hindsight: 事后回想;crack: 裂縫;rational thinking: 理性思維。
13. add grist to the mill: 給磨坊增加制粉用的谷物,這里是比喻給火上澆油。
14. fall for: 信以為真,受騙上當(dāng);bizarre: 離奇的,匪夷所思的;baseless: 毫無(wú)根據(jù)的。
15. 然而近來(lái)一系列心理學(xué)上的進(jìn)展或許能提供一些答案,顯示出構(gòu)建一個(gè)能繞開(kāi)大腦“欺騙過(guò)濾器”的謠言是多么容易。bypass: 繞開(kāi)。
16. humbling: 令人感到羞辱的;cognitive: 認(rèn)知上的; miser: 小氣鬼,吝嗇鬼;intuition: 直覺(jué)。
17. Moses: 摩西,是《圣經(jīng)》中所記載的公元前13世紀(jì)時(shí)猶太人的民族領(lǐng)袖;the Ark: 《圣經(jīng)·創(chuàng)世記》中的方舟,一艘根據(jù)上帝的指示建造的大船,目的是為了讓諾亞與他的家人,以及世界上的各種陸上生物躲避上帝因故而造的大洪水災(zāi)難。
18. Margaret Thatcher: 瑪格麗特·撒切爾(1925—2013), 第49任英國(guó)首相。
19. 被問(wèn)到這些問(wèn)題時(shí),有10%到50%的參與者沒(méi)能注意到是諾亞,而不是摩西,建造了方舟,而且瑪格麗特·撒切爾是首相,不是總統(tǒng)——即使是在明確要求他們?nèi)プ⒁獠粶?zhǔn)確之處時(shí)亦如此。explicitly: 明確地;note: 注意;inaccuracy: 不準(zhǔn)確的地方。
20. Moses illusion: 摩西錯(cuò)覺(jué),也稱“語(yǔ)義錯(cuò)覺(jué)”,指聽(tīng)者或讀者未能認(rèn)識(shí)到文中的不準(zhǔn)確或不連貫之處;absentmindedness: 心不在焉;illustrate: 說(shuō)明; gist: 主旨,要點(diǎn);in place of: 代替;specific: 細(xì)節(jié)。
21. to date: 至今;gut reaction: 直覺(jué)反應(yīng);swivel around: 圍繞。
22. credible: 可信的。
23. compatible with: 與……相符。
24. crucially: 關(guān)鍵是;be swayed by: 被……所左右;frivolous: 無(wú)聊的,瑣碎的;extraneous: 沒(méi)有關(guān)聯(lián)的。
25. tend to: 傾向于;talking head: (電視發(fā)言者的)頭部特寫(xiě),接受電視采訪者;begrudgingly: 勉強(qiáng)地,不情愿地。
26. 此外,我們也沒(méi)法計(jì)算支持某個(gè)觀點(diǎn)的人數(shù),當(dāng)接受電視采訪者在沒(méi)完沒(méi)了的新聞節(jié)目上重復(fù)他們的觀點(diǎn)時(shí),就會(huì)制造一種錯(cuò)覺(jué)——這個(gè)觀點(diǎn)比實(shí)際上更受歡迎且更為普遍。keep count of: 數(shù)……的數(shù)目;pervasive: 普遍的,流行的。
27. sticky: 持異議的,抱不合作態(tài)度的; nugget: 有價(jià)值的信息。
28. cognitive fluency: 認(rèn)知的流暢性; essentially: 實(shí)質(zhì)上;coherent: 連貫的。
29. smooth: 流暢的;default: 默認(rèn)設(shè)置。
30. soundbite: 簡(jiǎn)短引述,原話片段; reaffirm: 再肯定,重申。
31. in light of: 根據(jù),鑒于;infectious: 易傳播的。
32. inherently: 內(nèi)在地,固有地。
33. neatly: 嚴(yán)絲合縫地。
34. backfire: 產(chǎn)生出乎意料及事與愿違的結(jié)果;spectacularly: 引人注目地;to ones cost: 吃了苦頭之后才……,由于付了代價(jià)才……。
35. counter-evidence: 反例,反證; conviction: 信念。
36. bounce-back effect: 反彈效應(yīng)。
37. fraying: 磨損了的。
38. 由于以上人性的弱點(diǎn),人們總是迅速地為這些繪聲繪色的故事情節(jié)(即最初的謠言)所吸引,而不去理會(huì)其已被證偽的事實(shí)。frailty: 弱點(diǎn);juicier: juicy的比較級(jí),更生動(dòng)的,更有趣的;piddling: 瑣屑的,不重要的。
39. worse still: 更糟糕的是;familiarity: 熟悉;breed: 引起,產(chǎn)生;believability: 可信度。
40. uproot: 根除,連根拔起。
41. debunked: 被揭穿的,被暴露的。
42. be embedded in: 被嵌入在;mental model: 心智模式;be interlinked with: 與……相互關(guān)聯(lián)的。
43. bound: 裝訂的。
44. mental representation: 心理表征。
45. cling to: 堅(jiān)持,緊握不放;unravel: 崩潰,瓦解。
46. set sb. straight: 糾正某人的做法;make sth. stick: 讓某事持續(xù)。
47. alternative: 供替代的選擇;patch up the tear: 修補(bǔ)裂縫。
48. 例如,當(dāng)考慮到那些認(rèn)定麻疹、腮腺炎和風(fēng)疹的混合疫苗與自閉癥有關(guān)的恐慌時(shí),更好的辦法是圍繞引起恐慌的科學(xué)欺騙展開(kāi)敘事,而不是典型的“辟謠”文章,這反而會(huì)在不經(jīng)意間強(qiáng)化錯(cuò)誤信息。MMR vaccines: 麻疹(measles)、腮腺炎(mumps)和風(fēng)疹(rubella)的混合疫苗;autism: 自閉癥;narrative: 敘述,故事;scientific fraud: 科學(xué)欺騙;give rise to: 引起;myth-busting: 辟謠的,破除流言的;unwittingly: 不知不覺(jué)地,不經(jīng)意地; reinforce: 強(qiáng)化;misinformation: 誤傳。
49. erroneous: 錯(cuò)誤的,不正確的。
50. biased: 有偏見(jiàn)的。
51. dear: 寶貴的,重視的;substance: 內(nèi)容,實(shí)質(zhì); hoax: 騙局,惡作劇。