Two recent magazine articles of linguistic interest: from the Atlantic issue for September 2018, “Your Lying Mind” by Ben Yagoda, about cognitive biases; and in the New Yorker‘s 9/3/18 issue “The Mystery of People Who Speak Dozens of Languages: What can hyperpolyglots teach the rest of us?” (on-line title; “Maltese for Beginners” in print) by Judith Thurman.
Yagoda on cognitive biases.
(#1)
The Cognitive Biases Tricking Your Brain
Science suggests we’re hardwired to delude ourselves. Can we do anything about it?
Are we hardwired to delude ourselves? Those who study cognitive bias seem to think so. They disagree on whether we can do much about it
When people hear the word bias, many if not most will think of either racial prejudice or news organizations that slant their coverage to favor one political position over another. Present bias, by contrast, is an example of cognitive bias—the collection of faulty ways of thinking that is apparently hardwired into the human brain. The collection is large. Wikipedia’s “List of cognitive biases” contains 185 entries, from actor-observer bias (“the tendency for explanations of other individuals’ behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation … and for explanations of one’s own behaviors to do the opposite”) to the Zeigarnik effect (“uncompleted or interrupted tasks are remembered better than completed ones”).
Some of the 185 are dubious or trivial. The ikea effect, for instance, is defined as “the tendency for people to place a disproportionately high value on objects that they partially assembled themselves.” And others closely resemble one another to the point of redundancy. But a solid group of 100 or so biases has been repeatedly shown to exist, and can make a hash of our lives.
The gambler’s fallacy makes us absolutely certain that, if a coin has landed heads up five times in a row, it’s more likely to land tails up the sixth time. In fact, the odds are still 50-50. Optimism bias leads us to consistently underestimate the costs and the duration of basically every project we undertake. Availability bias makes us think that, say, traveling by plane is more dangerous than traveling by car. (Images of plane crashes are more vivid and dramatic in our memory and imagination, and hence more available to our consciousness.)
The anchoring effect is our tendency to rely too heavily on the first piece of information offered, particularly if that information is presented in numeric form, when making decisions, estimates, or predictions. This is the reason negotiators start with a number that is deliberately too low or too high: They know that number will “anchor” the subsequent dealings. A striking illustration of anchoring is an experiment in which participants observed a roulette-style wheel that stopped on either 10 or 65, then were asked to guess what percentage of United Nations countries is African. The ones who saw the wheel stop on 10 guessed 25 percent, on average; the ones who saw the wheel stop on 65 guessed 45 percent. (The correct percentage at the time of the experiment was about 28 percent.)
The effects of biases do not play out just on an individual level. Last year, President Donald Trump decided to send more troops to Afghanistan, and thereby walked right into the sunk-cost fallacy. He said, “Our nation must seek an honorable and enduring outcome worthy of the tremendous sacrifices that have been made, especially the sacrifices of lives.” Sunk-cost thinking tells us to stick with a bad investment because of the money we have already lost on it; to finish an unappetizing restaurant meal because, after all, we’re paying for it; to prosecute an unwinnable war because of the investment of blood and treasure. In all cases, this way of thinking is rubbish.
If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view. Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
Confirmation bias plays out in lots of other circumstances, sometimes with terrible consequences. To quote the 2005 report to the president on the lead-up to the Iraq War: “When confronted with evidence that indicated Iraq did not have [weapons of mass destruction], analysts tended to discount such information. Rather than weighing the evidence independently, analysts accepted information that fit the prevailing theory and rejected information that contradicted it.”
Details in the article. The main story concerns two researchers: the team of Amos Tversky and Daniel Kahneman (pessimistic about our ability to learn to overcome the biases); and Richard E. Nisbett (more optimistic).
On the Tversky-Kahneman collaboration, on this blog: the 12/4/11 posting “Collaborations”. And in much greater detail, Michael Lewis’s book; from Wikipedia:
The Undoing Project: A Friendship That Changed Our Minds is a 2016 nonfiction book by American author Michael Lewis, published by W.W. Norton. The Undoing Project explores the close partnership of Israeli psychologists Daniel Kahneman and Amos Tversky, whose work on heuristics in judgment and decision-making demonstrated common errors of the human psyche, and how that partnership eventually broke apart.
The connection to linguistics is through what I’ve called illusions about variation in language use: especially Recency, Frequency, Out-Group, In-Group, Antiquity, and Local Color. Cognitive biases are driving forces in these illusions. (For an inventory of my postings on the illusions, see the Page on ths blog.)
Thurman on hyperpolyglots.
Thurman’s piece follows one hyperpolyglot, Luis Miguel Rojas-Berscia (with 22 languages under his belt so far) as he tackles Maltese (and Thurman herself atempts to learn the language), others he leads her to, and researcher Evelina Fedorenko. From the article:
The word “hyperpolyglot” was coined two decades ago, by a British linguist, Richard Hudson, who was launching an Internet search for the world’s greatest language learner. [The accepted threshold for hyperpolyglot status is eleven languages.]
… Richard Hudson’s casual search for the ultimate hyperpolyglot was inconclusive, but it led him to an American journalist, Michael Erard, who had embarked on the same quest more methodically. Erard, who has a doctorate in English, spent six years reading the scientific literature and debriefing its authors, visiting archives (including Mezzofanti’s, in Bologna), and tracking down every living language prodigy he had heard from or about. It was his online survey, conducted in 2009, that generated the first systematic overview of linguistic virtuosity. Some four hundred respondents provided information about their gender and their orientation, among other personal details, including their I.Q.s (which were above average). Nearly half spoke at least seven languages, and seventeen qualified as hyperpolyglots. The distillation of this research, “Babel No More,” published in 2012, is an essential reference book — in its way, an ethnography of what Erard calls a “neural tribe.”
Two postings on this blog about Erard’s book — primarily about selecting a title for the UK edition (the British publisher didn’t like the Amercan title, Babel No More): on 2/5/13, “Rename that book”, and on 3/19/13, “The title saga”. The American cover:
And the eventual winner for the British cover:
August 31, 2018 at 4:26 am |
in this context design is the translator for cognitive cultural biases. the cliche about american directness vs what i’ll call english (because i’m scottish) cleverness, is clear in the two covers. i’m grabbed by the first and intrigued by the second… thanks for all of these – they’re quite stimulating