Archive for the ‘unSpun’ Category

Constant repetition doesn’t make it true

February 25, 2008

Constant repetition of a claim may cause people to believe it, but repetition doesn’t make it true.

Unspun by Brooks Jackson and Kathleen Hall Jamieson

Be a skeptic, not a cynic or a naive person

December 30, 2007

When confronted with a factual claim:

  • A skeptic demands evidence
  • A cynic doesn’t demand evidence, he just automatically assumes the claim is false
  • A naive person also doesn’t demand evidence, but he automatically assumes the claim is true

Both cynicism and naivete are forms of gullibility:

  • A cynic is easily led to believing that any claims made by xyz are false and not to be trusted
  • A naive person is easily led to believing that any claims made by abc are true and should be trusted

Be a skeptic.  Demand and weigh evidence and keep your mind open.

— Extracted from Unspun by Brooks Jackson and Kathleen Hall Jamieson

Absolute certainty is elusive

December 23, 2007

You might think that all swans are white because you have never seen a black one. But there are black swans, in Australia. Karl Popper, a famous philosopher, held that even the so-called laws of science are hypothetical, subject to being disproved someday by new evidence. You only need one counterexample to disprove a claim of “never” or “always.” All swans are white — until you see a black one. But you can never tell when that will happen.Perfect knowledge is seldom if ever available to humans. For one thing, new information is constantly arriving, and human learning is constantly expanding.

While we can’t be absolutely certain, we can be certain enough to make a reasonable decision.

In the U.S. court system there are various standards of certainty. A criminal trial requires a much higher level of certainty than a civil trial to convict a person. (Consequently, an individual may be found not guilty in a criminal trial and guilty in a civil trial, e.g. O. J. Simpson).

In our everyday lives, we have to pick an appropriate standard of certainty. With trivial matters the level of certainty can be low, but for nonreversible decisions such as when choosing a spouse or a president, a much higher level of certainty is required.

Be as certain as you need to be.

Unspun by Brooks Jackson and Kathleen Hall Jamieson

TV, newspaper, and magazine reporters and editors decide what we learn

November 5, 2007

When a CNN/New York Times poll asked people where they learned most about health-related issues, only 1-in-10 said from a doctor; 6-in-10 said they learned most from television, newspapers, or magazines.

What reporters and editors find newsworthy often is a poor measure of what people really need to know. We get spun by mistaking how often we hear about something for how often it really occurs.

For example, breast cancer gets enormous attention in the news media. Yet, the plain fact is that women are nine times more likely to die of heart disease, and more than twice as likely to die from a stroke, and lung cancer kills far more women than breast cancer, and so do other chronic lung diseases, such as emphysema.

Psychologists call this effect the availability heuristic, a mental bias that gives more weight to vividness and emotional impact than to actual probability.

Unspun by Brooks Jackson and Kathleen Hall Jamieson

Related blogs:

When is a “large” coffee a large coffee? Names that deceive

October 19, 2007

“California Ripe Olives grow in a variety of sizes: small, medium, large, extra large, jumbo, colossal and super colossal,” the industry website informs us. Of seven sizes, “large” is actually the third smallest.

The Starbucks Corporation doesn’t even use the term “large.” The smallest size on the menu is a “Tall” coffee (12 oz); the next size up is a “Grande” (16 oz) and the largest size Starbucks call “Venti” (20 oz).

Always ask, “What’s behind that name? Does it really describe the thing that they are trying to sell me? What should be a more accurate name for it?”

unSpun by Brooks Jackson and Kathleen Hall Johnson

Fear Sells (advertisers and politicians know it and exploit it)

October 13, 2007

Poor Edna. She was one great-looking woman, so it was strange that she couldn’t land a husband. And nobody would tell her why she was often a bridesmaid but never a bride … The reason Edna was headed for spinsterhood was breath so offensive that “even your best friends won’t tell you.”

The above was an advertisement that Listerine Mouthwash ran in 1923. The ploy worked: the company sold tanker loads of Listerine.

This advertisement gives us a window into how we can be manipulated by appeals to our fears and insecurities. Advertisers know it and exploit it. So do politicians.

In his State of the Union address on January 28, 2003, President Bush said that Saddam Hussein was pursuing weapons of mass destruction and invited listeners to imagine what would have happened if Saddam had given any to the 9/11 hijackers: “It would take one vial, one canister, one crate slipped into this country to bring a day of horror like none we have ever known.”

This appeal to fear helped generate overwhelming public support for the war.

FUD – fear, uncertainty and doubt. Advertisers exploit it to sell their products. Politicians exploit it to sell their policies.

Fear has been a staple tactic of advertisers and politicians for so long you would think that we would have become better at detecting their use of it. But fear and insecurity can still cloud our judgment.

Here’s the lesson in a nutshell: “If it’s scary, be wary.”

Unspun by Brooks Jackson and Kathleen Hall Jamieson

“Anecdotal evidence” is an oxymoron

September 17, 2007

[Definition] Oxymoron: two terms that contradict each other; a contradiction in terms.

Interesting stories (i.e. anecdotes) don’t prove anything.  They could be far from the typical. Anecdotes are not evidence.

Example: a person saw a crow drop walnuts onto a street as a car was approaching.  The car ran over the walnut, breaking it apart.  The crow then flew down and ate the contents of the walnut.

This is an interesting story, but in no way does it prove that crows are clever enough to learn such a neat trick as using human drivers to prepare their meals for them.  In fact, a scientific study was done and it concluded that crows do not possess this ability.

Lesson learned: an anecdote is just that – an interesting story.  It doesn’t prove anything.

— Extracted from Unspun by Brooks Jackson and Kathleen Hall Jamieson

The price-equals-quality fallacy

September 11, 2007

“We tend to think of higher-priced goods as being of better quality than lower-priced goods; but while You get what you pay for may be folk wisdom, it isn’t always true.”

“In the 1950s, Pepsi competed with Coca-Cola by selling its soda at half the price of Coke and advertising twice as much for the nickel. But more people bought Pepsi after it raised its price, a lesson not lost to other marketers. ”

“The price-equals-quality fallacy is exploited in many ways. Many second-tier private colleges and universities make sure the sticker price of their tuition is close to (or even higher than) Harvard’s, Princeton’s, and Yale’s, in the hope that parents and students will take the mental shortcut of equating price with quality.”

“Consumer Reports magazine, which conducts carefully designed tests on all sorts of products from automobiles to toasters to TV sets, often finds lower-priced goods to be of higher quality than those costing much more. For example, in a comparison of upright vacuum cleaners on the magazine’s website in 2006, the $140 Eureka Boss Smart Vac Ultra 4870 was rated better overall than the $1,330 Kirby Ultimate G Diamond Edition or the $700 Oreck XL21-700. The Eureka was also better than the highly advertised $500 Dyson DC150.”

Unspun by Brooks Jackson and Kathleen Hall Jamieson

The “I know I’m right” syndrome

September 9, 2007

“There’s evidence that the more misinformed we are, the more strongly we insist that we’re correct. ”

“In a fascinating piece of research published in 2000, the political psychologist James H. Kuklinski and his colleagues reported findings from a random telephone survey of 1,160 Illinois residents.  They found few who were well informed about the facts of the welfare system: only 3 percent got more than half the questions right.  That wasn’t very surprising, but what should be warning to us all is this: those holding the least accurate beliefs were the ones expressing the highest confidence in those beliefs.”

“Of those who said correctly that only 7 percent of American families were getting welfare, just under half said they were very confident or fairly highly confident in their answer.  But 74 percent of those who grossly overestimated the percentage of those on welfare said they were very confident or fairly highly confident, even though the figure they gave (25 percent) was more than three times too high.  This I know I’m right syndrome means that those who most need to revise the pictures in their heads are the very ones least likely to change their thinking.  Of such people, it is sometimes said that they are often in error but never in doubt.”

Unspun by Brooks Jackson and Kathleen Hall Jamieson

Word of the Day: barking moonbat

September 8, 2007

A barking moonbat is someone who bases their belief, not on evidence, logic, or reason, but simply on things that they want to believe and completely ignoring facts.

unSpun by Brooks Jackson and Kathleen Hall Jamieson