Life-Changing Science of Detecting Bullshit

Having been accused of it:

I decided to do some research and started with this book.

It is perfectly normal for scientists to change their conclusions and opinions after learning new information. This is not a sign of weakness; it is, in fact, an essential feature of the scientific method. (page 8)

Cultural truisms are beliefs that most members of a society accept uncritically and have never so much as considered defending – they are so widely shared within the social environment that people have not heard them attacked or even considered an attack possible. (page 22)

I believe that Madoff succeeded because he had a lot of help from over 4,800 investors who, just like everyone else, are naturally bullible….A gullible individual may believe something despite signs of dishonesty, but the bullible individual is a relatively lazy thinker who doesn’t even care about signs of dishonesty. (page 64)

When we give preference to what we wish is true over truth, we allow bullshit to flourish. (page 72)

Pennycook’s perspective is that people believe fake news and other forms of bullshit not because of motivated reasoning, such as wishing false things to be true, but because they tend to engage in a “lazy” style of thinking by putting very little thought into what they are seeing on social media. (page 88)

When you want to help people, you tell them the truth. When you want to help yourself, you tell them what they want to hear. Thomas Sowell (page 101)

Like it or not, we live in a world that pays more attention to bullshit than facts, evidence, and science. We live in a world that gives more credence to motivated bullshitters than to scientists and truth seekers. (page 104)

People are willing to communicate about things they know nothing about when they feel some obligation or opportunity to do so. People will sometimes be expected, if not obligated, to talk about things they know nothing about – and what often comes out is bullshit. (page 109)

Doctors often talk with their patients about the implications of positive test results in vague ways and opt to test ($), test ($), and retest ($) until they are subjectively satisfied of a diagnosis. Doctors do have specialized knowledge, but they are often under considerable pressure to speak about things they do not know. Because of these expectations, doctors rarely tell their patients that they simply do not know the answer to a question – and what often comes out is bullshit. (page 118)

Introducing accountability is one of the easiest ways to reduce bullshit. Accountability signals to potential bullshitters that someone is listening carefully. (page 120)

In his bestselling book Thinking, Fast and Slow, Nobel laureate Daniel Kahneman wrote that “people can maintain an unshakable faith in any proposition, however absurd, when they are sustained by a community of like-minded believers.” (page 120)

Because relying on evidence to make decisions does not appeal to them, high-propensity bullshitters tend to show signs of irritation when asked to provide reasons for their beliefs. (page 122)

Consistent thinking and behaving with a high need for evidence require a great deal of work – and few people are willing to do such work. Not only are high-propensity bullshitters indifferent to truth and evidence, they tend not to react kindly when presented with evidence. Receiving consistently irrational responses to cold, hard evidence indicates that you are probably dealing with a high-propensity bullshitter. A sure sign of a low-nee3d-for-evidence individual – and likely a high-propensity bullshitter – is a person who appears willing to fabricate evidence or data. They often refer to data that only they have access to – and what often comes out is bullshit. (page 124)

People overestimate their own knowledge, in part, because people who are unskilled in a domain lack the ability to distinguish competence from incompetence. In other words, incompetent individuals are prone to erroneous conclusions and unfortunate choices because they usually do not know they are incompetent. (page 126)

Relying on anecdotal evidence is convenient because it requires doing the least amount of work to substantiate a claim. (page 142)

However, if one can discern no differences between the effects of a claim and the effects of its opposites, then the claim is pseudo-profound bullshit. Pseudo-profound bullshit contains vacuous and confusing buzzwords that obscure meaning and invite people to fill in the gaps with whatever they think the nonsense means – meanwhile, Deepak comes away sounding brilliant. (page 145)

Another sure-fire way to determine if someone is using pseudo-profundity is to ask them to clarify what they mean: “So you say, ‘Defund the police.’ What do you mean by that? What would that look like? How would it work? Tell me the logistics. How would we know it’s working?” There will be a stark difference in how academics and serious criminal-justice reform activists respond to these questions and how those who blindly advocate the phrase on Twitter respond. Clarification is a major antidote to bullshit because bullshitters find it difficult to clarify pseudo-profound bullshit by saying something that actually makes sense or reflects truth and evidence. (page 146)

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: