Magazine, Vol. 4: Changing Climate
Leave a comment

„We need to forget about the idea that knowledge requires one hundred percent certainty“

Johan Huibers’ Ark, The Netherlands, 2010 © Sayler/Morris


Facts Are Futile: The Climate Deniers’ Philosophy


The facts are clear: the earth is heating up. Scientific consensus tells us: We are responsible for it. Though some people are still not convinced, nor – it seems – want to be convinced. Why is that? Professor of Philosophy N. Ángel Pinillos talks to 42 Magazine about the theory behind the doubt. About what it means to know something and whether or not it is justified to doubt scientists. He also proposes a strategy for dealing with sceptics on a day to day basis. 

Dr. Pinillos, you’re working on epistemology – the question of what knowledge means – in relation to climate change sceptics. Philosophy has reached a form of consensus where knowledge is analyzed as “justified, true belief.“ Do you agree with this analysis?

This analysis has been commonplace in philosophy since Plato. Most philosophers today think it is approximately correct and I agree. For example, I know that I attended college. This means that (a) I believe that I attended college, (b) my belief that I attended college is justified – through memory, testimony from classmates, the diploma in my drawer, and so on, and (c) it is true that I went to college. It is often helpful to think of knowledge in this way.

Is it philosophically justifiable to gain knowledge by believing in authorities like scientists? Why should we believe them, when they tell us, that climate change is real and man-made?

Let’s think about a simpler example. Many non-scientists know some basic facts about stars. For example, it is common knowledge that many stars are millions of miles away, are very old and contain helium. We know lots about stars. But how does this happen? This is probably obvious to most, but it is worth repeating. The process which leads to knowledge has at least two stages. Firstly, astronomers use the scientific method including recording the movement, luminosity and radiation emitted by stars to determine their various properties. Secondly, this information is disseminated to the public. In the cases where there is consensus, like with the composition of stars, the information will be included in textbooks that you read in school.


“It is important to recognize that just because the system fails once in a while, it doesn’t mean that it can’t yield knowledge”


Does this system fail sometimes?

This two-level system works well enough, although it can break down at each step. At the level of the creation of knowledge, a scientist may fail to collect the appropriate evidence, mishandle data or make the wrong inference. At the level of dissemination, the message from the scientist may be distorted by reporters, blocked by special interest groups, or it may be that the scientist herself misreports her own findings.

It is important to recognize that just because the system fails once in a while, it doesn’t mean that it can’t yield knowledge in normal cases—just as the fact that my auditory system fails me sometimes doesn’t mean that my hearing can’t give me knowledge in normal cases. I know that there is music playing in the background right now even though my hearing has failed me in the past. We need to forget about the idea that knowledge requires one hundred percent certainty. One thing we learned from Descartes is that we can’t have certainty about even the most basic things anyways.

But is it really enough of a justification to listen to the findings of the scientific community without knowing anything about the experiments that led to these findings?

Yes, I think this follows from the idea that knowledge and justification can be transmitted via testimony. If I did laundry and I tell you about it, you now know that I did laundry – even though you haven’t checked yourself whether what I’m saying is true. Of course, this doesn’t always work because people lie on occasion. But testimony works in general. If you are thinking that testimony doesn’t usually transmit knowledge or justification, you are probably over-emphasizing the cases where it doesn’t work and you are ignoring all those boring, run-of-the-mill cases, where it works perfectly.

But at any rate, you could never really be acquainted with all the experiments surrounding a scientific result. There is too much cumulative knowledge in science for anyone to really understand everything that goes into establishing a claim. You always have to rely on testimony, whether it stems from some work you cited, or the technician at your lab, trusting that he reads off the right data from your instruments.


„When this consensus happens, it becomes less likely that something has gone wrong with the science“


How does the two-step-process you mentioned earlier work concerning climate-change? Do you believe it’s succeeding at educating people?

I think it seems to be working well. Scientists gain knowledge using the scientific method and this gets disseminated to the public – though we still need to do a better job in passing it on. But how can I be so confident that the system is working as it should in the case of climate change? There are two clues here. Firstly, you have a lot of scientists independently reaching the same results about man-made climate change. When this consensus happens, it becomes less likely that something has gone wrong with the science – this is why replicability is highly valued. Secondly, although there are thousands of climate scientists who each select a different approach in their research, there have been many collective attempts to summarize and catalogue their findings and bring the results to public attention. Again, there is convergence on these summaries. This makes it less likely that there is some problem with the transmission of knowledge.

Consequently, it’s not so much that I trust the authority of some scientist. It is rather that there is a great deal of evidence that the two-stage system of knowledge creation plus dissemination is working well. All signs point to the idea that human activity is having a significant impact on the environment.

Why do some people find it harder than others to believe these authorities?

The main reason is probably a lack of understanding of climate science, and a simple lack of awareness that there is a great deal of consensus from scientists regarding climate change. Another explanation coming from social psychology, follows from the general idea that people’s alliances and feelings affect their beliefs. For example, if believing in human-caused climate change would create conflict between my family members, I would be less likely to believe it. This is supposed to be an implicit effect, so I might not be aware that my beliefs are being affected by these factors. From my perspective, I may be reasoning impeccably. In addition, I think that people may be afflicted by what we might call “local” philosophical skepticism. I don’t think this is enough to fully explain skeptical attitudes about climate change. But together with the other explanations, it can help sustain those attitudes. At least, this possibility from philosophy has been overlooked by discussants.

Skepticism is widely known as Descartes’ Cogito-Argument. How does one get from the famous phrase “cogito ergo sum”, “I think, therefore I am”, to doubting or even outright denying climate change?

Descartes was interested in figuring out which beliefs we can be certain about. The legacy from this work is the idea that in fact, we can’t be certain of anything our senses are telling us, although we can be certain about a few things like whether we exist – cogito ergo sum – and the inner world of our minds. If you think that it is obvious that we can’t be certain about the external world, you are a recipient of Descartes’ legacy. Descartes’ ideas were not at all obvious at the time he presented them. At any rate, a lot of philosophers take Descartes’ arguments to challenge not only the idea that we can have certainty but whether we can know or have justified beliefs.

Do you agree?

Some philosophers, including myself, think Descartes’ argument follows a certain formula which can be recreated to turn people into local skeptics. That is, we can use his recipe to get people to surgically doubt specific things while not necessarily doubting others. Imagine that you get tested for a disease and the test comes out negative. That’s a relief. But then I remind you, ‘Well, false negatives do happen. How can you know that this isn’t one of those false negative cases? Can you really rule out that possibility without further tests?’. Here, you may feel tempted to agree ‘Ok, sure, I guess I don’t know if it is one of those false negative cases’. Although the exact details of how this creation of doubt works is currently being debated by philosophers, researchers tend to agree that it typically involves mentioning a possibility of error. But it is not the case that raising any old possibility of error is enough to make you doubt, or at least not to the same degree. Going back to the disease case, if I remind you that people’s memory sometimes fails and perhaps you didn’t get tested after all, you may just look at me like I’ve lost my mind.

You mentioned the concept of „local skepticism” a few times.

There are actually two types of skeptical positions in philosophy—global and local skepticism. The global skeptic thinks none of your beliefs about the external world amount to knowledge. The local skeptic is one that finds herself sincerely asserting she doesn’t know in this or that case, depending on how skeptical pressure is mounted. The local skeptic may deny knowing for sure that her medical test is not a false negative, but she will be happy to say she knows other ordinary things – she will still insert that she knows she’ll have a lunch meeting next Tuesday at 2 pm. It’s rare to find a global skeptic. It would thus be hard to motivate climate skepticism in this way. Those individuals, if there are any, will say they fail to know human caused climate change is real. But this is the least of their problems. Global skeptics will also say they fail to know many mundane things. They will fail to know they have bodies and that they are living in the 21st century.

People doubting climate change and people believing in conspiracy theories often use similar arguments and shared beliefs – specifically about society, the government, and science. Is there any structural evidence relating conspiracy theories to skeptical pressure?

That’s an interesting area to explore. I think that any time people ruminate and worry about some possibility, like a conspiracy theory, then local skepticism can creep in to help sustain a skeptical attitude. This can happen with people from all political sides. Just as some people are overly-suspicious of government and science, some are overly suspicious of corporations.

You speculated that an extreme version of skeptical pressure is present in obsessive-compulsive disorder (OCD). Could you elaborate on this?

OCD is a complex condition with different manifestations. However, it seems to be essentially connected to doubt. In fact, the 19th century French psychiatrist Jean-Pierre Falret called it ‘folie du doute’ or ‘madness of doubt’. Take a case where an agent feels compelled to keep checking if they turned off their stove. They can’t be sure they turned off the stove despite evidence to the contrary. Here are some similarities between the OCD patient and the local skeptic. They both mentally focus on the possibility that a certain belief is false. They are also both aware that in some sense there is great evidence against their attitude, yet they still find themselves pushed to doubt. For example, the patient who feels she doesn’t know the test was not a false negative, still agrees that false negatives are extremely rare. Finally, both local skepticism and OCD appear to be exacerbated when the belief at issue is important to the agent. A significant difference between OCD patients and local skeptics is, however, that the doubt in the former group rises to an obsessive level. I believe this to be local skepticism taken to the extreme. So the idea is that OCD is at the extreme end of a doubt spectrum, whereas ordinary local skeptical doubt is a manifestation of behavior at the more or less normal range of the spectrum. At this point in my thinking, the connection is just speculative. More work is needed, but I think it is an interesting area to explore.


“Once we engage in a discussion about probabilities, participants will find themselves more engaged with the actual data”


Regarding skeptical pressure – how can one argue against this position?

This is one of the great problems in philosophy and philosophers are still working on it. But I think we can identify some arguments that may be helpful. Local skepticism is interesting. One way to meet it is to point out how one’s local doubt appears to be in conflict with beliefs on different topics. Consider again the person who is worried that they don’t know their negative test result for the deadly disease is not one of those rare false negative cases. We could try to assuage this worry by reminding them that they know lots of ordinary things with even lower probability of being true. For example, they know they have a lunch meeting next week at 2:30 pm. Of course, the chances of this being correct are much lower than the chances of their test result actually being a rare false negative case.

Another way to meet local skepticism in our daily lives is to stop talking about ‘knowledge’ and ‘justification’. We can shift to talk of probabilities. So instead of saying ‘I don’t know that my test result is not one of those rare false negative cases’, we can just state the probability, which is actually known for medical tests. For example, you can say ‘There is a .0005% chance that this is a false negative case’. A nice thing about working with probabilities is that it pushes us into more systematic, deliberate “system 2“ reasoning. In addition, it allows us to make cleaner comparisons with other beliefs.

“System 2“ reasoning, could you explain this term?

In the tradition associated with Nobel prize winner Daniel Kahneman and many of his collaborators, “system 2“ refers to a thinking process that is slow, deliberate, conscious, effortful and which taxes working memory. In contrast, “system 1“ thinking tends to be automatic, fast, and subconscious. Both systems are needed for humans to get along in the world. But when it comes to public discourse about the future of our planet, we should be engaging in careful, system 2 reasoning.

So how can this reasoning be used when arguing about climate change?

Talk of “knowledge” and “justification” has some pitfalls. It is sometimes helpful to look at things from different perspectives, like using probabilities. In the case of climate change, this is useful. The person who denies knowing that climate change is caused by humans will presumably accept that there is some chance that it is, in fact, caused by humans. Once we engage in a discussion about probabilities, participants will find themselves more engaged with the actual data.

Interview: Sara Pichireddu




© Angel Pinillos

Nestor Ángel Pinillos is a Professor of Philosophy at the Arizona State University, holds a PhD in Philosophy and an BA degree in Mathematics and specializes in questions of epistemology and the philosophy of language. His works also include experimental philosophy. He is currently working on a new book called „Mind and Doubt“, which will present a theory of how people make decisions and come to conclusions in the first place.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.