ICBS Everywhere Rotating Header Image

What’s “Right” or “Are you now, or have you ever been, a troll?”

Each individual has the ability to make judgments about what is “right”, in terms of what is true or correct and in terms of what is moral or ethical. These judgments are the basis for personal knowledge and for personal moral codes. The terms “rational” and “irrational” can apply in either case.

Skepticism is about shared knowledge, but the process of gaining shared knowledge can be easily mapped onto shared morality.

In science, the closest that we can come to what is right (as in “correct”) is whatever the consensus of experts in the field say is most likely right. This is one of the reasons that I cite for why conclusions-based activism is not good skepticism.

You see, the problem is that everybody thinks that their beliefs (i.e., what they hold to be true) are correct; that’s the very definition of “belief”. What’s more is that everybody thinks that they arrived at their beliefs through logic and evaluation of evidence. Everyone. This is a problem because unless we all agree, we cannot all be “right”. So, anyone who insists that their beliefs be adopted by everyone is basically saying “I know the absolute truth for certain,” which is the ultimate display of arrogance and irrationality.

So where does that put skeptical activism? Right where we left it, because good skeptical activism addresses the questions, not the answers. As I have discussed in several blog posts now, good skepticism doesn’t involve telling people that there is no such thing as ghosts or psychic abilities. It involves examining evidence and providing alternative explanations for phenomena. It involves education about this method.

On rare occasions when Skepticism (big-S refers to organized skeptical activism & scientific skepticism) addresses conclusions, it does so because an overwhelming scientific consensus is common knowledge and the consequences of failing to recognize that consensus are potentially very harmful. Again, this is rare; some examples that come to mind are vaccine safety and climate change.

Shared knowledge demands the consensus of those with expertise who have followed a number of steps designed to reduce the impact of human biases and heuristics. It is a process that works slowly, but works well overall.

Moral judgments are not very different from judgments about truth. Good moral reasoning uses critical thinking (i.e., considering plausible alternatives and hidden facts) and is open-minded. As with reasoning about facts, moral reasoning is subject to a number of biases and heuristics which favor what we currently believe and what we want to believe.

As with reasoning about facts, arrogance and cognitive laziness are sources of irrational behavior. We fail to consider long-term ramifications of moral judgments, especially when we choose to act on them. For example, we fail to consider that there may be trade-offs between safety and personal freedoms (think TSA). We fail to consider what we may not know about a situation. We fail to recognize fraud, framing, and spin, especially if the outcome supports our views. We fail to hold our views tentatively while we consider alternatives and we favor the choices that promote our personal goals (e.g., allow us to keep our friendships intact).

As with reasoning about facts, human beings will jump through all kinds of hoops to reconcile conflicting attitudes and actions so that we can maintain our self-image. We justify, rationalize, redefine, backpedal, redirect, and use all of those other wonderful self-deception tools.

As with reasoning about facts, if we don’t all agree on what’s right/righteous, then we can’t all be right/righteous.

As with reasoning about “facts” almost everyone believes that their own moral compass points to what is righteous.

So how do we have justice? Where do we get shared morality? Through something we call the justice system.

Like science, the justice system is imperfect, but self-correcting. Laws are created and changed as culture changes and mistakes are made. It is complex, but built on a strong foundation (in the U.S., we have the constitution) and flexible.

Like science, we are all responsible for providing fuel and direction. In science, that includes the problems that human beings need solved, such as the technology to explore outer space. This dictates priorities for science and encourages the growth of funding sources. In morality, it’s changing culture and technological advances that provide priorities and encourage changes in laws.

Science provides a framework that is adopted outside of the formal process (e.g., academic research and publishing) in many ways. We use scientific methods to investigate crimes and to produce technology. Likewise, the legal process inspires policies such as rules and regulations in employment situations and grievance processes.

Again, both of these are imperfect systems.

However, given all of their imperfections and our tendency to discard them when convenient, science is the best system we have for gaining and sorting out shared knowledge (what’s right) and the system based on our constitution is the best system we have for dealing with issues of justice (what’s righteous).

Whether science or the justice system, when it does not tell us what we want to hear, human beings tend to leave it behind. And that’s a mistake.

Skeptical activism is about promoting scientific consensus and the methods used to achieve it. It’s about limiting the impact of those who choose to circumvent those methods.

This is my tiny effort to limit the impact of those who choose to circumvent methods of shared morality.

Print Friendly

6 Comments

  1. Wendy Hughes says:

    Not so tiny. I think the same things – but could not have put it into words. Thank you.

  2. Thanks for writing this Barbara!

  3. Doug Duncan says:

    Happy to see you posting again–it’s been awhile!

    1. Thanks and sorry about that. I’m what’s known as a “SUPER slow” blogger! :)

  4. Skeptek says:

    This is an excellence summary of what can sometimes be a difficult notion to grasp – that because it’s human nature for us to believe all manner of things, and use myriad tricks to hide that nature from ourselves, is exactly why we must be so cautious about what we believe… and why we believe it. Paranoia, more or less, is a reasonable response to that fact of our nature.

    We’re so skeptical about everything, all the time, only because it’s always so easy, to be so wrong, and never even know it.

    Great article. Shared. Favorite quote: “cognitive laziness” (stealing that).

  5. Steersman says:

    Excellent post that highlights both the “nature of the beast”, and the necessity for making some efforts to forestall those who would “circumvent methods of shared morality”. Although I might quibble a bit over this statement of yours:

    As with reasoning about facts, if we don’t all agree on what’s right/righteous, then we can’t all be right/righteous.

    While I’ll agree about “righteous” as that seems largely a subjective term, that we might all think and agree that we are all right on any given issue doesn’t necessarily preclude the possibility that we are all in fact wrong. Which should, I think, engender some degree of “fear and trembling” over embarking on any program, particularly one where we all agree.

    However, I think that the crux of the problem is that, as you phrased it, “everybody thinks that they arrived at their beliefs through logic and evaluation of evidence”, particularly if not exclusively relative to various religious, political and sexual issues where moral judgments hold sway. But the evidence seems to suggest that many if not most of our premises derive from inductive leaps – the problem of induction – that are predicated, in each of our cases, on very different sets of data points. Rather analogous to the childhood pastime or puzzle of connecting a bunch of dots to draw a picture: many different figures can be drawn, very few if any actually corresponding to “reality”. Not surprising then that many of us frequently wind up at loggerheads insisting that the other person “just doesn’t get it”, and then raising questions about the other person’s ethics, morality or honesty. Somewhat apropos is this passage from Michael Shermer’s The Believing Brain [highly recommended]:

    Shermer wrote: As we saw in the previous chapter, politics is filled with self-justifying rationalizations. Democrats see the world through liberal-tinted glasses, while Republicans filter it through conservative shaded glasses. When you listen to both “conservative talk radio” and “progressive talk radio” you will hear current events interpreted in ways that are 180 degrees out of phase. So incongruent are the interpretations of even the simplest goings-on in the daily news that you wonder if they can possibly be talking about the same event. [pg 263] [my emphasis]

    For an interesting and graphic analogy of that process, I would recommend Massimo Pigliucci’s post on The limits of reasonable discourse wherein he uses the “fitness landscape” metaphor of evolutionary biology to model the process of discourse – reasonable and otherwise. In each case, very slight differences in the “fitness” of the related “premises” – for physical survival, or for basing policy on – can lead to very different conclusions even though steps based on each of those slightly different premises are entirely “logical”.