Wednesday, 28 October 2015

Factual Claims and Fear of Religion

On 26 October 2015, some poor teacher, who was slightly less paranoid than perhaps she should have been, got into a little hot water after she presented her class with an activity worksheet:

Now, believe it or not, she did not get into trouble because of the blatant and apparently unapologetic use of Comic Sans font*, and she did not get into trouble because of the unnecessary use of the Oxford comma in the activity's title (and again in the title of the Anchor Chart).  No, there was instead a storm in a tea cup about her use of the statement "There is a God."

This story was picked up by a few outlets, including ABC13 Eyewitness News (which quotes the school district's statement) and Live Leak (which has a video of the child's testimony to the school board, as also linked to in the school district's statement - see agenda item 8).  I came across both of these representations of the story in /r/atheism.

Within the context of discussing the ABC13 coverage of this event, I made a comment that I thought the teacher was wrong because (in my opinion) the statement "There is a god." is actually a factual claim, rather than an opinion or a commonplace assertion (and I also distinguish between "facts" and "factual claims" as I shall explain below).  Here it is, as recorded in internet history:

I smell a beat-up. And, even speaking as an atheist, I have to say that if the teacher was claiming that the statement "There is a god" is not factual claim and only an opinion, then that teacher was wrong.
For some people making that statement, it most certainly is a factual claim - it is simply not a correct claim. Some people will make that statement as no more than an opinion (weak theists). In some parts of the world it's also a commonplace assertion (particularly if Dennett is right about "believing in belief".

Now this comment didn't get much of a response, probably not enough to truly warrant a blog article all by itself, but I also asked around and found that quite a few people are labouring under the misapprehension that a "factual claim" is the same as a "correct claim" (or a "fact").  My interlocutor on /r/atheism even went so far as to provide me with an extract of the definition of the word "factual":

adjective, concerned with what is actually the case rather than interpretations of or reactions to it.

Now I have no huge issue with this definition of the term "factual" - so long as it is understood that it is only a partial definition of the term and that this partial definition only works within certain contexts.  It is interesting that when presenting a definition, the commenter chose to take one from (which in turn presented a definition from the Oxford Pocket Dictionary of Current English).  They didn't just go with what a google search on "define factual" provided:

concerned with what is actually the case

Nor did they go with

1. of or relating to facts; concerning facts: factual accuracy.
2. based on or restricted to facts: a factual report.


1:  of or relating to facts <a factual error>

2:  restricted to or based on fact <a factual statement>

These definitions are not hugely different, but there is a subtle difference that perhaps he was aware of (leading him to seek out a definition that he thought was more supportive of his position).  Note that both, in the primary definition, leave open the possibility of talking about things being factually correct or factually incorrect.

But let's assume that my interlocutor was factually correct in his apparent claim that factual had no more than the contorted and somewhat limited definition that he dug up for it.  The facts of the matter (as presented in ABC13 news article) were that the teacher presented an "assignment () to classify each statement as an 'opinion', 'factual claim' or 'commonplace assertion.'"

We are at a bit of a disadvantage of course, because we only have a snippet from the teaching materials - we don't have access to the "Anchor Chart on Facts, Opinions, and Commonplace Assertions".  It is likely that there were clear definitions of the terms, although I do have mild reservations about her use of the word "facts", but I have not seen them presented anywhere.  Using only what we have to hand (as presented in ABC13 news article) there is no reason to believe that the teacher was asking students to provide an opinion on whether the statements were correct or not.  In the context of the activity, it would be totally bizarre if she were - but then this was happening in Texas, so who knows.

What we need to look at, if we are going to make the attempt to be reasoned, is the meaning of "factual claim".

My position is that this term refers to the nature of the claim being made, particularly from the perspective of the person making the claim, and nothing about the veracity (or otherwise) of the claim.

Imagine for a moment that you know someone who would be entirely comfortable saying that aliens exist and, furthermore, they are working together with the UN in order to enslave humanity.  If this person said "There are aliens", would this be an opinion, a factual claim or a commonplace assertion?

It's certainly not a commonplace assertion where I live, so I can eliminate that.

I asked some people whether such a statement would be an opinion or a factual claim and they were all quick to categorise it as an opinion.  But when I asked whether this conspiracy nut would categorise his own claim as "opinion", they wavered - no, they said, from his perspective it would a factual claim.  And then they conceded that the conspiracy nut would likely have "facts" and "evidence" to support his wild theories (the problem of course is that these facts and evidence would be cherry picked and carefully interpreted to support his preferred conspiracy).

Imagine now that I make a statement that is clearly an opinion - such as "black cats are attractive".  The problem here is that our language can be a bit vague as to whether a statement is being presented as an opinion or as a fact.  Few people would seriously claim that it is factually correct that black cats are attractive (and in my opinion such people are likely to be witches).

If we were anally retentive, or in a formal debate, we might clarify things by saying "in my opinion black cats are attractive " but most of us can distinguish between clear cases of opinion and statements that are clearly being presented as facts (also known as "factual claims").   And this is the very point of the activity in question, drawing attention to the fact that "X is Y" is sometimes an opinion, sometimes an assertion of fact (a factual claim) and sometimes just an echoing of something that everyone says without actually thinking too much about it (a commonplace assertion).

The problem, in my opinion, is that if we were to say that statements such as "There is a god." are merely opinions then we are effectively saying that this statement resolves down to "In my opinion, there is a god."  Such a statement can never be shown to be wrong.  It's possible that there are situations where, as Dennett points out, people might be lying about their religious beliefs, but even then this just means that such a person would be using a different code - something to effect of "In my opinion, it is beneficial to make statements to the effect that there is a god, such as this statement, even if one does not in actuality believe that there is a god."

I accept that there may well be people who claim there is a god while either knowing that they don't actually mean it or understanding the claim to be no more than an opinion.  But these are surely in the minority.  Most people making the statement "There is a god." will be of the opinion that they are making a factual claim.  The fact that atheists consider such a claim to be either incorrect or indefensible doesn't change the fact that it is a claim about an asserted fact regarding the universe - which I believe to be a reasonable definition of a "factual claim".  And if a theist is making a factual claim about the existence of a god, as opposed to stating an opinion or reporting a commonplace assertion, then we are entitled to challenge that claim either in an attempt to show that that claim is false or in order to show that the claim stands up to scrutiny.

It seems pretty bizarre to me that some extreme theists might blow up family planning centres or abortion clinics on the basis that they have an opinion that there is a god who disapproves of such things.  Surely such people sincerely believe that the existence of their god is a fact?

There is also the rhetorical trick of taking someone's assertion of fact (factual claim) and saying "well, that's your opinion".  It's certainly a good way to rattle someone, to annoy them and belittle them, but it's not an honest tactic.  If an assertion of fact is made and that assertion of fact is erroneous, and you care about what is being asserted, then you should address the factual problems with the assertion.


At the beginning of this article, I presented an image of the activity that the students were given.  This was taken from the school district's FAQs Regarding 7th grade Classroom Activity.

It's really worth scrolling down and looking at the actual question and answer section.  Note that the school asserts that the teacher did not tell her students that god is a myth, that no students cried and that there were no arguments.  Despite this, "personnel action will be taken" and the activity would no longer be used.

It's quite disturbing that an activity like this, "designed to encourage critical thinking skills and dialogue", should have such an outcome.  It's almost like a section of society are uncomfortable with children being given critical thinking skills …

And the school's actions seem to indicate a fear of religion which is very sad indeed.


* The absence of Comic Sans from Google Fonts destroyed my hilarious running joke in which "There is a God." always appeared in that font.  My apologies for any inconvenience caused.

Monday, 26 October 2015

The Nature of Paradox

At first my reaction involved thinking that this person was clearly confused, but then I wondered if, perhaps, I think about paradoxes in a slightly different way to most people.  If that is so, then I should clarify what I mean when I use the term "paradox".

I've actually written about paradoxes a few times (Patently Paradoxical Pabst's Perplexing Performance, WLC Takes Us for a Ride, There is no Twin Paradox, Immovability and a series on the Bertrand Paradox) but it was in my response to Melchior regarding the Bertrand Paradox that, possibly, I have most clearly articulated my position on what a paradox means.

Let me try again.

As far as I am concerned, thinking only about the strictest meaning of the term "paradox", if a statement is paradoxical then it is:

  • wrong,
  •  self-referential, or
  •  self-referential and wrong

If you are thinking through the logic associated with a proposition and you come across a paradox, then there is something wrong with either the proposition or your thinking about it.  (Note that we can use paradoxes to identify where our thinking is incorrect, but we can't use them to bootstrap the non-existent into existence.)  For this reason, I tend to think in terms of resolving a paradox - which means identifying the problem in thinking that leads to the appearance of a paradox.  Once you've eliminated the problem, then you no longer have a paradox.

There are some paradoxes for which the problem cannot really be eliminated, because a statement is in some sense self-referential, but these tend to be either meaningless or vague.  An example is the classic "this statement is false".  Sure, it's paradoxical, but it's also meaningless, since it refers only to itself.  Another is the even more classic "all Cretans are liars" (as spoken by a Cretan).  It's only paradoxical if you define "liar" to mean a person who always lies, as opposed to the rather more accurate, if also somewhat vague definition - namely someone who lies (with some undefined frequency).

Where a paradox is meaningful (at least in some sense), it tends to arise because of limitations on logic.  Russell's paradox, for example, is self-referential, but it's not meaningless because of its application to set theory. That said, it did show that na├»ve set theory was flawed, so it is amenable to a trivial resolution.  Another paradox that can be trivially resolved is the paradox of the stone.  The paradox hangs on the notion of omnipotence.  Once you accept the fact that omnipotent beings can't exist, the "paradox" dissipates.

It's worth nothing that logic works within a framework.  For example, we could look at a simple syllogism:

(Major Premise) if A then B → (Minor Premise) A → (Conclusion) therefore B

Using this form, we could conclude that, given that I have walked the dogs, the dogs will be tired.  What we can't conclude, using this syllogism, is that the form of the syllogism is true and valid.  Trying to avoid the assumption that the form of the syllogism is true and valid leads to a sort of paradox:
if a syllogism of the form
  • if A then B → A → therefore B
is true and valid then the syllogism
  • if I have walked the dogs then the dogs will be tired → I have walked the dogs → therefore the dogs will be tired
will be true and valid
a syllogism of the form
  • if A then B A therefore B
is true and valid 
therefore the syllogism
  • if I have walked the dogs then the dogs will be tired → I have walked the dogs → therefore the dogs will be tired
will be true and valid

While this seems to be saying that the conclusion is conditional on the truth of the minor premise, which is always the case for syllogisms of this form, the whole structure itself is in the form of the syllogism that is the subject of the minor premise (as shown by the colour coding, showing Major Premise, Minor Premise and Conclusion).

Now when I say this is a "sort of" paradox, I don't mean that it is necessarily an "actual" paradox.  Remember I said that we can use paradoxes to identify where our thinking is incorrect.  What this means is that we have falsifiability.  If this structure ever fails, then we say that we have falsified this form of syllogism.  It's about as scientifically rigorous as you can get, as well as being logically rigorous.

Similarly, we can test science scientifically and we do so all the time.  Our working hypothesis is that the scientific method always works - and this is a falsifiable hypothesis.  If we come across any situation in which rigorous application of the scientific method doesn't work, then (pseudo-paradoxically) we will have used the scientific method to show that the scientific method doesn't always work.  Good luck with that!

Sunday, 25 October 2015

Hands Off Our Logic, God-Boy

“This is a lie.”
Well, actually it isn’t.  In which case, it is and therefore … well, it’s a paradox.  It’s also a good demonstration of how logic can fail when self-reference is involved.  Many moons ago, I used to challenge people to come up with a paradox that doesn't involve self-reference, but then I stumbled across the "Bertrand Paradox". (Why quotation marks?  Because some will argue that it's not a real paradox, that it can be resolved if one thinks about it in the right way - I happen to agree.) There is also the twins paradox (which I also believe can be resolved).  Note that I am not delineating between self-reference and circular reference, which I consider to be self-reference at one remove.  Yablo’s paradox is self-reference at greater than one remove, but it still involves self-reference within the system as a whole.
I guess I should have included those “paradoxes” which are rooted in vagueness, even though I don’t count these as proper paradoxes.
The Sorites paradox is an example of a vagueness paradox, in which a heap of sand can be reduced one grain of sand at a time, but remains a heap – perhaps even up until the very last grain is removed.  However, it is only a paradox in so much as the term “heap” is not clearly defined.  If a heap is defined as two or more grains of sand lying in close proximity such that at least one grain of sand lies on another grain of sand, then the “paradox” disappears.  The same type of resolution can be applied to the Ship of Theseus.
Then there are numerous curiosities of science which result in unexpected results which aren’t really paradoxes at all, but still manage to appear on lists of paradoxes
Nevertheless, even if there were to be other valid sorts of logical paradox, what we can say with confidence is that there are plenty of self-referential paradoxes like the Liar paradox above which was first put in recorded words by Epimenides.
So what? I don’t hear you ask.

What I want to point out here is that apologists like Craig (or Plantinga) should not be dabbling in logic at all.  This is not only because they fail so spectacularly when they attempt it, but also because their so-called “logical arguments”, designed to lend credibility to their assumptions regarding a creator, are inherently self-referential. The logic they using is, as a consequence, fragile in the extreme.
Let’s look at one of Craig’s major arguments in a little more detail.
The cosmological argument from first cause argument derives from apparent paradox.  I’ll paraphrase the argument to highlight the paradox.
If all things that begin to exist have an antecedent cause,
and the universe (where “the universe” is generally understood to mean “all things” including time) began to exist,
then the universe has an antecedent cause.
This is impossible because without time (and therefore without “the universe”), there can be no antecedent cause.
There is also the niggling fact that the only evidence that we have suffices only to support the claim that:
All things that begin to exist in the universe have an antecedent cause within the universe.
Clearly we do exist, in some form or another, but Craig’s argument seems to say that that is not possible.
The paradox is a blend of vagueness and self-reference.  The term all things is not usually understood to include time itself, and the lack of clarity is heightened when the term is used in the context of an indirect reference to time.  The term “the universe” is presented as if it meant all things but really refers to all things plus time.  The self-reference is revealed when it is understood that we are talking about a cause of (inter alia) time that is antecedent with respect to time.
Craig tries desperately to avoid problems by defining away his god, to make it not part of the set of all things.  Timeless, changeless, immaterial and uncaused.  He fails, however, when he admits that his god is “enormously powerful”.  Well, excuse me, but Craig should check what “energy” and “power” mean.  If Craig wants to use physics to explain his god, he can’t abandon it when it becomes inconvenient – or he is guilty of his very own fallacy.  As it is, he’s left with something which depends on time creating time – a paradox which he could not escape – unless he wants to introduce magic, thus losing all the credibility that the use of science and logic was supposed to provide.
And that, Ladies and Gentlemen, is a paradox.

Monday, 19 October 2015

What Really Happened to the Dinosaurs?

This brief article, about what happened to the dinosaurs, is written with the hope that it will go some way to displacing the current site that tops the Google search results when you type in the search string "what happened to the dinosaurs" or "what really happened to the dinosaurs".

If you carry out this search and find that the result is as disturbing as I found it, then you can do something about it.  Scroll to the bottom of the page and see there is a link to "Send feedback".  You can write what you like, but here's what I wrote:
Your search engine is biased towards creationism - alternatively it has been gamed by creationists - for the search term "what happened to the dinosaurs".  Very disappointing.  I use a few of your products and it would be sad, but not overwhelmingly difficult, to leave them behind in order to protest your apparent lack of action with respect to this problem ( was posted almost five months ago).
The idea is that when you perform an internet search, you should have some level of confidence that the results are not complete crap and this example serves only to reduce my confidence - in Google.  To be fair, I also tried the search string "what happened to the dinosaurs" in other search engines.  For example, I typed "what happened to the dinosaurs" into Bing (there is a reason for the repetitive nature of this post).

Sadly, they also have the same problem.  So I gave similar feedback.

Then I typed "what happened to the dinosaurs" into Yahoo! (Does anyone still use Yahoo!?) Because Yahoo! is powered by Google, I got Google's results.

I got a bit smarter and pasted "what happened to the dinosaurs" into Gigablast - finally finding a search engine that doesn't have a creationist website as the top result.  Instead it's a website talking about how Google Answers used to have an answer to "what happened to the dinosaurs" which quotes an article saying that Google had it wrong.  And the same creationist result appears shortly after (with something from "Present Truth Mag", a Seventh Day Adventist publication, just pipping it).

Getting a bit bored with the whole process, I decided to check one last search engine - DuckDuckGo - and, yep, the top result for "what happened to the dinosaurs" is yet again that same nonsense creationist website.

I think therefore that it's not really Google's fault, although Google could do something about it.  It's just that the creationists have managed to corrupt the search protocols so that the search string "what happened to the dinosaurs" promotes their web site.

In order to resolve this, I suggest that people who have a web presence try to create an article that would edge out those peddling nonsense about what happened to the dinosaurs. Ideally, if we got enough people involved, we'd see the web site in question relegated to the second or third page.

Oh, and don't forget to provide an answer or a link which accurately explains what happened to the dinosaurs.  I'd suggest:

  • Dinosaur - talks about dinosaurs in general and points out that they weren't all killed in one single event,  in part because birds are dinosaurs and in part because only late Cretaceous era (non-avian) dinosaurs were driven extinct by the asteroid - admittedly this does include three of the most readily identifiable dinosaurs (none of which existed in the Jurassic period): Tyrannosaurus Rex, Tricerotops and Velociraptor
  • Cretaceous-Paleogene Extinction Event - talks about the big event that many believe is the answer to the question "what happened to the dinosaurs", but isn't really the whole story (because a lot more than dinosaurs went extinct at the time and many dinosaurs were already extinct)

Edit one day later: sadly enough, it seems that this has absolutely no effect at all.  I'm going to have to live with the minor irritation.

Thursday, 15 October 2015

The subtle distinction between "believe" and "believe in"

If I had a dollar for every time I've seen someone going on about "believing in evolution", well … let's just say that I could afford quite a few hot dinners.  Despite being well-fed, however, I'd still be grumpy about the lack of understanding about how "believe" differs from "believe in".

It is probably a largely semantic issue - what do we really mean by believe?

For me, a belief is related simply to the state of considering something to be true.  I believe that I am sitting on a chair (in part because I am sitting on a chair).  For a theist (the sort of person most likely to err and say that others "believe in evolution") belief can mean something more, so that it involves an act of will, perhaps even the overcoming of doubt (often caused by evidence that shows that supports the notion that a specific belief is false).

Sometimes an atheist like myself will use "believe" in a slightly different way, so as to imply doubt - not doubt that has been overcome, but doubt that remains.  So I might say that I believe my football team will win on the weekend, because I don't know that they will and I'm a little more confident that I would be if I just said that I hope that they will win.  But I am nevertheless aware that there's an element of hope in there - you know, like wishful thinking.

I think that this can lead us atheists, at times, to misconstrue how a theist thinks about belief.  We tend to see this form of belief as lying a lot closer to hoping (or dreading if we are less optimistic) while I think that theists see it as being a lot closer to knowing (perhaps some theist or ex-theist could shed light on this).  If so, then there's an element of magical thinking, similar to those life-coaches who rattle off such nonsense as "if you truly believe it, it will come true".

"Believing in" something is quite a different concept to "believing" it.  Say I was listening to the coach of my football team as he described how the team planned to make the finals next season, that they've all been working very hard, making sacrifices and putting the "I" in "team".  To say that I don't "believe" the coach is to say that I think he's lying - they haven't been working hard enough, not enough sacrifices have been made and there's no "I" in "team".  To say that I don't "believe in" the coach is to say one of (at least) three things.

Firstly, I could be saying that I don't have any confidence in him, that I don't think his plans are sufficient to get my team through next season or something like that.

Secondly, I could be implying that I don't think that the idea of coaching is a good one, suggesting that it would be better if all the members of the team just did their own thing and turned up on the day and tried their best (an approach which I heartily recommend to all competitors in the league).

Alternatively, I could be saying that I don't believe that he exists.  This would of course be ridiculous, he's right there, giving a speech that I am listening to.  It would of course be quite different if I didn't believe in the coach that you were listening to, if I could neither hear nor see him.

If we talk about "believing in evolution" then strictly speaking, we know it happens - because we breed dogs, and cats, and sheep, and pigeons ... we can force them to evolve.  Not even a creationist could argue against that (well ... a somewhat mythical beast, the rational creationist, couldn't argue against that).  The sort people making the claim along the lines of "atheists believe in evolution" are actually referring to the theory of evolution.  Does the theory of evolution exist?  Well, yes, it does.  No-one is contesting that, as far as I know, not even the creationists.

We could ask a similar question: "do atheists believe in creationism?"  Strictly speaking, we do, because creationism exists.  We sort of wish it didn't because it rots people's minds, but wishing something doesn't make it true.

Then there are the other meanings of "believe in".

Do we "believe in" natural evolution meaning that we think it is a good idea?  Well, no, I don’t think so.  Instead, we tend to avoid it.  After all, we don't look on dispassionately as our children fail to evolve quickly enough to counter the latest version of the influenza virus.  We don't let women die in childbirth because their hips aren't wide enough.  We don't let myopic people die out (imagine making them drive without glasses for the sake of evolution …)

Or do we have confidence in evolution, in the same way as we might have confidence in and thus believe in the coach?  Since evolution is not goal oriented, this doesn't really seem applicable.  There might be some people who have some bizarre faith that, if we don't come to grips with climate change, everything will be alright because we'll just evolve gills.  Such people don't comprehend how evolution tends to viciously prune its metaphorical tree of failed experiments, something like 99.9% of all species that have ever existed are now extinct (being aware that species are designated by humans, really there are only as many species as we are collectively aware of, so this estimate is based on some rather rubbery assumptions).  If humans manage to be part of screwing the environment up so badly that we can't live in it, we won't magically adapt - we'll just die out, like most of the dinosaurs did as the Earth's climate changed on them over a period of 175 million years.

So, we don't believe that evolution is necessarily good, nor that it necessarily aligns with our goat of ongoing survival as a species.  Therefore, we don't "believe in" evolution in either of those senses.

But, given careful definitions of the terms "believe", "evolution" and "is true", almost every single one of us does believe that evolution is true.

Tuesday, 13 October 2015

A Response to Josh's Screen Argument

This article is in response to Josh Willms' "screen argument".


This is a rather impressive photo of a waterspout.  In the same vein as Josh's investigation into the colours green and red, I ask "Does the waterspout exist?"

Note that in this case I am not speaking about a category associated with the waterspout, I'm not asking whether wetness exists, or twistiness, or white.

Josh's approach to thinking about green and red was relentlessly reductive.  We experience red and green in the image of an apple, but where is the image or at least the experience of the image?  Eventually he gets to the point where he is looking very closely at the brain and sees … neurons, nothing but neurons.

We could do the same thing with the waterspout.  If we get closer and closer to the waterspout (assuming that we don't mind getting wet and we are immune to being tossed about by it), it will eventually disappear and we might see nothing more than water molecules banging together.  Is a waterspout more than the water molecules?

Well, yes, of course it is.  Despite its name, a waterspout is largely an air-based phenomenon - it's a tornado-like event above water, so we are just seeing water pulled up into the air and tossed around.  But even if we accept that, is a waterspout more than the air and water molecules banging together?

Again, yes, it is.  The waterspout is a located process.

By this I mean that we are not so much observing a thing that exists but rather we are observing a thing that is happening.  And in order for it to happen, just where it is happening, there are factors that have to be conducive to a waterspout - a body of warm water, an air mass and (in order for this to be an observed process so that it becomes an experience) an observer.

I think that this is a major issue in Josh's thinking.  He's looking for a thing when he should, at the very least, be accepting the possibility that what he is looking at (consciousness) is more of a process than a thing.

There are factors that are conducive to (and I would go so far as to suggest necessary as a prerequisite to) consciousness happening - like the presence of lots and lots of neurons and sensory input, but consciousness exists in neither the neurons nor the sensory input.


Another problem with the example given by Josh is associated with looking at an apple.

The implication is that a detailed, faithful image of the apple is generated in the brain, similar to the image above as displayed on a screen.  Then Josh asks, "where is this image?"

The problem is that Josh is begging the question here.  I believe that by assuming that there is an image of the sort that we believe that we experience (which is essentially indistinguishable from saying simply "of the sort that we experience"), Josh is heading off in the wrong direction.

There is plenty of research that indicates that we don't generate and then maintain images directly from our sensory input - at least not comprehensive and faithful images.  We don't have the ability to take in as much information as exists in our standard interface with the world.  For example, if I stop and look around me, I will see dozens of artefacts which are in my field of view.  Then when I go back to typing, the artefacts are still there in my peripheral vision, but I am not strictly looking at them with enough focus to paint in the details - my brain is doing the hard work via imagination and memory to put items in locations which make sense to me.

What seems to happen (and perhaps I might be wrong in this) is that the brain has a bunch of labels and applies them to those things within my field of vision - cup, bottle of water, mouse, screen, roll of tape, aircon controller, plastic bag, cables, piece of paper, pad of post-it notes, receipt, calling card, keyboard, hands, roll of paper.  Each of those, if I think about them triggers a memory such that I don't even have to consciously look at the item.  But if I don't think about them, they simply don't exist - like the fork, the battery, the pair of scissors and the two jars that I didn't "see" until I stopped and looked around more carefully to see what I missed.

These items also exist in a context, so I have a method by which I can place them in my reconstruction of what I "see".  (So the cup is to the left and just behind the post-it notes and between them is the pair of previously invisible scissors.)

I think the same happens even with things we are looking directly at.  Melvin will break the view of the apple up into labels, "apple", "red skin", "green leaf", "moderate size", "generally unblemished" and this will allow him to regenerate the image if he is required to recall it later.  But it is the "regeneration" of the image that he is experiencing, not the image itself.

And furthermore, I am pretty sure that neuroscience has shown that the parts of the brain that are brought to bear in imagining tasks are the same as used in image processing based on sensory input.  So my argument would be that when looking at an apple Melvin is not experiencing an image per se but is experiencing the process of generating an image.  As soon as he stops generating the image, it's gone - there's no created image that an external observer could extract from his brain.


This is all complicated, a little, by "attentional circuitry" in the brain.  Most people who looked at the image of the waterspout above will not have noticed that the person who took the photo was standing on a balcony with a metal fence around it to the left, nor that there is set of stairs heading down to the right towards the beach, nor that what looks like red chair is close to a brick wall that surrounds the remainder of the balcony.  Most would probably have noticed the impressive lensing effect on the sun, the plume to the left of the waterspout, the twisted clouds, the palm tree, the beach, the jetty and the fact that waterspout is happening in a bay (due to the landmasses to the left and right).

Monday, 12 October 2015

A Trickle is Not a Flood

In May of this year, Twenge et al. published a research article Generational and Time Period Differences in American Adolescents’ Religious Orientation, 1966–2014.  At first there was very little interest in it.  Here's the abstract to the article, as partial explanation for the lack of interest:

In four large, nationally representative surveys (N = 11.2 million), American adolescents and emerging adults in the 2010s (Millennials) were significantly less religious than previous generations (Boomers, Generation X) at the same age. The data are from the Monitoring the Future studies of 12th graders (1976–2013), 8th and 10th graders (1991–2013), and the American Freshman survey of entering college students (1966–2014). Although the majority of adolescents and emerging adults are still religiously involved, twice as many 12th graders and college students, and 20%–40% more 8th and 10th graders, never attend religious services. Twice as many 12th graders and entering college students in the 2010s (vs. the 1960s–70s) give their religious affiliation as “none,” as do 40%–50% more 8th and 10th graders. Recent birth cohorts report less approval of religious organizations, are less likely to say that religion is important in their lives, report being less spiritual, and spend less time praying or meditating. Thus, declines in religious orientation reach beyond affiliation to religious participation and religiosity, suggesting a movement toward secularism among a growing minority. The declines are larger among girls, Whites, lower-SES individuals, and in the Northeastern U.S., very small among Blacks, and non-existent among political conservatives. Religious affiliation is lower in years with more income inequality, higher median family income, higher materialism, more positive self-views, and lower social support. Overall, these results suggest that the lower religious orientation of Millennials is due to time period or generation, and not to age.

It wasn't ignored entirely however.  There was one report on it the very next day, filed at the on-line version of the Pacific Standard magazine, in a piece titled Millennials are Less Religious-and Less Spiritual Too.  Here's a sample:

It’s common knowledge that many members of the Millennial generation in the United States are rejecting religion. But some analysts argue that, while these emerging adults are less likely to participate in organized religion, they retain an interest in spirituality.

Not so, concludes a newly published study.

“American adolescents in the 2010s are significantly less religiously oriented, on average, than their Boomer and Generation X predecessors were at the same age,” writes a research team led by San Diego State University psychologist Jean Twenge. Confirming earlier evidence, the study finds they are less likely than members of previous generations to attend religious services, and less supportive of religious organizations.

After that there was nothing for more than two weeks.  A wider subset of the general population then became aware of the research article through Reddit towards the end of May, after Twenge got some assistance from her university.  The publicity article, published by the San Diego State University NewsCenter and picked up by the AAAS science news aggregator Eureka, was titled The Least Religious Generation.  Here's how they kicked off their release:

In what may be the largest study ever conducted on changes in Americans’ religious involvement, researchers led by San Diego State University psychology professor Jean M. Twenge found that millennials are the least religious generation of the last six decades, and possibly in the nation’s history.

The researchers — including Ramya Sastry from SDSU, Julie J. Exline and Joshua B. Grubbs from Case Western Reserve University and W. Keith Campbell from the University of Georgia — analyzed data from 11.2 million respondents from four nationally representative surveys of U.S. adolescents ages 13 to 18 taken between 1966 and 2014.

The distribution of interest at Reddit is interesting.  The article was linked to eight times, at the following sub-reddits (in order of popularity of the article within the sub-reddit [as gauged by score]): r/HowGodWorks [1], r/Christianity [3], r/lostgeneration [10], r/UUreddit [11], /r/theworldnews [14], /r/NoShitSherlock [74], /r/atheism [182] and /r/science [4397].  Not only was the article more popular at /r/science, but there were far more comments there as well, so people got involved in discussions about it.

There was a little more interest after The Raw Story published their take on the article, in a piece titled Teens are fleeing religion like never before: Massive new study exposes religion’s decline.  Here's their intro:

Religion is rapidly losing the youngest generation of Americans, according to new research.

America’s rising generation of adults are the least religiously observant of any generation in six decades, determined an expansive study led by Jean Twenge, a psychology professor at San Diego State.

Oddly though, although this seems to be the sort of article that the atheists would be keen on, it was once again the science sub-reddit that gave it the most votes and who commented on it the most.  There were quite a few other likely sources that covered Twenge's article from various perspectives that got no reddit interest at all - such as patheos, whose focus was on females who are apparently departing religion at a higher rate than males.  (Note, I rectified this total lack of interest by linking the patheos article.  At last count, this has received more positive interest than the last two articles put together.)

So, why was so little interest from atheists?

My hope is that it is because many atheists responded to The Raw Story's breathless commentary on Twenge's research in much the same way as I did.  By noticing that it's misleading.  By noticing that even Twenge's far more restrained abstract is misleading.

Let's look at that abstract a little more closely.  It is claimed that "twice as many 12th graders and college students, and 20%–40% more 8th and 10th graders, never attend religious services".  The next sentence implies that this means as compared to cohorts in the 1960s and 1970s.

For anyone with an interest in promoting rationality and secularism, a significant decrease in religiosity in the dominant exporter of culture in this world has got to be good news, right?  Well, yes.  But this isn't really what is being said here - even if what is being said is entirely true (which I question).

Twenge is writing about reported irreligiosity, which in the case of 12th graders and college students has doubled.  But double "not very much" is still not very much, especially when compared to more advanced nations.  For example, the Christian Research Association estimates that "10 per cent of all Australian young people in secondary school attend" (church monthly or more often) while Wikipedia reports that only 2% of Swedes are reported to be regular attendees at church services.  Note however that this Swedish data is more than 20 years old and only refers to one church - when other churches are considered, as reported by the Swedish Institute, "only eight per cent of Swedes attend any religious services regularly".  In neither Sweden nor Australia is it possible for irreligiosity to increase by 20%, let alone double.

(For those among us who are not maths geeks, irreligiosity stands at about 92% in Sweden generally and 90% for Australian teenagers.  An increase of 20% would make the irreligiosity rate 110% in Sweden and 108% in Australia.  It just can't happen.)

Such improvement in the US is, sadly enough, still possible.  Check Twenge's chart for church attendance (but note that the question was restricted to "never" attending church - which should exclude those who traipse along for cultural events like christmas or easter):

There was no chart provided for regular attendance, but there are some entries in a table, replicated below:

weekly or more

Ignoring the difference between weekly and monthly attendance and using a simple average of the figures for 8th, 10th and 12th graders, this means about 35% of American teenagers are regular churchgoers and the irreligiosity rate is therefore 65%.  It is pleasing to see that even in the US, a doubling of this rate is not possible, but an increase of 50% is still possible - to 98%.

Using the most recent figures, we can see that about 21% of 12 graders never go to church while 30% attend regularly (weekly or more).  That leaves almost 50% who attend "non-regularly", which could mean once a month, only on special days or it might include christenings, weddings or funerals but nothing else.  There's potential for a broad scope of belief in that 50%.

Overall though, what the research is telling us is that as far as the battle between the religious and the rational in American goes, there is still a long way to go.  Which is a bit of a worry.

What worried me most however was when I saw an article crowing that the American teenagers "are fleeing religion like never before".  Excuse me?  Fleeing?

How about we compare what is happening with youth and religion in America to the other sort of fleeing that American teenagers are far too often required to do.  You know, the sort of fleeing that happens when a gunman enters school grounds and starts shooting at people (I don't want to appear to be overstating the case, so I want to make clear that this has happened only 22 times so far in 2015, at the time of writing).  If we were to hear that students fled from a classroom, we would not expect to then read that the number of students absent from the classroom rose from 8% to 27% over a period of more than 30 minutes (and from 25% to 27% in the past two minutes).  This is hardly a flood of students leaving.  It's barely a trickle.

Then there is also the question of what the figures actually mean.  Twenge's research is based on survey responses but she makes only one reference to privacy and says nothing about confidentiality.  We therefore have no idea about how the respondents would have felt about their freedom to answer candidly.  America still has a problem dealing appropriately with youths who are not religiously inclined: there is talk among atheist circles of "coming out", with the same trepidation that a gay person might feel about coming out to their friends and family; there is even a movement called "Atheist Havens" dedicated to providing shelter to youths who are thrown out of home by their religious parents.  It's not a huge leap to imagine that some respondents would be disinclined to be entirely candid with respect to the survey.

That said, it has become increasingly likely that students who are not religiously inclined might be encouraged to speak openly due to the media attention given to celebrity atheists such as Sam Harris, Richard Dawkins, Michael Shermer and so on.  There's no indication that Twenge has controlled for this effect across the years.  Therefore, it's possible that the vast majority of what appears to be an increase in irreligiosity in America is merely the effect of increased confidence on the part of those who are not religious and their increased willingness to admit to it.  This too militates against any breathless claim that American youth is "fleeing" religion.

That said, I don't think that it's all doom and gloom.  Increased openness on the part of the non-religious will help provide an environment in which those who are not overly committed to their religion may reconsider their beliefs and perhaps come to see the irrationality of whatever faith they were brought up in.  Calmly walking away from religion, even if it's a slower process is a much more dignified approach than fleeing from it.

And if it is actually true that atheism isn't growing at the rate that some have been led to believe, this might galvanise proponents of rationality into action, encouraging them to review what they've been doing, to take lessons from the latest research in psychology (particularly about the creation and maintenance of beliefs and attitudes) and to consider whether other approaches might be more suitable.

Meanwhile, those of us who are lucky enough to be living in a nation that is relatively free of religion should not rest on our laurels but keep a close eye on those who would have us return to the Iron Age.