Sunday, 27 December 2015

Is Luke Barnes Even Trying Anymore?

I have, more than once, accused the cosmologist Luke Barnes of being a closet theist or, at the very least, someone who knowingly provides support to apologists such as William Lane Craig.

He doesn’t go so far as to deny it, but instead protests that his beliefs aren't relevant and more recently he has admitted to being “more than a deist” which, in the context of other comments, makes him a theist.

So, my question here is: is Barnes’ almost certainly christian theism relevant?  It wouldn’t be if Barnes were a total nobody and I would suggest that Barnes would be a total nobody, if it weren’t for the fact that he is doing so much wonderful work for various liars for christ, oops, I mean apologists.  And Barnes’ work appears, to a certain extent, to be informed by his theistic leanings.  A quite recent example is his latest paper, Bindingthe Diproton in Stars: Anthropic Limits on the Strength of Gravity, which he announced on his blog, just in time for christmas.

That doesn’t sound very crazy religious though, does it? Well, it does when you break it apart and when you look at just what Barnes is trying to do.

What are “diprotons” and the “anthropic limits” on them?  Diprotons are, as Barnes explains, two protons bound together.  He postulates that if an evil genius were to find a way to make the binding of protons into diprotons possible and turned this power on our sun, this would be catastrophic.  The sun would burn through all its fuel and go out in less than a second.  So, if we want life in our universe, it’s a good thing that protons don’t bind to each other.  What a relief that that didn’t happen, right?

In effect, what Barnes is arguing is that, if gravity were stronger (or, rather, less weak) then diprotons could form, and we’d have what he terms a “diproton disaster”.  As Barnes puts it:
Regardless of the strength of electromagnetism, all stars burn out in mere millions of years unless the gravitational coupling constant is extremely small, αG<≈10-30
Ok, I can accept that. I do notice however, that gravity isn’t too strong.

The mediating constant (the one that is the crux of Barnes' paper) is this αG, the gravitational coupling constant, which Barnes’ gives as the square of the ratio of mass of a proton to the Planck mass (ie αG=(mp/mplanck)2) but is given elsewhere as the square of the ratio of mass of an electron to the Planck mass (ie αG=(me/mplanck)2). 

Note that the other constant that Barnes could have talked about with reference to gravity is known as the gravitational constant.  It might seem curious that this constant didn’t come up for review, but the value of this constant is precisely 1 when expressed in natural units (ie Planck units, such that G=(ħ.lplanck)/(mplanck2.tplanck) where ħ (known as h-bar) is the reduced Planck constant which has, in Planck units, a value of precisely 1).  There's no apologetic wriggle room with the gravitational constant.

It should be noted that the gravitational coupling constant as defined by Barnes is, effectively, a measure of the mass of a proton.  And, thus, it should be further noted that what Barnes, and his fellow seekers after fine tuning, cannot legitimately do is claim both fine tuning of the gravitational coupling constant (as defined by Barnes) and fine tuning of the mass of the proton, because they are the same thing.  In other words, it would be quite questionable to claim the masses of component elements of a proton (the up and down quarks, so mup and mdown) as a separate example of fine tuning.  But guess what!  This is precisely what Barnes does in the discussion section of his paper.

Naughty, naughty boy.

But leaving that aside, one might be interested in knowing just how finely tuned the gravitational coupling constant is. Barnes does go into this, he provides some formulae together with some funky graphs and arrives at the conclusion that, to be life-permitting, the gravitational coupling constant must be less than or equivalent to 10-30, a value that Barnes characterises as “unnaturally small” (note that this might have originally been the opinion of Martin Rees).  The point Barnes is making with this description is that the value 10-30 is much less than 1 – because the mass of the proton (and even more so the electron) is much less than the Planck mass.  This is true. The Planck mass isn’t that small at all, weighing in at about 2.176×10−8kg.  This is equivalent to the mass of five human ova, 20,000 normal human cells, 1/4 of a human eyelash, 1/10 of the dry weight of a fruit fly or approximately one flea egg.  Compared to a proton, that's ... um ... massive.

So, Barnes’ argument sort of resolves down to a question as to why the Planck mass is so huge, especially given that the Planck length, Planck time and Planck charge are all so tiny (at about 1.616×10-35m, 5.391×10-44s and 1.875×10-18C, respectively).

Well the answer to that can be found by considering what the Planck mass actually is.  The Planck length and the Planck time can both be considered as the smallest meaningful divisions of their dimensions (which might not actually be the case, but we can’t make sense of smaller divisions – Wikipedia merely states “According to the generalized uncertainty principle (a concept from speculative models of quantum gravity), the Planck length is, in principle, within a factor of 10, the shortest measurable length – and no theoretically known improvement in measurement instruments could change that” but there is no citation given to support this statement).

Note also that there is, in both the Planck length and the Planck time entries, a statement to the effect that “there is no reason to believe that exactly one Planck unit has any special physical significance”.  Again there is no citation to support this statement, and I do have my doubts with regard to it.  Particularly when there is no such statement in the Planck mass entry.

The thing is that Planck length and Planck time are linked by the speed of light, c=lplanck/tplanck (and they are basically the same thing anyway given that time and space are, to an extent, interchangeable according to general relativity - making them alternative measures of spacetime).  And Planck length and Planck mass are linked via black holes.  The Planck mass is the mass of a black hole for which the Schwarzschild radius is two Planck lengths.

rS=2GM/c2

rS planck mass =2Gmplanck/c2

rS planck mass =2G.√(ħc/G)/c2

rS planck mass=2.√(ħG/c3)=2.lplanck

The upshot of this is that if the Planck mass were lower, then black holes would be forming all over the place and the universe as we know it would not exist.  It makes sense for the Planck mass to be (relatively) large in comparison to the mass of a proton, it’s not “unnatural” at all.

But we still don’t know how “fine-tuned” the gravitational constant is (unless you, dear reader, have scooted away and read Barnes’ article).  Remember, he arrived at the conclusion that αG<≈10-30.  And what is the actual value?  According to Barnes … αG≈5.9×10-39.  This is a value that seems to have been taken from Martin Rees’ work, possibly via William Lane Craig since it’s only one significant place, which is a bit odd for such a figure quoted in a scientific paper.  Wikipedia has Rees in 2000 giving the figure as 5.906×10-39, during some discussion on the value of N.  N is one Rees’ six numbers and is given by “the strength of the electrical forces that hold atoms together, divided by the force of gravity between them” – or in other words, the fine structure constant [hereafter to be known as the James Bond Number for reasons which might shortly be made plain] divided by gravitational force, N=α/αG=0.007/5.906×10-39≈1×1036, which hopefully we can all agree is a Very Big Number given that there are estimated to be only about 1024 stars in the universe).  We will never know where Barnes got his αG number from, he just throws it out there in the discussion without discussing how it was arrived at.  Again, that’s a bit naughty.

Anyways … it’s a value that could be (about) a billion times larger without leading to a “diproton disaster” and could (at least notionally) be a small as you like.  That could hardly be considered “fine-tuned” if you ask me.

And that’s only when you use Barnes’ definition of αG and Barnes’ figure for αG.  Otherwise, you have values of 1.7518×10-45 or 3.217×10-42.  Clearly it doesn’t matter much what value this constant has …

Oh, and who funded Barnes’ paper?  Yep.  Templeton.

---

When doing a little research for this article, I looked at some of Barnes’ older fine tuning posts, given that he himself linked to some of them in the post announcing his new paper.  One was particularly revealing.  It was an attack on PZ Myers, a person I have never warmed to although I can’t quite put my finger on precisely why.  The details aren’t terribly important, because what I found revealing was this comment right at the end:

Not content with merely demonstrating his ignorance, Myers proceeds to parade it as if it were a counterargument, allowing him to dismiss some of the finest physicists, astronomers, cosmologists and biologists of our time as “self-delusional”.

This isn’t controversial, I suppose, but under “physicists, astronomers, cosmologists and biologists” were links to specific examples: Paul Davies (Templeton winner), Martin Rees (Templeton winner), John Barrow (Templeton winner) and Simon Conway Morris.  The last name is not yet that of a Templeton winner, but he is “a Christian, … most popularly known for his theistic views of biological evolution”, so he is quite likely to be a future nominee if not winner.  It’s also interesting to note this from his Wikipedia entry:

He is now involved on a major project to investigate both the scientific ramifications of convergence and also to establish a website (www.mapoflife.org) that aims to provide an easily accessible introduction to the thousands of known examples of convergence. This work is funded by the John Templeton Foundation.


What were the chances?

No comments:

Post a Comment

Feel free to comment, but play nicely!

Sadly, the unremitting attention of a spambot means you may have to verify your humanity.