Tuesday 19 September 2023

The Patrick T Brown Debacle - An Own Goal or Something More Sinister

 On 5 September 2023, the climate scientist Patrick T Brown published an article at “The Free Press” which implied that he had perpetrated a Sokal style hoax on the journal Nature.  His explicit claim was that an “unspoken rule in writing a successful climate paper (is that the) authors should ignore—or at least downplay—practical actions that can counter the impact of climate change.”

 

Note the title of the article, “I Left Out the Full Truth to Get My Climate Change Paper Published” and the by-line “I just got published in Nature because I stuck to a narrative I knew the editors would like. That’s not the way science should work.  Note also that the link includes the word “overhype”, indicating that the editor had a different title in mind.  This is another claim in itself, although it doesn’t really appear in the text.

 

The paper he co-authored was Climate warming increases extreme daily wildfire growth risk in California.

 

This all raises some key questions.  Who is Patrick T Brown?  Where does he hail from?  And are his claims reasonable?

 

Patrick T Brown is, among other things, a co-director of the Climate and Energy Group at the Breakthrough Institute.  This institute, established by Michael Shellenberger and Ted Nordhaus, is focused on “ecomodernism” which tends to be in favour of using technology to solve problems – replacing fossil fuels with nuclear energy (not entirely bad), but resisting anything approaching efforts to minimise current reliance on fossil fuels.  To be cynical, they appear to be in the “we don’t need to worry about climate change because we can fix it with out technology” camp of climate deniers.

 

If it were true that there was a real effort by academia to squash efforts to address climate change, this would indeed be a problem.  We should do all the research, research into the impact of human activities on the climate (which we can more easily moderate), the effects of climate change and ways of mitigating the effects of climate change.  However, there are journals which address different aspects of science.

 

What does Nature publish?  According to their website:

 

The criteria for publication of scientific papers (Articles) in Nature are that they:

  • report original scientific research (the main results and conclusions must not have been published or submitted elsewhere)
  • are of outstanding scientific importance
  • reach a conclusion of interest to an interdisciplinary readership.

 

Note that they don’t indicate that they publish articles on technological developments (which is where much of the detail on efforts to mitigate climate change would be expected to appear).  However, there is a journal in the Nature stable precisely for that, the open access journal npj Climate Action.  So, the question is, did Patrick T Brown do any original scientific research into other contributions to climate change?  He doesn’t say so we don’t know.

 

Does Nature refuse to publish papers on natural contributions to climate change?  No.  Contribution of natural decadal variability to global warming acceleration and hiatus. Indirect radiative forcing of climate change through ozone effects on the land-carbon sink.  Admittedly this is old (about a decade), but there’s no indication that there is new original research into other factors that has been rejected.  There are newer papers on the effect of the release of methane due to melting permafrost, such as this one from 2017: Limited contribution of permafrost carbon to methane release from thawing peatlands.

 

Did Nature give any indication that they didn’t want publish a paper that talked about other drivers of climate change?  No, the opposite in fact.  Hi co-author, Steven J Davis (reported at phys.org), said “we don't know whether a different paper would have been rejected.  … Keeping the focus narrow is often important to making a project or scientific analysis tractable, which is what I thought we did. I wouldn't call that 'leaving out truth' unless it was intended to mislead—certainly not my goal.”

 

Nature provides visibility of the peer review comments, available here, and in those comments, there are references to other factors “that play a confounding role in wildfire growth” and the fact that “(t)he climate change scenario only includes temperature as input for the modified climate.”  Two of the reviewers rejected the paper, but neither of them did so on the basis that it mentioned other factors than anthropogenic climate change.

 

In the rebuttal to the reviewer comments, the authors wrote:

 

We agree that climatic variables other than temperature are important for projecting changes in wildfire risk. In addition to absolute atmospheric humidity, other important variables include changes in precipitation, wind patterns, vegetation, snowpack, ignitions, antecedent fire activity, etc. Not to mention factors like changes in human population distribution, fuel breaks, land use, ignition patterns, firefighting tactics, forest management strategies, and long-term buildup of fuels.

Accounting for changes in all of these variables and their potential interactions simultaneously is very difficult. This is precisely why we chose to use a methodology that addresses the much cleaner but more narrow question of what the influence of warming alone is on the risk of extreme daily wildfire growth.

 

We believe that studying the influence of warming in isolation is valuable because temperature is the variable in the wildfire behavior triangle (Fig 1A) that is by far the most directly related to increasing greenhouse gas concentrations and, thus, the most well-constrained in future projections. There is no consensus on even the expected direction of the change of many of the other relevant variables.

 

So the decision to make the study very narrow, in their (or his) own words, was made on the basis of ease and clarity, not to overcome publishing bias.  Perhaps Patrick T Brown was lying.  But there would be little point, since the paper’s authors write:

 

Our findings, however, must be interpreted narrowly as idealized calculations because temperature is only one of the dozens of important variables that influences wildfire behaviour.

 

So, that’s true.  Like much of science, it’s all about trying to eliminate confounding factors and working out what the effect of one factor is (or a limited number of factors).  In this case, the authors have (with assistance of machine learning) come to the staggering conclusion that if forests are warmer and drier, they burn more.  The main criticism that could be made is that Nature published a paper with such a mundane result.  However, the mechanism, using machine learning, is potentially interesting.  It could easily contribute to modelling – both in predicting the outcomes of various existing models and potentially by being redeployed to improve existing models (or create new and better models).

 

It’s a bizarre situation.  Why did Patrick T Brown, as a climate scientist, do this?  Maybe he has been prevented from publishing something in the past.  Perhaps his new institute (or group) has been prevented from publishing something.  That would be interesting to know.

 

Or is it something else?

 

Well, if you search hard enough, you can find that Patrick T Brown has posted at Judith Curry’s blog back when he was a PhD student.  And if you look at Judith Curry, you will find that she is what Michael Mann labelled a delayer – “delayers claim to accept the science, but downplay the seriousness of the threat or the need to act”.

 

Is it merely coincidence that the Breakthrough Institute for whom Patrick T Brown works, and his fellow ecomodernists, are also the types who appear to accept the science, but downplay the seriousness of the threat of climate change and the need to act, or at least criticise all current efforts to act?

 

---

 

My own little theory is that Patrick T Brown was not so much involved in scoring an own goal in the climate science field, but that he was attempting deliberate sabotage.

Wednesday 13 September 2023

A Further Departure from MOND

Looking more closely at Milgrom’s Scholarpedia entry on MOND, I found something else that I didn’t like.  It was the method by which he arrives at an equation that I used in the previous post, A Minor Departure from MOND, namely g(in the MOND regime)=√(GMa0)/r.

 

I was walking the dogs actually, mulling over things, and realised that I couldn’t for the life of me remember how I arrived at that equation.  I must have seen it, got stuck in a mental alleyway and just automatically applied it.  Very embarrassing.

 

While it works, and seems to work better from one perspective with the different value of a0, it won’t wash if there’s no derivation.  And there’s no derivation.  This is the numerology that I was complaining about a few posts ago.

 

What Milgrom writes is: “() A0 is the “scale invariant” gravitational constant that replaces G in the deep-MOND limit.  The fact that only A0 and M can appear in the deep-MOND limit dictates, in itself, that in the spherically symmetric, asymptotic limit we must have g∝(MA0)1/2/r, since this is the only expression with the dimensions of acceleration that can be formed from M, A0, and r.”  The term A0 had been introduced earlier in the text: “A0 is the “scale invariant” gravitational constant that replaces G in the deep-MOND limit. It might have been more appropriate to introduce this limit and A0 first, and then introduce a0≡A0/G as delineating the boundary between the G-controlled standard dynamics and the A0-controlled deep-MOND limit.”

 

The problem I have is that, in Towards a physical interpretation of MOND's a0, I considered critical density of our universe, and that very specifically uses the Gravitational Constant (G), and I consider the gravitational acceleration at the surface of a Schwarzschild black hole with the same density as that critical density, and that equation also very specifically uses G.  However, the resultant acceleration would be right on the border between “the G-controlled standard dynamics and the A0-controlled deep-MOND limit”, so there’s an issue right there.

 

There’s also an issue with the fact that forces are vector quantities, in the case of gravity directly towards the centre of mass (although due to the summing and negation of sub-forces created by every element of the mass).

 

When considering the surface of a black hole, the gravitational force is towards the centre of the mass of the black hole.  Now, in earlier posts, I have indicated that the density of the universe is the same as the density of a Schwarzschild black hole with a radius equivalent to the age of the universe times the speed of light.  What I have never said, at any point, is that the universe is inside a black hole.

 

My position has been more that the universe *is* a black hole, which may seem rather esoteric, but the point is that I don’t consider there to be an outside in which there would be a black hole inside of which our universe would sit.  To the extent that there is a universe in which our universe is nestled, that “outer” universe is on the other side of the Big Bang.  So it’s not so much a “where” question, but rather a “when” question.

 

But even then, it’s not correct to say that the “outer” universe is in our past, because time in that universe was/is orthogonal to our time, and in the same way the spatial dimensions of the “outer” universe were/are orthogonal to our spatial dimensions.

 

(I know this is difficult to grasp initially, but this video may go some way to explaining a version of the concept.)

 

This introduces another issue.  If we could, in any way, consider our universe to be a black hole in an “outer” universe, then our universe would be smeared across the surface of that black hole and any gravitational force due to the total mass of that black hole would be orthogonal to our spatial dimensions.

 

So, while it’s tempting to consider a value of a0 that is linked to the mass of a black hole with the dimensions of a FUGE universe, it doesn’t seem supportable.

 

I had tried a method, considering the curvature of the “fabric of spacetime”, but I suspect that it introduces more problems than it solves.



An image like this illustrates curvature of two dimensions, but it represents curvature of three dimensions.  We could eliminate another dimension, to get something like this:

 

 

In this image, the notional gravitation that a0 would represent would be a vector field throughout with a downwards trajectory.  Without a mass deforming spacetime, that vector field would be orthogonal to it, but with any deformation, there would be a component that is not orthogonal.

 

It made sense at the time, since it does tie the effect of a gravitational force that should be uniform throughout the universe to a mass that is deforming spacetime but I don’t have any confidence that it works, since the upshot would be additional deformation, which could have a potential runaway effect.

 

Someone else might have an idea as to how this could work, even if it seems to me to be a dead-end.