"I am a professional Climate Scientist hwose pares and input has
featured in the IPCC"
"The IPCC is talking bollocks, based on sound science. proper statistcs
and their own data"
"At worse we might see a 2°C rise in 100 years which would be entirely
does that summarise it accurately?
Not quite although it is considerably better than the usual denier tripe
that usually gets posted here. He does appear to be genuinely interested
in getting the right answer which is something of a novelty in the
contrarian camp. A bit more was on Wattsup with that in April
And his paper has been accepted for publication in Journal of Climate.
As a hard line objective Bayesian I reckon he does have a technical
point that a uniform prior over a wide range is inappropriate here.
To summarise the Bayesian approach to determine an unknown parameter x
you assign an uninformative prior probability to its initial value
before you start and you do some horrid integrals and get back a best
estimate probability distribution after applying the observational data
constraints you have.
The priors are almost always un-normalisable which makes mathematicians
and classical statisticians unhappy but can result in being able to
solve difficult inference problems. Scientists starting with Laplace
have used it for solving difficult problems of inference from incomplete
and noisy data. It can usually out perform classical analysis which is
why it is used in the most demanding roles.
A few examples of priors (stated without deriving - see Jaynes work) :
Unknown position x p(x) = 1 (-inf ... inf)
Unknown scale factor x p(x) = 1/x (0 ... inf)
Unknown binary value 0,1 p(x) = 1/(x(1-x)) (0 ... 1)
These are pure results derived from deep symmetry arguments and a
requirement for self consistency. They represent the initial state of
ignorance of x in a mathematical formula. You can cut the range down if
you have other information to incorporate before you do the experiment.
It is interesting to note here that the last of these is the a priori
prior probability for any binary question
eg "Is there a God?" or even "Is AGW really happening?".
It is symmetrical in x and very sharply peaked at 0 and 1 with a minimum
at x = 0.5. It cannot be integrated on this range without wild
divergence until you have data. It reflects the state of belief of a
population in the absence of evidence. They either believe it or they
deny it. You don't find many people hacking people to death in the high
street to defend their fundamentalist belief that p(God exists) = 0.5.
The sort of problem it gets used on are the likes of "weigh Saturn"
(Laplace) or Wolfs dice expriment (Jaynes). The latter is analysed to
death in a way that illustrates Bayesian analysis and maximum entropy in
a famous paper by Ed Jaynes. Basically given the fact that you have a
six sided die and the actual occurence of each face work out what
defects the die has. Or for extra marks given only the expectation value
of many throws (should be 3.5 but is observed to be 4.5).
Q: what is the best estimate of p(1), p(2) ... p(6) ?
(there is a provably unique solution)
The paper is here for those curious (higher mathematics needed)
However, I would equally expect that had they used the correct prior he
or some other contrarian would then have accused them of not allowing
for the possibility that the "warming" effect of CO2 was in fact
negative. And also of using a "controversial" unnormalisable prior.
Perhaps I am being a little cynical here. I was inclined to lump him
with the usual deniers for hire until I looked him up. His paper *has*
been published and the above would carry more weight if he had included
a summary of the results he obtained by using the right prior.
I suspect the people who did the original analysis thought it would be
easier to defend a simple uniform prior from external attacks.
Interestingly this guy has done an objective Bayesian analysis of these
data with the correct prior and got a 90% confidence range for CO2
doubling of +1.0 - 3.0 with a most likely value of +1.6C.
The corresponding values for the naive analysis wide prior are +2.1-8.9
with a most likely value of +2.9 and with an undefined in the review I
found expert prior +1.9 - 4.7 most likely value not stated.
So he claims that using his analysis the estimate of CO2 doubling is
between 1 degree lower and a half of what other practitioners estimate.
However, it is worth pointing out that the climate modellers are rather
conservative in their choices of parameters so that they can be more
easily defended from politically motivated attacks. This tends to bias
the simulations down towards lower values of CO2 sensitivity as was
demonstrated when Nature organised a wider ensemble of simulations.
One thing people seem to forget is that global warming is an average.
There is every possibility that the UK could end up with cold weather
more appropriate to its high latitude in a globally warmer world.
Not really. He does appear to have a point though.
An excellent, careful, detailed destruction of the methods used to
date. Keep this for future reference.
His destruction of the Subjective (used by the 'Global Warmers')
Bayesian statistical methods and its contrast with the Objective
('the rest of the scientific community') ("The alternative that
Jewson mentions, the Objective Bayesian approach, was until recently
almost unheard of in climate sensitivity studies") methods used is
IMV particularly damning. He neatly puts the boot into the
discrepancy by saying "The inquiries into the Climategate affair
noted the lack of interaction between climatologists and
professional statisticians, which may explain the ubiquity of such
inappropriate methodology in this area" which is about as dismissive
as it's possible to get.
The GW models get a similar pasting:
"Apart from these known shortcomings in GCMs, it is fundamental to the scientific
method that when modelled values do not agree with observations then the hypothesis
embodied in the model is modified or rejected. The refusal in AR5 to accept the
implications of the best observational evidence and of the over-estimation of warming by
the climate models and accordingly to either:
- reject the ensemble of GCM projections; use projections from a
subset of GCMs with ECS and TCR values fairly close to the best
observational estimates; or scale all GCM projections to reflect
those estimates, is unscientific."
I like graphs, and this is summed up nicely in Fig 1, where only two
sets of predictions meet the 'actual' temperature line, and that's at
the extreme lower bound of their ranges.
Finally, it should be noted that the predicted rise for the rest of
this century, after winding in the excesses of the IPCC reports,
reflects the sort of climate difference between Bristol or
Hereford and Exeter, that is, not very much.
TNP has put it all rather more succintly.
Well, not entirely, but the problem is that the changes are slow and muddled
by the general noise in the figures I think. I think one of the main issues
is why, not if. The climate has always changed, as we would not be able to
see the ice ages and more temperate times in the geological and fossil
The question is, are we affecting it enough that stopping what we do will
change it sufficiently to allow the expected changes to be mitigated. I
suggest its still debatable.
We did however manage to sort out the Ozone problem as there has been a
demonstrable effect of stopping the CFC emissions already.
Humans are also crap at long term planning and execution of those plans, and
we are far better when our backs are against the wal on short term
projects, so its hard to see we will be able to stop whatever happens at
least for now. It will I suspect take some more catastrophes to happen
before anyone will get their acts together.
We live on a dynamic planet which wobbles in space so we get what we get.
Thanks, I shall bookmark all this to read and try to digest when I have
time over the hols!
Used to work with Frank Duckworth (of the Duckworth-Lewis method) and
never really understood all this Bayes stuff.
Arctic sea ice extent this September reached outside the upper
2-sigma for 1981 - 2010, as was that for 2011.
"This modeled Antarctic sea ice decrease in the last three decades is
at odds with observations, which show a small yet statistically
significant increase in sea ice extent,? says the study, led by
Colorado State University atmospheric scientist Elizabeth Barnes".
er no. Apparently not. ozone holes have come and gone since then ISTR.
Humans are very good at taking natural events and interpreting them in
an anthropocentric way.
Sin caused the black death too, you know.
The earth is still the centre of the universe,m ruled by a vicious
god(dess?) who created it purely to educate men into using windmills not
1) That is completely irrelevant to the paper under discussion.
2) Despite its date, the article only mentions the status in September
2012, when the ice extent was more than two standard deviations from the
long term average. In simple terms, that means it was a very rare event.
A year later, the sea ice was at its greatest extent since records
began, 35 years ago. That was within two standard deviations from the
long term average.
CryoSat data also shows that the volume of ice varies far less than the
extent and that there has been a significant growth in multi-year ice.
unfortunetately since then there has been a sharp recovery.
sea ice was only a little under long term average this year.
yes. climate models don't handle the Antarctic well.
In fact they don't really handle anything welll.
If all the money time and effort poured into climate research has taught
us one thing, it's that the science ain't settled and its not that
Sorry, should have made it clear that 'this year' was 2013. It looks
like Arctic sea-ice recovery is under way, with good-quality
multi-year ice being formed.
The wheels are well and truly falling off the GW/AGW/CC/CCC/CACC (I
forget what it's called this week) wagon.
What a bunch of Dismal Jimmies the Beeb found as commentators:
"But scientists caution against reading too much into one year's
I thought it was more than one year, but moving swiftly on...
"Although the recovery of Arctic sea ice is certainly welcome news, it has to be considered against the backdrop of changes that have occurred over the last few decades," said Prof Andy Shepherd of University College London, UK.
Yes, that's why we can call it a 'recovery'. It's a change in the
And it's not good news for the modellers.
"It's estimated that there were around 20,000 cu km of Arctic sea ice
each October in the early 1980s, and so today's minimum still ranks
among the lowest of the past 30 years," he told BBC News, without
the Beeb asking what the lowest figure was.
Well, it would be, wouldn't it, if it's recovering from a minimum. If
it had recovered in one year to a level 50% more than the minimum, it
would rank as an event equal to the Second Coming.