Calculating your carbon footprint - a load of bollocks

You're always at liberty to publish data that has been smoothed, and compare that with the same data that has been statistically analysed, to demonstrate how much better smoothing is for determining the underlying trends.

Reply to
Terry Fields
Loading thread data ...

Yes, thanks. I make sure I see the doc first thing in the morning when my readings are usually good. That way "we" don't have a problem

Reply to
stuart noble

The message from Terry Fields contains these words:

Snip all detail

Terry, I don't have Excel so don't have any familiarity with the underlying mechanics of Excels smoothing filters (which in essence what that regression is). The Met Office filter, at least when it has the full range of numbers, has the advantage that it weights the closest data higher than the more distant data while introducing no variation in the overall total. Where it seems to go wrong is the fiddle used to extend the smoothed curve up to the end of the data. I could be wrong but it doesn't appear that Excel does anything to correct any possible bias that is introduced by a short term effect at the end of the sequence but almost certsainly there is some form of weighting in its polynomial regression. It wouldn't work with a non linear data otherwise.

You can place too much reliance on manipulated data. For instance your straight line, the 'linear regression' is almost certainly an artifact.

I can't be sure but I suspect that your moving average is the simple one with no weighting.

As you have Excel it would be relatively simple for you to take the Met Office figures and see how polynomial regression differs from the MO filter both during the period where the filter can work on all 21 years and in the final 10 where the end bias gets stronger and stronger.

Reply to
Roger

I don't like Excel much, but have to use it for the sake of compatibility - I work from home on contract, and the stuff I produce has to be fully compliant with with other systems. A spin-off of this is that I don't know how the smoothing works, as you say it's probably a simple moving-avarage. I only included that ghraph for interest's sake.

I'd like to try playing with the MetO data, but two things aren't working in my favour: a brief search of the MetO site didn't turn up the figures - have you come across them. by any chance?; and I'm supposed to be putting together a presentation for a formal meeting next week, but more interesting things keep turning up :-/

Reply to
Terry Fields

You miss the point. Smoothing is what statistics do. Its just another compression algorithm.

You are comparing apples and apples and calling them oranges.

The innate characteristic of any compression algorithm is that it innately implies an underlaying pattern in the data, and seeks to find that pattern and suppress the noise.

What you are essentially doing when analysing, smoothing, or whatever a time series, is applying a low pass filter to it: What the filter has as output is critically dependent on the form of the filter. You are merely comparing two different forms of filter..and if they give different results, that merely calls into question the validity of using ANY of them.

Reply to
The Natural Philosopher

The message from Terry Fields contains these words:

Not as yet. As it is fine atm I am spending very little time in the house. If the worst comes to the worst I could always extract the figures from the graph but they would be rather imprecise.

Reply to
Roger

I can think of a way that might lend some precision, but it'll be tedious: bring up the graph in say Paint Shop Pro, hover the cursor over each point in turn, and use the pixel-corordinate readout to give each point a value in X,Y format. Calibrate by doing the same for each axis. Might get the points to a precision of 1 percent - which could be enough for our purposes.

Reply to
Terry Fields

I know it's a been as while since I did any statistics in anger, but from my first lecture in college in 1963 up to what's laughingly called retirement, I've never heard of statistics being called a LPF.

Perhaps the picture in my mind, of a LPF rejecting HF components of something, isn't what I see as being done by statistics, which is to discard nothing in orderto return the best estimates of the provenance of the data.

How would a comparison of say, satellite data analysed by two different organisations, with ditto from ground-based data, by putting them through an analysis of variance, be classed as low-pass filtering?

Apologies if I appear dim, but I'm struggling to visualise the LPF concept.

Reply to
Terry Fields

I hope it isn't time for round two already but The Independent front page today has the latest twist on the GW saga - methane emissions in the Arctic.

Reply to
Roger

Note that I did say with respect to a *time series*.

Indeed you probabably would not, but let me assure you that the digital implementation of a low pass filter is a statistically weighted moving average of a sort.

Juts like e.g. a 'moving average'.

A resistor and a capacitor does a beautiful moving average as it happens ;-) More resistors and capacitors change its characteristics.

Anyway, as ypu know my background is in analog and digital electronics, and thats a unique viewpoint that allows me to see the things that you regard - possibly because the methodology you use appears oo different, to be different things: They are not. They are data compressors. Of which a low pass filter - or indeed any filter - is a classic example, but done 'analogue'

An average is a complete and most basic reduction of a data set to ONE data point.

Low pass filters discard nothing: its simply that the shorter the duration of a section of the raw graph is, the less it affects the final output.

Sounds a perfect description to me.A differential amplifier and low pass filter..

That, I am afraid, is your problem.

Broadly speaking, any device which allows the time variance of a signal to affect the amplitude of the output as well as the amplitude of that signal does, is some kind of filter.

The moment you derive an output at a given time, not just from the input data at that time, but from other times as well, you are applying some sort of frequency style filter in practice, even if you think you are doing statistics.

Reply to
The Natural Philosopher

Yup. Not sure what the breakdown mechanism of methane is..guess it sort of oxidises in due course.

Its not particularly soluble is it?

Perhaps its the answer to 'renewable' aircraft fuel. Scoop it up and burn it ;-)

Or catalyse it with CO2 to make ethanol..

Reply to
The Natural Philosopher

If they're right, the release is just in time to counter the Jan 07 - Jan 08 cooling ;-)

Reply to
Terry Fields

formatting link
if you want to play.

Reply to
dennis

I would like to see the evidence that the seabed have warmed at all. It sounds like another scare story just to ask for more funding. They need money to do endless surveys, and change the fudge factors in their models.

Reply to
dennis

The message from The Natural Philosopher contains these words:

IIRC it is not particularly long lived in the atmosphere, a few years at the most

Not AFAIK but I have no detailed knowledge of the subject.

Either case would be extraordinarily difficult given the widespread nature of the release.

Reply to
Roger

The message from Terry Fields contains these words:

I don't have PSP. I do however have an old CAD package (Turbocad v6.5) but in the past I have had problems importing objects into that. Alternatively I could pretend it was a scanned map and take each years co-ordinates but given the small size of the original graph | wouldn't be at all hopeful of 99% accuracy except by sheer chance.

Reply to
Roger

The message from Terry Fields contains these words:

formatting link
you can find out what it all means. :-)

Reply to
Roger

I've worked out that 1.0000000E+30 means 'data missing'....

....but not a lot else :-(

The best hope might be to find the actual data needed on a blog, but that seems to be just as obscure....

Reply to
Terry Fields

The message from Terry Fields contains these words:

I am some way into extracting approximate figures from the graph. Do you have a valid email address I can send it to?

Reply to
Roger

It might be easier to ftp the figures as a txt file to your webspace, that way they'd be available for others too. But even a as table of figures they wouldn't take much bandwidth, so you could post them on here - or on a test or misc group if people were unhappy with that.

Reply to
Terry Fields

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.