Posted 20 November 2010

"There is a widespread public perception that increasing emissions of greenhouse gases into the atmosphere by humans is harmful to the climate and that it causes a general warming of the earth’s surface, There is also a widespread public perception that these beliefs have been proven to be true by scientific evidence. It is the purpose of this paper to examine whether there is scientific evidence for the supposed harmful influence of human-emitted greenhouse gases."  Dr Vincent Gray, New Zealand, finds no such evidence.

**SCIENTIFIC METHOD AND THE “GREENHOUSE” THEORY

**

by Dr Vincent Gray

75 Silverstream Road, Crofton Downs. Wellington 6035, New Zealand

Email [email protected]

ABSTRACT

The theory that increased human-related emissions of so-called “greenhouse” gases have a harmful effect on the climate has not been confirmed by the procedures of the scientific method. The theory cannot be confirmed by experiment, the climate, observations cannot be repeated, and the assumptions cannot be falsified by validation of the models.

This situation is accepted by the Intergovernmental Panel on Climate Change (IPCC) which insists that its models provide only “projections”, not “forecasts” or “predictions” and Its models are “evaluated” only by “experts”, and “attributed” not “validated”.

The temperature measurements used for producing the Global Mean Surface Temperature Anomaly Record (GMSTAR) are shown to be so inaccurate that “trends” in the record are meaningless. Attempts to show a correlation between this record and model outcomes fail to include the most important influences on temperature and are therefore unreliable.

Despite the lack of scientific rigour, public perception continues to support the “greenhouse” theory of “global warming”.

Absence of scientific rigour exists in other scientific disciplines, but its acceptance does not have such important consequences as the current policies that depend on the inadequate advice of climate science

**

PUBLIC PERCEPTION**

There is a widespread public perception that increasing emissions of greenhouse gases into the atmosphere by humans is harmful to the climate and that it causes a general warming of the earth’s surface, There is also a widespread public perception that these beliefs have been proven to be true by scientific evidence. It is the purpose of this paper to examine whether there is scientific evidence for the supposed harmful influence of human-emitted greenhouse gases.

The study must begin by examining what is meant or implied by “scientific evidence”. The term “science” can be applied to almost any empirical knowledge, usually involving some form of reasoning. It can apply to the knowledge shown by primitive humans, and even aspects of the behaviour of other organisms. Scientific evidence needs to go beyond a mere description, to comprise the methodical procedures of the scientific method in order to .justify the economically expensive measures which are currently being carried out or proposed for controlling greenhouse gas emissions .

**

THE SCIENTIFIC METHOD**

According to Wikipedia1,The Scientific Method 

“consists of the collection of data through observation and experimentation, and the formulation and testing of hypotheses”

It is also generally considered that scientific observations and experiments must be capable of being repeated by independent observers.

The philosopher Karl Popper2 considered that:

“In so far as a scientific statement speaks about reality, it must be falsifiable; and in so far as it is not falsifiable, it does not speak about reality”.

Falsifiability is tested by a process called “validation” This includes the ability to simulate existing and past behaviour, but it must also show that a scientific hypotheses can forecast future behaviour to a satisfactory level of accuracy. Without this step it is impossible to derive an estimate of the accuracy of any result,

Current climate science, as summarized by the Intergovernmental Panel on Climate Change (IPCC)3 fails to implement the procedures of the scientific method for the following reasons.

•    “Experimentation” is impossible. “Experiments” with computers are not experiments on the climate.

•    Climate observations cannot be repeated by independent observers. .Yesterday’s measurement of temperature cannot be independently checked.

•    None of the statements by the Intergovernmental Panel on Climate Change (IPCC)3 on the “likely” behaviour of future climate can be falsified at present because they all refer to the year 2100.

•    The opinions of “experts” cannot replace proper scientific investigation.

Edward Lorenz4 claimed that any information about the climate is subject to “chaos” He stated

“in view of the inevitable inaccuracy and incompleteness of weather observations, precise, very long range forecasting would seem to  be non-existent”

His arguments arise from the fact that a scientific understanding of the behaviour of fluids involves the use of nonlinear equations whose solution depends on exact knowledge of present conditions.

The IPCC5  has tried to claim immunity from “chaos” in the following statement:

“A common confusion between weather and climate arises when scientists are asked how they can predict climate 50 years from now when they cannot predict the weather a few weeks from now. The chaotic nature of weather makes it unpredictable beyond a few days. Projecting changes in climate (i.e., long-term average weather) due to changes in atmospheric composition or other factors is a very different and much more manageable issue”.

Rind6, however, disagrees

"The climate that we experience results both from ordered forcing and chaotic behaviour, the result of a system with characteristics of each. In forecasting prospective climate changes for the next century, the focus has been on the ordered system's responses to anthropogenic forcing. The chaotic component may be much harder to predict, but at this point it is not known how important it will be"

The IPCC appreciated the necessity for attempting to falsify results by a process of validation  from the beginning. Their first Report (1990)7 has a Chapter 4 “Validation of Climate Models”

A similar Chapter appeared in the First Draft of the next (1995) Report and, as an “Expert Reviewer” at the time,  I submitted the comment that since no Climate Model has ever been validated the term was inappropriate. Somewhat to my surprise, they agreed with me. In the Second Draft, not only had the title of the Chapter been changed, to “Evaluation of Climate Models” but the words “validation” and “validated” had been altered to “evaluation” and “evaluated” no less than fifty times in the text. In addition, all references to “forecasting” and “prediction” had been removed and all model results are now “projections” whose value depends on the extent to which their assumptions can be believed. .

These practices are now standard throughout all the IPCC Reports, .In other words, the IPCC admits that Climate Science cannot meet the requirements usually regarded as essential for the scientific method. .

Attempts at correlation between climate properties and model outputs are made, despite an admission that this procedure does not prove cause and effect. It is called  “attribution”, and  techniques such as Intercomparison of models and Bayesian statistical techniques,  are  used to improve the “reliability” of models. They actually ensure that all models make uniform mistakes.

The recent Report by Sir Muir Russell8 which enquired  into  the disclosure of Emails from climate scientists at the Climate Research Unit at the University of East Anglia, (CRU) confirmed current IPCC practice as follows:

"8. Modern digital technologies permit the acquisition and manipulation of very much larger datasets than formerly. To enable proper validation of the conclusions, such datasets must be made freely available, along with details of the associated computational manipulation Its purpose is to produce a 'best estimate‘ of what is currently understood, through the work of a group of scientists chosen for their expertise and experience to make reasoned assessments on the balance of evidence.”

This statement confirms that a “best estimate” obtained from “experts” has replaced  scientific investigation.  It does not mention that  a “proper validation” of climate models is never made.

They did not seem to appreciate that “datasets” of original surface temperature measurements on the earth’s surface are not available. The “datasets” that are available are the result of multiple manipulation of the original observations, but the processes and the inaccuracies of these manipulations are not revealed.

The members of the panel took the trouble to confirm that these publicly available processed figures do give a version of the published Mean Global Surface Temperature Anomaly Record (MGSTAR)  when manipulated by approved techniques..

Other disciplines which are considered as “Sciences” may also adopt procedures which depart from the scientific method. Geology, Biology, Anthropology, Psychology, Sociology, and Economics may have to deal with observations or experiments that are not repeatable, and theories that are not falsifiable. and they may sometimes also use correlation and the opinions of “experts” instead of proper scientific evidence, but it is rare that the economic consequences can be as severe as inadequate climate advice

It is a long-established logical principle that a correlation, however convincing, cannot prove cause and effect.  My old statistics textbook8 gives two examples that show this is so. A high correlation between teachers’ salaries and consumption of alcohol, and a high correlation between birth rate and the number of storks in Holland are neither of them a proof of a causal relationship.Yet 64% of Americans appear to believe that correlation does prove causation9 and we are constantly being told of  “links” which can be made between one thing and another.

TRUTH AND PROBABILITY

The Scottish philosopher David Hume10  wrote

“all knowledge degenerates into probability; and this probability is greater or less, according to our experience of the veracity or deceitfulness of our understanding, and according to the simplicity or intricacy of the question”.

Until the development of mathematical techniques for the estimation of probability, scientific measurements were only available in the form of “ranges” of values from different experiments or observers. There was no technique for preferring any figure.  Studies in  the 1930’s by Fisher, Yates and others have supplied an array of mathematical techniques for obtaining a most probable estimate from a number of observations, together with a means of estimating the probable reliability of this figure. There are also techniques for estimating the reliability of mainly linear “trends

These techniques are now widespread and even familiar. They are supplied on every “scientific” calculator and computer spreadsheet. The results are quoted as part of opinion polls and medical experiments for assessing the value of drugs. It is unfortunate that there is widespread ignorance, even amongst scientists, as to the degree to which quoted results comply with the assumptions of the mathematical equations used.

The commonest mathematical technique employs the Gaussian distribution, or “bell curve” as a model for a probability distribution. The main reasons for this are that it is mathematically fairly simple and that it does approximately fit many real sets of measurements.  Its estimates are however unreliable unless the results comply with the following assumptions

Samples must closely resemble a Gaussian curve

Samples must be  random and unbiased

Samples must be completely uniform in all circumstances

The distribution curve of results must be symmetrical

Many observations of the climate do not conform with one or more of these assumptions. For example, although it is often true that  measurements close to an average roughly fit the Gaussian curve, the fit usually not  so successful for low probability, outlier measurements. It is therefore wrong to base predictions of the behaviour of outliers on statistical estimates obtained from the most probable measurements. Thus we are always finding that “100 year” floods droughts or hurricanes may occur much more frequently than “expected”.

**

TEMPERATURE MEASUREMENT**

The most important climate property for establishing the “greenhouse” theory  is temperature

It is the public perception that the globe is warming. Yet there is no technique currently available to us to discover whether this is true, to a known level of accuracy. It is just not possible to place temperature measuring equipment in a random and representative fashion over the entire surface of the earth, Even measuring the surface temperature in one single place cannot be done in a satisfactory manner. This point is eloquently made by Hansen11  and elaborated by Pielke et al.12

Yet it was Hansen himself13 who was responsible for the suggestion that “temperature anomalies” could be established by making use of temperature measurements at weather stations. He proposed a system of dividing the globe into latitude/longitude boxes, averaging temperature measurements from approved stations within each box, and by comparing the “anomaly” figures for each year it would be possible to establish a temperature “trend” for the entire earth’s surface

There are many objections to this procedure. The greatest is that the original observations, which would have consisted of daily records carried out by many people in many places, appear to be lost. At least they are not publicly available. The Mean Global Surface Temperature Anomaly Record (MGSTAR) cannot be checked by using the original observations.

Locations ot weather stations are grossly unrepresentative of the earth’s surface. They do not include the 71% that is ocean. Inclusion of sea surface  temperatures have been made by the CRU14  but these measurements are even less accurate than the surface measurements and Hansen at the Goddard Institute for Space Studies (GISS)  and the other US system the Global Historical Climate Network (GHCN) have never accepted them. .

Few people seem to understand how limited are actual temperature measurements made at weather stations and under what conditions. The equipment has tended to include a Stevenson Screen, situated 2 meters or so above the ground, containing liquid-in-glass thermometers read only once a day. Some of the early readings are  single figures, but most were of the maximum and the minimum figure. If read in the morning the maximum would be for the previous calendar day.

Until recently, there has been no method for continuous measurement. The “mean daily temperature” that is the basis for the Hansen/CRU/GHCN temperature anomaly record has been the average of the daily maximum and minimum.

Surface temperatures at different times in any one place do not form a symmetrical sequence. The daytime temperatures are dependent on the sun and its changing elevation, but at night there is no sun and the temperature regime is entirely different, There is no definable average temperature for this skewed distribution

Even if there were an acceptable average, it cannot be related to the mean of a mximum and minimum, which is all we have of a “mean daily temperature” .A study I made recently15  compared the Maximum/Minimum mean with the 24 hourly mean for a summer and a winter day for 20 weather stations in New Zealand I found that the difference between the two  means could be as high as ±2ºC This figure would be expected to be higher for places with a greater temperature variation and for many past temperatures.

The procedure adopted to obtain the Mean Global Surface Temperature Anomaly Record (MGSTAR) calculates multiple simple arithmetic averages of distributions of figures that are not symmetrical, at every step,. .Each of these individual “mean daily temperatures” has to be averaged with all the others in the chosen box, then monthly, then yearly, then subtracted from the average figure for the whole lot, for a reference period. The figures for each box are then averaged to give the MGSTAR. The uncertainties for each of these processes are certainly very high, but they are ignored completely when compiling the MGSTAR.

D’Aleo and Watts16 have recently provided a long list of sources of inaccuracy with surface temperature records. Watts has carried out a comprehensive survey of US weather stations which showed that 82% of them are incapable of measuring temperature to better than one or two degrees,

It must surely be concluded that a “trend” of less than one degree in 100 years in  the MGSTAR is far lower than the likely accuracy of the method, and is therefore unreliable,

It is interesting that in a discussion of the uncertainties in the CRU temperature anomaky record, Brohan et al17 admit that there are “unknown unknowns” which they are unable to quantify,  citing the well-known expert on this subject, Donald Rumsfeld

The attempted simulation of the MGSTAR by the IPCC has difficulties beyond those related to the very low accuracy of the record itself. In the first draft of the 4th IPCC Report this simulation attempt occurred in Chapter 8 “Evaluation of Climate Models”.  This attempt included natural climate effects such as volcanic eruptions and changes in the Sun. I commented that the simulation did not include the most important influences on the MGSTAR, which are the ocean oscillations and the urbanization influence around weather stations. I felt so strongly about this issue that I repeated it in an additional comment.

When the Report was finished3 I was glad to see that my  comment had been headed fprthe attempt to simulate the MGSTAR had been removed from Chapter 8. I felt that I had made a useful contribution. But then, I found that it had been included in “Frequently Asked Questions No 8.1 and also in the Preliminary “Technical Report” where I had somehow escaped making comments. But my comment still stands. The simulation is defective.

It might be remarked that despite the evidence of very large “unknown unknown” uncertainties in the MGSTR. it does seem capable of responding to several of the more obvious natural influences on temperature, such as volcanic eruptions and ocean oscillations. The uncertainties do imply that the  “trend” that has been found can be plausibly explained without a contribution from greenhouse gas emissions. from the many sources of upwards bias, such as urban effects on weather stations, ocean oscillations, and changes in the number of participating stations

Other climate “data” are no better than temperature measurements. Rainfall and snow are more difficult to measure than temperature  Sea level is now more accurate, but its results are ignored19 Sunspots are an extremely crude system for estimating the activity of the Sun, The Southern Oscillation Index is measured extremely crudely as the difference in Air Pressure between Tahiti and Darwin. It is surely difficult to find any global climate sequence of sufficient accuracy to serve as a means of validating climate models.

The technique of seeking the opinions of experts runs into the problem of conflict of interest, since most of these “experts” would lose income and status if their models were not given a high level of “confidence” “likelihood” or “probability”. Independent “experts” are never consulted

It must surely be concluded that the Public Perception that the globe is warming and that this is caused by increased emissions of greenhouse gases has no scientific basis  either from its non compliance with the requirements of the scientific method but even even after accepting the departures from the use of the scientific method currently practiced by the IPCC.

REFERENCES

  1. Wikipedia. “Scientific Method”. . http://en.wikipedia.org/wiki/Scientific_method

  2. Popper, K. R. “The Logic of Scientific Discovery” (2002), 316.

3  Solomon, S., D Qin, M. R. Manning,  M.  Marquis, K.  Averyt, M. H Tignor, H. L. Miller, and Z. Chin.  (Eds.). “Climate Change 2007: The Physical Science Basis” (IPCC)

  1. Lorenz, E. N. 1963. “Deterministic Nonperiodic Flow”. J Atmos Sci. 20,  141

  2. Solomon, S., D Qin, M. R. Manning,  M.  Marquis, K.  Averyt, M. H Tignor, H. L. Miller, and Z. Chin.  (Eds.) 2007. “Climate Change 2007: The Physical Science Basis” (IPCC), Frequently Asked Questions No 2. in Chapter 1  Cambridge University Press.

  3. Rind, D. 1999 "Complexity and Climate". Science, 284, 105-107.

  4. Houghton, J, T. G. J. Jenkins, and J. J. Ephraums, (Eds.) 1990. “Climate Change: The IPCC Scientific Assessment.”  Cambridge University Press.

  5. Russell, Sir Muir  2010 “The Independent Climate Change Emails Review” http:/. /pdf/FINAL REPORT.pdf

  6. Freund, J. E. 1967, “Modern Elementary Statistics” 3rd Edition. Page 359 Prentice Hall International.

10  “New Poll Finds Correlation is Causation” 1998.  http://obereed.net/hh/correlation.html

11  Hume, D. 1748,  “An Enquiry Concerning Human Understanding” Part 4 Section 1. Many Editions.

12  Hansen, J., 2010, “GISS Surface Temperature Analysis, The Elusive Absolute Surface Air Temperature (SAT).”  http://data.giss.nasa.gov/gistemp/abs_temp.html

13 Pielke, R.A Sr, and 12 others, 2007. “Unresolved Issues with the Assessment of Multi-Decadal Global Land-Surface Temperature Trends”. J Geophys.l Res,, 112, d24s08, doi:10.1029/2006jd008229.

14 Hansen, J.,  & S. Lebedeff.  1987. “Global Trends of Measured Surface Air Temperature” Journal of Geophysical Research  92  13345-13372.

  1. Folland, C. K. and D..E. Parker.1995. “Correction of instrumental biases in historical sea surface temperature data”. Quart. J. Met. Soc. 15 1195-1218.

16 Gray, V. R.  2007. “Climate Change 2007: The Physical Science Basis: Summary for Policymakers”. Energy and Envi ronment.  18 433-440.

17 D’Aleo J. and Watts. A. 2010 “Surface Temperature Records, Policy-Driven Deception”. http://scienceandpublicpolicy.org/originals/policy_driven_deception.html

18 Brohan, P., J. J.  Kennedy, I. Harris, S. F, B, Tett, and P. D. Jones. 2006, “Uncertainty estimates in regional and global observed temperature changes: A new data set from 1850”. J. Geophys. Res.  111, D12106.doi:1020/2005JD006546.

19 Gray, V. R. 2009, “South Pacific Sea Level A Reassessment” 2009 http://nzclimatescience.net/images/PDFs/spsl3.pdf.

22nd July 2010

Vincent R. Gray , M.A.,Ph.D., F.N.Z.I.C.

Climate Consultant

75 Silverstream Road

Crofton Downs

Wellington 6004,  New Zealand                              Phone ( FAX) (064) (04) 9735939

Email [email protected]

Next Post Previous Post