E. Allan Blair ('A More Practical Goal', Port of Call, Dec. 20) writes that:
"We should all be aware global warming has been going on for a very long time...the article in the 'gcrio' online magazine Consequences shows that during the Medieval Warm Period, beginning about the years 1050 and 1250, the temperature was somewhat higher than it is now. It's hard to see how man could have caused those higher temperatures. 125,000 years ago it was warmer still."
Okay, let's back up a bit here and try to shed some light on the basis of the warming in the Medieval period. An ongoing issue with climate research has been to construct a reliable base of observations for comparisons of solar radiation intensity in different epochs - thereby to better infer or delineate the emergence of any anthropogenic effects. To this day, the best work in the area of the Sun-climate relationship has probably been done by the late Dr. John Eddy, a solar physicist.
The core problem concerns comparing eras of solar variation when there were no telescopes to detect sunspots (such as in the 12th-13th century), with eras in which spot recordings existed. We're fairly confident, for example, that the Maunder Minimum (1645-1715) really did have few or no spots because we're the beneficiaries of the observational records of such famous astronomers as Flamsteed, Helley, Cassini and Hevelius whose work in other areas was above reproach.
That observational record through 70 years discloses only occasional sunspots, mainly single and scattered across six separate 11-year solar cycles, with locations mainly near the solar (heliographic) equator.
But what about discerning solar variations in earlier epochs, say the 1200s when the so-called Middle Ages "warming" occurred? Eddy has pointed out (The New Solar Physics, p. 16) that the breakthrough for this arrived in the 1960s when a series of papers demonstrated that radiocarbon (C14) in plant cellulose could be used as an indirect or proxy register of solar activity.
In general, C14 is produced in the upper atmosphere via the impact-interaction with high energy cosmic rays, say from galactic sources. Solar activity in turn modulates the intensity of these cosmic rays via the action of the heliosphere which deflects a fraction of the intense cosmic ray flux and other harmful interstellar radiation. (This shield by the way is shrinking - see: Suns-protective-bubble-is-shrinking )
At times the Sun is more active, so the heliosphere will also be stronger, shielding the Earth from more intense cosmic rays - the effect of which is to reduce the C14 produced in the Earth's upper atmosphere. Conversely, when the Sun is less active - as it's been from 2000-2008 - then the shield is weaker and more intense cosmic rays penetrate to our upper atmosphere yielding more C14 produced. It follows from this that if a record could be obtained of the ratio of say C14 to C12, then one would have a proxy indicator of solar activity for any time (with the C14 to C12 ratio extracted from tree rings or other plant tissue). If such a record showed falling C14 to C12 we'd deduce higher solar activity, and if increased C14 to C12 lower solar activity. If the same ratios were obtained in the modern era it might be feasible to normalize all the results to compare them and draw conclusions.
Most importantly and crucially, since the Sun isn't the exclusive modulator of cosmic rays (clearly if there's an anthropogenic effect that would also impact the high atmosphere and modulate intensity and C14 production, as would a changing magnetic moment for the planet), then the C14 record embedded in tree rings would have to be expected to be inscribed with other histories too.
Fortuitously, a 2000-year record of C14:C12 deviations has been compiled by P.E. Damon (The Solar Output and Its Variation, The University of Colorado Press, Boulder, 1977), and this is shown in the accompanying graph.
To conform to solar activity the plot is such that increasing radiocarbon (C14 %) is downward and indicated with (+). The deviations in parts per thousand are shown relative to an arbitrary 19th century reference level. As John Eddy observes concerning this output (Eddy, op. cit. p. 17):
"The gradual fall from left to right (increasing C14/C12 ratio) is...probably not a solar effect but the result of the known, slow decrease in the strength of the Earth's magnetic moment [1] exposing the Earth to ever-increased cosmic ray fluxes and increased radiocarbon production.
"The sharp upward spike at the modern end of the curve, representing a marked drop in relative radiocarbon, is generally attributed to anthropogenic causes - the mark of increased population and the Industrial Age. The burning of low radiocarbon fossil fuels - coal and oil - and the systematic burning off of the world's forests for agriculture can be expected to dilute the natural C14/C12 ratio in the troposphere to produce an effect like the one shown, though a real increase in solar activity may be hidden under the curve"
Assuming the validity of the arbitrary norm (zero line or abscissa) for 1890, then it is clear that the magnitude of the Middle Ages warming period (relative C14 strength of -18), for example, is less than about 1/2 the relative effect attributed mainly to anthropogenic sources in the modern era (-40). Even if one fourth the latter magnitude is assigned to solar activity (based on the solar variability component detected over 1861-1990 amounting to 0.1- 0.5 W/m^2 vs. 2.0 to 2.8 W/m^2 for heating component arising from greenhouse gas emissions, cf. Martin I. Hoffert et al, in Nature, Vol. 401, p. 764) the anthropogenic effect is at 1 and 1/2 times that for the last (exclusively solar) warming period.
These results comport with modern findings that the last ten years have been the warmest ever, since records have been kept. This is according to data from the World Meteorological Office. For reference: parts of Greenland had an average temperature 5.4 F above normal. Meanwhile, Russian officials have ascribed 11,000 "excess deaths" due to heat, arising from their prolonged heat wave. According to the WMO:
"The year 2010 is almost certain to rank in the top three warmest years since the beginning of instrumental records in 1850."
Beyond this, I suggest Dr. Blair consult some of the ground breaking ice core research of Dr. Gunther Weller (of The Geophysical Institute, University of Alaska- Fairbanks) - who was among the first to detect the accelerated warming of the Arctic ca. 1985. The ice core record of Weller and his group found that when the CO2 was assessed in comparative cores, we were entering a period with CO2 concentrations (in parts per million) that haven't been seen for nearly 600,000 years.
The danger is that unlike those earlier epochs, humans are now flirting with CO2 concentrations with the potential to trigger a runaway greenhouse effect. (Most climate scientists like Weller put this at a threshold of about 500 ppm.) In that case, all controls and hope for adaptation will evaporate with the seas and humans will be measuring the time to their extinction in decades instead of possible millions of years.
In terms of possible adaptations or even mitigation of anthropogenic warming, Dr. Blair may be interested in the recent issue of The Economist ('How to Live with Climate Change' - lead story, Nov. 27 - Dec. 3, 2010) and also Eos AGU Transactions, Vol. 90, No. 21, 26 May, 2009, p. 181 ('New Study for Climate Modeling Analysis and Scenarios'). The article covers in commendable detail - at a reasonable level - what the European Commission is currently doing with their ENSEMBLES project which aims to provide policy makers with information from the latest climate modeling analyses.
ENSEMBLES, as the paper notes, is primarily concerned with quantifying the (politically relevant) aggressive mitigation scenario. What happens - by what time - if we cut CO2 emissions by so much? Their working scenario thus far (given existing assumptions and variables) leads to a peak in the CO2 equivalent concentration in the atmosphere of nearly 535 parts per million in 2045, before eventually stabilizing at 450 ppm. Even so, the concentration peak is precariously close to what many (e.g. the late Dr. Carl Sagan, who originally quantified the runaway greenhouse effect on Venus in his Ph.D. dissertation) have claimed is at the cusp of the runaway greenhouse effect.
A warning given in the piece, and cautionary note for all over-simplistic takes, is that while simpler models often give useful results they almost uniformly show a modest warming only, of say 2C in the time period up to 2100. Once one factors in complexities, for example removing the current global dimming factor (which masks two thirds of the warming) things change and fast. You now see warming levels ramped up to the 5-6C range or again - close to what'd be expected in the runaway greenhouse scenario.
Most basic temperature data models are of the simpler form, so their projections are not very significant compared to the complex models that incorporate aspects like global dimming, or changing albedo.
Dr. Blair is quite correct that we need to somehow learn to live with the changes global warming will introduce. The problem is whether it may be too late for any such reasonable attainment, assuming we are now on the cusp of the runaway greenhouse effect.
[1] Estimated currently using:
m = r B(r, L)/ [1 + 3 sin^2(L)]^1/2
where r is the distance from the center of the Earth, and referenced to latitude L with B(r,L) the magnetic intensity as a function of r, L.