Warming’s happening now – it’s real and anthropogenic Rahmstorf, ‘8 (Stefan, Professor of Physics @ Potsdam University, Member of the German Advisory Council on Climate Change, Global Warming: Looking Beyond Kyoto, ed. Ernesto Zedillo, Prof. IR @ Yale, p. 42-49 accessed 6-27, Google Books, JG)
It is time to turn to statement B: human activities are altering the climate. This can be broken into two parts. The first is as follows: global climate is warming. This is by now a generally undisputed point (except by novelist Michael Crichton), so we deal with it only briefly. The two leading compilations of data measured with thermometers are shown in figure 3-3, that of the National Aeronautics and Space Administration (NASA) and that of the British Hadley Centre for Climate Change. Although they differ in the details, due to the inclusion of different data sets and use of different spatial averaging and quality control procedures, they both show a consistent picture, with a global mean warming of 0.8°C since the late nineteenth century. Temperatures over the past ten years clearly were the warmest since measured records have been available. The year 1998 sticks out well above the longterm trend due to the occurrence of a major El Nino event that year (the last El Nino so far and one of the strongest on record). These events are examples of the largest natural climate variations on multiyear time scales and, by releasing heat from the ocean, generally cause positive anomalies in global mean temperature. It is remarkable that the year 2005 rivaled the heat of 1998 even though no El Nino event occurred that year. (A bizarre curiosity, perhaps worth mentioning, is that several prominent "climate skeptics" recently used the extreme year 1998 to claim in the media that global warming had ended. In Lindzen's words, "Indeed, the absence of any record breakers during the past seven years is statistical evidence that temperatures are not increasing.")33 In addition to the surface measurements, the more recent portion of the global warming trend (since 1979) is also documented by satellite data. It is not straightforward to derive a reliable surface temperature trend from satellites, as they measure radiation coming from throughout the atmosphere (not just near the surface), including the stratosphere, which has strongly cooled, and the records are not homogeneous' due to the short life span of individual satellites, the problem of orbital decay, observations at different times of day, and drifts in instrument calibration.' Current analyses of these satellite data show trends that are fully consistent with surface measurements and model simulations." If no reliable temperature measurements existed, could we be sure that the climate is warming? The "canaries in the coal mine" of climate change (as glaciologist Lonnie Thompson puts it) ~are mountain glaciers. We know, both from old photographs and from the position of the terminal moraines heaped up by the flowing ice, that mountain glaciers have been in retreat all over the world during the past century. There are precious few exceptions, and they are associated with a strong increase in precipitation or local cooling.36 I have inspected examples of shrinking glaciers myself in field trips to Switzerland, Norway, and New Zealand. As glaciers respond sensitively to temperature changes, data on the extent of glaciers have been used to reconstruct a history of Northern Hemisphere temperature over the past four centuries (see figure 3-4). Cores drilled in tropical glaciers show signs of recent melting that is unprecedented at least throughout the Holocene-the past 10,000 years. Another powerful sign of warming, visible clearly from satellites, is the shrinking Arctic sea ice cover (figure 3-5), which has declined 20 percent since satellite observations began in 1979. While climate clearly became warmer in the twentieth century, much discussion particularly in the popular media has focused on the question of how "unusual" this warming is in a longer-term context. While this is an interesting question, it has often been mixed incorrectly with the question of causation. Scientifically, how unusual recent warming is-say, compared to the past millennium-in itself contains little information about its cause. Even a highly unusual warming could have a natural cause (for example, an exceptional increase in solar activity). And even a warming within the bounds of past natural variations could have a predominantly anthropogenic cause. I come to the question of causation shortly, after briefly visiting the evidence for past natural climate variations. Records from the time before systematic temperature measurements were collected are based on "proxy data," coming from tree rings, ice cores, corals, and other sources.These proxy data are generally linked to local temperatures in some way, but they may be influenced by other parameters as well (for example, precipitation), they may have a seasonal bias (for example, the growth season for tree rings), and high-quality long records are difficult to obtain and therefore few in number and geographic coverage. Therefore, there is still substantial uncertainty in the evolution of past global or hemispheric temperatures. (Comparing only local or regional temperature; as in Europe, is of limited value for our purposes,' as regional variations can be much larger than global ones and can have many regional causes, unrelated to global-scale forcing and climate change.) The first quantitative reconstruction for the Northern Hemisphere temperature of the past millennium, including an error estimation, was presented by Mann, Bradley, and Hughes and rightly highlighted in the 2001 IPCC report as one of the major new findings since its 1995 report; it is shown in figure 3_6.39 The analysis suggests that, despite the large error bars, twentieth-century warming is indeed highly unusual and probably was unprecedented during the past millennium. This result, presumably because of its symbolic power, has attracted much criticism, to some extent in scientific journals, but even more so in the popular media. The hockey stick-shaped curve became a symbol for the IPCC, .and criticizing this particular data analysis became an avenue for some to question the credibility of the IPCC. Three important things have been overlooked in much of the media coverage. First, even if the scientific critics had been right, this would not have called into question the very cautious conclusion drawn by the IPCC from the reconstruction by Mann, Bradley, and Hughes: "New analyses of proxy data for the Northern Hemisphere indicate that the increase in temperature in the twentieth century is likely to have been the largest of any century during the past 1,000 years." This conclusion has since been supported further by every single one of close to a dozen new reconstructions (two of which are shown in figure 3-6). Second, by far the most serious scientific criticism raised against Mann, Hughes, and Bradley was simply based on a mistake. 40 The prominent paper of von Storch and others, which claimed (based on a model test) that the method of Mann, Bradley, and Hughes systematically underestimated variability, "was [itself] based on incorrect implementation of the reconstruction procedure."41 With correct implementation, climate field reconstruction procedures such as the one used by Mann, Bradley, and Hughes have been shown to perform well in similar model tests. Third, whether their reconstruction is accurate or not has no bearing on policy. If their analysis underestimated past natural climate variability, this would certainly not argue for a smaller climate sensitivity and thus a lesser concern about the consequences of our emissions. Some have argued that, in contrast, it would point to a larger climate sensitivity. While this is a valid point in principle, it does not apply in practice to the climate sensitivity estimates discussed herein or to the range given by IPCC, since these did not use the reconstruction of Mann, Hughes, and Bradley or any other proxy records of the past millennium. Media claims that "a pillar of the Kyoto Protocol" had been called into question were therefore misinformed. As an aside,the protocol was agreed in 1997, before the reconstruction in question even existed. The overheated public debate on this topic has, at least, helped to attract more researchers and funding to this area of paleoclimatology; its methodology has advanced significantly, and a number of new reconstructions have been presented in recent years. While the science has moved forward, the first seminal reconstruction by Mann, Hughes, and Bradley has held up remarkably well, with its main features reproduced by more recent work. Further progress probably will require substantial amounts of new proxy data, rather than further refinement of the statistical techniques pioneered by Mann, Hughes, and Bradley. Developing these data sets will require time and substantial effort. It is time to address the final statement: most of the observed warming over the past fifty years is anthropogenic. A large number of studies exist that have taken different approaches to analyze this issue, which is generally called the "attribution problem." I do not discuss the exact share of the anthropogenic contribution (although this is an interesting question). By "most" I imply mean "more than 50 percent.” The first and crucial piece of evidence is, of course, that the magnitude of the warming is what is expected from the anthropogenic perturbation of the radiation balance, so anthropogenic forcing is able to explain all of the temperature rise. As discussed here, the rise in greenhouse gases alone corresponds to 2.6 W/tn2 of forcing. This by itself, after subtraction of the observed 0'.6 W/m2 of ocean heat uptake, would Cause 1.6°C of warming since preindustrial times for medium climate sensitivity (3"C). With a current "best guess'; aerosol forcing of 1 W/m2, the expected warming is O.8°c. The point here is not that it is possible to obtain the 'exact observed number-this is fortuitous because the amount of aerosol' forcing is still very' uncertain-but that the expected magnitude is roughly right. There can be little doubt that the anthropogenic forcing is large enough to explain most of the warming. Depending on aerosol forcing and climate sensitivity, it could explain a large fraction of the warming, or all of it, or even more warming than has been observed (leaving room for natural processes to counteract some of the warming). The second important piece of evidence is clear: there is no viable alternative explanation. In the scientific literature, no serious alternative hypothesis has been proposed to explain the observed global warming. Other possible causes, such as solar activity, volcanic activity, cosmic rays, or orbital cycles, are well observed, but they do not show trends capable of explaining the observed warming. Since 1978, solar irradiance has been measured directly from satellites and shows the well-known eleven-year solar cycle, but no trend. There are various estimates of solar variability before this time, based on sunspot numbers, solar cycle length, the geomagnetic AA index, neutron monitor data, and, carbon-14 data. These indicate that solar activity probably increased somewhat up to 1940. While there is disagreement about the variation in previous centuries, different authors agree that solar activity did not significantly increase during the last sixty-five years. Therefore, this cannot explain the warming, and neither can any of the other factors mentioned. Models driven by natural factors only, leaving the anthropogenic forcing aside, show a cooling in the second half of the twentieth century (for an example, See figure 2-2, panel a, in chapter 2 of this volume). The trend in the sum of natural forcings is downward. The only way out would be either some as yet undiscovered unknown forcing or a warming trend that arises by chance from an unforced internal variability in the climate system. The latter cannot be completely ruled out, but has to be considered highly unlikely.No evidence in the observed record, proxy data, or current models suggest that such internal variability could cause a sustained trend of global warming of the observed magnitude. As discussed twentieth century warming is unprecedented over the past 1,000 years, (or even 2,000 years, as the few longer reconstructions available now suggest), which does not 'support the idea of large internal fluctuations. Also, those past variations correlate well with past forcing (solar variability, volcanic activity) and thus appear to be largely forced rather than due to unforced internal variability." And indeed, it would be difficult for a large and sustained unforced variability to satisfy the fundamental physical law of energy conservation. Natural internal variability generally shifts heat around different parts of the climate system-for example, the large El Nino event of 1998, which warmed, the atmosphere by releasing heat stored in the ocean. This mechanism implies that the ocean heat content drops as the atmosphere warms. For past decades, as discussed, we observed the atmosphere warming and the ocean heat content increasing, which rules out heat release from the ocean as a cause of surface warming. The heat content of the whole climate system is increasing, and there is no plausible source of this heat other than the heat trapped by greenhouse gases. ' A completely different approach to attribution is to analyze the spatial patterns of climate change. This is done in so-called fingerprint studies, which associate particular patterns or "fingerprints" with different forcings. It is plausible that the pattern of a solar-forced climate change differs from the pattern of a change caused by greenhouse gases. For example, a characteristic of greenhouse gases is that heat is trapped closer to the Earth's surface and that, unlike solar variability, greenhouse gases tend to warm more in winter, and at night. Such studies have used different data sets and have been performed by different groups of researchers with different statistical methods. They consistently conclude that the observed spatial pattern of warming can only be explained by greenhouse gases.49 Overall, it has to be considered, highly likely' that the observed warming is indeed predominantly due to the human-caused increase in greenhouse gases. ' This paper discussed the evidence for the anthropogenic increase in atmospheric CO2 concentration and the effect of CO2 on climate, finding that this anthropogenic increase is proven beyond reasonable doubt and that a mass of evidence points to a CO2 effect on climate of 3C ± 1.59C global-warming for a doubling of concentration. (This is, the classic IPCC range; my personal assessment is that, in-the light of new studies since the IPCC Third Assessment Report, the uncertainty range can now be narrowed somewhat to 3°C ± 1.0C) This is based on consistent results from theory, models, and data analysis, and, even in the absence-of any computer models, the same result would still hold based on physics and on data from climate history alone. Considering the plethora of consistent evidence, the chance that these conclusions are wrong has to be considered minute. If the preceding is accepted, then it follows logically and incontrovertibly that a further increase in CO2 concentration will lead to further warming. The magnitude of our emissions depends on human behavior, but the climatic response to various emissions scenarios can be computed from the information presented here. The result is the famous range of future global temperature scenarios shown in figure 3_6.50 Two additional steps are involved in these computations: the consideration of anthropogenic forcings other than CO2 (for example, other greenhouse gases and aerosols) and the computation of concentrations from the emissions. Other gases are not discussed here, although they are important to get quantitatively accurate results. CO2 is the largest and most important forcing. Concerning concentrations, the scenarios shown basically assume that ocean and biosphere take up a similar share of our emitted CO2 as in the past. This could turn out to be an optimistic assumption; some models indicate the possibility of a positive feedback, with the biosphere turning into a carbon source rather than a sink under growing climatic stress. It is clear that even in the more optimistic of the shown (non-mitigation) scenarios, global temperature would rise by 2-3°C above its preindustrial level by the end of this century. Even for a paleoclimatologist like myself, this is an extraordinarily high temperature, which is very likely unprecedented in at least the past 100,000 years. As far as the data show, we would have to go back about 3 million years, to the Pliocene, for comparable temperatures. The rate of this warming (which is important for the ability of ecosystems to cope) is also highly unusual and unprecedented probably for an even longer time. The last major global warming trend occurred when the last great Ice Age ended between 15,000 and 10,000 years ago: this was a warming of about 5°C over 5,000 years, that is, a rate of only 0.1 °C per century. 52 The expected magnitude and rate of planetary warming is highly likely to come with major risk and impacts in terms of sea level rise (Pliocene sea level was 25-35 meters higher than now due to smaller Greenland and Antarctic ice sheets), extreme events (for example, hurricane activity is expected to increase in a warmer climate), and ecosystem loss. The second part of this paper examined the evidence for the current warming of the planet and discussed what is known about its causes. This part showed that global warming is already a measured and well-established fact, not a theory. Many different lines of evidence consistently show that most of the observed warming of the past fifty years was caused by human activity. Above all, this warming is exactly what would be expected given the anthropogenic rise in greenhouse gases, and no viable alternative explanation for this warming has been proposed in the scientific literature. Taken together., the very strong evidence accumulated from thousands of independent studies, has over the past decades convinced virtually every climatologist around the world (many of whom were initially quite skeptical, including myself) that anthropogenic global warming is a reality with which we need to deal.
Warming is rapid- fossil fuel emissions risks release of stored carbon, which causes runaway warming- the next 30 years are key.
Stein 11 (David, SCIENCE EDITOR for The Canadian (newspaper), Antartic melting becomes Global time bomb, July 17, http://www.agoracosmopolitan.com/news/nature/2011/07/17/172.html)
The immediate threat of runaway global warming and climate change melt-down There are 'carbon bombs': carbon in soils, carbon in warming temperate and boreal forests and in a drought struck Amazon, methane in Arctic peat bogs and in methane hydrates melting in warming ocean waters. "For several decades it has been hypothesized that rising temperatures from increased greenhouse gases in the atmosphere due to burning fossil fuels could be releasing some of and eventually all of these stored carbon stocks to add substantially more potent greenhouse gases to the atmosphere," Bill Henderson further elaborates. Given time lags of 30-50 years, we might have already put enough extra greenhouse gases into the atmosphere to have crossed a threshold to these bombs exploding,their released greenhouse gases leading to ever accelerating global warming with future global temperatures maybe tens of degrees higher than our norms of human habitation and therefore extinction or very near extinction of humanity. "(T)he science is clear. We need not a 20% cut by 2020; not a 60% cut by 2050, but a 90% cut by 2030 (1). Only then do we stand a good chance of keeping carbon concentrations in the atmosphere below 430 parts per million, which means that only then do we stand a good chance of preventing some of the threatened positive feedbacks. If we let it get beyond that point there is nothing we can do. The biosphere takes over as the primary source of carbon. It is out of our hands," George Monbiot says. Ticking Time Bomb by John Atcheson , a geologist writing in the Baltimore Sun, is the best and almost only mainstream media explanation of runaway global warming and how close we are to extinction. "There are enormous quantities of naturally occurring greenhouse gasses trapped in ice-like structures in the cold northern muds and at the bottom of the seas. These ices, called clathrates, contain 3,000 times as much methane as is in the atmosphere. Methane is more than 20 times as strong a greenhouse gas as carbon dioxide."
Warming leads to extinction – try or die
Romm 10 (Jon, Editor of Climate Progress, “Disputing the “consensus” on global warming,” http://climateprogress.org/2010/06/16/scientific-consensus-on-global-warming-climate-science/, JG)
A good example of how scientific evidence drives our understanding concerns how we know that humans are the dominant cause of global warming. This is, of course, the deniers’ favorite topic. Since it is increasingly obvious that the climate is changing and the planet is warming, the remaining deniers have coalesced to defend their Alamo — that human emissions aren’t the cause of recent climate change and therefore that reducing those emissions is pointless. Last year, longtime Nation columnist Alexander Cockburn wrote, “There is still zero empirical evidence that anthropogenic production of CO2 is making any measurable contribution to the world’s present warming trend. The greenhouse fearmongers rely entirely on unverified, crudely oversimplified computer models to finger mankind’s sinful contribution.” In fact, the evidence is amazingly strong. Moreover, if the relatively complex climate models are oversimplified in any respect, it is by omitting amplifying feedbacks and other factors that suggest human-caused climate change will be worse than is widely realized. The IPCC concluded last year: “Greenhouse gas forcing has very likely (>90 percent) caused most of the observed global warming over the last 50 years. This conclusion takes into account … the possibility that the response to solar forcing could be underestimated by climate models.” Scientists have come to understand that “forcings” (natural and human-made) explain most of the changes in our climate and temperature both in recentdecades and over the past millions of years. The primary human-made forcings are the heat-trapping greenhouse gases we generate, particularly carbon dioxide from burning coal, oil and natural gas. The natural forcings include fluctuations in the intensity of sunlight (which can increase or decrease warming), and major volcanoes that inject huge volumes of gases and aerosol particles into the stratosphere (which tend to block sunlight and cause cooling)…. Over and over again, scientists have demonstrated that observed changes in the climate in recent decades can only be explained by taking into account the observed combination of human and natural forcings. Natural forcings alone just don’t explain what is happening to this planet. For instance, in April 2005, one of the nation’s top climate scientists, NASA’s James Hansen, led a team of scientists that made “precise measurements of increasing ocean heat content over the past 10 years,” which revealed that the Earth is absorbing far more heat than it is emitting to space, confirming what earlier computer models had shown about warming. Hansen called this energy imbalance the “smoking gun” of climate change, and said, “There can no longer be genuine doubt that human-made gases are the dominant cause of observed warming.” Another 2005 study, led by the Scripps Institution of Oceanography, compared actual ocean temperature data from the surface down to hundreds of meters (in the Atlantic, Pacific and Indian oceans) with climate models and concluded: A warming signal has penetrated into the world’s oceans over the past 40 years. The signal is complex, with a vertical structure that varies widely by ocean; it cannot be explained by natural internal climate variability or solar and volcanic forcing, but is well simulated by two anthropogenically [human-caused] forced climate models. We conclude that it is of human origin, a conclusion robust to observational sampling and model differences. Such studies are also done for many other observations: land-based temperature rise, atmospheric temperature rise, sea level rise, arctic ice melt, inland glacier melt, Greeland and Antarctic ice sheet melt, expansion of the tropics (desertification) and changes in precipitation. Studies compare every testable prediction from climate change theory and models (and suggested by paleoclimate research) to actual observations. How many studies? Well, the IPCC’s definitive treatment of the subject, “Understanding and Attributing Climate Change,” has 11 full pages of references, some 500 peer-reviewed studies. This is not a consensus of opinion. It is what scientific research and actual observations reveal. And the science behind human attribution has gotten much stronger in the past 2 years (see a recent literature review by the Met Office here). That brings us to another problem with the word “consensus.” It can mean “unanimity” or “the judgment arrived at by most of those concerned.” Many, if not most, people hear the second meaning: “consensus” as majority opinion. The scientific consensus most people are familiar with is the IPCC’s “Summary for Policymakers” reports. But those aren’t a majority opinion. Government representatives participate in a line-by-line review and revision of these summaries. So China, Saudi Arabia and that hotbed of denialism — the Bush administration — get to veto anything they don’t like. The deniers call this “politicized science,” suggesting the process turns the IPCC summaries into some sort of unscientific exaggeration. In fact, the reverse is true. The net result is unanimous agreement on a conservative or watered-down document. You could argue that rather than majority rules, this is “minority rules.” Last April, in an article titled “Conservative Climate,” Scientific American noted that objections by Saudi Arabia and China led the IPCC to remove a sentence stating that the impact of human greenhouse gas emissions on the Earth’s recent warming is five times greater than that of the sun. In fact, lead author Piers Forster of the University of Leeds in England said, “The difference is really a factor of 10.” Then I discuss the evidence we had even back in 2008 that the IPCC was underestimating key climate impacts, a point I update here. The bottom line is that recent observations and research make clear the planet almost certainly faces a greater and more imminent threat than is laid out in the IPCC reports. That’s why climate scientists are so desperate. That’s why they keep begging for immediate action. And that’s why the “consensus on global warming” is a phrase that should be forever retired from the climate debate. The leading scientific organizations in this country and around the world, including all the major national academies of science, aren’t buying into some sort of consensus of opinion. They have analyzed the science and observations and expressed their understanding of climate science and the likely impacts we face on our current emissions path — an understanding that has grown increasingly dire in recent years (see “An illustrated guide to the latest climate science” and “An introduction to global warming impacts: Hell and High Water“).
Independently it displaces billions – causes civil wars, terrorism, and genocide in the short term
CSM, ‘7 – Christian Science Monitor [4/19/2007, Christian Science Monitor, “Could global warming cause war?,” http://www.csmonitor.com/2007/0419/p02s01-usgn.html, DS]
For years, the debate over global warming has focused on the three big "E's": environment, energy, and economic impact. This week it officially entered the realm of national security threats and avoiding wars as well.A platoon of retired US generals and admirals warned that global warming "presents significant national security challenges to the United States." The United Nations Security Council held its first ever debate on the impact of climate change on conflicts. And in Congress, a bipartisan bill would require a National Intelligence Estimate by all federal intelligence agencies to assess the security threats posed by global climate change.Many experts view climate change as a "threat multiplier" that intensifies instability around the world by worsening water shortages, food insecurity, disease, and flooding that lead to forced migration. That's the thrust of a 35-page report (PDF) by 11 admirals and generals this week issued by the Alexandria, Va.-based national security think tank The CNA Corporation. The study, titled National Security and the Threat of Climate Change, predicts: "Projected climate change will seriously exacerbate already marginal living standards in many Asian, African, and Middle Eastern nations, causing widespread political instability and the likelihood of failed states.... The chaos that results can be an incubator of civil strife, genocide, and the growth of terrorism. "The U.S. may be drawn more frequently into these situations, either alone or with allies, to help provide stability before conditions worsen and are exploited by extremists. The U.S. may also be called upon to undertake stability and reconstruction efforts once a conflict has begun, to avert further disaster and reconstitute a stable environment." "We will pay for this one way or another," retired Marine Gen. Anthony Zinni, former commander of American forces in the Middle East and one of the report's authors, told the Los Angeles Times. "We will pay to reduce greenhouse gas emissions today … or we'll pay the price later in military terms. And that will involve human lives." As quoted in the Associated Press, British Foreign Secretary Margaret Beckett, who presided over the UN meeting in New York April 17, posed the question "What makes wars start?" The answer: "Fights over water. Changing patterns of rainfall. Fights over food production, land use. There are few greater potential threats to our economies ... but also to peace and security itself." This is the concern behind a recently introduced bipartisan bill by Sens. Richard Durbin (D) of Illinois and Chuck Hagel (R) of Nebraska. It would require all US intelligence agencies – the CIA, the NSA, the Pentagon, and the FBI – to conduct a comprehensive review of potential security threats related to climate change around the world. "Many of the most severe effects of global warming are expected in regions where fragile governments are least capable of responding to them," Senator Durbin said in a story from the Inter Press Service news agency in Rome. "Failing to recognize and plan for the geopolitical consequences of global warming would be a serious mistake." Rep. Edward J. Markey (D) of Massachusetts, chairman of the newly formed House Select Committee on Energy Independence and Global Warming, is proposing companion legislation that would fund climate change plans by the Department of Defense. On his website, Mr. Markey called for action based on the retired senior officers' report, saying: "Global warming's impacts on natural resources and climate systems may create the fiercest battle our world has ever seen. If we don't cut pollution and head off severe global warming at the pass, we could see extreme geopolitical strain over decreased clean water, environmental refugees, and other impacts." In a speech April 16 to BritishAmerican Business Inc., a trans-Atlantic business organization, British Foreign Secretary Beckett "praised the growing actions of US business executives and state politicians in addressing climate change, including California Governor Arnold Schwarzenegger, who along with British Prime Minister Tony Blair announced plans last year to work toward a possible joint emissions-trading market," reported the Associated Press. Ms. Beckett also told the business executives that clean technology is going to create a "massive" market opportunities: "Those who move into that market first – first to design, first to patent, first to sell, first to invest, first to build a brand – have an unparalleled chance to make money." The Bush administration has taken a less stark view of the security implications of greenhouse-gas emissions than many scientists and military officers. But in a broader context, the administration has agreed that environmental issues could present national and international security challenges. In its 2006 National Security Strategy (PDF), the administration acknowledged that environmental destruction, including that caused by human activity, "may overwhelm the capacity of local authorities to respond, and may even overtax national militaries, requiring a larger international response." "These challenges are not traditional national security concerns, such as the conflict of arms or ideologies. But if left unaddressed they can threaten national security." These concerns are likely to keep growing and continue to be on the agendas at international meetings. A strongly worded draft communiqué for June's G8 summit in Heiligendamm, Germany, warns that "tackling climate change is an imperative, not a choice," reported the British newspaper The Independent on Sunday. The draft says: "Global warming caused largely by human activities is accelerating [and it] will seriously damage our common natural environment and severely weaken [the] global economy, with implications for international security."
Warming’s net negative now – insects, diseases, invasive diseases, and droughts all outweigh fertilization
Garber, ‘8 – reporter for US News and World Report [Kent, 5/28/2008, US News and World Report, “How Global Warming Will Hurt Crops,” http://www.usnews.com/articles/news/2008/05/28/how-global-warming-will-hurt-crops.html, DS]
The global food supply, as recent events have shown all too clearly, is threatened by many problems.Some of them are man-made; some are natural. The natural ones tend to be obvious—droughts, floods, hurricanes, earthquakes—and, in the past year alone, they have been notably devastating. Searing droughts in Australia and central Europe have squandered wheat supplies; more recently, Cyclone Nargis destroyed rice stocks for millions of people in Myanmar. Historically, the damage to food supplies by bad weather has been regarded as fleeting: catastrophic in the short term but ultimately remitting. Droughts ease, floodwaters recede, and farmers replant their crops. But as a new government report indicates, such views are increasingly narrow and outdated, in that they fail to acknowledge the creeping reach of global climate change. The report, released Tuesday, offers one of the most comprehensive looks yet at the impact that climate change is expected to have on U.S. agriculture over the next several decades. Not surprisingly, the prognosis is grim. Temperatures in the United States, scientists say, will rise on average by about 1.2 degrees Celsius by 2040, with carbon dioxide levels up more than 15 percent. The consequences for American-grown food, the report finds, will most likely be far-reaching: Some crop yields are predicted to drop; growing seasons will get longer and use more water; weeds and shrubs will grow faster and spread into new territory, some of it arable farmland; and insect and crop disease outbreaks will become more frequent.The new report, which was produced by more than a dozen agencies over multiple years and reflects the findings of more than 1,000 scientific studies, offers only predictions, but the predictions reflect a high degree of confidence. In a sense, there is a vein of fatalism among most scientists about what will happen in the next few decades. Government actions, they say, may alter the trajectory of climate change 50 to 100 years from now, but the fate of climate change in the short term has been largely shaped by past behavior, by carbon already released into the atmosphere. The question now is the extent of its impact. Some agricultural changes are already observable. In the central Great Plains, in states known for their grassy prairies and sprawling row crops, there are new neighbors: trees and large shrubs, often clustering in islands in the middle of fields. In the Southwest, perennial grasses have been largely pushed out by mesquite bushes, those long-rooted staples of the desert. And the invasive kudzu vine, formerly a nuisance only to the South, has advanced steadily northward, forming a staggered line stretching from Connecticut to Illinois. Human practices in all three cases have abetted the turnover, but climate change, scientists say, has been a primary driver, as invasive species reproduce more quickly and expand into areas once deemed too cold for their survival. In turn, high-quality pastureland, once ideal for livestock grazing, has become poor-quality brush, and farmland faces competitors for space. In the next 30 years these problems will very likely expand and multiply, as an already taxed food system faces threats on multiple fronts. A rise in temperature—even as little as 1 degree Celsius—could cause many plantings to fail, the report indicates, since pollen and seeds are sensitive to slight temperature changes. Yields of corn and rice are expected to decline slightly. Heat-sensitive fruits and vegetables, such as tomatoes, will most likely suffer. Some of the potential damage will be blunted by higher carbon dioxide levels; soybean yields, for instance, will probably improve, because soybeans (and several other crops) thrive from higher carbon inputs. But if temperatures keep rising, the balance will ultimately tip: At some extreme temperature, cells stop dividing, and pollen dies. High ozone levels, which have risen sixfold in the United States in the past century and are expected to rise further, will suppress yields as well. In fact, ozone levels are already extremely high in the eastern and midwestern regions of the country, rivaled globally only by eastern China (no model of air quality, to be sure) and parts of western Europe. One recent study, for instance, found that high ozone levels significantly suppress yields of soybean, wheat, and peanuts in the Midwest. Eventually, the effects of climate change, far from being limited to individual plants, could percolate throughout entire ecosystems. If springs become warmer, as predicted, the crop-growing season will expand. Insects and pests, thriving in warmer winters, will reproduce more frequently and spread more rapidly. Many, in fact, are proliferating already, as reflected in reports of abnormally high rates of disease outbreaks in the western half of the United States. Higher temperatures also are usually accompanied by declining rainfall, threatening to slowly transform once lush areas into arid expanses. At the same time, droughts and heavy isolated rainfalls could become more numerous. For all the criticism that has been piled upon the $300 billion farm bill that Congress recently passed over President Bush's veto, the bill does include many provisions that pertain directly to concerns cited in the new report. Fruit and vegetable growers, for instance, will receive millions of dollars of new funding for research on pest and disease resistance.
SPS is the only solution to warming Hsu 10 (Feng, PhD in Engineering, 12-2010, “Harnessing the Sun: Embarking on Humanity's Next Giant Leap,” Online Journal of Space Communication, http://spacejournal.ohio.edu/issue16/hsu.html, JG)
It has become increasingly evident that facing and solving the multiple issues concerning energy is the single most pressing problem that we face as a species. In recent years, there has been extensive debate and media coverage about alternative energy, sustainable development and global climate change, but what has been missing (at least in the mainstream media) is the knowledge and point of view of scientists and engineers. From the scientists or engineers perspective, this paper discusses the prospects for mankind's technological capability and societal will in harnessing solar energy, and focuses on the issues of: 1) space based solar power (SBSP) development, and, 2) why it is imperative that we must harness the unparalleled power of the sun in a massive and unprecedented scale, which I believe will be humanity's next giant leap forward. Solar Power from a Historic Perspective Whether terrestrially based or space based, solar energy has not yet emerged as a significant solution in public discussions of global warming. Yet, among scientists and engineers and other visionaries, it is starting to be viewed as one of the most promising and viable ways to eventually remove human dependence on fossil fuels. Nearly three years ago at the Foundation For the Future (FFF) International Energy Conference, my presentation was one of the few that took a look back at energy use in human history[1]. In this paper, I would like to offer a brief summary of the various stages mankind has passed through in our quest for energy, and how long they lasted. To understand and fully appreciate the profound idea that humankind has and can continue to harness sun's energy, it is imperative for us to learn from the history of our civilization and from the perspective of human evolution, especially from those societies in crisis over energy. Previewing the history of human energy consumption and energy technologies, we can see that there were three such eras. In the early years of human presence on this planet, we relied on wood-generated energy, based on the burning of firewood, tree branches and the remains of agricultural harvests. Starting in the 1600s, our forefathers discovered the energy properties of coal, which taught us how to tap stored supplies of fossil fuels. Less than two hundred years later, about the middle of the 1800s, we found petroleum and learned to commercialize the use of oil and gas, which brought about our current industrial civilization. In the 20th century, society witnessed the dawn of electricity generation via hydro-power and atomic energy. Today, demand for energy continues to soar, but we're rapidly using up our supplies of easily accessible fossil fuels. What is more, a profound environmental crisis has emerged as the result of our total reliance on energy sources based on those fuels. In the 21st century, there is great uncertainty about world energy supplies. If you plot energy demand by year of human civilization on a terawatt scale, you will see the huge bump that occurred barely a hundred years ago (Figure 1). Before that, in the Stone Age, basically the cultivation of fire led to the emergence of agriculture, cooking, tool making, and all the early stages of human civilization. Now, after about 150 years of burning fossil fuels, the earth's 3 billion years' store of solar energy has been plundered. In my view, mankind must now embark on the next era of sustainable energy consumption and re-supply. The most obvious source of which is the mighty energy resource of our sun. Adequately guide and using human creativity and innovation; the 21st century will become the next great leap forward in human civilization by taming solar energy, transforming our combustion world economy into a lasting solar-electric world economy. issue pic Figure 1. An approximation of fossil fuel age on the scale of human history. (click image for larger view) In solving humanity's energy problems we must learn from our ancestors. Taming the natural forces of the sun will be much like our ancestors' early efforts to harness the power of wild fire. We must use common sense, as they did, developing the tools and technologies that address the needs of our time. The Romans used flaming oil containers to destroy the Saracen fleet in 670. In the same century, the Japanese were digging wells to a depth approaching 900 feet with picks and shovels in search of oil. By 1100, the Chinese had reached depths of more than 3,000 feet in search of energy. This happened centuries before the West had sunk its first commercial well in 1859 in Titusville, Pennsylvania. With all such human creativities in the past, the searching for energy has been driven by our combustion world economy, which focused primarily on what's beneath the surface of our planet - the secondary energy resources which originated from the power of our sun. Now it's time for mankind to lift their heads and start focusing our profound creativity in harnessing the sun and making our way into the energy technology frontiers in the sky. Solar Energy - The Ultimate Answer to Anthropogenic Climate Change The evidence of global warming is alarming. The potential for a catastrophic climate change scenario is dire. Until recently, I worked at Goddard Space Flight Center, a NASA research center in the forefront of space and earth science research. This Center is engaged in monitoring and analyzing climate changes on a global scale. I received first hand scientific information and data relating to global warming issues, including the latest dynamics of ice cap melting and changes that occurred on either pole of our planet. I had the chance to discuss this research with my Goddard colleagues, who are world leading experts on the subject. I now have no doubt global temperatures are rising, and that global warming is a serious problem confronting all of humanity. No matter whether these trends are due to human interference or to the cosmic cycling of our solar system, there are two basic facts that are crystal clear: a) there is overwhelming scientific evidence showing positive correlations between the level of CO2 concentrations in the earth's atmosphere with respect to the historical fluctuations of global temperature changes; and b) the overwhelming majority of the world's scientific community is in agreement about the risks of a potential catastrophic global climate change. That is, if we humans continue to ignore this problem and do nothing, if we continue dumping huge quantities of greenhouse gases into earth's biosphere, humanity will be at dire risk. As a technical and technology risk assessment expert, I could show with confidence that we face orders of magnitude more risk doing nothing to curb our fossil-based energy addictions than we will in making a fundamental shift in our energy supply. This is because the risks of a catastrophic anthropogenic climate change can be potentially the extinction of human species, a risk that is simply too high for us to take any chances. Of course, there will be economic consequences to all societies when we restrict the burning of fossil fuels in an effort to abate "global warming." What we are talking about are options and choices between risks. All human activities involve risk taking; we cannot avoid risks but only make trade-offs, hopefully choosing wisely. In this case, there has to be a risk-based probabilistic thought process when it comes to adopting national or international policies in dealing with global warming and energy issues. As the measure of risk is a product of "likelihood" and "consequence," when consequence or risk of a potential human extinction (due to catastrophic climate change) is to be compared with the potential consequence or risk of loss of jobs or slowing the growth of economy (due to restriction of fossil-based energy consumption), I believe the choice is clear. My view is that by making a paradigm shift in the world's energy supply over time through extensive R&D, technology innovations and increased production of renewable energy, we will create countless new careers and jobs and end up triggering the next level of economic development, the kind of pollution free industrial revolution mankind has never before seen. The aggravation and acceleration of a potential anthropogenic catastrophic global climate change, in my opinion, is the number one risk incurred from our combustion-based world economy. At the International Energy Conference in Seattle, I showed three pairs of satellite images as evidence that the earth glaciers are disappearing at an alarming rate.[2] Whether this warming trend can be reversed by human intervention is not clear, but this uncertainty in risk reduction doesn't justify the human inactions in adapting policies and countermeasures on renewable energy development for a sustainable world economy, and for curbing the likelihood of any risk event of anthropogenic catastrophic climate changes. What is imperative is that we start to do something in a significant way that has a chance to make a difference. It’s not too late – best simulations prove that massive cuts can check back NERSC, 9 (National Energy Research Scientific Computing Center, NERSC.gov, It’s not too late to change Global Warmings Course, Simulations show that cuts in greenhouse gas emissions would save arctic ice, reduce sea level rise, http://www.nersc.gov/news-publications/science-news/2009/it-s-not-too-late/, JG)
The threat of global warming can still be greatly diminished if nations cut emissions of heat-trapping greenhouse gases by 70 percent this century, according to a study led by scientistsat the National Center for Atmospheric Research (NCAR). While global temperatures would rise, the most dangerous potential aspects of climate change, including massive losses of Arctic sea ice and permafrost and significant sea level rise, could be partially avoided. "This research indicates that we can no longer avoid significant warming during this century," says NCAR scientist Warren Washington, the lead author. "But if the world were to implement this level of emission cuts, we could stabilize the threat of climate change and avoid an even greater catastrophe." To simulate a century of climate conditions, the researchers used more than 2000 processors of Franklin, the National Energy Research Scientific Computing Center's (NERSC) Cray XT4 system, as well as computers at the Oak Ridge and Argonne Leadership Computing Facilities and at NCAR. Over the past two years, the NCAR team received a total allocation of 50 million processor hours on NERSC computers for a variety of climate studies. Average global temperatures have warmed by close to 1 degree Celsius (almost 1.8 degrees Fahrenheit) since the pre-industrial era. Much of the warming is due to human-produced emissions of greenhouse gases, predominantly carbon dioxide. This heat-trapping gas has increased from a pre-industrial level of about 284 parts per million (ppm) in the atmosphere to more than 380 ppm today. With research showing that additional warming of about 1 degree C (1.8 degrees F) may be the threshold for dangerous climate change, the European Union has called for dramatic cuts in emissions of carbon dioxide and other greenhouse gases. The U.S. Congress is also debating the issue. To examine the impact of such cuts on the world's climate, Washington and his colleagues ran a series of global supercomputer studies with the NCAR-based Community Climate System Model (CCSM). They assumed that carbon dioxide levels could be held to 450 ppm at the end of this century. That figure comes from the U.S. Climate Change Science Program, which has cited 450 ppm as an attainable target if the world quickly adopts conservation practices and new green technologies to cut emissions dramatically. In contrast, emissions are now on track to reach about 750 ppm by 2100 if unchecked. The team's results showed that if carbon dioxide were held to 450 ppm, global temperatures would increase by 0.6 degrees C (about 1 degree F) above current readings by the end of the century. In contrast, the study showed that temperatures would rise by almost four times that amount, to 2.2 degrees C (4 degrees F) globally above current observations, if emissions were allowed to continue on their present course (Figure 1). Holding carbon dioxide levels to 450 ppm would have other impacts, according to the climate modeling study: Sea level rise due to thermal expansion as water temperatures warmed would be 14 centimeters (about 5.5 inches) instead of 22 centimeters (8.7 inches). Significant additional sea level rise would be expected in either scenario from melting ice sheets and glaciers. Arctic ice in the summertime would shrink by about a quarter in volume and stabilize by 2100, as opposed to shrinking at least three-quarters and continuing to melt. Some research has suggested the summertime ice will disappear altogether this century if emissions continue on their current trajectory. Arctic warming would be reduced by almost half, helping preserve fisheries and populations of sea birds and Arctic mammals in such regions as the northern Bering Sea. Significant regional changes in precipitation, including decreased precipitation in the U.S. Southwest and an increase in the U.S. Northeast and Canada, would be cut in half if emissions were kept to 450 ppm (Figure 2). The climate system would stabilize by about 2100, instead of continuing to warm. The research team used supercomputer simulations to compare a business-as-usual scenario to one with dramatic cuts in carbon dioxide emissions beginning in about a decade. The authors stressed that they were not studying how such cuts could be achieved nor advocating a particular policy. "Our goal is to provide policymakers with appropriate research so they can make informed decisions," Washington says. "This study provides some hope that we can avoid the worst impacts of climate change—if society can cut emissions substantially over the next several decades and continue major cuts through the century."
Prefer peer-reviewed experts and scientific consensus – climate deniers circumvent accountability and should be rejected
Lewandowsky, 6/20 - Australian Professorial Fellow, Cognitive Science Laboratories at University of Western Australia [Stephan, 6/20/2011, The Conversation, “Climate change denial and the abuse of peer review,”
On 20 April 2010, a BP oil rig exploded in the Gulf of Mexico, killing 11 workers and creating the largest oil spill in history. When President Obama sought to hold the corporation accountable by creating a $20B damage fund, this provoked Republican Congressman from Texas Joe Barton to issue a public apology. An apology not to the people affected by the oil spill … but to BP. In a peculiar inversion of ethics, Barton called the President’s measures a “shakedown”, finding it a “tragedy in the first proportion” that a corporation should be held accountable for the consequences of its actions. What does a Congressman’s inverted morality have to do with climate denial? Quite a bit. In a similar inversion of normal practice, most climate deniers avoid scrutiny by sidestepping the peer-review process that is fundamental to science, instead posting their material in the internet or writing books. Books may be impressively weighty, but remember that they are printed because a publisher thinks they can make money, not necessarily because the content has scientific value. Fiction sells, even if dressed up as science. During peer review, by contrast, commercial interests are removed from the publication decision because journals are often published by not-for-profit professional organizations. Even if private publishers are involved, they make their profit primarily via university subscriptions, and universities subscribe to journals based on their reputation, rather than based on individual publication decisions. Very occasionally a contrarian paper does appear in a peer-reviewed journal, which segments of the internet and the media immediately hail as evidence against global warming or its human causes, as if a single paper somehow nullifies thousands of previous scientific findings. What are we to make of that handful of contrarian papers? Do they make a legitimate if dissenting contribution to scientific knowledge? In some cases, perhaps. But in many other cases, troubling ethical questions arise from examination of the public record surrounding contrarian papers. For example, in 2003 the reputable journal Climate Research published a paleoclimatological analysis that concluded, in flat contradiction to virtually all existing research, that the 20th century was probably not the warmest of the last millennium. This paper, partially funded by the American Petroleum Institute, attracted considerable public and political attention because it seemingly offered relief from the need to address climate change. The paper also engendered some highly unusual fall-out. First, three editors of Climate Research resigned in protest over its publication, including the incoming editor-in-chief who charged that “…some editors were not as rigorous in the review process as is otherwise common.” This highly unusual mass resignation was followed by an even more unusual public statement from the publisher that acknowledged flaws in the journal’s editorial process. Three editorial resignations and a publisher’s acknowledgement of editorial flaws are not standard scientific practice and call for further examination of the authors and the accepting editor. The first author of this paper, Dr Willie Soon, is an astrophysicist by training. In U.S. congressional testimony, he identified his “training” in paleoclimatology as attendance at workshops, conferences, and summer schools. (The people who teach such summer schools, actual climate scientists, published a scathing rebuttal of Soon’s paper.) Undaunted, Dr Soon has since become an expert on polar bears, publishing a paper that accused the U.S. Geological Survey of being “unscientific” in its reports about the risks faced by polar bears from climate change. Most recently, Dr Soon has become an expert on mercury poisoning, using the Wall Street Journal as a platform to assuage fears about mercury-contaminated fish because, after all, “mercury has always existed naturally in Earth’s environment.” Lest one wonder what links paleoclimatology, Arctic ecology, and environmental epidemiology, the answer is not any conventional area of academic expertise but ideology. As Professor Naomi Oreskes and historian Erik Conway have shown in their insightful book, Merchants of Doubt, the hallmark of organized denial is that the same pseudo-experts emerge from the same shadowy “think” tanks over and over to rail against what they call “junk science”. Whether it is the link between smoking and lung cancer, between mercury and water poisoning, or between carbon emissions and climate change, ideology inverts facts and ethics whenever overwhelming scientific evidence suggests the need to regulate economic activity. So what of the editor who accepted the flawed Climate Research paper, Dr Chris de Freitas of Auckland? Later, De Freitas co-authored a paper in 2009 that some media outlets heralded as showing that climate change was down to nature. One of the authors, Adjunct academic Bob Carter from James Cook University, claimed that “our paper confirms what many scientists already know: which is that no scientific justification exists for emissions regulation.” Welcome news indeed, at least for the coal industry, but does the paper support this conclusion? No. For starters, the 2009 paper by McLean, de Freitas, and Carter did not address long-term global warming at all. It discussed the association between ocean currents and air temperature — in particular the time lag between the warm El Niño current and the ensuing increase in temperature. Indeed, the article does not even contain the words “climate change” except in a citation of the IPCC, and its only conceivable connection with climate change arises from the speculative phrase “ … and perhaps recent trends in global temperature …” in the final sentence. It appears ethically troubling to derive strong statements about emissions regulations from such a tentative clause in one’s final sentence in a paper on quite a different issue. Such statements appear even more troubling if one considers paragraph 14 of the paper, which reads, “to remove the noise, the absolute values were replaced with derivative values based on variations. Here the derivative is the 12-month running average subtracted from the same average for data 12 months later.” What happens to data if successive annual values are subtracted from each other? This mathematically removes any linear time trend. In other words, temperatures could have doubled every other year and it would have escaped detection by the authors. This removal of the trend did not escape detection by the scientific community, however, and the published rebuttal of this “it’s-all-natural” paper was as swift and devastating as it was for Dr Soon’s. To remove the linear trend from temperature data in a paper that does not address climate change, and to then claim that nature is responsible for global warming and there is no scientific basis for emissions regulations smacks of an inversion of scientific ethics and practice. Let us return to Congressman Barton. Before apologizing to BP, not for the nearly $3,000,000 he has received in contributions from the oil, gas, and energy industries, but for President Obama seeking accountability from the corporation, Mr Barton also sponsored a contrived investigation of the famed “hockeystick” paper by Professor Michael Mann and colleagues. The hockeystick is the iconic graph that shows the sky-rocketing temperatures of the last few decades in comparison to the relatively constant temperatures during the preceding centuries. The U.S. National Academy of Sciences affirmed the basic conclusions of Professor Mann, as have numerous other papers published during the last decade. Mr. Barton, however, relied on a report by a certain Professor Wegman, who claimed to have identified statistical flaws in the analysis underlying the original hockeystick. (Even if correct, that criticism has no bearing on the overall conclusion of Professor Mann’s paper or on the numerous independent hockeysticks produced by other researchers.) Professor Wegman subsequently published part of his report in the journal Computational Statistics and Data Analysis. Although normally a peer-reviewed journal, in this instance the paper was accepted a few days after submission, in July 2007, in an especially ironic twist as the paper tried to cast doubt on the quality of peer review in climate research. Alas, the paper’s lifetime was cut tragically short when it was officially withdrawn by the publisher a few weeks ago. Why? The paper by Wegman and colleagues was officially withdrawn because of substantial plagiarism. Conforming to the typical pattern of inversions, Wegman also appears to have plagiarized large parts of his initial hockeystick critique for Congressman Barton, while additionally distorting and misrepresenting many of the conclusions of the cited authors. We have examined just the tip of an iceberg of inversion of normal standards of ethics and scientific practice. These multiple departures from common scientific practice are not isolated incidents — on the contrary, they represent a common thread that permeates all of climate denial. Because climate denial is just that: denial, not scepticism. Science is inherently sceptical, and peer-review is the instrument by which scientific scepticism is pursued. Circumventing or subverting that process does not do justice to the public’s need for scientific accountability. At a time when Greenland is losing around 9,000 tonnes of ice every second — all of which contributes to sea level rises – it is time to hold accountable those who invert common standards of science, decency, and ethics in pursuit of their agenda to delay action on climate change.
Models are good – more accurate than ever and improving
ScienceDaily, ‘8 [4/6/2008, “Climate Models Look Good When Predicting Climate Change,” ScienceDaily, http://www.sciencedaily.com/releases/2008/04/080402100001.htm, DS]
The accuracy of computer models that predict climate change over the coming decades has been the subject of debate among politicians, environmentalists and even scientists. A new study by meteorologists at the University of Utah shows that current climate models are quite accurate and can be valuable tools for those seeking solutions on reversing global warming trends. Most of these models project a global warming trend that amounts to about 7 degrees Fahrenheit over the next 100 years. In the study, co-authors Thomas Reichler and Junsu Kim from the Department of Meteorology at the University of Utah investigate how well climate models actually do their job in simulating climate. To this end, they compare the output of the models against observations for present climate. The authors apply this method to about 50 different national and international models that were developed over the past two decades at major climate research centers in China, Russia, Australia, Canada, France, Korea, Great Britain, Germany, and the United States. Of course, also included is the very latest model generation that was used for the very recent (2007) report of the Intergovernmental Panel on Climate Change (IPCC). "Coupled models are becoming increasingly reliable tools for understanding climate and climate change, and the best models are now capable of simulating present-day climate with accuracy approaching conventional atmospheric observations," said Reichler. "We can now place a much higher level of confidence in model-based projections of climate change than in the past." The many hours of studying models and comparing them with actual climate changes fulfills the increasing wish to know how much one can trust climate models and their predictions. Given the significance of climate change research in public policy, the study's results also provide important response to critics of global warming. Earlier this year, working group one of the IPCC released its fourth global warming report. The University of Utah study results directly relate to this highly publicized report by showing that the models used for the IPCC paper have reached an unprecedented level of realism. Another important aspect of the research is that climate models built in the U.S. are now some of the best models worldwide. Increased efforts in the U.S. over the past few years to build better climate models have paid off, and according to the authors' measure of reliability, one of the U.S. models is now one of the leading climate models worldwide.