We know there are larger forces outside the atmosphere that affect the Earth's climate. The Earth and Sun are moving through the galaxy, exposed to different environments as the solar system orbits the galactic core, over a period of 225-250 million years. The eccentricity of the Earth's orbit (difference from a perfect circle) changes over a 100,000 year period. The tilt of the Earth on its axis varies over 41,000 years, from 21.5 degrees to 24.5 degrees. The axis of the Earth changes direction, undergoing precession (see here) over 26,000 years. All of these change the amount of sunlight received, and the timing of the seasons.
The 100,000 year and 41,000 year periods in these orbital changes certainly correspond to similar periods seen in the climate data (see the 5 million year graph above.) It's certainly not a stretch to think that the Sun has something to do with climate!
The output of the Sun itself varies. On an 11-year "Schwabe Cycle", the number of sunspots varies. They also vary over longer periods, a 75-90-year "Gleissberg Cycle," a 200-500-year "Suess Cycle" and a 1,100-1,500-year "Bond Cycle." In the National Post article here, R. Timothy Patterson says that a close correlation between the output of the Sun and climate is seen:
Our finding of a direct correlation between variations in the brightness of the sun and earthly climate indicators (called "proxies") is not unique. Hundreds of other studies, using proxies from tree rings in Russia's Kola Peninsula to water levels of the Nile, show exactly the same thing: The sun appears to drive climate change.
Sunspots have been counted continuously since around 1749, with earlier data points to 1610. Before that, carbon-14 levels can be used as a proxy for solar activity. The last 1100 years are shown below.
You can compare this data with the 2000 year temperature graph above (which unfortunately runs the other direction.) Around 1000 AD the peak of the Medieval Warm Period, you have a maximum in solar activity. Similarly, the modern warm spell also corresponds to a peak. The Little Ice Age on the other hand, lines up with a minimum of activity.
The brightness of the sun doesn't vary enough to cause these effects on temperature, so until recently, there was no mechanism that would explain this. However, it was shown in a laboratory experiment that cosmic rays could affect cloud formation. If I understand correctly, an increase in solar activity increases the Sun's solar wind, which shields the Earth from cosmic rays. This in turn means fewer clouds form, allowing more sunlight to reach the ground. In times when solar activity is low, the reverse happens, and more clouds are formed, resulting in less sunlight.
This is the "deniers" answer to the current change in climate. The sun is more active than it has been in 8000 years. This is heating the Earth. The CO2 increase results from the oceans releasing the gas as they warm. Human-produced CO2 is a minor factor.
There are some reports of similar changes on Mars, which would naturally also be warmed by changes in the Sun (though I would think that cloud formation would happen differently in the Martian atmosphere, and that the solar wind would naturally thin farther out, making Mars less sensitive.) See here for more information, or here for a rebuttal of the whole idea (warming site RealClimate.)
Bad Data We've seen that the climate is extremely variable, over both the longest timescales and the shortest. We're told that temperature has risen one degree centigrade in the last 100 years. As we've seen, shifts this large happen to the climate all the time. In the National Post article by R. Timothy Patterson here, we have this paragraph:
Ours is one of the highest-quality climate records available anywhere today and in it we see obvious confirmation that natural climate change can be dramatic. For example, in the middle of a 62-year slice of the record at about 4,400 years ago, there was a shift in climate in only a couple of seasons from warm, dry and sunny conditions to one that was mostly cold and rainy for several decades.
We're not even sure that the temperature data is correct. I've listed a few of the criticisms - that various reconstructed temperature sequences don't agree and that modern temperature data (from weather stations) may be biased due to the growth of cities around the stations. What data we do have doesn't cover the whole world (far from it), so some of the effects we've seen may be local.
The timing of the modern warming doesn't really correspond very well to CO2 levels (which Jaworowski thinks are also poorly reconstructed.) The 2000 year graph above has some reconstructions that show a continuous temperature increase since the worst of the Little Ice Age in 1600 -- long before industrialization could have played a role.
The solar cycle certainly seems to have some effect on climate. Even if we are definitely warming, you'd have to subtract the influence of the solar cycle to gauge the human impact.
Most of the concern about the future comes not from any measurement, or any simple theory of what will happen. Instead, it comes from predictions produced by extremely complicated computer models. Models which are known to be incomplete and which can't be realistically tested against real data (not for decades yet.)
You and I can't directly evaluate these models, but it's not as if these are the only computer models in existence. Models are used for all kinds of things we are familiar with. Computer models give us our daily weather forecasts. Computer models predict hurricane and tornado tracks during a storm. Computer models are used by traders on stock exchanges. We have experience with these models and how accurate they are. Climate models are the biggest models around, with less of a track record than any other economic or weather model.
If someone told you they had a computer model that would predict the size of the economy in 2100, you'd laugh. If they demanded (and got) a massive government spending program based on this model, you'd be angry. Yet the warming model must include an economic model. The economic model tells you how much CO2 is being put into the atmosphere, which then drives the climate model. If the economic model is wrong, so is the climate model.
It's certainly possible that we are scaring ourselves over nothing. It has happened before! In the 1970's, science was just as certain that the world was going into a new ice age. They were sure that this was imminent because temperature was falling from the 1940's to the 1970's. If they had had access to the ice core data, they would have been even more certain. The graphs above show that the Earth has been in an ice age for millions of years, and our brief warm period is already as long as the typical warming every 100,000 years.
But the predictions of a new ice age were wrong. After 1975, the climate started to warm again. We think of the climate as stable, but it's not. We think the current climate is normal, but we're wrong. We have to remember that these events happen on a much longer timescale than human lives or even human history. The period we live in is a brief gap in an ice age lasting millions of years.
Think of it another way. If the dust bowl drought of the 1930's were to happen today, everyone would blame it on global warming. Yet there's no way that global temperature had increased enough in the 1930's to cause droughts. In fact, the geological record in the Southwest says that droughts have come and gone for millennia, and that some droughts naturally lasted over 100 years.
So when you look at the global warming idea, you could just put it down to us not having a long enough experience to know what's "normal." We're like mayflies who have never seen summer and think it's the end of the world.