Humans and climate change can take credit for a much warmer Arctic, according to new research
By David Biello
Courtesy: The Scientific American
Based on its long-term orbit, Earth should be heading into an ice age. But instead of continuing to cool—as it had been for at least the past 2,000 years—the Arctic has started to warm. And the reason is humans’ impact on the composition of the atmosphere, new research suggests.
To look at this trend, geologist Darrell Kaufman of Northern Arizona University and a consortium of colleagues reconstructed Arctic temperatures decade by decade over the past two millennia by pulling
sediment cores from the bottoms of 14 Arctic lakes—backed up by records in tree rings and ice cores.
In warm summers, relatively more sediment is deposited thanks to
more meltwater from the glaciers that create these lakes, and the abundance of algae in the sediment layers reveals the length of growing seasons. So, these sediment cores provide a picture of the climate that goes back millennia.
The record they reveal is of a cooling pole. As the Earth has moved slightly further away from the sun due to
vagaries in its orbit—it’s roughly 600,000 miles further away now than in 1 C.E.—some parts of the Arctic received as much as 6 watts per meter squared less sunlight than in 1 C.E. That, in turn, has led to a cooling rate of roughly 0.2 degrees Celsius per 1,000 years. But at some point in the 20th century, that trend stopped and reversed.
"Orbitally driven summer insolation continued to decrease through the 20th century, implying that summer temperatures should have continued to cool," the researchers wrote this week in the September 4 edition of Science. "Instead, the
shift to higher temperatures during the 20th century reversed the millennial scale cooling trend."
In the past decade, summertime Arctic temperatures have been 1.4 degrees Celsius higher on average than would be expected and 1.2 degrees Celsius higher than in 1900. And the Arctic is merely the trendsetter—the northern-most latitudes are among the
fastest-warming parts of the globe due to various feedbacks. For example, melting Arctic sea ice exposes more ocean, which in turn absorbs more of the sunlight’s warmth and further increases warming.
A graph of the warming trend largely replicates the so-called "hockey stick," a previous reconstruction that showed relatively stable temperatures suddenly spiking upward in recent history. It also accurately reveals the impact of historical climate events like the Little Ice Age, which took place from the 17th to 19th centuries.
Without greenhouse gas emissions in the atmosphere, a true ice age might have been expected as a 21,000-year wobble in Earth’s tilt relative to the sun that shifts the
intensity of sunlight. That cooling trend wouldn’t have reversed naturally for at least another 4,000 years. Yet, despite this decline, Arctic temperatures have soared and the most likely culprit is the build-up of greenhouse gases in the atmosphere from fossil fuel burning, forest clearing and other human activity, Kaufmann and his colleagues wrote.
"The most recent 10-year interval (1999–2008) was the warmest of the past 200 decades," they wrote. "Temperatures were about 1.4 degrees C higher than the projected values based on the linear cooling trend and were even more anomalous than previously documented."
Of course, summer temperatures when the warming portion of the wobble cycle peaked roughly 7,500 years ago were at least 0.8 degrees Celsius warmer than 20th-century average temperatures. Nonetheless, this current, countercyclical warming trend will likely continue—potentially exceeding that earlier warming—unless
greenhouse gas levels begin to come back down. In the meantime, polar denizens adapted for the cooler climate can blame humanity for a balmier Arctic.