The balance, or more accurately the imbalance, of infrared radiation across the Earth's surface is the fundamental driver of climate and weather variability, on timescales ranging from seconds to millenia. Short-wavelength (shortwave) infrared radiation from the Sun heats the Earth's surface, and the surface emits long-wavelength (longwave) radiation back out to space. Clouds and atmospheric gases such as water vapor and carbon dioxide also emit longwave radiation that is absorbed by the ground, and this is a major component of the surface radiation budget. The relative magnitude of each of these radiative fluxes plays a large role in determining temperature changes at the surface, with the most obvious contrasts being those between day and night, and between summer and winter.
The rate of heating by shortwave radiation from the Sun is obviously affected by the time of year and of day, a location's latitude, the extent and character of cloudiness, and other factors like atmospheric aerosols (smoke, haze) and the degree of reflectivity (albedo) of the surface. As for the longwave radiation emitted by the ground and by the atmosphere, the rate of emission is essentially determined by the temperature of the respective components, as per the Stefan-Boltzmann law. So for example a warmer ground surface emits much more radiation upward than a cold surface; and a layer of low (warm) clouds aloft emits much more radiation downward than a clear sky. (This is why cloudy nights are usually warmer than clear nights, and clouds bring warmth to interior Alaska in winter.)
It's an interesting exercise to develop a rough estimate of the radiation budget in the Fairbanks area. We can do this using observed shortwave radiation data from the CRN site 11 miles northeast of Fairbanks; this site has measured shortwave radiation since mid-2002. There are undoubtedly some differences between the CRN site and Fairbanks itself, but we'll ignore that for now.
The next assumption we make is that the radiation budget is in balance at both the summer peak of temperature and at the winter minimum of temperature; in other words, the following equation is satisfied on these dates:
absorbed shortwave + downward longwave = upward longwave
We know what the normal mid-summer and mid-winter temperatures are according to the 1981-2010 normals (63.7°F and -8.7°F at Fairbanks airport), so using Stefan-Boltzmann we can calculate the longwave radiation emitted upward from the ground at these temperatures. Then given a balanced budget, this immediately gives us the downward longwave on these two dates. The table below shows the results in terms of daily energy transfer; note that I've assumed a shortwave albedo of 0.132 on both dates. (This is about right for mixed coniferous and deciduous forest in summer; it certainly should be higher in winter, but forested areas still have a fairly low albedo even with snow on the ground. Of course, in the depths of winter the albedo doesn't make much difference as there is so little solar insolation.)
|Summer temperature maximum|
|Absorbed shortwave||4382 Wh|
|Upward longwave||9726 Wh|
|Downward longwave||5344 Wh|
|Winter temperature minimum|
|Absorbed shortwave||17 Wh|
|Upward longwave||5362 Wh|
|Downward longwave||5344 Wh|
With the arbitrary but realistic albedo choice that I made above, the estimated downward longwave radiation is (by design) exactly the same in summer and winter. I don't know how close this is to the truth, but I suspect it is not a bad assumption; albedo constraints suggest that it can't be far from the truth. If we proceed with the assumption that the downward longwave is constant throughout the year, we can then calculate the radiation imbalance for any day of the year. The result is shown below.
Although I've made a few assumptions along the way - and this is admittedly a crude approach - the chart shows the basic features of the infrared radiation processes in interior Alaska. The springtime gain in shortwave radiation outpaces the longwave losses, as the surface remains cold, and this leads to a radiation surplus and a warming trend. In autumn, the direct solar radiation drops off more rapidly than the emitted longwave, because the surface remains warm, leading to a net loss of radiation. We also see that autumn's peak rate of energy loss is greater than springtime's peak rate of gain, and so the temperature change is a bit more rapid in autumn than in spring.