The chart below shows the monthly normals for the familiar shortwave radiation budget near Fairbanks, according to ERA5. Solar input (red columns) is close to zero in November through January, it rises quickly in spring under relatively clear skies, and then its decay in autumn is more gradual because increased cloud cover in July and August brings a premature decline (e.g. July has more daylight than May, but much more cloud cover).
The blue columns show the normal upward (reflected) shortwave radiation at ground level, and this peaks in April because most of the month is usually snow covered under relatively strong sunshine. However, the albedo (i.e. the fraction of incoming solar radiation that's reflected) does drop off markedly in April as snow disappears from trees and eventually from the ground, and it remains near 0.1 until October, when snow cover typically returns.
This much is pretty straightforward to understand. However, the longwave radiation budget is less intuitive, and in particular it can be a surprise to see that the rate of energy transfer for both incoming and outgoing longwave radiation is much greater than for shortwave - see below. Probably not many of us would guess that much more radiation is warming us from the sky above than we receive directly in the form of sunshine, even in summer. Of course this is largely because we're bathed in longwave radiation at all hours of the day and night (emitted by clouds and the atmosphere), with little change from hour to hour, but intense sunshine is confined to only a portion of the day.
But although the incoming longwave is large, the outgoing is even larger at all times of the year, because the ground temperature is higher than the average emitting temperature of the clouds and air above. The longwave radiation flux is proportional to the fourth power of temperature, so both upward and downward components track very closely with the seasonal temperature cycle. The only obvious departure from a simple seasonal cycle that I can see is that the downward flux doesn't increase from January to March as quickly as the upward flux, and that's because the surface warms up more quickly than the air aloft (and also because the air stays very dry well into spring - water vapor is very efficient at absorbing and emitting these wavelengths).
So we have a net loss of longwave energy at all seasons, and a net gain of shortwave in all but winter. What does the overall net look like?
Here we see that there's a net gain of radiative energy from March (barely) through September, and a small loss from October through February; and overall it's a significant gain over the course of a year, which is perhaps a bit surprising at a latitude of nearly 65°N.
Now someone may ask why the temperature in Fairbanks drops so dramatically in the early autumn when there's still a net gain of radiation - even in September, according to ERA5. The answer is that this radiation budget pertains to the ground surface, not the air above. The atmosphere doesn't absorb or emit shortwave radiation, so the longwave balance is the only thing at play - and consider that the atmosphere only gains longwave energy from below, while it emits it both downward to the ground and upward to space. This implies a significant net radiative loss for the atmosphere year-round, and when heat transfer from the surface (by mixing/convection) drops off in the autumn, there's nothing to stop the air from cooling rapidly.
In another post I'll take a look at long-term trends in the different radiation components.