The CERES and CRN data are not well correlated year by year, but both series show a statistically significant downward trend in average incoming solar radiation. This is consistent with increasing cloud cover associated with a warming Arctic in which reduced ice cover and a warmer ice-free ocean surface tend to produce more evaporation and higher atmospheric moisture content.
The CERES estimate of reflected solar radiation is more variable from year to year than the downward flux, but it too suggests a downward trend. If annual average albedo is decreasing (owing to a shorter ice/snow season), then we would expect reflected solar radiation to decrease even more rapidly than incoming radiation, and in fact the linear trends do support this. According to this short period of record, the surface is absorbing slightly more solar energy even though cloudiness has increased markedly.
The situation in Fairbanks is of course much different. There are no significant trends in the shortwave radiation data, although it's interesting to note that 2018 was the least sunny year in the CRN data set by some margin. 2013 stands out as being a sunny year, but that's mostly because of the extraordinary weather of May and June that year.
How about longwave radiation? First I'll emphasize how closely tied the longwave energy transfer is to the surface temperature; in both Fairbanks and Utqiaġvik, the annual average emission of infrared radiation upward from the surface is correlated with the annual mean temperature at about R=0.9; this is just the Stefan-Boltzmann law in action.
With longwave radiation being closely tied to temperature, it's no surprise that (a) there is not much variation in absolute terms from year to year, but (b) values have been noticeably higher since 2016 at Utqiaġvik - see below. This means that the surface is losing energy upward at a greater rate; it's a natural consequence of warming and it's a negative feedback, although the downward longwave has also increased nearly the same amount. The CERES data hint at a small increase in the net energy loss from the surface at these wavelengths.
As with the shortwave data, the trends are smaller in Fairbanks; it appears there has been a small increase in net energy loss - similar to Utqiaġvik - but it's very marginal in terms of statistical significance (p~0.1).
What happens when we add up the net shortwave and longwave contributions? See the chart below. For Fairbanks, there is a hint of a downward trend in the overall annual radiation gain owing to the longwave changes (increased upward emission), but again it's not really significant.
Utqiaġvik's total radiation gain is noisy, but here the interesting result is the absence of a trend; it turns out that the small gain in shortwave radiation (lower albedo) is almost exactly offset by a small increase in net energy loss via longwave. Does this suggest that the ice-albedo effect is less of a concern than we might have thought? I don't think so, but it would be worth examining more closely for other parts of the Arctic; there is always more to learn.
One caveat associated with all of this is of course that the surface CERES data is computed with a radiative transfer model that depends on all sorts of inputs, both measured and estimated. The model undoubtedly has some deficiencies, so the results - and especially the trend analyses - could be sensitive to details of the model; in other words, I frankly have no idea what the error distributions might be on the CERES data.