Raw earth imagery from satellites is blue-ish — the atoms in atmosphere scatter the blue color of solar light the most. We are processing algorithms to make brighter colors that are truer to what you would see from ground. This is important not only for beautiful maps, but also for measuring vegetation indices, albedo, burned area, land cover type and temporal change — in short, removing atmospheric effects like “blue-ish” is key for remote sensing.

Here is a look at Venice, Italy, after effects of the atmosphere have been removed, using the NASA developed LEDAPS code. When we run LEDAPS, the code is essentially using a parameterized model of the atmosphere, getting ancillary data from other instruments, and running scattering physics to invert the process and find the input [surface] reflectance that produces the output image on the satellite. The code has to implement various techniques to exclude pixels such a clouds and cloud shadows, and has to use look up tables to optimize the highly computer intensive inversion process. NASA has been developing, refining and publishing this code since 2005 and they are currently implementing the necessary changes to support Landsat 8.

This scattering process is what makes the sky blue from the ground looking up, and the surface kind of blue for the satellites looking down. The smaller the wavelength, the more efficient the scattering is. Since Blue is the smallest wavelength the human eyes can see, the sky is blue. About 2/3 of the solar light gets deflected. Also, the more atmosphere, the more scattering. That is why clear noon skies are blue, and why humid sunsets have an horizontal red gradient.

Sunset picture, from ahumbleperspective

And yes, we also remove clouds.