In 2003 machinery on Landsat 7 broke. This malfunction now causes stark black diagonals across all images. Despite the malfunction, NASA kept the satellite in operation; it’s still collecting imagery, but at 80% of its intended throughput.

This imagery is incredibly valuable. It is the primary source used by the Satellite team for the next release of our Cloudless Atlas. Part of our job is to remove artifacts stemming from this Landsat 7 hardware malfunction.

Unfortunately, the malfunction propagates through our cloudless pipeline. Certain regions of the world have far less pixels due to the diagonals, resulting in banding artifacts that coincide with the Landsat 7 null regions.

## Background

A Fourier Transform is a function that decomposes a signal, such as an image, into waves of decreasing amplitude. Since the banding artifacts are periodic, its wave component will accumulate over the entire image, and appear as a coalesced point of large amplitude.

## The approach

### Transform the image

Every declouded tile from across the world is transformed using a Fast Fourier Transform (FFT). The peaks that correspond to the bands can be subtle to the eye, and vary in inclination and strength.

An algorithm to deband these images is becoming clear:

- Transform the image using a Fast Fourier Transform
- Locate the artifact peaks
- Remove the artifacts
- Transform the image back using an inverse transform

## Metadata is important

Before searching the image for artifacts, we use metadata from EarthExplorer to narrow the search region. The inclination of the artifacts is calculated using the location of the image on Earth. We then calculate the diagonal region on the power spectrum where the bands will be located. This step is crucial since it greatly reduces the number of pixels to search, and filters other false positives.

## Search for artifacts

Though using a Fourier representation helps manage this problem, it also introduces complexity. The Fourier image is volatile. It has a large dynamic range, often causing sharp changes between neighboring pixels. Before searching for artifacts, the image needs to be smoothed. Using an image processing technique called dilation, the overall image noise is reduced and the artifacts are accentuated. The metadata-guided region then goes through an off-the-shelf peak finding algorithm from the `scikit-image`

Python library.

There are many potential artifacts to investigate. From a visual inspection we know that not all of these are artifacts; the false positives need to be filtered out.

## Which are the real artifacts?

By looking at many images there are a few points known about the banding artifacts:

- they are near symmetric between the upper right and lower left quadrants
- the maxima nearest the image center are
*often*related to banding - they lie along a diagonal line

Using this information we can start filtering out maxima that are not related to banding. Each maxima in the first quadrant should have a counterpart in the third quadrant; the first step is to filter all maxima by symmetry.

Next up is to locate the two maxima nearest the image center. These are pretty often (but not always) related to most of the banding. The line that passes through these maxima should be near the same inclination as the one derived from metadata. If the inclination of this line is not sufficiently close to the expected angle, then the next two nearest maxima are selected until there is a match. Once the two angles are sufficiently close, we use the line as another filter criterion. The distance to the line is checked for every maxima. Those far from the line are filtered out, while those near the line are kept.

One last step is made to ensure that the remaining maxima correspond to artifacts. The maxima and their reflected values (reflected across the *x* axis) are compared. We expect that false positives will have similar values, while true artifacts differ by a large amount. Only maxima with a large difference from their reflected values are kept.

### Correction mask

At this point we have an array of points, each corresponding to an artifact peak.

```
[(1312, 1673), (1322, 1594), (1333, 1517), (1347, 1438), (1369, 1278), (1383, 1199), (1394, 1122), (1404, 1043)]
```

Peaks near the image center are stronger and wider than those far from the center. The radii used in the mask must shrink as we move away from the image center. The right size for these radii has been an important topic for the entire debanding process. Using radii that are too large can introduce other artifacts including

- saturated pixels or streaks
- image blur
- additional banding

To minimize the introduction of these artifacts we examine a patch around the nearest peak.

The patch is smoothed using a Gaussian filter (remember the data is volatile!), and a profile through the image center is taken.

For each pixel, the percentile over the patch is computed.

The *x* axis represents the pixel number along the profile, while the *y* axis is the percentile. The drop off on the right side shows the fall off once we leave the area around the artifact peak. The distance from the image center to this fall off gives us the radius for the two nearest peaks. In this case the fall off happens around *x = 14* or 6 pixels from the patch center, so we use a starting radius of 6 pixels.

All subsequent radii are gradually reduced to create the mask below.

### Check our work!

*Check our work like in the good ole math class days*

Now that we have the correction mask, it’s applied to all bands (spectral bands) of the image. Each band is transformed to its Fourier representation, the correction mask is applied, then it’s transformed back to its original form using an inverse Fourier transform. We compare some simple statistics (mean, median, standard deviation) between the original pixels and the corrected pixels, and if there’s a large deviation, we opt to bypass debanding. In many cases the change in statistics is subtle, and only related to artifact removal. This is the desired result, and produces higher quality images.