RapidEye has high-res, extremely up-to-date satellite data. Their satellite constellation provides daily images of anywhere in the world with 5 meter resolution, making it especially useful for some of our agricultural and industrial subscribers. And this beautiful imagery is easy to work with: you can go from a data delivery to a rendered map in just a few steps, using all free software.
Here’s how to take data directly from a RapidEye download, through processing, and into a cloud-published map in under 10 minutes. To follow along, you’ll want a recent version of GDAL and a copy of ImageMagick with tiff support. Both are in most Unix package systems, or you can get them from their project pages.
RapidEye imagery comes already georeferenced and corrected for topography – at level 3A, in remote sensing jargon. The delivery will have assorted metadata files, a small preview (“browse”) image, and a large geotiff with the data payload. The geotiff contains 5 spectral bands, the first three of which are visible blue, green, and red. We can make an ordinary RGB image and reproject it right away (your input tiff’s name will vary, of course):
gdal_translate -b 3 -b 2 -b 1 \
gdalwarp -co photometric=RGB -co tfw=yes -t_srs EPSG:3857 \
-b flags tell GDAL which bands to pull out of the source image, and
-co photometric=RGB means they’ll be interpreted as red, green, and blue respectively in the output. (We’ll cover the
-co tfw=yes shortly.) As you may have seen before if you’ve worked with other satellite data, the raw image is dark and muddy:
But that’s only because it has to represent a huge range of information. The bright, natural-looking image is in there, it’s just hiding. Add lightness and contrast:
convert -sigmoidal-contrast 30x15% -depth 8 \
You’ll see some warnings as
convert skips the georeference tags, and you may find that a different brightness/contrast mix (say, 10x20%) works better for your scene. We also drop down to 8-bit color now that we’re done with processing. And presto! We’ve taken a RapidEye download to a true-color picture:
To import this into TileMill, take the .tfw file that the
gdalwarp step created (that was the
-co tfw=yes) and rename it to match the final image, then use GDAL one last time to bundle the adjusted image data and the geographical information into a geotiff:
cp rgb-proj.tfw rgb-proj-bright.tfw
gdal_translate rgb-proj-bright.tif RapidEye-ready-for-mapping.tif
TileMill will now happily open it as SRS 900913. I assembled some sample images into a larger scene using this process:
Update: The demo that supported this post is no longer available.
If you head over to the southwest corner of this map, you’ll see turquoise specks in the residential areas. The color seemed so out of place that I was worried I’d misprocessed the imagery. I’d looked at this part of the world – the suburbs of Los Angeles – while working on Landsat 8 processing, and certainly hadn’t noticed any neighborhoods full of houses with light blue roofs. Then I put two and two together: those are swimming pools! You simply can’t see them at Landsat 8’s resolution:
Porter Ranch, a neighborhood of Los Angeles, in late June 2013. Left: Landsat 8 data (courtesy of USGS) at 15 m/px. Right: RapidEye data, 5 m/px.
For every Landsat 8 pixel, even after pansharpening, RapidEye delivers nine. But it’s not just spatial resolution that matters – there’s also temporal resolution, or the time between successive images. For Landsat 8, that’s 16 days, and usually the trade-off is slower revisits as resolution increases. But of course, with five identical satellites and the ability to aim, RapidEye can deliver daily.
Among many other applications, this is a powerful tool for large-scale agriculture. A big farm operation might have crops planted further than the eye can see, and very small changes out in the field (such as how the plants are responding to irrigation, how fast they’re ripening, or whether they’re showing indications of disease) can be vital to discover as soon as possible. The RapidEye sensors offer another unusual service to agriculture: a red edge band, between red and conventional near-infrared (NIR). This slice of the spectrum is even more sensitive than NIR to differences in vegetation – between healthy and unhealthy, but also between different varieties, like trees v. ground crops.
To get a sense of what the red-edge band can show, let’s construct a false-color image using it. In the delivered geotiff, spectral bands are numbered in order from shortest wavelength (band 1 is blue) to longest (band 5 is near-infrared). To make an image with red-edge as red, red as green, and blue as blue:
gdal_translate -b 4 -b 3 -b 1 -co photometric=RGB \
-sigmoidal-contrast 40x14% for clarity, this shows urban areas and bare land in blues and grays, while agriculture and natural vegetation are in reds and yellows:
We can even dabble in band math, using imagemagick’s
-fx operator. It’s not the fastest tool, but its sheer flexibility is hard to beat. Let’s look at red-edge NDVI – an index that highlights leafy, healthy plants. The
-fx operator can be a little finicky, and it works best on images that ImageMagick itself constructed, so first run the 4-3-1 image through
convert 431.tif 431-prepared.tif
The formula for NDVI is (NIR − red) ÷ (NIR + red).
-fx will find NIR (in this case red-edge, or very near infrared) in the red channel of the image, which it calls
u.r, and actual red in the green channel, or
u.g. I’m also throwing in a
-monitor, which you can add to any
convert command to track its progress:
convert -monitor 431-prepared.tif \
-fx '(u.r - u.g) / (u.r + u.g)' \
This gives us a grayscale image that, even without further processing, clearly shows where there are healthy plants. Zooming in on an area with crops, we can find very small variations in plant vitality within individual fields:
It only takes a couple minutes of processing to start finding this kind of insight. If that farm is your business, you know exactly where your water and other resources are best applied tomorrow morning, or even this afternoon. And RapidEye’s revisit capability can sustain fine-grained analysis over time, so you see not just points of data but curves and trends.
We’re building infrastructure for a future where this kind of frequent, high-quality imagery becomes a normal part of not only agriculture, but logistics, science, policy, journalism, and so on. If you’re interested, you can sign up for the MapBox Satellite Live beta, and as always you should say hi to me (@vruba), Chris(@hrwgc), or Bruno(@brunosan) on Twitter if you have questions or comments. We’re happy to put you in touch with our contacts at RapidEye if you’d like to start ordering imagery today and working with it in TileMill.