Lab 2: Characteristics of remotely sensed data

Purpose: The purpose of this lab is to demonstrate concepts of spatial, spectral, temporal and radiometric resolution. You will be introduced to image data from several sensors aboard various platforms. At the completion of the lab, you will be able to understand the difference between remotely sensed datasets based on sensor characteristics and how to choose an appropriate remotely sensed dataset based on these concepts.

Prerequisites: Lab 1

1. Spatial resolution

In the present context, spatial resolution means pixel size. In practice, spatial resolution depends on the projection of the sensor’s instantaneous field of view (IFOV) on the ground and how a set of radiometric measurements are resampled into a regular grid. To see the difference in spatial resolution resulting from different sensors, visualize data at different scales from different sensors.

a. MODIS. There are two Moderate Resolution Imaging Spectro-Radiometers (MODIS) aboard the Terra and Aqua satellites. Different MODIS bands produce data at different spatial resolutions. For the visible bands, the lowest common resolution is 500 meters (red and NIR are 250 meters).

i. Search for ‘MYD09GA’ and import ‘MYD09GA.006 Aqua Surface Reflectance Daily Global 1km and 500m’. Name the import myd09. (Complete list of MODIS land products. Note that Terra MODIS datasets start with ‘MOD’ and MODIS Aqua datasets start with ‘MYD’).

ii. Zoom the map to SFO airport:

// Define a region of interest as a point at SFO airport.
var sfoPoint = ee.Geometry.Point(-122.3774, 37.6194);

// Center the map at that point.
Map.centerObject(sfoPoint, 16);

iii. To display a false-color MODIS image, select an image acquired by the Aqua MODIS sensor and display it for SFO:

// Get a surface reflectance image from the MODIS MYD09GA collection.
var modisImage = ee.Image(myd09.filterDate('2017-07-01').first());

// Use these MODIS bands for red, green, blue, respectively.
var modisBands = ['sur_refl_b01', 'sur_refl_b04', 'sur_refl_b03'];

// Define visualization parameters for MODIS.
var modisVis = {bands: modisBands, min: 0, max: 3000};

// Add the MODIS image to the map.
Map.addLayer(modisImage, modisVis, 'MODIS');

iv. Note the size of pixels with respect to objects on the ground. (It may help to turn on the satellite basemap to see high-resolution data for comparison.) Print the size of the pixels (in meters) with:

// Get the scale of the data from the first band's projection:
var modisScale = modisImage.select('sur_refl_b01')
.projection().nominalScale();

print('MODIS scale:', modisScale);

v. It’s also worth noting that these MYD09 data are surface reflectance scaled by 10000 (not TOA reflectance), meaning that clever NASA scientists have done a fancy atmospheric correction for you!

b. MSS. Multi-spectral scanners (MSS) were flown aboard Landsats 1-5. MSS data have a spatial resolution of 60 meters.

i. Search for ‘landsat 5 mss’ and import the result called ‘USGS Landsat 5 MSS Collection 1 Tier 2 Raw Scenes’. Name the import mss.

扫描二维码关注公众号,回复: 5778805 查看本文章

ii. To visualize MSS data over SFO, for a relatively less cloudy image, use:

// Filter MSS imagery by location, date and cloudiness.
var mssImage = ee.Image(mss
.filterBounds(Map.getCenter())
.sort('CLOUD_COVER')
// Get the least cloudy image.
.first());

// Display the MSS image as a color-IR composite.
Map.addLayer(mssImage, {bands: ['B3', 'B2', 'B1'], min: 0, max: 200}, 'MSS');

iii. Check the scale (in meters) as previously:

// Get the scale of the MSS data from its projection:
var mssScale = mssImage.select('B1')
.projection().nominalScale();

print('MSS scale:', mssScale);

c. TM. The Thematic Mapper ™ was flown aboard Landsats 4-5. (It was succeeded by the Enhanced Thematic Mapper (ETM+) aboard Landsat 7 and the Operational Land Imager (OLI) / Thermal Infrared Sensor (TIRS) sensors aboard Landsat 8.) TM data have a spatial resolution of 30 meters.

i. Search for ‘landsat 5 toa’ and import the first result (which should be ‘USGS Landsat 5 TM Collection 1 Tier 1 TOA Reflectance’. Name the import tm.

ii. To visualize MSS data over SFO, for approximately the same time as the MODIS image, use:

// Filter TM imagery by location, date and cloudiness.
var tmImage = ee.Image(tm
.filterBounds(Map.getCenter())
.filterDate('2011-05-01', '2011-10-01')
.sort('CLOUD_COVER')
.first());

// Display the TM image as a color-IR composite.
Map.addLayer(tmImage, {bands: ['B4', 'B3', 'B2'], min: 0, max: 0.4}, 'TM');

iii. For some hints about why the TM data is not the same date as the MSS data, see this page.

iv. Check the scale (in meters) as previously:

// Get the scale of the TM data from its projection:
var tmScale = tmImage.select('B1')
.projection().nominalScale();
print('TM scale:', tmScale);

d. NAIP. The National Agriculture Imagery Program (NAIP) is an effort to acquire imagery over the continental US on a 3-year rotation using airborne sensors. The imagery have a spatial resolution of 1-2 meters.

i. Search for ‘naip’ and import the first result (which should be ‘NAIP: National Agriculture Imagery Program’. Name the import naip.

ii. Since NAIP imagery is distributed as quarters of Digital Ortho Quads (DOQs) at irregular cadence, load everything from the most recent year in the acquisition cycle (2012) over the study area and mosaic() it:

// Get NAIP images for the study period and region of interest.
var naipImages = naip.filterDate('2012-01-01', '2012-12-31')
.filterBounds(Map.getCenter());

// Mosaic adjacent images into a single image.
var naipImage = naipImages.mosaic();

// Display the NAIP mosaic as a color-IR composite.
Map.addLayer(naipImage, {bands: ['N', 'R', 'G']}, 'NAIP');

iii. Check the scale by getting the first image from the mosaic (a mosaic doesn’t know what its projection is, since the mosaicked images might all have different projections), getting its projection, and getting its scale (meters):

// Get the NAIP resolution from the first image in the mosaic.
var naipScale = ee.Image(naipImages.first()).projection().nominalScale();
print('NAIP scale:', naipScale);

2. Spectral resolution

Spectral resolution refers to the number and width of spectral bands in which the sensor takes measurements. A sensor that measures radiance in multiple bands is called a multi-spectral sensor, while a sensor with many bands (possibly hundreds) is called a hyper-spectral sensor (these are not hard and fast definitions). For example, compare the multi-spectral OLI aboard Landsat 8 to Hyperion, a hyperspectral sensor aboard the EO-1 satellite.

There is an easy way to check the number of bands in Earth Engine, but no way to get an understanding of the relative spectral response of the bands, where spectral response is a function measured in the laboratory to characterize the detector.

a. To see the number of bands in an image, use:

// Get the MODIS band names as a List
var modisBands = modisImage.bandNames();

// Print the list.
print('MODIS bands:', modisBands);

// Print the length of the list.
print('Length of the bands list:', modisBands.length());

i. It’s worth noting that only some of those bands contain radiometric data. Lots of them have other information, like quality control data. So the band listing isn’t necessarily an indicator of spectral resolution, but can inform your investigation of the spectral resolution of the dataset. Try printing the bands from some of the other sensors to get a sense of spectral resolution.

3. Temporal resolution

Temporal resolution refers to the revisit time, or temporal cadence of the image data. (Sometimes temporal resolution refers to the dwell time or integration time: the amount of time the sensor “stares” at a location through it’s IFOV.) Think of this as the frequency of pixels in a time series at a given location.

a. MODIS. MODIS (either Terra or Aqua) produces imagery at approximately daily cadence. To see the time series of images at a location, you can print() the ImageCollection, filtered to your area and date range of interest. For example, to see the MODIS images in 2011:

// Filter the MODIS mosaics to one year.
var modisSeries = myd09.filterDate('2011-01-01', '2011-12-31');

// Print the filtered MODIS ImageCollection.
print('MODIS series:', modisSeries);

i. Expand the features property of the printed ImageCollection to see a List of all the images in the collection. Observe that the date of each image is part of the filename. Note the daily cadence. Observe that each MODIS image is a global mosaic, so there’s no need to filter by location.

b. Landsat. Landsats (5 and later) produce imagery at 16-day cadence. TM and MSS are on the same satellite (Landsat 5), so it suffices to print the TM series to see the temporal resolution. Unlike MODIS, data from these sensors is produced on a scene basis, so to see a time series, it’s necessary to filter by location in addition to time:

// Filter to get a year's worth of TM scenes.
var tmSeries = tm
.filterBounds(Map.getCenter())
.filterDate('2011-01-01', '2011-12-31');

// Print the filtered TM ImageCollection. 
print('TM series:', tmSeries);

i. Again expand the features property of the printed ImageCollection. Note that a careful parsing of the TM image IDs indicates the day of year (DOY) on which the image was collected. A slightly more cumbersome method involves expanding each Image in the list, expanding its properties and looking for the ‘DATE_ACQUIRED’ property.

ii. To make this into a nicer list of dates, map() a function over the ImageCollection. First define a function to get a Date from the metadata of each image, using the system properties:

var getDate = function(image) {
 // Note that you need to cast the argument
var time = ee.Image(image).get('system:time_start');
// Return the time (in milliseconds since Jan 1, 1970) as a Date
return ee.Date(time);
};

iii. Turn the ImageCollection into a List and map() the function over it:

var dates = tmSeries.toList(100).map(getDate);

iv. Print the result:

print(dates);

4. Radiometric resolution

Radiometric resolution is determined from the minimum radiance to which the detector is sensitive (Lmin), the maximum radiance at which the sensor saturates (Lmax), and the number of bits used to store the DNs (Q):

Radiometric resolution = (Lmax - Lmin)/2Q.

It might be possible to dig around in the metadata to find values for Lmin and Lmax, but computing radiometric resolution is generally not necessary unless you’re studying phenomena that are distinguished by very subtle changes in radiance.

5. Orbits and sensor motion (optional)

The image data are collected from moving platforms (satellites or aircraft). The motion of the platform, together with the imaging geometry of the sensor determines the spatio-temporal resolution of the data. To get an idea for how these design choices interact to produce the wonderful imagery in Earth Engine, examine the orbit of the Aqua satellite for a selected day in 2013.
a. Load and display MODIS (Aqua) sensor zenith angle for imagery collected on September 1, 2013:

var aquaImage = ee.Image(myd09.filterDate('2013-09-01').first());

// Zoom to global level.
Map.setCenter(81.04, 0, 3);

// Display the sensor-zenith angle of the Aqua imagery.
var szParams = {bands: 'SensorZenith', min: 0, max: 70*100};
Map.addLayer(aquaImage, szParams, 'Aqua sensor-zenith angle');

i. To understand where those visualization parameters come from, expand the ‘Layers’ section on this page.

b. We’ve stored some satellite positions from September 1, 2013 in a Google Fusion Table. (The positions were downloaded from this site as a CSV table, then uploaded to a Fusion Table). Load the Fusion Table into Earth Engine to get a FeatureCollection of points:

var aquaOrbit = ee.FeatureCollection('ft:1ESvPygQ76WvVflKMN2nc14sS2wwtzqv3j2ueTqg');

c. Display the points by adding them to the map. (If you have other layers displayed from the previous sections, either comment the Map.addLayer() lines, or add false as a fourth argument to Map.addLayer(), so that they will be turned off by default.) Zoom the map to a global view:

Map.addLayer(aquaOrbit, {color: 'FF0000'}, 'Aqua Positions');

d. Compare this orbit, and the swath of imagery, to a Landsat orbit from the same day. To explore the difference, color the landsat imagery blue:

// Load Landsat ETM+ data directly, filter to one day.
var landsat7 = ee.ImageCollection('LANDSAT/LE7')
.filterDate('2013-09-01', '2013-09-02');

// Display the images by specifying one band and a single color palette.
Map.addLayer(landsat7, {bands: 'B1', palette: 'blue'}, 'Landsat 7 scenes');

6. Assignment

  1. In your code, set the following variables to the scales of the following bands:
  2. Make this point in your code: ee.Geometry.Point([-122.30144181223767, 37.80215861281014]). How many MYD09GA images are there in 2017 at this point? Set a variable called mod09ImageCount with that value. How many Landsat 8 surface reflectance images are there in 2017 at this point? Set a variable called l8ImageCount with that value.

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright 2015-2018, Google Earth Engine Team

猜你喜欢

转载自blog.csdn.net/qq_34907896/article/details/89026629
今日推荐