Marine mammal detection
For those involved with marine mammal surveillance, infrared analysis in both daytime and nighttime conditions is an effective means of building accurate species inventories.
Imagery provides more than plain pictures. Some sensors detect energy beyond what is humanly visible, allowing us to “see” across broad swaths of the electromagnetic spectrum. This enables scientists, geologists, farmers, botanists, and other specialists to examine conditions, events, and activities that would otherwise be hidden. The implications are profound and the applications are seemingly endless.
Every day, the earth is directly imaged from scores of sensors in the sky and from orbit in space. Almost everything that happens is measured, monitored, photographed, and explored by thousands of imaging devices mounted on satellites, aircraft, drones, and robots. Much of this information ends up as imagery that is integrated into a large living, virtual GIS of the world, deployed on the web.
Some of these sensors see beyond what our eyes see, enabling us to view what’s not apparent. Multispectral imagery measures and captures this information about a world that has many more dimensions than just the colors of the rainbow—it sees past the limits of what our eyes perceive.
Other active sensor technologies such as lasers and radar beam out signals that are reflected back at the speed of light, adding even more information to the collective repository. Some image sensors can see through clouds and under trees. Some detect things too subtle for any of our senses to distinguish. The richness and immediacy of this information is leading to a heightened understanding of the natural processes and human activities that influence our communities and environment. The ability to gather and exploit these new information sources is increasingly important to GIS practitioners.
This has been the mission of the remote sensing community since the first camera went up in an aircraft, and today the output from these sensors, across all the spectral ranges, is absolutely essential in enabling people to make better decisions.
The new kinds of multispectral sensors used in scientific work and analysis function the same way in GIS as the traditional natural light scenes; the basic principles are the same. These days, the speed and extent of collection and transmission means that the information is more immediate than ever, enabling us to make vital comparisons in near real-time after major man-made and natural events occur.
Web GIS is the nervous system for the planet, and imagery across the spectrum plays a vital role.
The camera on your phone is a sensor designed to capture photos—light and colors that represent objects we recognize in the way we are accustomed to seeing. These photos are collections of pixels expressed as the depth of red, green, and blue colors. Many aerial and satellite platforms capture images in this same way—along the visible spectrum—resulting in what are essentially georeferenced pictures of the earth from above. While not as exotic as some types of images, the value of natural color imagery is exceptionally high.
Simply capturing a series of pictures of the visible landscape from the air delivers fresh insight and helps us to understand many things within the framework of their geographic context and location. Additionally, the growing number of sensors and frequency of capture is increasing the application value of imagery and photography.
In the early history of powered aircraft, aerial photographs—pictures of the earth from above—began to be found useful for military and scientific applications. Quite quickly, imaging professionals and scientists realized that it was possible to detect beyond what is visible to the unassisted human eye. Deeper and richer information could be revealed by detecting waveforms from beyond the rainbow of visible light, into the invisible. As it turns out, these hard-to-detect realms of the spectrum offered some of the most meaningful insights. Hidden in these signals were previously unknown facts about Earth that have enabled us to understand our world far more effectively than had been possible.
Multispectral imagery measures different ranges of frequencies across the electromagnetic spectrum. One way to think of these different frequencies is as colors, where some colors are not directly visible to human eyes. These frequency ranges are called bands. Different image sensors measure different band combinations. The longest-running and perhaps most well-known multispectral imaging program has been Landsat, which began Earth image collection in the 1970s. By assigning data from three bands of the sensor to the red, green, and blue channels of an electronic display (or printer for a hard copy), color visualizations are created. Here are some examples of various alternate band combinations and their applications.
Panchromatic imagery, commonly known as pan, is typically recorded at a higher resolution than the multispectral bands on any given satellite. It remains a critical source for many GIS applications as a reference for basic interpretation and analysis. Pan is often combined with other bands through a process called pansharpening to generate higher-resolution scenes.
The Normalized Difference Moisture Index (NDMI) estimates moisture levels in vegetation where wetlands and vegetation with high moisture appear as blue growing to dark blue for higher moisture levels, and drier areas appear as yellow to brown shades. Image analysts often apply a formula to combine the selected multispectral bands to calculate various indexes.
For those involved with marine mammal surveillance, infrared analysis in both daytime and nighttime conditions is an effective means of building accurate species inventories.
The French organization Réseau d’Observation du Littoral Normand et Picard uses imagery for several platforms to study the evolution of the coastline from Normandy to Picard. Presented in French, this stunning story map tracks the transit of sediments, sands, and gravel in the coastal strip as carried out by the action of tidal currents, waves, and prevailing wind. Coastal erosion is having a significant impact on the beaches and cliffs.
This comparison of two images in the west of Serbia shows two rivers overflowing extensively into the surrounding fields following a major flood in 2014. The towns of Krupanj and Obrenovac in Serbia are completely flooded. The ground and various parcels of land are completely hidden by water and mud. The image on the left is a TerraSAR-X radar image taken the night the flooding started, already showing breaches in the earthen dams, followed by an optical SPOT image (right) taken once the cloud cover reduced, and displaying the full devastation of the flood breach. © CNES 2013-2014, Distribution Airbus DS
These views from near Tehran, Iran, show a natural color band image on the left and short-wave infrared (SWIR) image on the right. Note how one particular rock top pops out in pink using the SWIR bands and is not as easily discernible in the natural color band combination. The variation in rock types allows analysts to easily identify specific mineral patterns, greatly narrowing the search areas for particular materials.
Copernicus is the European Space Agency’s (ESA) earth observation program to monitor how our planet and its environment are changing. Finite natural resources are under pressure from our global population growth, generating an ever-increasing demand for safe living space, fresh water, fertile land, and clean air.
To make effective decisions, public authorities, policy makers, businesses, and citizens need reliable and up-to-date information services. The Copernicus program is founded on a dedicated constellation of satellites named the Sentinels—more than a dozen will be launched into orbit over the next 10 years, covering marine, land, climate, emergency, security, and atmospheric applications. The first one—Sentinel-1A—is a polar-orbiting, all-weather, day-and-night radar imaging mission for land and ocean services. It came online in late 2014. The second radar satellite, Sentinel-1B, launched successfully in April 2016.
Sentinel-2A was launched in June 2015 to monitor land and vegetation, as well as coastal waters. This new satellite carries a high-resolution optical instrument that covers 13 spectral bands with a swath width of 290 kilometers. The refined band sensitivity of the sensor means it’s particularly adept at monitoring urban sprawl and land use, as seen in this image (right) of Rome, Italy, taken in August 2016.
You can download the ESA app to watch the Sentinel satellites orbit the earth live. Search “ESA Sentinel” in the iTunes App Store or Google Play on Android.
The Landsat program is the longest-running enterprise for acquisition of satellite imagery of Earth. On July 23, 1972, the first Earth Resources Technology Satellite was launched. This was eventually renamed Landsat. The most recent, Landsat 8, was launched in 2013. The instruments on the Landsat satellites have acquired millions of images. Archived at Landsat receiving stations around the world, these images are a unique resource for global change research, agriculture, cartography, geology, forestry, regional planning, surveillance, and education. Historical archives can be viewed through the USGS EarthExplorer website. Through Landsat 7, the data has eight spectral bands with spatial resolutions ranging from 15 to 60 meters. Every part of the Landsat coverage is rephotographed every 16 days.
Landsat 8 added two additional bands. The three key mission and science objectives for the latest “bird” were to collect and archive medium-resolution (30-meter per pixel) multispectral image data affording seasonal coverage of the global landmasses for a period of no less than five years; ensure that Landsat 8 data is sufficiently consistent with data from the earlier Landsat missions in terms of coverage and spectral characteristics, output product quality, and data availability to permit studies of land cover and land-use change over time; and distribute Landsat 8 data products to the general public on a nondiscriminatory basis at no cost to the user.Video: Landsat data continuity mission overview
Multispectral remote sensing serves up radically new perspectives on crop health and vigor. The red edge is the boundary between the (visible) red and (invisible to humans) near infrared (NIR), and it’s called an edge because the spectral profile of vegetation shows a dramatic rise in brightness from red to NIR. When vegetation is stressed and the profile changes, the edge moves, and a narrow spectral band at the right wavelength can detect a dramatic difference. MicaSense’s RedEdge cameras are calibrated for classic spectral bands of blue, green, red, and NIR, but also have a fifth band at 720 nanometers explicitly to detect movement of the red edge. An inexpensive drone equipped with a MicaSense camera can be flown as often as necessary, allowing the grower to finely calibrate irrigation, fertilization, and the application of insecticides (which can also be applied via drone).
Space-based image sensors typically measure solar light reflected from the ground. This is often called passive sensing. In contrast, active sensors such as lidar, radar, and sonar emit pulses of energy and then monitor the return of energy. As the return energy arrives at the sensor, the intensity and time stamps of the return signals are used to determine the precise shape and location of the object. Active sensors work perfectly well at night, an inherent capability of active sensing technologies.
KyFromAbove is a statewide mapping program for the Kentucky state government that includes the comprehensive lidar collection of surface elevation at high resolution throughout the commonwealth. This story map tells the tale of the statewide collection and how it is being put to use.
LAS files (the generic lidar exchange format) are a collection of points, each with horizontal coordinates and a vertical elevation value. LAS files provide a common format for storing additional information such as laser intensity, scan angle, and return information. When encoded as red-green-blue (RGB), the scenes take on a photorealistic appearance, like this visualization from Petaluma, California.
Weather-based radar represents a type of radar used to sense precipitation and type (such as rain, snow, or hail), as well as to calculate the motion of storm systems. Modern weather radars are mostly Doppler radars, capable of detecting the motion and location of rain droplets in addition to the intensity of precipitation. Radar data can be analyzed to determine the structure of storms and their potential to cause severe weather.
The Shuttle Radar Topography Mission (SRTM) was a NASA Space Shuttle-based research effort that obtained digital elevation models on a near-global scale from 56° S to 60° N in an effort to generate a complete high-resolution digital topographic database of Earth from space.
To acquire elevation data, the Space Shuttle Endeavour was outfitted with two radar antennas, one in the shuttle’s payload bay and the other tethered on the end of a 60-meter mast. The radar instruments on board applied synthetic aperture radar, which was used to generate terrain surface maps of the earth at a resolution of 30 meters. Once the mission was complete and the data could be processed, it was shared publicly in the first few years at a reduced resolution of 90 meters. Recently, elevation data has been released for the world at the full resolution of 30 meters.
All objects on Earth emit or radiate infrared radiation because they have a temperature. This energy is long wavelength and can be collected by thermal infrared (TIR) sensors. Thermal imaging is a day-night capable sensor since it does not require illumination; all objects radiate energy on their own, day or night. Objects that are hotter radiate more energy, so on a thermal image, they appear brighter.
Hyperspectral sensors see the world using a broad swath of the electromagnetic spectrum, but unlike multispectral sensors, the hyperspectral systems provide many more spectral bands, enabling observation of detailed spectral signatures. Hyperspectral images can enable identification of specific plants and minerals.
Many projects that use hyperspectral sensors are designed for specialized focus on particular bands to discover the presence of specific phenomena. These signatures enable identification of the materials that make up a scanned object. Detection of known spectral objects is aided by their tendency to have very similar spectral characteristics wherever they occur. For example, the spectral signature of a white pine tree is consistent and distinct from the signature of a sugar maple. Rocks that hold significant amounts of one mineral are distinct from similar-looking rocks holding another type of mineral. These distinctions are used to identify and extract features for use in a variety of applications.
Hyperspectral map of Cuprite, Nevada, provides a synoptic view of the surface mineralogy, and identified a previously unrecognized early steam-heated hydrothermal event that resulted in extensive distribution of iron-bearing elements.
Individual materials scanned using hyperspectral imagery have unique characteristics, or fingerprints. This graph compares the reflectance of hematite (an iron ore) with malachite and chrysocolla (copper-rich minerals) from 200 to 3,000 nanometers in wavelength.
There may be hundreds of thousands, if not millions, of undiscovered ancient sites across the globe, and Sarah Parcak wants to locate them. As a satellite archaeologist, she analyzes infrared imagery collected from far above the earth’s surface to identify subtle changes that signal a man-made presence hidden from view. Doing so, she and her colleagues aim to make invisible history visible once again—and to offer a new understanding of the past.
Inspiration comes from her grandfather, an early pioneer of aerial photography. While studying Egyptology in college, Parcak took a class on remote sensing and went on to develop a technique for processing satellite data to see sites of archaeological significance in Egypt. The method allows for the discovery of new sites in a rapid and cost-effective way.
In partnership with her husband, Greg Mumford, they have directed survey and excavation projects in various places in Egypt. She’s used several types of satellite imagery to look for water sources and archaeological sites.
Her latest work focuses on the looting of ancient sites. By satellite-mapping Egypt and comparing sites over time, the team noted a 1,000 percent increase in looting since 2009 at major ancient sites. It’s likely that millions of dollars’ worth of ancient artifacts are stolen each year. The hope is that, through mapping, unknown sites can be protected to preserve our rich, vibrant history.
Not all imagery applications require the projection of sensor data onto a map or, in other words, the registration of imagery into a geographic coordinate system. There are many applications where it is more effective and appropriate to work with the original image and view it from the perspective of the camera. This is referred to as working in image space, in contrast to working in a map coordinate system. Numerous military and civilian reconnaissance applications involve the use of both a map view and an image window. For example, inspection applications effectively use an image view and a map view in concert.
ArcGIS has the ability to integrate and incorporate full motion video (referred to as FMV), presuming you have metadata that describes the geographic location for your video. This is akin to how aerial imagery is georeferenced except that every frame in the video is georeferenced. Such georeferenced videos adhere to formats established by the Motion Imagery Standards Board (MISB), which oversees standards for full motion video capture pertaining to the defense and intelligence communities in the United States.
This enables MISB-compliant video frame locations to be placed as windows into your map views—and your map data as optional overlays in your video. FMV technology enables you to quickly and easily analyze video data from many kinds of airborne sensors—such as aircraft, drones, and other UAVs.
The quickest way to access the multispectral imagery in ArcGIS Online is through the Living Atlas of the World. But this is only a starting point. Once you’ve opened any of these services in ArcGIS, you can use the Display Image menu selection to alter the bandwidths and create your own combinations.
In these lessons, you’ll assume the role of a geospatial scientist tasked with calculating the change in area of the lake between 1984 and 2014. Using Landsat imagery, you’ll classify land cover in three images of the lake taken at various times over the past 30 years to show only the surface area of the lake. You’ll then determine the change in lake area over time.
Poyang Lake, China’s largest freshwater lake, has always had significant seasonal fluctuations in water level. Fed both by rains and the Yangtze River, Poyang Lake has lately experienced even more extreme fluctuations due to several years of drought and the construction of the Three Gorges Dam.
Dry season water levels are alarmingly low, and even rainy season water levels have fallen. The changes have impacted the local economy and altered the land cover of the area. But if locals want to do something about their lake’s disappearance, they’ll need to back their live observations with scientific facts.