GEOSCAN Search Results: Fastlink

GEOSCAN Menu


TitleDrone based very-high resolution imagery analysed with geographic object-based image analysis: the perfect match for mapping intertidal habitats?
DownloadDownload (whole publication)
AuthorDiesing, M; Archer, S; Bremner, J; Dolphin, T; Downie, A -L; Scougal, C
SourceProgram and abstracts: 2017 GeoHab Conference, Dartmouth, Nova Scotia, Canada; by Todd, B J; Brown, C J; Lacharité, M; Gazzola, V; McCormack, E; Geological Survey of Canada, Open File 8295, 2017 p. 46, https://doi.org/10.4095/305846 (Open Access)
LinksGeoHab 2017
Year2017
PublisherNatural Resources Canada
Meeting2017 GeoHab: Marine Geological and Biological Habitat Mapping; Dartmouth, NS; CA; May 1-4, 2017
Documentopen file
Lang.English
Mediaon-line; digital
RelatedThis publication is contained in Todd, B J; Brown, C J; Lacharité, M; Gazzola, V; McCormack, E; (2017). Program and abstracts: 2017 GeoHab Conference, Dartmouth, Nova Scotia, Canada, Geological Survey of Canada, Open File 8295
File formatpdf
Subjectsgeophysics; mapping techniques; oceanography; marine environments; coastal studies; conservation; marine organisms; marine ecology; resource management; remote sensing; biology; habitat mapping; habitat conservation; habitat management; algorithms; object-based image analysis; drones
ProgramOcean Management Geoscience, Offshore Geoscience
Released2017 09 26
AbstractIntertidal zones act as a natural buffer against storms and wave activity, and support rich assemblages of invertebrates and vertebrates, which have high economic, conservation and aesthetic importance. They are highly dynamic environments subject to constant change, while at the same time also threatened by anthropogenic stresses, including coastal development and habitat degradation. Mapping intertidal habitats and monitoring their temporal change due to natural processes and human activities is therefore important.
The challenges of mapping and monitoring the intertidal zone are numerous including access, remote locations, tides and the dangers which arise from these. Remotely Piloted Aircraft Systems, commonly known as drones, offer exciting new opportunities for studies on intertidal ecology and habitat mapping, as they allow access to areas which are otherwise difficult to reach, can be mobilised relatively quickly and collect imagery of the intertidal zone at spatial scales relevant to answer ecological research questions. Drones allow intertidal zone mapping at very high spatial resolution (0.5 - 5 cm) in the visual and near-infrared spectrum. Additionally, digital surface models can be generated using structure-from-motion. These highly resolved data mean that scene objects are much larger than the pixel size of the image. In this so-called H-resolution case, pixel-based image analysis methods become increasingly inefficient as they struggle to derive meaningful spectral signatures from real-world objects with high within-class spectral variability. Conversely, geographic object-based image analysis (GEOBIA) is well suited for analysing and classifying highly resolved imagery. In the GEOBIA approach, the imagery is initially segmented into discrete regions that are internally coherent and different from their surroundings (so-called image objects). Classification of these image objects is subsequently performed by making use of image object features, which might include object statistics on input layers, geometry (shape and size), texture, topology (e.g. relations to neighbouring objects) and others.
This contribution presents results of a two-year project investigating the applicability of GEOBIA to very-high resolution imagery collected with a fixed-wing drone over (i) a muddy intertidal zone in the East of England (Two Trees Island, Essex, UK) with the aim of mapping seagrass beds, and (ii) a rocky shore platform situated in the Bristol Channel (UK) mapped repeatedly to detect change in the cover of Corallina sp., a red seaweed with a calcareous skeleton. We demonstrate that intertidal habitats can be mapped with high accuracy >90% (sensitivity, specificity, balanced accuracy) across all classes. For change detection, we use image datasets from two dates (T1 and T2). Initially, a habitat map is created for T1 using the random forest algorithm. Segmentation on temporally stacked image data, followed by iterative trimming of outliers allows us to identify changed image objects in a statistically robust way. Subsequently, unchanged image objects of the T1 map are used as 'samples' to predict habitat classes in the T2 map. Post-classification change detection (T1 vs T2) allows definition of the direction of change (from-to). We show that the inclusion of these latter stages in the change detection methodology not only yields more information on the nature of change, but also improves the accuracy in change detection.
GEOSCAN ID305846