FUSION OF HYPERSPECTRAL, MULTISPECTRAL, COLOR AND 3D POINT CLOUD INFORMATION FOR THE SEMANTIC INTERPRETATION OF URBAN ENVIRONMENTS
release_3xq3myk2jng4la36z24q6ssgu4
by
M. Weinmann, M. Weinmann
2019 Volume XLII-2/W13, p1899-1906
Abstract
<strong>Abstract.</strong> In this paper, we address the semantic interpretation of urban environments on the basis of multi-modal data in the form of RGB color imagery, hyperspectral data and LiDAR data acquired from aerial sensor platforms. We extract radiometric features based on the given RGB color imagery and the given hyperspectral data, and we also consider different transformations to potentially better data representations. For the RGB color imagery, these are achieved via color invariants, normalization procedures or specific assumptions about the scene. For the hyperspectral data, we involve techniques for dimensionality reduction and feature selection as well as a transformation to multispectral Sentinel-2-like data of the same spatial resolution. Furthermore, we extract geometric features describing the local 3D structure from the given LiDAR data. The defined feature sets are provided separately and in different combinations as input to a Random Forest classifier. To assess the potential of the different feature sets and their combination, we present results achieved for the MUUFL Gulfport Hyperspectral and LiDAR Airborne Data Set.
In application/xml+jats
format
Archived Files and Locations
application/pdf 1.6 MB
file_3vm3uuoqp5fmdgmkvg4su67y7u
|
www.int-arch-photogramm-remote-sens-spatial-inf-sci.net (publisher) web.archive.org (webarchive) |
Open Access Publication
In DOAJ
In ISSN ROAD
In Keepers Registry
ISSN-L:
1682-1750
access all versions, variants, and formats of this works (eg, pre-prints)
Crossref Metadata (via API)
Worldcat
SHERPA/RoMEO (journal policies)
wikidata.org
CORE.ac.uk
Semantic Scholar
Google Scholar