Loading...
Pubblicazioni Scientifiche
Filtri di ricerca 2 risultati
Pubblicazioni per anno
An intensity, image-based method to estimate gap fraction, canopy openness and effective leaf area index from phase-shift terrestrial laser scanning
Grotti
,
Mirko
,
Calders
,
Kim
,
Origo
,
Niall
,
Puletti
,
Nicola
,
Alivernini
,
Alessandro
,
Ferrara
,
Carlotta
,
Chianucci
,
Francesco
Mostra abstract
Accurate in situ estimates of leaf area index (LAI) are essential for a wide range of ecological studies and applications. Due to the destructiveness and impracticality of direct measurements, indirect optical methods have mostly been used in the field to derive estimates of LAI from gap fraction measurements. Terrestrial laser scanning (TLS) is strongly supporting use of this active technology, which possesses several advantages compared to passive sensors. However, edge effects and partial beam interceptions are significantly challenges for the accurate retrieval of gap fraction from 3D point cloud data available from TLS, particularly in phase-shift instruments, which in turns require point cloud filtering to correct erroneous point measurements. As the limitations above influences the point cloud, we proposed a new method which is based only on the laser return intensity (LRI) information derived from raw TLS data, which are used to generate 2D intensity images. The intensity image contains all the unfiltered LRI information captured by TLS, which is used to separate gap from non-gap pixels, using a procedure comparable to the standard image analysis processing of digital hemispherical images. This allows a theoretically consistent comparison between active and passive optical measurements of gap fraction across all the zenith angle range. The method was tested in real and simulated forests. Gap fraction, canopy openness and effective leaf area index derived from real and simulated intensity TLS images were compared with those obtained using digital hemispherical photography (DHP). Results indicated that the intensity, image-based method outperformed DHP, as the higher pixel resolution of the intensity images and the larger distance covered by TLS allowed detection of many small canopy elements, particularly at higher zenith angles (longer optical distance), which are not detected in DHP. The main findings support the reliability of the intensity, image-based method to standardize protocols for TLS phase-shift scan data processing and use of the produced canopy estimates as a benchmark for passive optical measurements. © 2019 Elsevier B.V.
A new method to estimate clumping index integrating gap fraction averaging with the analysis of gap size distribution
leaf area index
hemispherical photography
canopy nonrandomness
ordered weighted averaging (owa) operator
orness
Mostra abstract
Estimates of clumping index (Ω) are required to improve the indirect estimation of leaf area index (L) from optical field-based instruments such as digital hemispherical photography (DHP). A widely used method allows estimation of Ω from DHP using simple gap fraction averaging formulas (LX). This method is simple and effective but has the disadvantage of being sensitive to the spatial scale (i.e., the azimuth segment size in DHP) used for averaging and canopy density. In this study, we propose a new method to estimate Ω (LXG) based on ordered weighted gap fraction averaging (OWA) formulas, which addresses the disadvantages of LX and also accounts for gap size distribution. The new method was tested in 11 broadleaved forest stands in Italy; Ω estimated from LXG was compared with other commonly used clumping correction methods (LX, CC, and CLX). Results showed that LXG yielded more accurate Ω estimates, which were also more correlated with the values obtained from the gap size distribution methods (CC and CLX) than Ω obtained from LX. Leaf area index estimates, adjusted by LXG, are only 5%–6% lower than direct measurements obtained from litter traps, while other commonly used clumping correction methods yielded more underestimation. © 2019, Canadian Science Publishing. All rights reserved.