ABSTRACT
The management of natural and planted forests can be conducted sustainably by implementing techniques that consider the spatial and temporal variability of the plant and soil. In this context, precision silviculture through remote sensing can play a vital role, mainly when using Unmanned Aerial Vehicles (UAVs) equipped with specific sensors. In the present study, an automated computational routine (based on computer vision techniques) was developed and validated to perform forest inventory in commercial Eucalyptus grandis forests, using an orthophoto mosaic obtained with an RGB sensor built-in to a UAV. The developed routine employs computer vision techniques, including template matching to locate plants, Delaunay triangulation to create a mesh and indicate the predominant orientations of the planting rows, and an adaptation of the Hough transform to estimate the analytical parameters of each row. These parameters are refined using linear regression to generate the lines best fitting the input data. Finally, the failure segments on each row are identified by detecting the plants in each row. A simulation of regular point distribution on the segment is then used to identify the planting failure. This process allows the geolocation of each failure point for replanting to be quantified. The routine has significant potential in the forest inventory, allowing the geolocation of failures with an overall accuracy of 0.97 and 0.99, respectively, and a maximum positional error of 0.15 m and 0.20 m, respectively.
hough transform; point Clustering; precision forestry; remote sensing; unmanned aerial vehicles
Introduction
Planting row reconstitution has been extensively studied in automated guidance of agricultural vehicles using proximal Red-Green-Blue (RGB) sensors. There are different methodological approaches to solving this problem, with the main ones presented in the works of Vidović et al. (2016)Vidovic I, Cupec R, Hocenski Ž. 2016. Crop row detection by global energy minimization. Pattern Recognition 55: 68-86. https://doi.org/10.1016/j.patcog.2016.01.013
https://doi.org/10.1016/j.patcog.2016.01...
, Basso and Freitas (2020)Basso M, Freitas EP. 2020. A UAV guidance system using crop row detection and line follower algorithms. Journal of Intelligent & Robotic Systems 97: 605-621. https://doi.org/10.1007/s10846-019-01006-0
https://doi.org/10.1007/s10846-019-01006...
and Rabab et al. (2021) Rabab S , Badenhorst P , Chen YP , Daetwyler HD . 2021. A template-free machine vision-based crop row detection algorithm. Precision Agriculture 22: 124-153. https://doi.org/10.1007/s11119-020-09732-4
https://doi.org/10.1007/s11119-020-09732...
. In general, the process of crop row detection involves three main steps: the first is image acquisition, which can be conducted using a proximal or aerial sensor; the second is image segmentation, which involves the creation of a binary image and the removal of noise; and the third is crop or planting row detection, which is based on pre-defined parameters and a proposed method (Rabab et al., 2021 Rabab S , Badenhorst P , Chen YP , Daetwyler HD . 2021. A template-free machine vision-based crop row detection algorithm. Precision Agriculture 22: 124-153. https://doi.org/10.1007/s11119-020-09732-4
https://doi.org/10.1007/s11119-020-09732...
). The most common method in this of approach is the Hough transform, as demonstrated by Jiang et al. (2015)Jiang G, Wang Z, Liu H. 2015. Automatic detection of crop rows based on multi-ROIs. Expert Systems with Applications 42: 2429-2441. https://doi.org/10.1016/j.eswa.2014.10.033
https://doi.org/10.1016/j.eswa.2014.10.0...
and Vidović et al. (2016)Vidovic I, Cupec R, Hocenski Ž. 2016. Crop row detection by global energy minimization. Pattern Recognition 55: 68-86. https://doi.org/10.1016/j.patcog.2016.01.013
https://doi.org/10.1016/j.patcog.2016.01...
. However, when using RGB orthophoto mosaics from UAV images, as the images are obtained with a sensor directed at the nadir and further away from the target (plants), the approach is different, and new research is necessary (Chen et al., 2021 Chen P , Ma X , Wang F , Li J . 2021. A new method for crop row detection using unmanned aerial vehicle images. Remote Sensing 13: 3526. https://doi.org/10.3390/rs13173526
https://doi.org/10.3390/rs13173526...
). A summary of some advances is presented in Table 1 .
Another approach, developed by Oliveira et al. (2020)Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
https://doi.org/10.5935/1806-6690.202000...
, proposed a solution based on computer vision techniques. This involved the application of template matching to identify eucalyptus plants and the use of Delaunay Triangulation to identify cases of planting failures (Figure 1A). The computational routine developed by the authors demonstrated high accuracy (> 93 %) for the plant inventory. However, failure to associate each plant with its respective planting row compromises the routine’s capacity to identify failures in abnormal situations, such as those resulting from excessive planting (Figure 1B). It is noteworthy that, in their methodology, the Delaunay triangulation is unsuitable in areas with many failures, as it cannot reconstitute the planting rows accurately.
– A) Failure identification (green line and red circle), planting rows (yellow color), and between rows (magenta color) (Oliveira et al., 2020Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
https://doi.org/10.5935/1806-6690.202000... ); B) Limitation of the Delaunay technique for reconstituting rows and planting failures in abnormal situations.
Although the studies mentioned above present advances in planting row reconstitution, they have yet to presented a specific and satisfactory approach for forest inventory (for monitoring tree mortality). Given the challenges mentioned, this work aims to develop and validate a fully automated solution for carrying out the forest inventory (for monitoring tree mortality) in commercial eucalyptus plantations. This solution will be based on orthophoto mosaics obtained by RGB sensors built-in to a UAVs, even in more extreme situations (Figure 1B).
Materials and Methods
Study site and dataset
In the present work, a commercial plantation area (47.60 ha) of Eucalyptus grandis W. Hill ex Maiden with a spacing of 2.5 m × 3.0 m and 90 days of regrowth (under coppice), located in the municipality of Martinho Campos, Minas Gerais state, Southeastern Brazil, was selected. However, the development of the computational routine was delimited by a test area of 3.28 ha (Figure 2A-D). The aerophotogrammetric data were collected on 27 Aug 2021 using a UAV of multirotor type (Phantom 4 Pro) equipped with an RGB sensor (20 Megapixels, 13.2 mm × 8.8 mm and focal length of 8.8 mm). The flight planning was prepared in the DJI Pilot® App, with 85 % of forward and lateral overlap. The maximum speed was 5 m s –1 , flight height of 80 m, with eight bits of radiometric resolution. Nine support points were set up: five ground control points and four checkpoints to control quality (accuracy), both systematically distributed within the area of interest.
– Location of the experimental site (Martinho Campos – MG): A) Federative Units of Brazil; B) state of Minas Gerais (MG); C) municipality of Martinhos Campos; and D) limits of experimental area.
The sensor settings and operations during the flight were as follows: ISO = 100; diaphragm = F/2.8; exposure time < 1/640; EV = –0.3; photo = single shot; white balance = sunny; style = landscape; image size = 3/2; mechanical shutter = on; camera focus = infinity. Finally, the mission was executed at 11h00 (to minimize the shadowing effect), and 186 aerial images were collected. With the flight parameters adopted, flying over the test area in approximately 9 min was possible.
Subsequently, the collected images were imported into Agisoft Metashape Professional Edition® software (version 1.8.2) for the geoprocessing, specifically Structure from Motion (SfM). This generated the Classic Orthophoto Mosaic (COM-RGB) with a ground sample distance (GSD) of 5 cm. The checkpoints for root mean squared error (RMSE) [X, Y] and RMSE [Z] were 5.2 cm and 6.1 cm, respectively. The Test Area (19°30’57.56” S, 45°18’46.87” W, 760 m) was clipped in QGIS 3.22.11 software.
Implementing the automated routine
The developed routine designated “FindRow&Failure” comprises a workflow considering two modules with their respective inputs and outputs. The workflow of the computational routine and its two modules is shown in Figure 3 . On the left is highlighted (Module I) a step-by-step approach, as proposed in the work of Oliveira et al. (2020)Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
https://doi.org/10.5935/1806-6690.202000...
, for the identification and counting of eucalyptus plants based on the Classic Orthophoto Mosaic (COM-RGB) input. In the present approach (Module II), the generation of the geo-located plants file from the output of this module is also implemented. Consequently, this module can be applied when the analyst lacks access to a geo-located plant file. The second column in the workflow illustrates (Module II) a step-by-step process that utilizes a geo-located plants file as input, resulting in the generation of geo-located rows and geo-located failures files as output. It should be noted that Module II is independent and can accept input from the output of Module I or from a georeferenced plant file produced by any other method. Module II represents the primary focus of this study and will be discussed in detail in subsequent sections.
– Workflow of the FindRow&Failure. The grey trapezium represents the inputs, the white rectangles represent algorithms, and the remaining polygons (colored) are the outputs.
The integration of the two modules enables the routine to geolocate and count plants, eucalyptus planting lines, and failures, thereby facilitating the forest inventory. All custom routines were developed to operate within the Python 3.8 terminal of QGIS 3.22.11, using the following libraries: matplolib, numpy, scipy.spatial, os, cv2, geopandas and osgeo.
The details of Module I can be found in Oliveira et al. (2020)Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
https://doi.org/10.5935/1806-6690.202000...
, where template matching is employed to identify the coordinates (pixels) of each plant within the studied image. This process is initiated by selecting a template, a manual clipping of a region patch in the image used to identify a plant. This patch is used as a template and, in a sliding window process, it is compared with each image point in order to produce a map with the template’s correlation index concerning each image point. The image regions associated with the plant canopy are segmented based on a minimum correlation value. Finally, the local maximum points are computed and chosen to indicate the location of the plants, generating the coordinate file (pixels) but without the geolocation. In the present work, the geolocation of the identified plants is implemented, specifically the Pixel2Coord transformation, which yields the geo-located plants file.
The details of Module II are discussed in the following subsections, which encompass the Delaunay triangulation, Hough transform, Point clustering, Linear regression and failure detection, and filling routines.
Delaunay triangulation
From the location of the plants obtained by Module I (or by another method), it is possible to construct a triangular mesh defining the edges between the points in the image. In this step, the Delaunay triangulation was employed, which has the property of constructing a more regular mesh, with triangles closer to the isosceles case, as described by Lee and Schachter (1980)Lee DT, Schachter BJ. 1980. Two algorithms for constructing a Delaunay triangulation. International Journal of Computer & Information Sciences 9: 219-242. https://doi.org/10.1007/BF00977785
https://doi.org/10.1007/BF00977785...
. From the Delaunay edges, edges aligned with the planting lines are selected. The selection of edges considers the angle defined by the edges and the x-axis. A probability distribution of these angles allows the definition of an interval of predominant edge angles. This predominant direction indicates the planting lines.
The process of selecting edges in the predominant direction, which indicates the planting lines, is illustrated in Figure 4A-D. Initially, the points associated with the identified plants are highlighted (Figure 4A). Subsequently, the Delaunay triangulation establishes a mesh connecting the points (Figure 4B). The distribution of principal orientations of the Delaunay edges allows for the highlighting of the edges in the predominant direction, which indicates the planting lines, as illustrated in Figure 4C. Finally, the remaining edges are removed, and only edges aligned with the planting lines are selected, as shown in Figure 4D.
– Process of selecting predominant edge direction. A) identified plants; B) Delaunay triangulation; C) edges with predominant orientation highlighted; D) selected edges aligned with the planting rows.
It is evident that irregularities or planting failures result in the Delaunay edges deviating from the intended layout of the planting rows, thereby creating discontinuities in the lines. In the study conducted by Oliveira et al. (2020)Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
https://doi.org/10.5935/1806-6690.202000...
, these discontinuities were not addressed due to the difficulty of recovering the analytical expression of the line representing the planting row. This limitation is addressed in this study by using the Hough transform.
Contrary to previous approaches that used only points associated with the detected plants as input for the Hough transform, the present work considers all points associated with the edges that indicate the direction of the planting lines, as detailed in Figure 4D. This strategy enhances line indication and significantly reduces false positives and false negatives, which are common in Hough transform applications.
Hough transform
The Hough transform is a widely used technique for detecting curves from a set of points within an image I. It can identify several parametric curves with a known analytical formulation, including lines, circles, and ellipses (Duda and Hart, 1972Duda RO, Hart PE. 1972. Use of the Hough transformation to detect lines and curves in pictures. Communications of the ACM 15: 11-15. https://doi.org/10.1145/361237.361242
https://doi.org/10.1145/361237.361242...
). In the case of line detection, a parametric space P is defined where each point ( a , b ) ∈ P is associated with a line of the equation y = ax + b . Conversely, each point ( x , y ) in image I corresponds to a line of the equation b = x (− a ) + y in the parametric space P . Thus, a line defined by k collinear points {( x i , y i ) , i = 1, ..., k } in the image I is identified by the intersection of the k lines associated with the points in the parameter space. In order to enhance the efficiency of the implementation, the polar representation ρ = x cos (θ) + y sin (θ) is used to define the parameter space. Here, θ represents the orientation of the line, while ρ denotes the distance from the line to the origin.
Implementing the Hough transform typically generates a parameter space where the axis ρ varies from 0 to h , where h is the diagonal size of the image in pixels, and the axis θ varies from 0 to 360°. Given the predominant direction of the edges that indicate the planting lines (Figure 4D), the proposed implementation makes an adaptation to reduce the search space on the θ axis to Eq. (1).
where: ϕ is the angle of the predominant direction of the planting lines and δ is a threshold that sets the maximum variation for the angle θ around ϕ. This strategy was used in Chen et al. (2021) Chen P , Ma X , Wang F , Li J . 2021. A new method for crop row detection using unmanned aerial vehicle images. Remote Sensing 13: 3526. https://doi.org/10.3390/rs13173526
https://doi.org/10.3390/rs13173526...
, where the threshold δ was set as 5°. In the experiments, the optimal results were achieved with a threshold δ = 3°.
The proposed algorithm adaptation for the Hough transform as a voting process in the parameter space P is detailed in Table 2 . The parameter space is initialized with zeros, and for each non-zero point in the image space, a voting process increases the values for the points in the parameter space associated with the lines that pass through that point in the image.
– Adaptation (algorithm) for the Hough transform as a voting process in the parameter space P, P = Hough (I, R, T, ∅, δ).
The points of each planting line in the image are not perfectly aligned, which may result in false positives. Furthermore, the Hough transform may return more than one line associated with the same planting row. False positives are typically associated with a neighborhood of the local maxima in the parameter space. In order to address this, the proposed algorithm selects the local maxima and ignores points from the parameter space that are in the neighborhood of the local maxima. The effect of this selection of local maxima in the parameter space is illustrated in Figure 5A and B.
– False positives removal in line detection. A) neighboring points of the local maxima in the parameter space generate false positives; B) the selection of local maxima allows for the isolation of correct lines, removing false positives.
Point clustering
The Hough transform returns lines that implicitly determine a grouping of points, assigning each plant to its respective planting row. However, the transform returns only the parameters of each line, without establishing a relationship between each line and the respective set of points. In a subsequent post-processing step, a distance matrix is constructed to compute the distance from each point to each line. From this distance matrix, each point is associated with the closest line. The algorithm ( Table 3 ) takes the set P of planting lines and the set L of planting lines and associates each plant of coordinates ( x i , y i ) ∈ P with the line of parameters (θ k , ρ k ) ∈ L that best approximates the point.
This approach allows for the grouping of points by association with each planting row k defining a set of points, P k = {( x i , y i ); K ( i ) = k }. These points are associated with the plants detected for that row and will be used to refine the parameters of each planting row using linear regression.
Linear regression
Consider the set P k = {( x i , y i ); K ( i ) = k } with n points associated with the planting row k . The analytical expression of the line with the best fit to the points is given by the Eq. (2).
The linear regression finds the parameters a , b , and c that lead to the best-fit line. Ideally, the Eq. (3) should be satisfied for all n points on the line.
However, as the points are typically not precisely aligned, this homogeneous system commonly needs a valid (or non-zero) solution. In such cases, the best fit line to the points is determined by the least squares method and indicated by the eigenvector associated with the smallest eigenvalues from the matrix A t A . This method, denoted by A , is the coefficient matrix of the linear system, and the equation for this matrix given by Eq. (4).
The eigenvector associated with the smallest eigenvalue of A t A is indicated by [ a , b , c ], and the new parameters of the line in polar coordinates are retrieved by Eq. (5).
Adjusting analytical parameters for the planting row allows for the replanting of failed plants in the most appropriate location within the row, thus enhancing the accuracy of georeferencing planting failure points.
Failure detection and filling
The analytical equation of the planting row, obtained in the previous subsection, is employed in the failure detection step to identify segments with a length greater than the theoretical spacing between plants in each row. This identification depends on the correct association of each detected plant with its respective planting row, as detailed in the section Point clustering. Each failure segment is then subdivided into replanting points by the theoretical spacing originally predicted between plants.
The detection of failure segments and the distribution of failure points for replanting along the segments are illustrated in Figure 6A-C. Figure 6A depicts the detected plants and reconstructed planting rows, while Figure 6B highlights the failure segments in yellow on each planting row. Finally, Figure 6C indicates the points for replanting over each failure segment. It should be noted that although the detected plants are not precisely aligned in the row, the points for replanting are distributed precisely along the planting row, respecting the planned spacing between plants.
– Indication of failure segments and points for replanting. A) the plants were detected, and planting rows were reconstructed; B) the failure segments are highlighted in yellow; C) points for replanting are indicated over the failure segments.
Georeferencing
In order to maintain the georeferencing of the products (outputs), the gdal library was employed to read and transform geodesic coordinates into pixels (Coord2Pixel) and vice versa (Pixel2Coord). Consequently, the analyst can import the Classic Orthophoto Mosaic (COM-RGB) and Geo-located Plants files directly into the routine, as well as export the products in file formats that can be read and edited (if necessary) in geoprocessing software, such as QGIS.
Data analysis
The efficacy of the implemented routine was evaluated by comparing the number of failures extracted automatically by the routine with the number of failures identified manually through photointerpretation of the RGB orthophoto mosaic in QGIS software, considering the plant spacing (2.5 m × 3.0 m). Three performance measurement metrics (Eq. (6), (7) and (8)) were then applied, as presented by Cleverdon and Keen (1966)Cleverdon C, Keen M. 1966. ASLIB cranfield research project - Factors determining the performance of indexing systems. 2ed. Cranfield University, Cranfield, UK..
where: TP , TN , FP , and FN mean, respectively, true positive (number of eucalyptus failures identified correctly), true negative (number of “non-failures” identified), false positive (number of failures identified incorrectly), and false negative (number of eucalyptus failures identified incorrectly) (Figure 7A). The sensitivity (Se) is a measure of the algorithm’s ability to detect eucalyptus failures, while specificity (Sp) is a metric that indicates the algorithm’s effectiveness in identifying “non eucalyptus failures”. The accuracy (Acc) is an overall measure of the performance of the proposed method.
– A) TP = identification of true positive; TN = true negative; FP = false positive; and FN = false negative. B) Spatial range of TP, FN and FP, according to radius variation.
The relative error (Re) was calculated in Eq. (9). to increase the rigor of the algorithm evaluation, following the recommendations of Armstrong and Collopy (1992)Armstrong JS, Collopy F. 1992. Error measures for generalizing about forecasting methods: Empirical comparisons. International Journal of Forecasting 8: 69-80. https://doi.org/10.1016/0169-2070 (92)90008-W
https://doi.org/10.1016/0169-2070 (92)90...
and Stine et al. (2004)Stine BE, Hess C, Weiland LH, Ciplickas DJ, Kibarian J. 2004. System and method for product yield prediction using a logic characterization vehicle. US Patent 6,834,375. WIPO, Geneva, Switzerland. (PCT): WO2001037322A9.
where: N fr = TP + FP is the total number of eucalyptus failures detected by the routine, and ( N ) indicates the real number of eucalyptus failures in the study site, that is, obtained manually (by photointerpretation in QGIS), measuring the size of each “free span” and, consequently, quantifying and geolocating each failure, according to the theoretical spacing between plants (2.5 m).
In order to associate the aforementioned metrics with eucalyptus failures positional error (geolocation), a circular geometric figure was established, with radius variation (from 0.05 m to 1.0 m) from the real geolocation of each failure (true failure). This figure determined the spatial range of TP, FN, and FP (Figure 7B). Given that the positional accuracy of vector data is contingent upon dimensionality, with points typically defined by the Euclidean distance (Zandbergen, 2008Zandbergen PA. 2008. Positional accuracy of spatial data: non-normal distributions and a critique of the national standard for spatial data accuracy. Transactions in GIS 12: 103-130. https://doi.org/10.1111/j.1467-9671.2008.01088.x
https://doi.org/10.1111/j.1467-9671.2008...
; Lee et al., 2016Lee L, Jones M, Ridenour GS, Bennett SJ, Majors AC, Melito BL, et al. 2016. Comparison of Accuracy and Precision of GPS-Enabled Mobile Devices. IEEE, Piscataway, NJ, USA. https://doi.org/10.1109/CIT.2016.94
https://doi.org/10.1109/CIT.2016.94...
), the average positional accuracy was estimated using the Average Euclidean Error (AEE), as defined in Eq. (10).
where: ∆x i and ∆y i are, respectively, the absolute error in the x direction ( e xi ) and the absolute error in the y direction ( e yi ) ( Figure 8 ).
– Euclidean distance ( d i ) or positional error, absolute error in the x direction ( e xi ), and absolute error in the y direction ( e yi ).
To analyze the computational processing time of the developed routine, a laptop computer, with a Linux operating system (Linux Mint 20.2 Cinnamon), CPU Intel(R) Core (TM) i7- 4500U (4 cores, 1.80GHz, L2 cache 4096 KB), RAM 16 GB, SSD Kingston SA 400S3, and GPU 2.0 GB (GeForce GT 740M) was used.
Results
The efficacy of the identification of planting failures by the presented method is contingent upon the Se and Acc of the initial identification of the plants. Module I resolves this with high accuracy (Oliveira et al., 2020Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
https://doi.org/10.5935/1806-6690.202000...
). Thus, rather than considering plant evaluations, the proposed work solely analyzed the planting failures identified by Module II. Following the automatic detection of plants by Module I, the vector editing tools of QGIS were employed to manually complete (photointerpretation) the survey of the forest stand, resulting in a total of 5,355 plants and 3,130 true failures of eucalyptus in the study site (Figure 9A), along with 21.213 km of planting rows.
– A) Spatial distribution of plants and real failures and B) detected failures by the routine in the Test Area. C) Detail of the planting rows reconstitution in the regions with and D) without false negative.
The routine identified a total of 3,091 instances of eucalyptus failure and 21.471 km of planting rows, illustrated in Figure 9B and C, respectively. Some regions showed a deficiency in FN. In particular, in planting rows comprising a single eucalyptus plant, as illustrated in the section Implementing the automated routine, there are no circumstances conducive to the acquisition of Delaunay edges, and even less so for Hough lines or linear regression. This circumstance influences the Se of the routine, as the quantity of FN is increased in these regions (Figure 9C). Nevertheless, only 0.06 km of planting rows were not reconstructed. In instances where there are at least two plants, the reconstitution of the line becomes favorable (Figure 9D).
The results of the quantitative evaluation metrics are presented in Table 4 . It shows that the efficacy of the routine is directly related to the accuracy magnitude of the geolocation of planting failures (represented by the size of the circle radius). The lower the value of the magnitude of positional accuracy, the lower the accuracy of the routine. With a radius of 0.05 m, equivalent to the ground sampling distance (GSD) of the image, the routine demonstrated a limited capacity to detect eucalyptus failures, with a sensitivity (Se) of 0.382. However, the low ability of the routine to detect eucalyptus failures did not influence the number of true negatives (TN), resulting in a Sp of 0.739. The general accuracy value (Acc) for that case was 0.631.
– Performance of the proposed routine for forest inventory of Eucalyptus plantations using aerial images obtained by UAV-RGB. R = radius or magnitude of positional accuracy of planting failures; TP = true positive; FP = false positive; TN = true negative; FN = false negative; Se = sensitivity; Sp = specificity; Acc = overall accuracy; and Re = relative error.
When the magnitude of the positional accuracy was doubled, to 0.10 m, there was a gain of 87.46 % in the algorithm’s ability to detect planting failures (Se = 0.716). In turn, Sp had an increase of 16.86 % and, consequently, the accuracy of the routine increased, with gains of 28.96 % for Acc (0.814). As the magnitude of positional accuracy is increased to a radius of 0.15 m, further improvements are observed in the aforementioned metrics, with values exceeding 0.90. In this interval, there is a significant reduction in the number of false negatives (FN) (– 81.42 %) and false positives (FP) (– 85.16 %), resulting in an increase in the number of true positives (TP) (+ 32.25 %).
For radii greater than 0.15 m, the gains in performance metrics were not as significant as before. For the 0.20 m radius, there was an improvement of only 2.66 % for Se (0.973), 1.46 % for Sp (0.991), and 1.88 % for Acc (0.984). In general, an increase in radius (or in the magnitude of positional accuracy) results in the incorporation of false positives (FP) into the region of influence of true positives (TP), which provides improvements in the accuracy values ( Figure 10 ). This change is more significant for radii less than 0.20 m. Over 0.20 m, the values of the performance measures remained practically constant ( Figure 11 ).
– Radius variation (magnitude of positional accuracy) and true positive (TP) influence region.
– Metric Evalutation. For increasing influence radius, values for sensitivity (Se), specificity (Sp), and accuracy (Acc) have corresponding increases. The gain in accuracy approaches zero for a radius greater than 0.20 m.
A comparison of the coordinates in the UTM projection of the true points (real failure) and their counterparts (detected failure) generated by the routine revealed an average positional accuracy (AEE) of 0.077 m (σ = ± 0.144 m). The directional error in the x-axis ( e x ) and in the y-axis ( e y ), respectively, was 0.049 m (σ = ± 0.134 m) and 0.053 m (σ = ± 0.05 m), as illustrated in Figure 12 .
It is important to emphasize that the developed routine does not require user intervention for configuration of the parameters (100 % automated). The analyst is solely required to indicate the input and output directories and files, and the theoretical spacing between plants. Regarding computational processing time, it is noteworthy that only 51 s were consumed in processing time to analyze the aforementioned test area.
Discussion
Conducting a forest inventory is a crucial aspect of sustainable forest management. It requires spatial and temporal analysis to identify changes in forest cover, including growth and regeneration (Coops et al., 2023 Coops NC , Tompalski P , Goodbody TRH , Achim A , Mulverhill C . 2023. Framework for near real-time forest inventory using multi source remote sensing data. Forestry 96: 1-19. https://doi.org/10.1093/forestry/cpac015
https://doi.org/10.1093/forestry/cpac015...
). During tree mortality monitoring, replanting within 15 to 30 days after planting is recommended. This underscores the need for a rapid and precise inventory. Given the significance of this analysis, it is evident that surveying by UAV and very high spatial resolution satellite (VHSRS) represents an optimal alternative to forest inventory (Choudhry and O’Kelly, 2018Choudhry H, O'Kelly G. 2018. Precision Forestry: A Revolution in the Woods. McKinsey, Chicago, IL, USA. Available at: https://www.mckinsey.com/industries/paper-forest-products-and-packaging/our-insights/precision-forestry-a-revolution-in-the-woods [Accessed June 8, 2023]
https://www.mckinsey.com/industries/pape...
). When employing technologies based on remote sensing, it is also essential to assess the requisite resolution (temporal, spatial, spectral, and swath) (Mukherjee et al., 2019Mukherjee A, Misra S, Raghuwanshi NS. 2019. A survey of unmanned aerial sensing solutions in precision agriculture. Journal of Network and Computer Applications 148: 102461. https://doi.org/10.1016/j.jnca.2019.102461
https://doi.org/10.1016/j.jnca.2019.1024...
).
While both platforms (UAV and VHSRS) have their respective advantages and disadvantages, the UAV is more flexible (Nikolakopoulos et al., 2019 Nikolakopoulos K , Kyriou A , Koukouvelas I , Zygouri V , Apostolopoulos D . 2019. Combination of aerial, satellite, and UAV photogrammetry for mapping the diachronic coastline evolution: the case of Lefkada Island. ISPRS International Journal of Geo-Information 8: 489. https://doi.org/10.3390/ijgi8110489
https://doi.org/10.3390/ijgi8110489...
; Olson and Anderson, 2021Olson D, Anderson J. 2021. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agronomy Journal 113: 971-992. https://doi.org/10.1002/agj2.20595
https://doi.org/10.1002/agj2.20595...
). This flexibility is particularly advantageous when mapping is carried out at a specific date or time. However, it is important to note that the UAV may require more pre-processing time and management, as well as legislation restrictive and operator required. Conversely, the VHSRS facilitates mapping larger areas with images that are generally readily available and suitable for analysis (no pre-processing required) on proprietary web-based platforms. Another factor that must be considered is weather restrictions, as UAVs are subjected to limitations related to wind and precipitation. In contrast, the VHSRS is susceptible to cloud dependence.
A comparison of the two data sources, UAV and VHSRS, reveals that the advantages of the UAV outweigh those of the VHRS, and vice versa. This suggests a strong potential for synergies (Alvarez-Vanhard et al., 2021Alvarez-Vanhard E, Corpetti T, Houet T. 2021. UAV & satellite synergies for optical remote sensing applications: A literature review. Science of Remote Sensing 3: 100019. https://doi.org/10.1016/j.srs.2021.100019
https://doi.org/10.1016/j.srs.2021.10001...
). There are complementarities between UAV-based and VHSRS-based resolution requirements, and data fusion is a well-known technique for dealing with this multi-source synergy (Barbedo, 2022 Barbedo JGA . 2022. Data fusion in agriculture: resolving ambiguities and closing data gaps. Sensors 22: 2285. https://doi.org/10.3390/s22062285
https://doi.org/10.3390/s22062285...
).
The statistics were applied solely for cases of planting failure. If Module I was utilized, the detection accuracy would be 93 %, which would certainly compromise the calibration and validation of the routine. If the analyst chose to identify the plants using other methodologies or if they already had the geolocation of the plants, they could employ only Module II, which renders it flexibility for adaptation to other forest and agricultural crops.
The computational routine is directly related to the magnitude of accuracy for geolocation of the planting failure (represented by the size of the radius of the circle). Therefore, the greater this value, the greater the routine accuracy. The routine has significant potential for use in forest inventory, as it allows for the geolocation of failures with an overall accuracy of 0.97 and 0.99 for a maximum positional accuracy of 0.15 m and 0.20 m, respectively. Forest inventory becomes more efficient and economical using more detailed and accurate acquisition techniques (Choudhry and O’Kelly, 2018; Zhao et al., 2021 Zhao H , Wang Y , Sun Z , Xu Q , Liang D . 2021. Failure detection in eucalyptus plantation based on UAV images. Forests 12: 1250. https://doi.org/10.3390/f12091250
https://doi.org/10.3390/f12091250...
).
Additionally, Zhao et al. (2021) Zhao H , Wang Y , Sun Z , Xu Q , Liang D . 2021. Failure detection in eucalyptus plantation based on UAV images. Forests 12: 1250. https://doi.org/10.3390/f12091250
https://doi.org/10.3390/f12091250...
aimed to identify and enumerate failures in eucalyptus plantation rows, achieving an overall 91.8 % and 95 % detection rate in two experiments. It is worth noting that the methodology proposed by those authors, which relies on segmenting the plant canopy (using the watershed method), is contingent upon the accuracy of this segmentation, specifically in identifying each plant. The same metric was applied to the proposed approach in this study, which yielded an overall detection rate of 98.8 %.
Fareed and Rehman (2020)Fareed N, Rehman K. 2020. Integration of remote sensing and GIS to extract plantation rows from a drone-based image point cloud digital surface model. ISPRS International Journal of Geo-Information 9: 151. https://doi.org/10.3390/ijgi9030151
https://doi.org/10.3390/ijgi9030151...
presented an automated method for extracting the eucalyptus planting rows based on digital surface models (DSM), specifically drone-based image point clouds (DIPC). The study demonstrated the efficacy of the DIPC-based solution in extracting planting rows, with an F1-score value of 0.94 for the optimal scenario evaluated. Upon evaluation of the proposed routine with this metric, an F1-score of 0.992 was obtained.
The findings of the present work are crucial for commercial forestry activity, as the sooner faults are detected, the greater the chances of having a homogeneous forest, with a reduction of idle areas and gains in productivity (Zhao et al., 2021 Zhao H , Wang Y , Sun Z , Xu Q , Liang D . 2021. Failure detection in eucalyptus plantation based on UAV images. Forests 12: 1250. https://doi.org/10.3390/f12091250
https://doi.org/10.3390/f12091250...
). Typically, this information is obtained through the traditional inventory, which involves extensive and laborious filed visits, in some cases conducted row by row and plant by plant. Although this method is highly accurate, it is extremely time-consuming to collect data in the field. Most operations are labor-intensive, time-consuming, and costly (Dainelli et al., 2021 Dainelli R , Toscano P , Di Gennaro SF , Matese A . 2021. Recent advances in unmanned aerial vehicle forest remote sensing: a systematic review. Part I: A general framework. Forests 12: 327. https://doi.org/10.3390/f12030327
https://doi.org/10.3390/f12030327...
). The proposed methodology allows for a significant reduction in the time required to collect and process.
The approach presented a routine 100 % automated and calibrated in the aspects of computer vision, which can be incorporated into the workflow of forest inventory of commercial eucalyptus plantations, especially for areas where the occurrence of failures is out of the normality, as illustrated in Figure 1B. The routine can be used in open-source software (QGIS and Spyder IDE) and exhibits optimal performance in orthophoto mosaics obtained by RGB sensor, a cost-effective and user-friendly sensor (Yao et al., 2019 Yao H , Qin R , Chen X . 2019. Unmanned aerial vehicle for remote sensing applications: a review. Remote Sensing 11: 1443. https://doi.org/10.3390/rs11121443
https://doi.org/10.3390/rs11121443...
). In addition, the routine demonstrates adaptability for incorporation into other forest species (rubber tree, African mahogany, Australian cedar, etc.) and crops (mango, avocado, coffee, etc.).
One of the primary factors contributing to the high degree of accuracy in detecting planting failures was the precise identification of planting rows. This approach resulted in a significant reduction in cases of false positives and false negatives. This improvement was due to the identification of Delaunay edges aligned with the planting rows, which allowed for the reinforcement of the Hough transform of the parameters effectively associated with the planting rows. However, the Hough transform does not provide an association between each point and its respective row. In the proposed implementation, the association between the two approaches was obtained, allowing the use of regression to refine the parameters of each line. This refinement contributed to greater accuracy in the indication of faults with a smaller error radius and, therefore, to greater accuracy in the geolocation of identified faults.
Despite the advances, several avenues can be pursued to ensure the continuity of this research. For instance, the line detection using the Hough transform and the regression adjustment depends on detecting a minimum number of points in each planting row. However, it is possible to occur 100 % of planting failure in an entire row, in which case, there would be insufficient plants to serve as reference for the reconstruction of the row. In this instance, it is not possible to indicate the presence or absence of failures for that row. A potential avenue for future improvement would be to use the theoretical spacing parameter between planting rows. This would enable the identification of instances where the distance between the detected rows exceeds the theoretical spacing, thereby indicating the absence of an entire row.
The implemented routine considers only straight lines; therefore, the routine cannot treat crops that do not follow this pattern of planting rows. In this sense, future research that aims to implement the detection of general curves can expand the spectrum of applications from the presented routine. In general, non-linear planting rows, although not straight, can be approximated locally, in a neighborhood, by straight lines. Consequently, partitioning the image into smaller patches would allow the construction of small segments that would reconstruct the approximate planting row as a polygonal curve.
Acknowledgments
The authors would like to thank ArcelorMittal BioFlorestas Ltda. for hosting the experiments. The authors would also like to thank the Fundação de Amparo à Pesquisa do Estado de Minas Gerais (FAPEMIG) and the Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) for granting the scholarships.
References
- Alvarez-Vanhard E, Corpetti T, Houet T. 2021. UAV & satellite synergies for optical remote sensing applications: A literature review. Science of Remote Sensing 3: 100019. https://doi.org/10.1016/j.srs.2021.100019
» https://doi.org/10.1016/j.srs.2021.100019 - Armstrong JS, Collopy F. 1992. Error measures for generalizing about forecasting methods: Empirical comparisons. International Journal of Forecasting 8: 69-80. https://doi.org/10.1016/0169-2070 (92)90008-W
» https://doi.org/10.1016/0169-2070 (92)90008-W - Barbedo JGA . 2022. Data fusion in agriculture: resolving ambiguities and closing data gaps. Sensors 22: 2285. https://doi.org/10.3390/s22062285
» https://doi.org/10.3390/s22062285 - Basso M, Freitas EP. 2020. A UAV guidance system using crop row detection and line follower algorithms. Journal of Intelligent & Robotic Systems 97: 605-621. https://doi.org/10.1007/s10846-019-01006-0
» https://doi.org/10.1007/s10846-019-01006-0 - Biglia A , Zaman S , Gay P , Aimonino DR , Comba L . 2022. 3D point cloud density-based segmentation for vine rows detection and localisation. Computers and Electronics in Agriculture 199: 107166. https://doi.org/10.1016/j.compag.2022.107166
» https://doi.org/10.1016/j.compag.2022.107166 - Chen P , Ma X , Wang F , Li J . 2021. A new method for crop row detection using unmanned aerial vehicle images. Remote Sensing 13: 3526. https://doi.org/10.3390/rs13173526
» https://doi.org/10.3390/rs13173526 - Choudhry H, O'Kelly G. 2018. Precision Forestry: A Revolution in the Woods. McKinsey, Chicago, IL, USA. Available at: https://www.mckinsey.com/industries/paper-forest-products-and-packaging/our-insights/precision-forestry-a-revolution-in-the-woods [Accessed June 8, 2023]
» https://www.mckinsey.com/industries/paper-forest-products-and-packaging/our-insights/precision-forestry-a-revolution-in-the-woods - Cleverdon C, Keen M. 1966. ASLIB cranfield research project - Factors determining the performance of indexing systems. 2ed. Cranfield University, Cranfield, UK.
- Coops NC , Tompalski P , Goodbody TRH , Achim A , Mulverhill C . 2023. Framework for near real-time forest inventory using multi source remote sensing data. Forestry 96: 1-19. https://doi.org/10.1093/forestry/cpac015
» https://doi.org/10.1093/forestry/cpac015 - Dainelli R , Toscano P , Di Gennaro SF , Matese A . 2021. Recent advances in unmanned aerial vehicle forest remote sensing: a systematic review. Part I: A general framework. Forests 12: 327. https://doi.org/10.3390/f12030327
» https://doi.org/10.3390/f12030327 - Di Gennaro SF, Matese A. 2020. Evaluation of novel precision viticulture tool for canopy biomass estimation and missing plant detection based on 2.5D and 3D approaches using RGB images acquired by UAV platform. Plant Method 16: 1-12. https://doi.org/10.1186/s13007-020-00632-2
» https://doi.org/10.1186/s13007-020-00632-2 - Duda RO, Hart PE. 1972. Use of the Hough transformation to detect lines and curves in pictures. Communications of the ACM 15: 11-15. https://doi.org/10.1145/361237.361242
» https://doi.org/10.1145/361237.361242 - Edelsbrunner H, Mücke EP. 1994. Three-dimensional alpha shapes. ACM Transactions on Graphics 13: 43-72. https://doi.org/10.1145/174462.156635
» https://doi.org/10.1145/174462.156635 - Fareed N, Rehman K. 2020. Integration of remote sensing and GIS to extract plantation rows from a drone-based image point cloud digital surface model. ISPRS International Journal of Geo-Information 9: 151. https://doi.org/10.3390/ijgi9030151
» https://doi.org/10.3390/ijgi9030151 - Jiang G, Wang Z, Liu H. 2015. Automatic detection of crop rows based on multi-ROIs. Expert Systems with Applications 42: 2429-2441. https://doi.org/10.1016/j.eswa.2014.10.033
» https://doi.org/10.1016/j.eswa.2014.10.033 - La Rosa LEC, Oliveira DAB, Zortea M, Gemignani BH, Feitosa RQ. 2020. Learning geometric features for improving the automatic detection of citrus plantation rows in UAV images. IEEE Geoscience and Remote Sensing Letters 19: 1-5. https://doi.org/10.1109/LGRS.2020.3024641
» https://doi.org/10.1109/LGRS.2020.3024641 - Lee DT, Schachter BJ. 1980. Two algorithms for constructing a Delaunay triangulation. International Journal of Computer & Information Sciences 9: 219-242. https://doi.org/10.1007/BF00977785
» https://doi.org/10.1007/BF00977785 - Lee L, Jones M, Ridenour GS, Bennett SJ, Majors AC, Melito BL, et al. 2016. Comparison of Accuracy and Precision of GPS-Enabled Mobile Devices. IEEE, Piscataway, NJ, USA. https://doi.org/10.1109/CIT.2016.94
» https://doi.org/10.1109/CIT.2016.94 - Liu M , Su W , Wang X . 2023. Quantitative evaluation of maize emergence using UAV imagery and deep learning. Remote Sensing 15: 1979. https://doi.org/10.3390/rs15081979
» https://doi.org/10.3390/rs15081979 - Mukherjee A, Misra S, Raghuwanshi NS. 2019. A survey of unmanned aerial sensing solutions in precision agriculture. Journal of Network and Computer Applications 148: 102461. https://doi.org/10.1016/j.jnca.2019.102461
» https://doi.org/10.1016/j.jnca.2019.102461 - Nikolakopoulos K , Kyriou A , Koukouvelas I , Zygouri V , Apostolopoulos D . 2019. Combination of aerial, satellite, and UAV photogrammetry for mapping the diachronic coastline evolution: the case of Lefkada Island. ISPRS International Journal of Geo-Information 8: 489. https://doi.org/10.3390/ijgi8110489
» https://doi.org/10.3390/ijgi8110489 - Oliveira WF, Vieira AW, Santos SR. 2020. Quality of forest plantations using aerial images and computer vision techniques. Revista Ciência Agronômica 51: e20197070. https://doi.org/10.5935/1806-6690.20200080
» https://doi.org/10.5935/1806-6690.20200080 - Olson D, Anderson J. 2021. Review on unmanned aerial vehicles, remote sensors, imagery processing, and their applications in agriculture. Agronomy Journal 113: 971-992. https://doi.org/10.1002/agj2.20595
» https://doi.org/10.1002/agj2.20595 - Rabab S , Badenhorst P , Chen YP , Daetwyler HD . 2021. A template-free machine vision-based crop row detection algorithm. Precision Agriculture 22: 124-153. https://doi.org/10.1007/s11119-020-09732-4
» https://doi.org/10.1007/s11119-020-09732-4 - Ribeiro JB , Silva RR , Dias JD , Escarpinati MC , Backes AR . 2023. Automated detection of sugarcane crop lines from UAV images using deep learning. Information Processing in Agriculture 10: 400-415. https://doi.org/10.1016/j.inpa.2023.04.001
» https://doi.org/10.1016/j.inpa.2023.04.001 - Rocha BM , Fonseca AU , Pedrini H , Soares F . 2023. Automatic detection and evaluation of sugarcane planting rows in aerial images. Information Processing in Agriculture 10: 400-415. https://doi.org/10.1016/j.inpa.2022.04.003
» https://doi.org/10.1016/j.inpa.2022.04.003 - Stine BE, Hess C, Weiland LH, Ciplickas DJ, Kibarian J. 2004. System and method for product yield prediction using a logic characterization vehicle. US Patent 6,834,375. WIPO, Geneva, Switzerland. (PCT): WO2001037322A9
- Vidovic I, Cupec R, Hocenski Ž. 2016. Crop row detection by global energy minimization. Pattern Recognition 55: 68-86. https://doi.org/10.1016/j.patcog.2016.01.013
» https://doi.org/10.1016/j.patcog.2016.01.013 - Yao H , Qin R , Chen X . 2019. Unmanned aerial vehicle for remote sensing applications: a review. Remote Sensing 11: 1443. https://doi.org/10.3390/rs11121443
» https://doi.org/10.3390/rs11121443 - Zandbergen PA. 2008. Positional accuracy of spatial data: non-normal distributions and a critique of the national standard for spatial data accuracy. Transactions in GIS 12: 103-130. https://doi.org/10.1111/j.1467-9671.2008.01088.x
» https://doi.org/10.1111/j.1467-9671.2008.01088.x - Zhao H , Wang Y , Sun Z , Xu Q , Liang D . 2021. Failure detection in eucalyptus plantation based on UAV images. Forests 12: 1250. https://doi.org/10.3390/f12091250
» https://doi.org/10.3390/f12091250
Edited by
Publication Dates
-
Publication in this collection
09 Sept 2024 -
Date of issue
2024
History
-
Received
25 Aug 2023 -
Accepted
20 Dec 2023