ABSTRACT
Unmanned aerial vehicles (UAVs) are a promising tool for technology development and transfer and for the economic success of the agricultural sector. The objective of this study is to assess the validity of biomass estimation in a commercial maize plantation using aerial images obtained by a UAV. The proposed methodology involved analyzing images acquired in scheduled flights, processing orthophoto (georeferenced image) data, evaluating digital terrain elevation models, and assessing the quality of dense point clouds. Data were collected using two cameras, one with a 16-megapixel flat lens and the other with a 12-megapixel fish-eye lens coupled to a UAV, at two flight altitudes (30 and 60 meters) over hybrid maize (AG1051) crop irrigated by center pivot in the municipality of Limoeiro do Norte, Ceará, Brazil. Crop biomass was estimated in 1 m2 plots sampled randomly, and data were validated by interpreting aerial images of target areas. The measurements of biomass using UAV-based aerial images were promising. The estimated values were more accurate using the fish-eye lens at 30 m altitude, corresponding to 2.97 kg m-2, which is very close to the values measured in the field (2.92 kg m-2).
KEYWORDS
precision agriculture; Structure from Motion; unmanned aerial vehicles; Zea mays L
INTRODUCTION
Precision agriculture is promising for developing technologies that contribute to the economic success of agricultural activities. Unmanned aerial vehicles (UAVs) are advanced technology, and several UAV-based applications have been developed over the years, allowing market expansion and an increase in the demand for services (De Lara et al., 2019De Lara A, Longchamps L, Khosla R (2019) Soil Water Content and High-Resolution Imagery for Precision Irrigation: Maize Yield. Agronomy 9(4):174. DOI: https://doi.org/10.3390/agronomy9040174
https://doi.org/10.3390/agronomy9040174...
; Han et al., 2019Han L, Yang G, Dai H, Xu B, Yang H, Feng H, Li Z, Yang X (2019) Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 15(1):10. DOI: https://doi.org/10.1186/s13007-019-0394-z
https://doi.org/10.1186/s13007-019-0394-...
). UAVs are useful for acquiring high spatial resolution aerial images at a cost lower than that of other methods (Honkavaara et al., 2013Honkavaara E, Saari H, Kaivosoja J, Pölönen I, Hakala T, Litkey P, Pesonen L (2013) Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sensing 5(10):5006-5039. DOI: https://doi.org/10.3390/rs5105006
https://doi.org/10.3390/rs5105006...
; Marcial-Pablo et al., 2019Marcial-Pablo MDJ, Gonzalez-Sanchez A, Jimenez-Jimenez SI, Ontiveros-Capurata RE, Ojeda-Bustamante W (2019) Estimation of vegetation fraction using RGB and multispectral images from UAV. International Journal of Remote Sensing 40(2):420-438. DOI: https://doi.org/10.1080/01431161.2018.1528017
https://doi.org/10.1080/01431161.2018.15...
; Niu et al., 2019Niu Y, Zhang L, Zhang H, Han W, Peng X (2019) Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sensing 11(11):1261. DOI: https://doi.org/10.3390/rs11111261
https://doi.org/10.3390/rs11111261...
).
UAV-based applications in agriculture have contributed to technological innovation. Calderón et al. (2013)Calderón R, Navas-Cortés JA, Lucena C, Zarco-Tejada PJ (2013) High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sensing of Environment 139:231-245. DOI: https://doi.org/10.1016/j.rse.2013.07.031
https://doi.org/10.1016/j.rse.2013.07.03...
used a UAV coupled to multispectral and thermal sensors to identify fungi in an olive tree plantation. Bendig et al. (2014)Bendig J, Bolten A, Bennertz S, Broscheit J, Eichfuss S, Bareth G (2014) Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sensing 6(11):10395-10412. DOI: https://doi.org/10.3390/rs61110395
https://doi.org/10.3390/rs61110395...
used remote sensing techniques and UAVs to estimate plant biomass and height in barley and rice crops. Shahbazi et al. (2015)Shahbazi M, Sohn G, Théau J, Menard P (2015) Development and Evaluation of a UAV-Photogrammetry System for Precise 3D Environmental Modeling. Sensors, v. 15, n. 11, p. 27493-27524. DOI: http://dx.doi.org/10.3390/s151127493
http://dx.doi.org/10.3390/s151127493...
and James et al. (2017)James MR, Robson S, D'oleire-Oltmanns S, Niethammer U (2017) Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 280:51-66. DOI: https://doi.org/10.1016/j.geomorph.2016.11.021
https://doi.org/10.1016/j.geomorph.2016....
developed and evaluated a UAV system for high-precision mapping and 3D model generation to assess the quality of the georeferencing model and the point clouds produced. Han et al. (2016)Han M, Zhang H, Dejonge KC, Comas LH, Trout TJ (2016) Estimating maize water stress by standard deviation of canopy temperature in thermal imagery. Agricultural Water Management 177:400-409. DOI: https://doi.org/10.1016/j.agwat.2016.08.031
https://doi.org/10.1016/j.agwat.2016.08....
and Santesteban et al. (2017)Santesteban LG, Di Gennaro SF, Herrero-Langreo A, Miranda C, Royo JB, Matese A (2017) High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agricultural Water Management 183:49-59. DOI: https://doi.org/10.1016/j.agwat.2016.08.026
https://doi.org/10.1016/j.agwat.2016.08....
combined thermal and multispectral cameras with UAVs to determine the correlation between temperature and water content in crops. Alsalam (2017)Alsalam B (2017) A small autonomous UAV for detection and action in precision agriculture. Tese Doutorado. Queensland University of Technology. Available: https://eprints.qut.edu.au/104318/. Accessed: Jun 15, 2017.
https://eprints.qut.edu.au/104318/...
used UAV-based high-resolution images for identifying and mapping invasive plants in the cultivation of a forage species.
Aerial images are processed using specific computer software, which builds three-dimensional (3D) models based on two-dimensional (2D) data using the photogrammetric principle through image superposition. These programs use algorithms such as structure from motion (SfM) (Ullman, 1979Ullman S (1979) The interpretation of structure from motion. Proceedings of the Royal Society of London B: Biological Sciences 203(1153):405-426. Available: http://rspb.royalsocietypublishing.org/content/203/1153/405.short. Accessed: Dec 17, 2015.
http://rspb.royalsocietypublishing.org/c...
), which recognize patterns by overlapping and aligning images acquired by a moving camera in an agricultural area. The algorithm detects and describes the attribute or local pattern of each 2D point, and this procedure is performed for each image in which the same pattern is found. Some studies described the structure and applicability of SfM (Verhoeven et al., 2012Verhoeven G, Doneus M, Briese C, Vermeulen F (2012) Mapping by matching: a computer vision-based approach to fast and accurate georeferencing of archaeological aerial photographs. Journal of Archaeological Science 39(7):2060-2070. DOI: http://dx.doi.org/10.1016/j.jas.2012.02.022
http://dx.doi.org/10.1016/j.jas.2012.02....
; Mancini et al., 2013Mancini F, Dubbini M, Gattelli M, Stecchi F, Fabbri S, Gabbianelli G (2013) Using Unmanned Aerial Vehicles (UAV) for high-resolution reconstruction of topography: The structure from motion approach on coastal environments. Remote Sensing 5(12):6880-6898. DOI: https://doi.org/10.3390/rs5126880
https://doi.org/10.3390/rs5126880...
). In addition, SfM photogrammetry has become even more attractive in recent years because of advances in matching images from moving cameras, allowing the accurate acquisition of dense 3D point clouds at full coverage (Piermattei et al., 2019Piermattei L, Karel W, Wang D, Wieser M, Mokroš M, Surový P, Koren M, Tomaštík J, Pfeifer N, Hollaus M (2019) Terrestrial Structure from Motion Photogrammetry for Deriving Forest Inventory Data. Remote Sensing 11(8):950. DOI: https://doi.org/10.3390/rs11080950
https://doi.org/10.3390/rs11080950...
). The 3D structure of the canopy surface is constructed from 2D images because the position of features in multiple image overlays (together with their respective coordinates) can be estimated in a 3D environment using triangulation (Swinfield et al., 2019Swinfield T, Lindsell JA, Williams JV, Harrison RD, Gemita E, Schönlieb CB, Coomes DA (2019) Accurate Measurement of Tropical Forest Canopy Heights and Aboveground Carbon Using Structure From Motion. Remote Sensing 11(8):928. DOI: http://dx.doi.org/10.3390/rs11080928
http://dx.doi.org/10.3390/rs11080928...
).
In crop yield forecasting, it is essential to assess the possibility of estimating biomass using remote sensing techniques because of several advantages over conventional (destructive) field sampling methods (Fassnacht et al., 2014Fassnacht FE, Hartig F, Latifi H, Berger C, Hernández J, Corvalán P, Koch B (2014) Importance of sample size, data type and prediction method for remote sensing-based estimations of aboveground forest biomass. Remote Sensing of Environment 154:102-114. DOI: https://doi.org/10.1016/j.rse.2014.07.028
https://doi.org/10.1016/j.rse.2014.07.02...
; Wani et al., 2015Wani AA, Joshi PK, Singh O (2015) Estimating biomass and carbon mitigation of temperate coniferous forests using spectral modeling and field inventory data. Ecological Informatics 25:63-70. DOI: http://dx.doi.org/10.1016/j.ecoinf.2014.12.003
http://dx.doi.org/10.1016/j.ecoinf.2014....
), as demonstrated by Bendig et al. (2014)Bendig J, Bolten A, Bennertz S, Broscheit J, Eichfuss S, Bareth G (2014) Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sensing 6(11):10395-10412. DOI: https://doi.org/10.3390/rs61110395
https://doi.org/10.3390/rs61110395...
, who used high spatial resolution images with UAVs to correlate biomass in barley and rice crops with plant height using digital elevation models and found strong statistical correlations.
The objective of this study is to develop a UAV-based aerial data processing methodology to estimate maize biomass. The proposed technique uses digital image processing software and procedures to create 3D models.
MATERIAL AND METHODS
Study area
The study was carried out in the city of Limoeiro do Norte (5°12,771′ S and 38°1,388′ W, 198 km from Fortaleza), Ceará, Brazil, on October 2015, in a 100-hectare area used for the commercial cultivation of maize (AG 1051, Agroceres®) irrigated by central pivot 74 days after planting (pre-harvest for silage) (Figure 1).
Data collection
A Phantom 2 UAV (DJI Innovations) was used as a platform for the image acquisition sensor. Phantom UAVs are classified as multi-rotor (quad-rotor), and flight autonomy is approximately 15 min. The Phantom 2 model has a 5200 mAh battery with a working voltage of 11.1 V.
The definition of the sample area was established in the field using Ground Station software, allowing programming the automatic flight, the desired altitude, and the cruising speed of the UAV.
The images were acquired in October 2015 during three flights using a 16-megapixel Ricoh GR digital camera (focal length of 18.3 mm) at an altitude of 60 m above ground level, and a 12-megapixel GoPro Hero 4 Silver camera with a fish-eye lens (focal length of 2.8 mm) at an altitude of 30 and 60 m above ground level. The flights performed at a height of 60 and 30 m covered an area of approximately 5.1 ha (161 m × 316 m) and 1.0 ha, respectively. It was not possible to use the Ricoh GR camera at a height of 30 m because of technical defects in the sensor. Image overlap was 65% lateral and 85% frontal, and these values were the lower limits for applying the SfM algorithm (Ullman, 1979Ullman S (1979) The interpretation of structure from motion. Proceedings of the Royal Society of London B: Biological Sciences 203(1153):405-426. Available: http://rspb.royalsocietypublishing.org/content/203/1153/405.short. Accessed: Dec 17, 2015.
http://rspb.royalsocietypublishing.org/c...
). To achieve the necessary overlap for the composition of dense point clouds, the software defined the horizontal flight speed (5 m s-1) and image capture frequency.
The flight altitudes were chosen on the basis of the pixel size produced (less than 0.1 m) in the orthophotos, enabling a level of detail compatible with the study objectives, and avoiding a mixture of pixels on targets such as the plant canopy, allowing measuring canopy height.
To calculate the biomass in the field and later validate the results using the proposed methodology, data were collected in eight 1 m² sampling plots (validation units, each with approximately 10 plants) in the maize cultivation area. The canopy volume was measured in each plot, and the average plant height was measured in the field. A conventional (destructive) sampling method was performed by cutting the plants with pruning shears to calculate the projected volume in each plot and obtain fresh mass values in the laboratory, according to the methodology proposed by T'Mannetje (2000)T’Mannetje L (2000) Measuring biomass of grassland vegetation. In: T’Mannetje L, Jones RM (eds). Field and laboratory methods for grassland and animal production research'. Wallingford, CABI Publishing, p151-178.. The soil characteristics, cultural practices, and maize variety were similar between the plots. A total of eight composite samples were collected because this size was sufficient to estimate the agronomic characteristics of the crop. These data allowed determining the silo density (SD), canopy density (CD), and fresh biomass (FB) according to the following equations:
Where:
SD is the silo density (kg m-3);
FM is the fresh mass of maize (kg),
VC is the volume of chopped maize (m³).
The measured CD values were multiplied by the volume obtained by processing cloud points in the 3D model, resulting in the total mass of the maize crop.
Where:
CD is the canopy density (kg m-3);
A is the area of one sampling unit (1 m²);
H is the average plant height in each sampling unit (m).
Where:
FB is the fresh biomass (kg m-2).
Before the flight, nine ground control points (GCPs) made of ceramic material (Figure 2A) were distributed in the study site and used to register the sample plots during the georeferencing of digital elevation models. The coordinates of each GCP were obtained using satellite navigation systems - GNSS Trimble® R4 and ProMark3 Magellan® (Figure 2B), with millimeter precision. An image acquired from the UAV is shown in Figure 2C.
A – A ground control point (GCP) made of ceramic material; B - Acquisition of geographic coordinates from a GCP using a GPS (R4 Trimble); C - Aerial image of a maize crop acquired with a UAV sensor.
A computer with a 3.40 GHz Intel Core™ i7-3770 processor, 8 GB RAM, and Windows 8 operating system (64 bits) was used for image processing. PhotoScan software (AgiSoft) allowed the efficient and sequential generation and manipulation of the orthomosaic and the composition of the desired 3D model. Data processing was divided into seven stages: 1, import of aerial images; 2, image matching; 3, camera calibration; 4, creation of the 3D mesh and georeferencing; 5, production of dense point clouds; 6. generation of orthophotos and export of documents; 7, export of dense cloud points. Dense clouds are clusters of 3D points created by the identification of the same pixel in two or more UAV photographs by the SfM algorithm, resulting in a point pattern that represents the terrain model.
For processing and interpreting the data generated in the 3D model, the biomass was calculated by selecting the plot located in the center of the study area, corresponding to approximately 1 hectare (10,000 m²), and the dense point cloud of this plot was exported by the software. The generated text file (.txt) contained data on tie point geographic coordinates, in the x, y, and z axes according to a predefined datum, color (red-green-blue [RGB]), and the pitch, roll, and yaw axes information (normals).
The data on dense point clouds were grouped into height classes using an algorithm in C++ language, and data on pixel color (RGB) and positioning (pitch, roll, and yaw) were excluded, leaving only height information. The height of each pixel was sorted in ascending order and grouped into classes to two decimal places, and this level of accuracy was satisfactory. The average heights and respective standard deviations of cloud points were calculated for the data from the three flights to evaluate data dispersion around the average. A column with data on the frequency of appearance of each height class was added to create a histogram. A histogram of each pixel was constructed to determine the height frequency, and the two most frequent values were considered the stand height of the maize crop and elevation of the uncovered soil.
Pixels of the same size and spatial location were sampled in the point clouds in the areas corresponding to the validation units for each processing performed. A completely random design with four treatments, eight repetitions, and ten experimental units was used. Ten plants were used to determine the average height in the eight repetitions performed in the field (control treatment [T4]). The pixel clusters (plant heights) from point clouds were sampled according to the cloud density generated for the other treatments (T1, T2, and T3). Average heights in the following treatments were subjected to analysis of variance and Tukey's test: T1 - Phantom UAV equipped with a 16-megapixel Ricoh GR camera (focal length of 18.3 mm) at a height of 60 m and data processing using PhotoScan software; T2 - Phantom UAV equipped with a 12-megapixel GoPro Hero 4 Silver camera at a height of 60 m, and data processing in PhotoScan software; T3 - Phantom UAV equipped with a 12-megapixel GoPro Hero 4 Silver camera at a height of 30 m and data processing in PhotoScan software; T4 (control) - plant heights determined in the field using a measuring tape.
A probability test was performed to identify the height classes at a level of confidence of 5%, and the amplitude of the classes was calculated by the difference in height values. Pixel cluster size was based on dense point clouds, pixel area, and appearance frequency according to the following equation:
Where:
S is the pixel cluster size (m³),
PA is the pixel area (m²),
PH is the pixel height (m), and
F is the frequency of appearance of the pixel in question.
The sum of the pixel cluster sizes corresponded to the total volume of the imaged area.
The biomass of the area of interest was obtained by multiplying S by CD. Therefore, biomass was determined by dividing the mass by the effective area (approximately 1 hectare [10,000 m2], excluding the outliers).
Biomass and volume were assessed in the three treatments. The stand biomass and volume values in each treatment were compared with each other and with the means calculated using the field sampling method.
The explanatory flowchart of data processing, including the acquisition of aerial images using the UAV and data analysis, is shown in Figure 3.
RESULTS AND DISCUSSION
The average values of fresh mass, plant height, crop volume, SD, FB, crop volume per hectare, and mass per hectare in field samples in two sampling days are shown in Table 1.
The values of the agricultural traits (Table 1) have the same order of magnitude as those in the study by Santos et al. (2010)Santos RD dos, Pereira LGR, Neves ALA, Azevêdo JAG, De Moraes AS, Costa CTF (2010) Características agronômicas de variedades de milho para produção de silagem 32i49299. Acta Scientiarum. Animal Sciences 32(4):367-373. DOI: http://dx.doi.org/10.4025/actascianimsci.v32i4.9299
http://dx.doi.org/10.4025/actascianimsci...
, wherein the average FM of six maize hybrids cultivated for silage production in Pernambuco, Brazil, was 33.8 t ha-1, and Guareschi et al. (2010)Guareschi RF, Brasil RB, Perin A, Ribeiro JMM (2010) Produção de silagem de híbridos de milho e sorgo sem nitrogênio de cobertura em safra de verão. Pesquisa Agropecuária Tropical 40(4):541-546. DOI: https://doi.org/10.5216/pat.v40i4.6389
https://doi.org/10.5216/pat.v40i4.6389...
, in which the average FM of three maize hybrids grown for spike and silage production in Goiás, Brazil, was 32.05 t ha-1.
The aerial images (1 hectare) of a maize crop are shown in Figure 4.
Georeferenced orthophotos from three 1-ha sections of a maize plantation located in Limoeiro do Norte, Ceará, Brazil, 2015. A - T1; B - T2; C - T3.
The georeferencing error values for processing of T1, T2, and T3, corresponding to 0.15, 0.31, and 0.34 m, respectively, were below the average errors found by Bachmann et al. (2013)Bachmann F, Herbst R, Gebbers R, Hafner VV (2013) Micro UAV based georeferenced orthophoto generation in VIS+ NIR for precision agriculture. International Archives of the Photogrammetry, Remote Sensing and Spatial and Information Science 11-16. Available: http://www.intarch-photogramm-remote-sens-spatial-inf-sci.net/XL-1W2/11/2013/isprsarchives-XL-1-W2-11-2013.pdf. Accessed: Dec 27, 2015.
http://www.intarch-photogramm-remote-sen...
(0.8 m) using PhotoScan to construct orthophotos in an agricultural area and test the geographic data acquisition system of a UAV (Oktokopter, HiSystems GmbH) and an RTK-GNSS millimeter precision system. This finding is related to the altitude of the flight performed by Bachmann et al. (2013)Bachmann F, Herbst R, Gebbers R, Hafner VV (2013) Micro UAV based georeferenced orthophoto generation in VIS+ NIR for precision agriculture. International Archives of the Photogrammetry, Remote Sensing and Spatial and Information Science 11-16. Available: http://www.intarch-photogramm-remote-sens-spatial-inf-sci.net/XL-1W2/11/2013/isprsarchives-XL-1-W2-11-2013.pdf. Accessed: Dec 27, 2015.
http://www.intarch-photogramm-remote-sen...
(100 meters). Flight altitude is directly related to the spatial resolution (level of detail) of images and to the georeferencing error. In addition, nine GCPs were used in the present study for model georeferencing, whereas Bachmann et al. (2013)Bachmann F, Herbst R, Gebbers R, Hafner VV (2013) Micro UAV based georeferenced orthophoto generation in VIS+ NIR for precision agriculture. International Archives of the Photogrammetry, Remote Sensing and Spatial and Information Science 11-16. Available: http://www.intarch-photogramm-remote-sens-spatial-inf-sci.net/XL-1W2/11/2013/isprsarchives-XL-1-W2-11-2013.pdf. Accessed: Dec 27, 2015.
http://www.intarch-photogramm-remote-sen...
used three GCPs, which increased precision error in digital imaging.
Average errors of plant height measurements from dense point clouds and respective standard deviations are shown in Table 2. Errors below 0.26 m were determined in the three treatments.
Average errors of plant height measurements from dense point clouds and standard deviations.
Zarco-Tejada et al. (2014)Zarco-Tejada PJ, Diaz-Varela R, Angileri V, Loudjani P (2014) Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods. European journal of agronomy 55:89-99. DOI: http://dx.doi.org/10.1016/j.eja.2014.01.004
http://dx.doi.org/10.1016/j.eja.2014.01....
evaluated the quality of assessment of olive plant height using a low-cost camera. The authors found a high correlation (R² = 0.83) between point cloud data and height measurements under field conditions and an estimated error below 0.5 meters, demonstrating the high accuracy of the method of reconstruction of digital elevation models using UAVs.
In the present study, the height error was determined by comparing the GNSS altitudes with the point cloud heights immediately above the GCP. T1, which was performed using images from the Ricoh GRLENS camera, presented the lowest errors between treatments (Table 2), which agrees with the study by Siebert & Teizer (2014)Siebert S, Teizer J (2014) Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Automation in Construction 41:1-14. DOI: http://dx.doi.org/10.1016/j.autcon.2014.01.004
http://dx.doi.org/10.1016/j.autcon.2014....
, who estimated height accuracy in a survey using a UAV and buildings of known height as a reference and found an error of 0.025 m.
The height frequencies (Figures 5A, C, and E) and the respective probabilities of occurrence (Figures 5B, D, and F), as well as the results of statistical analysis, are shown in Figure 5. Outliers—values with a probability of less than 0.5% and more than 99.5%—were excluded from the analysis to reduce variance around the mean. The heights of the pixels of the point clouds were similar to field values.
Maize plant height frequency and probability. A and B, T1 processing; C and D, T2 processing; E and F, T3 processing.
The peaks with the highest and lowest frequencies (Figures A, C, and E) of the most representative heights in the point clouds were considered plant heights and elevations of bare soil, respectively.
Frequency and probability values were similar across the samples. The peaks shown in Figure 5A are closer to each other and have smaller amplitudes, whereas the four peaks shown in Figure 5E are well-defined (Table 3). This result may be related to the quality of georeferencing. During this procedure, the higher is the amount of position information associated with the orthophoto mosaic, the higher is the level of accuracy of the entire mosaic. In this study, only eight GCPs were used, and this characteristic affected the quality of the point clouds.
Errors were calculated by subtracting actual plant heights from heights derived from point clouds. The differences in height (ΔH) (Table 3) were compared with the average heights estimated in the field (2.22 m), and the results from T2 processing were closer to field values.
The results of the variance analysis of point cloud-derived heights in the four treatments are shown in Table 4. The results of the means test and the statistical relationship between treatments are shown in Table 5. The values were obtained from eight random samplings using the respective point clouds of each treatment.
Treatments 2 and 3 did not differ significantly (p>0.05) from the control (treatment 4) and had a better definition in the generated model, which improved the estimation of plant height. This finding is related to the high degree of overlap produced with a fish-eye lens, which has a much wider field of view compared to the flat lens used in treatment 1. This flight configuration results in significant image overlap, i.e., more tie points between them. Therefore, the higher is the degree of overlap of aerial images, the higher is pixel combination during the construction of point clouds.
The crop volume and biomass results from processing point cloud data for calculating the FB of maize are shown in Table 6.
The mass values estimated by the proposed model varied between 23.58 and 31.77 tons, corroborating the study by Santos et al. (2010)Santos RD dos, Pereira LGR, Neves ALA, Azevêdo JAG, De Moraes AS, Costa CTF (2010) Características agronômicas de variedades de milho para produção de silagem 32i49299. Acta Scientiarum. Animal Sciences 32(4):367-373. DOI: http://dx.doi.org/10.4025/actascianimsci.v32i4.9299
http://dx.doi.org/10.4025/actascianimsci...
(33.8 tons) and Guareschi et al. (2010)Guareschi RF, Brasil RB, Perin A, Ribeiro JMM (2010) Produção de silagem de híbridos de milho e sorgo sem nitrogênio de cobertura em safra de verão. Pesquisa Agropecuária Tropical 40(4):541-546. DOI: https://doi.org/10.5216/pat.v40i4.6389
https://doi.org/10.5216/pat.v40i4.6389...
(32.05 tons). The estimated biomass values were similar to those reported in the literature for maize plantations, evidencing the adequacy of the number of samples for the composition of the method.
The biomass values estimated by the proposed method are shown in Table 6 and evidence the strong approximation of T3 values to field values (2.93 kg m-2). In contrast, T1 underestimated field biomass as a result of the poor quality of the point cloud due to the low field of view of the plane lens of the Ricoh camera. In this case, a flight plan with a higher image superposition is required, increasing the pattern recognition capability of the software algorithm. The image superposition achieved with the GoPro camera (T2 and T3) was more accurate than that of T1 because of its spherical lens, generating a denser point cloud. Therefore, T2 and T3 represented the sampled area with the highest fidelity, allowing measuring plant height, estimating biomass, and assessing the validity of this estimation in a commercial maize crop using UAV-based aerial images.
The information on canopy volume and height inferred from the point clouds allowed consistently estimating biomass in the study area. In this respect, Bendig et al. (2014)Bendig J, Bolten A, Bennertz S, Broscheit J, Eichfuss S, Bareth G (2014) Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sensing 6(11):10395-10412. DOI: https://doi.org/10.3390/rs61110395
https://doi.org/10.3390/rs61110395...
estimated biomass in barley and rice plantations, proposing a correlation between plant height and biomass using UAV images, and concluded that plant height models were strongly correlated with plant biomass, which opens new horizons for determining harvest yield by high spatial resolution image processing.
Therefore, estimating agricultural characteristics using remote sensing methods represents a new field of research involving UAVs, as shown in the present study, demonstrating that the quality of terrain modeling is crucial in precision agriculture.
CONCLUSIONS
The proposed methodology proved to be adequate for estimating maize biomass. The methodology involving the processing and generation of a 3D terrain model using aerial images acquired with a GoPro Hero 4 Silver camera at an altitude of 30 m produced values closer to those determined using a standard (destructive) sampling method.
REFERENCES
- Alsalam B (2017) A small autonomous UAV for detection and action in precision agriculture. Tese Doutorado. Queensland University of Technology. Available: https://eprints.qut.edu.au/104318/ Accessed: Jun 15, 2017.
» https://eprints.qut.edu.au/104318/ - Bachmann F, Herbst R, Gebbers R, Hafner VV (2013) Micro UAV based georeferenced orthophoto generation in VIS+ NIR for precision agriculture. International Archives of the Photogrammetry, Remote Sensing and Spatial and Information Science 11-16. Available: http://www.intarch-photogramm-remote-sens-spatial-inf-sci.net/XL-1W2/11/2013/isprsarchives-XL-1-W2-11-2013.pdf Accessed: Dec 27, 2015.
» http://www.intarch-photogramm-remote-sens-spatial-inf-sci.net/XL-1W2/11/2013/isprsarchives-XL-1-W2-11-2013.pdf - Bendig J, Bolten A, Bennertz S, Broscheit J, Eichfuss S, Bareth G (2014) Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sensing 6(11):10395-10412. DOI: https://doi.org/10.3390/rs61110395
» https://doi.org/10.3390/rs61110395 - Calderón R, Navas-Cortés JA, Lucena C, Zarco-Tejada PJ (2013) High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sensing of Environment 139:231-245. DOI: https://doi.org/10.1016/j.rse.2013.07.031
» https://doi.org/10.1016/j.rse.2013.07.031 - De Lara A, Longchamps L, Khosla R (2019) Soil Water Content and High-Resolution Imagery for Precision Irrigation: Maize Yield. Agronomy 9(4):174. DOI: https://doi.org/10.3390/agronomy9040174
» https://doi.org/10.3390/agronomy9040174 - Fassnacht FE, Hartig F, Latifi H, Berger C, Hernández J, Corvalán P, Koch B (2014) Importance of sample size, data type and prediction method for remote sensing-based estimations of aboveground forest biomass. Remote Sensing of Environment 154:102-114. DOI: https://doi.org/10.1016/j.rse.2014.07.028
» https://doi.org/10.1016/j.rse.2014.07.028 - Guareschi RF, Brasil RB, Perin A, Ribeiro JMM (2010) Produção de silagem de híbridos de milho e sorgo sem nitrogênio de cobertura em safra de verão. Pesquisa Agropecuária Tropical 40(4):541-546. DOI: https://doi.org/10.5216/pat.v40i4.6389
» https://doi.org/10.5216/pat.v40i4.6389 - Han M, Zhang H, Dejonge KC, Comas LH, Trout TJ (2016) Estimating maize water stress by standard deviation of canopy temperature in thermal imagery. Agricultural Water Management 177:400-409. DOI: https://doi.org/10.1016/j.agwat.2016.08.031
» https://doi.org/10.1016/j.agwat.2016.08.031 - Han L, Yang G, Dai H, Xu B, Yang H, Feng H, Li Z, Yang X (2019) Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 15(1):10. DOI: https://doi.org/10.1186/s13007-019-0394-z
» https://doi.org/10.1186/s13007-019-0394-z - Honkavaara E, Saari H, Kaivosoja J, Pölönen I, Hakala T, Litkey P, Pesonen L (2013) Processing and assessment of spectrometric, stereoscopic imagery collected using a lightweight UAV spectral camera for precision agriculture. Remote Sensing 5(10):5006-5039. DOI: https://doi.org/10.3390/rs5105006
» https://doi.org/10.3390/rs5105006 - James MR, Robson S, D'oleire-Oltmanns S, Niethammer U (2017) Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 280:51-66. DOI: https://doi.org/10.1016/j.geomorph.2016.11.021
» https://doi.org/10.1016/j.geomorph.2016.11.021 - Mancini F, Dubbini M, Gattelli M, Stecchi F, Fabbri S, Gabbianelli G (2013) Using Unmanned Aerial Vehicles (UAV) for high-resolution reconstruction of topography: The structure from motion approach on coastal environments. Remote Sensing 5(12):6880-6898. DOI: https://doi.org/10.3390/rs5126880
» https://doi.org/10.3390/rs5126880 - Marcial-Pablo MDJ, Gonzalez-Sanchez A, Jimenez-Jimenez SI, Ontiveros-Capurata RE, Ojeda-Bustamante W (2019) Estimation of vegetation fraction using RGB and multispectral images from UAV. International Journal of Remote Sensing 40(2):420-438. DOI: https://doi.org/10.1080/01431161.2018.1528017
» https://doi.org/10.1080/01431161.2018.1528017 - Niu Y, Zhang L, Zhang H, Han W, Peng X (2019) Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sensing 11(11):1261. DOI: https://doi.org/10.3390/rs11111261
» https://doi.org/10.3390/rs11111261 - Piermattei L, Karel W, Wang D, Wieser M, Mokroš M, Surový P, Koren M, Tomaštík J, Pfeifer N, Hollaus M (2019) Terrestrial Structure from Motion Photogrammetry for Deriving Forest Inventory Data. Remote Sensing 11(8):950. DOI: https://doi.org/10.3390/rs11080950
» https://doi.org/10.3390/rs11080950 - Santesteban LG, Di Gennaro SF, Herrero-Langreo A, Miranda C, Royo JB, Matese A (2017) High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agricultural Water Management 183:49-59. DOI: https://doi.org/10.1016/j.agwat.2016.08.026
» https://doi.org/10.1016/j.agwat.2016.08.026 - Santos RD dos, Pereira LGR, Neves ALA, Azevêdo JAG, De Moraes AS, Costa CTF (2010) Características agronômicas de variedades de milho para produção de silagem 32i49299. Acta Scientiarum. Animal Sciences 32(4):367-373. DOI: http://dx.doi.org/10.4025/actascianimsci.v32i4.9299
» http://dx.doi.org/10.4025/actascianimsci.v32i4.9299 - Shahbazi M, Sohn G, Théau J, Menard P (2015) Development and Evaluation of a UAV-Photogrammetry System for Precise 3D Environmental Modeling. Sensors, v. 15, n. 11, p. 27493-27524. DOI: http://dx.doi.org/10.3390/s151127493
» http://dx.doi.org/10.3390/s151127493 - Siebert S, Teizer J (2014) Mobile 3D mapping for surveying earthwork projects using an Unmanned Aerial Vehicle (UAV) system. Automation in Construction 41:1-14. DOI: http://dx.doi.org/10.1016/j.autcon.2014.01.004
» http://dx.doi.org/10.1016/j.autcon.2014.01.004 - Swinfield T, Lindsell JA, Williams JV, Harrison RD, Gemita E, Schönlieb CB, Coomes DA (2019) Accurate Measurement of Tropical Forest Canopy Heights and Aboveground Carbon Using Structure From Motion. Remote Sensing 11(8):928. DOI: http://dx.doi.org/10.3390/rs11080928
» http://dx.doi.org/10.3390/rs11080928 - T’Mannetje L (2000) Measuring biomass of grassland vegetation. In: T’Mannetje L, Jones RM (eds). Field and laboratory methods for grassland and animal production research'. Wallingford, CABI Publishing, p151-178.
- Ullman S (1979) The interpretation of structure from motion. Proceedings of the Royal Society of London B: Biological Sciences 203(1153):405-426. Available: http://rspb.royalsocietypublishing.org/content/203/1153/405.short Accessed: Dec 17, 2015.
» http://rspb.royalsocietypublishing.org/content/203/1153/405.short - Verhoeven G, Doneus M, Briese C, Vermeulen F (2012) Mapping by matching: a computer vision-based approach to fast and accurate georeferencing of archaeological aerial photographs. Journal of Archaeological Science 39(7):2060-2070. DOI: http://dx.doi.org/10.1016/j.jas.2012.02.022
» http://dx.doi.org/10.1016/j.jas.2012.02.022 - Wani AA, Joshi PK, Singh O (2015) Estimating biomass and carbon mitigation of temperate coniferous forests using spectral modeling and field inventory data. Ecological Informatics 25:63-70. DOI: http://dx.doi.org/10.1016/j.ecoinf.2014.12.003
» http://dx.doi.org/10.1016/j.ecoinf.2014.12.003 - Zarco-Tejada PJ, Diaz-Varela R, Angileri V, Loudjani P (2014) Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods. European journal of agronomy 55:89-99. DOI: http://dx.doi.org/10.1016/j.eja.2014.01.004
» http://dx.doi.org/10.1016/j.eja.2014.01.004
Publication Dates
-
Publication in this collection
09 Dec 2019 -
Date of issue
Nov-Dec 2019
History
-
Received
18 Sept 2016 -
Accepted
01 Oct 2019