Open-access ESTIMATING VEGETATION VOLUME OF COFFEE CROPS USING IMAGES FROM UNMANNED AERIAL VEHICLES

ABSTRACT

Tree crops, such as Arabica coffee (Coffea arabica L.), present enormous technical challenges in terms of pesticide application. The correct deposition and distribution of the active ingredient throughout the aerial part of these plants depends on knowledge of the canopy volume, but manually determining this volume is time consuming and imprecise. The objectives of this study were to develop a method to determine the vegetation volume of coffee crops from digital images captured by camera onboard unmanned aerial vehicles and to compare this approach with traditional vegetation volume estimation (tree row volume (TRV) method). Manual measurements of the canopy volume of four coffee cultivation areas were compared with data obtained using the method presented in this paper. It was concluded that the vegetation volume of coffee trees, a highly important variable in defining pesticide application techniques (in addition to other uses), could be determined in a practical and precise way by digitally processing the images captured by unmanned aerial vehicles. The method is fast and permits the assessment of large areas. Furthermore, estimates based on this method and the traditional TRV method were not significantly different.

KEYWORDS unmanned aircraft system; digital image processing; canopy volume

INTRODUCTION

Agriculture is increasingly linked to information technology and automation, and in this context, the unmanned aerial vehicle (UAV) is a tool with many applications (Gómez-Candón et al., 2014). UAVs can fly into the atmosphere from the ground; they are not designed to carry a human pilot but instead are modified to be operated by remote or autonomous control.

Although the UAV is a technology that has been developed for many years, it has long gone unnoticed in the agricultural sphere (Ballesteros et al., 2014). Jorge & Inamasu (2014) conducted a literature review on the use of UAVs in precision agriculture. They mention that Przybilla & Wester-Ebbinghaus (1979) did the first experiments involving using a UAV for photogrammetry. However, it has been shown to be effective for many uses, including crop inspection, weed detection, identification of planting failures, target volumetrics, normalized difference vegetation indice (NDVI), vegetation health, spot detection in cultivated areas (Peña et al., 2013, Castaldi et al., 2017), and digital terrain model analysis to define contour lines. The combination of photogrammetry with UAVs appears to be a viable alternative for various agricultural applications (Vega et al., 2015; Khot et al., 2016; Hunt et al., 2018), and research on the use of UAVs for precision agriculture has increased considerably in recent years (Peña et al., 2013; Ballesteros et al., 2014; Torres-Sánchez et al., 2014; López-Granados et al., 2015).

Pesticide use is very important in commercial-scale agriculture because a crop can only fully express its genetic potential if pests, diseases, and weeds are effectively controlled. Tree crops, such as Arabica coffee (Coffea arabica L.), pose enormous technical challenges in terms of pesticide application. The correct deposition and distribution of the active ingredient throughout the aerial part of a plant depends on several factors, such as plant size, shape and planting density.

The pesticide application rate must be adjusted to allow adequate wetting of the plant while minimizing losses through falling droplets that enter the soil. However, information on the application rate of most products that are recommended for foliar spraying is imprecise or vague, and not adjusted for different crop stages.

An alternative for improving pesticide application to tree crops is the use of the tree row volume (TRV) method, developed by Byers et al. (1971). This method has shown good results when used to calibrate sprayers for pesticide application to fruiting trees (Sutton & Unrath, 1988; Rüegg et al., 1999; Siegfried et al., 2007).

The TRV method is based on measuring the canopy of the trees and then determining the application rate for each situation based on a volume index, and it has been successfully applied in fruit cultivation throughout Europe. In orchards with continuous tree rows, we have [eq. (1)]:

(1) T R V = H × L × 10000 D

In which,

TRV - volume of trees, m3 ha-1;

H - height, m;

L - width, m, and

D - distance between rows, m.

If the recommended spray volume per vegetation volume (VI, L 1,000 m-3) is known, the rate of application (Ra, L ha-1) is expressed by [eq. (2)]:

(2) R a = T R V × V I

The TRV method seems complicated and time consuming to farmers, which has limited its adoption. Traditionally, the vegetation volume has been determined from manual measurements in the field that, in addition to being laborious, do not always yield satisfactory results due to poor accuracy and low sampling rates (Escolà et al., 2017; Torres-Sánchez et al., 2018).

Nevertheless, the vegetation volume of various orchards in a farm can be calculated using aerial images captured by UAVs, which is a simpler and less costly approach than, for example, using LiDAR (Light Detection And Ranging) sensors, which measure the distance to a target by illuminating it with pulsed laser light and measuring the reflected pulses. The estimated volume can be used to determine a suitable pesticide application rate for each orchid, resulting in more efficient and environmentally safe cultivation. This practice reduces the ecological impact of agriculture through the efficient use of resources such as water, fertilizer and pesticides (Burkart et al., 2018).

Because UAVs are a new technology in precision agriculture, baselines for routine image collection must be established before comparisons can be made with established methodologies (Hunt et al., 2018). The objectives of this study were to develop a method to estimate the vegetation volume (TRV) from digital images captured by UAVs of coffee crops and to compare these estimates with traditional vegetation volume estimation.

MATERIAL AND METHODS

Field trials were carried out in the coffee growing sector of the Glória Experimental Farm of the Federal University of Uberlândia (in Portuguese, identified as UFU) in the municipality of Uberlândia, Minas Gerais, Brazil. Different coffee cultivation areas, characterized by different crop stages under different management conditions (Figure 1), were used to carry out this study. The slope of the terrain is even (inclination under 3%). An example of the area is shown in Figure 2.

FIGURE 1
Aerial photo (A) and localization (B) of the study site at Gloria Experimental Farm on the UFU campus.
FIGURE 2
Example of one evaluated area at Gloria Experimental Farm on the UFU campus.

The coffee canopy volume was estimated via two methods, manually and using images collected by UAV, in four coffee plantation areas. Spacing between rows was 3.5 m, and spacing between plants was 0.7 m. Four coffee plantation areas with TRVs ranging from 5,000 to 15,000 m3 ha-1 were selected, according to the year of planting and pruning management, to verify the adequacy of the methods under different conditions. Each selected area was 50 m long and 7 m wide (2 rows).

Using the methodology adapted from Favarin et al. (2002) and Castro et al. (2018), manual estimation was performed with 20 randomly selected individual plants from each orchard (four areas). The height (H) of the plants; the width of the lower (Li), middle (Lm), and upper (Lu) thirds of the canopy; and the spacing between planting (D) rows were measured (Figure 3). The tree volume was then calculated in m3 ha-1 using [eq. (1)]. The tree width was calculated as the average of the lower, middle, and upper thirds of the canopy.

FIGURE 3
Parameters used to calculate the vegetation volume using manual estimation (H: height of the plants; Li, Lm, and Lu: widths of the lower, middle, and upper thirds of the canopy; D: spacing between plants).

The second estimation method was performed by digitally processing the aerial images. The UAV model was a DJI Phantom 4 quadcopter (DJI, Shenzhen, China), which was chosen because of its stabilized flight feature, generation of georeferenced images, manual and/or automatic flight modes, high-efficiency motor assembly, and battery that supplies 5,350 mAh of power, which enables an autonomous flight time of approximately 28 minutes under ideal flying conditions (low wind speed). Its camera system (model FC330, DJI, Shenzhen, China) features a 3-axis (x, y and z) image stabilizer, 4K video capture at 30 frames per second, full 1080p HD video capture at 120 frames per second, an aspherical lens with a 94° field of view (FOV), 1/2.3” CMOS, and high image quality. The radio control operates at a frequency of 2.4 GHz, allowing a flight of approximately 3.5 km of horizontal distance without interference (in the absence of obstacles). The vertical distance is limited to 120 m by current national legislation. The equipment was operated with an octa-core smartphone (1.5 GHz) with 16 GB memory and an Android OS version 6.0.1 operating system.

First, to carry out the flight plan, the coordinates of the study site were collected, and a polygon was created in Google Earth Pro (Google, Mountain View, USA) that delimited the flyover region for the UAV. Next, the polygon was saved in KML format and sent to DroneDeploy software (DroneDeploy, San Francisco, USA) installed on a smartphone (Figure 4). To finalize the flight plan, some parameters in the software had to be defined, such as flight altitude (60 m), flight direction, forward (80%) and lateral (80%) image overlap (to generate quality models, an overlap of not less than 60% is recommended by Wolf & Dewitt (2000)) and the maximum flight speed (11 m s-1). After inputting the parameters, a report was generated containing information about the flight, such as the area to be flown, ground sample distance (GSD, or pixel size in cm), and the number of batteries needed.

FIGURE 4
Polygon example of the autonomous flight path illustrated in DroneDeploy software.

The software PIX4D Mapper 3.2.23 (PIX4D, Lausanne, Switzerland) was used to process the georeferenced images obtained by the UAV on a desktop computer. The image processing was performed in three steps: 1 - initial processing (Figure 5), 2 - point cloud and mesh generation, and 3 - red-green-blue (RGB) orthomosaic, digital surface model (DSM) (Figure 6), and index generation.

FIGURE 5
Alignment of the photos for image processing and model generation (for volume estimation) in the PIX4D Mapper software.
FIGURE 6
RGB Orthomosaic (A) and the corresponding sparse Digital Surface Model (DSM) (B).

Initially, the obtained images were selected and aligned in a sequence in which they were captured and then subsequently calibrated using the standard method of the software, based on the parameters and the density of the point cloud, with the image resolution scale in multiple scale mode (½ resolution). The density of the points was optimized by re-projecting at least three photos. The spatial resolution of the orthophoto was automatic, varying according to its GSD value.

Subsequently, the volume of the targets in the selected areas in each plot (Figure 7) was measured using the PIX4D Mapper software. All the plants inside the area were considered. It was not possible to individualize the plants because there was no space between them that allowed to the software to perform the separation. The calculation was performed for the entire row. The software has a specific computational routine that allows estimating the volume of the target and consists of five steps:

  1. In the Menus bar, select the View, Volumes and New Volume functions.

  2. In 3D view, mark the vertices of the surface of the target base whose volume you want to measure.

  3. Finalize the vertex to create the volume base.

  4. For greater accuracy, correct the positioning of the vertices to the position of interest.

  5. Finally, process the volume calculation with the Compute function.

FIGURE 7
Study plots at Gloria Experimental Farm on the UFU campus.

The volume is computed using the DSM (Digital Surface Model). To draw a new volume, the point cloud and the DSM must be generated. Pix4Dmapper creates the base by accounting for the altitude of each vertex. The base surface is parallel to the XY plane, with an altitude equal to the highest altitude of all vertices. The software projects a grid with GSD spacing onto the base. For each cell of the grid, the volume is given using the length, width, and height of the cell. The length and width are equal to the project's GSD.

To compare the adequacy of the TRV (vegetation volume) values measured using the aerial image of each plot and the values measured manually in the field, a One-Sample t-Test (p≤0.05) was performed. This test determines whether the sample mean is statistically different from a known or hypothesized population mean. Winter (2013) shows that a paired t-test is feasible and robust, even for small sample sizes.

RESULTS AND DISCUSSION

The tested statistical hypothesis was that the estimates of the coffee vegetation volume produced by the two methods would be the same. The results of the One-Sample t-Test were not significant (Table 1), indicating no significant difference between the two methods of determining vegetation volumes and consequently the estimating ability of the digital image processing method.

TABLE 1
Results of the One-Sample t-Test (p≤0.05) comparing the vegetation volumes estimated manually and with digital image processing (DIP) for each of the plots evaluated in the field.

In coffee crops, to the best of the authors' knowledge, no previous investigations have reported assessments of the vegetation volume from UAV-based crop surface models. Burkart et al. (2018) analysed a field trial with two barley cultivars and two contrasting sowing densities in a random plot design, over two consecutive years, using aerial images of 28 flight campaigns. They concluded that aerial images collected by UAV can be used to provide quantitative data in crop management and precision agriculture. The objective of their work was different from ours, but it showed that it possible to use aerial images to measure crop phenotypes. Guerra-Hernández et al. (2016) showed that an UAV equipped with low-cost consumer grade cameras has the potential to provide large amounts of information for forest variable mapping and might be used for surveying spatial variations in growth or estimating yield differences. In another study, Hunt Jr. et al. (2018) concluded that small UAV platforms and sensors may not provide value to farmers for in-season nitrogen management. Therefore, it is necessary to establish baselines for routine UAV-image collection before comparisons with established technologies can be undertaken.

Castro et al. (2018) compared vine heights using manual measurements and a UAV image method. They found a good fit of estimated and measured height. In a previous investigation, Burgos et al. (2015) also obtained similar results using image-based UAV technology for vine height detection, although an exhaustive validation was not conducted. However, Mantese et al. (2017) reported a significant discordance (0.50 m) between the actual and estimated vine heights from UAV-based crop surface models due to a low-resolution sensor (1.3 MP), which caused a smoothing effect in the digital surface model generation.

In coffee production areas, the canopy is not uniform and can even vary within a single field. Determining the vegetation volume with the manual method over extensive areas becomes costly, requiring a longer execution time and possibly generating inaccurate data. When one collects data manually in the field, plants are chosen randomly, and the number may not be representative of the dimensions of the plot. With the digital image processing method, the sample size can vary from some plants to all the plants in the plot, resulting in more accurate data. In this way, some researchers have achieved good results in terms of vegetation detection using images from UAVs (Castaldi et al., 2017; Castro et al., 2018).

In this work, forward and lateral image overlaps were 80%. Future work can evaluate other values in coffee crops to improve accuracy. Using 60% lateral overlap allowed researchers to achieve high accuracy in the geometrical characterization of woody crops (Torres-Sánchez et al., 2015). In regard to forward overlap, Torres-Sánchez et al. (2018) suggested 95% flying at 100 m altitude. Using a large number of images results in long processing times, while using fewer images results in lower accuracy.

The canopy volume of tree crops can be determined with the aid of UAVs to optimize pesticide application at variable rates, thereby reducing the environmental impact caused by losses via runoff and increasing the operational capacity of agricultural machinery. Prior to pesticide application, a farmer can fly a UAV over the area to be sprayed, determine the vegetation volume of the target area(s), and calculate the appropriate application rates based on the specific volume indices for each crop, enabling the spraying equipment to be correctly adjusted. Terrestrial laser scanners have potential for estimating vegetation volume (Escolà et al., 2017) more precisely and provide other useful information, such as the porosity of trees, but this approach requires more complex equipment and methodology.

It is important to note that the area evaluated in the present study was flat. In the case of sloping terrain, it would be necessary to program the software so that the drone always flies at the same height from the ground, which would minimize possible distortions in the images. Moreover, it is important to highlight that the use of control points on soil, with a GNSS, could improve the accuracy of the measurements. Finally, more studies are necessary to verify the applicability of the methodology for different areas, pruning and porosity.

CONCLUSIONS

It is possible to determine coffee vegetation volume by digitally processing images captured by UAVs. This method is faster and permits the assessment of large areas. Furthermore, the method does not present statistically significant differences from the results obtained with the traditional TRV method.

ACKNOWLEDGEMENTS

This study was supported partially by the National Council of Scientific and Technological Development (CNPq, Brazil, 304353/2017-5) and the Research Foundation of the State of Minas Gerais (Fapemig, Brazil, PPM-00085-16).

REFERENCES

  • Ballesteros R, Ortega JF, Hernández D, Moreno MA (2014) Applications of georeferenced high resolution images obtained with unmanned aerial vehicles. Part II: Application to maize and onion crops of a semi-arid region in Spain. Precision Agriculture 15(6):593-614. DOI: http://dx.doi.org/10.1007/s11119-014-9357-6
    » http://dx.doi.org/10.1007/s11119-014-9357-6
  • Byers RE, Hickey KD, Hill CH (1971) Base gallonage per acre. Virginia Fruit 60:19-23.
  • Burgos S, Mota M, Noll D, Cannelle B (2015) Use of very high-resolution airborne images to analyse 3D canopy architecture of a vineyard. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Science 40:399-403. DOI: http://dx.doi.org/10.5194/isprsarchives-XL-3-W3-399-2015
    » http://dx.doi.org/10.5194/isprsarchives-XL-3-W3-399-2015
  • Burkart A, Hecht VL, Kraska T, Rascher U (2018) Phenological analysis of unmanned aerial vehicle based time series of barley imagery with high temporal resolution. Precision Agriculture 19(1): 134-146. DOI: https://doi.org/10.1007/s11119-017-9504-y
    » https://doi.org/10.1007/s11119-017-9504-y
  • Castaldi F, Pelosi F, Pascucci S, Casa R (2017) Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precision Agriculture 18(1):76-94. DOI: https://doi.org/10.1007/s11119-016-9468-3
    » https://doi.org/10.1007/s11119-016-9468-3
  • Castro AI, Jiménez-Brenes FM, Torres-Spanchez J, Peña JM, Borra-Serrano I, López-Granados FL (2018) 3-D characterization of vineyards using a novel uav imagery-based obia procedure for precision viticulture applications. Remote Sensing 10(4):584. DOI: https://doi.org/10.3390/rs10040584
    » https://doi.org/10.3390/rs10040584
  • Escolà A, Martínez-Casasnovas JA, Rufat J, Arnó J, Arbonés A, Sebé F, Pascual M, Gregorio E, Rosill-Polo JR (2017) Mobile terrestrial laser scanner applications in precisionfruticulture/horticulture and tools to extract information from canopy point clouds. Precision Agriculture 18(1):111-132. DOI: https://doi.org/10.1007/s11119-016-9474-5
    » https://doi.org/10.1007/s11119-016-9474-5
  • Favarin JL, Neto DD, García AG, Villa-Nova NA, Favarin MGGV (2002) Equações para estimativa do índice de área foliar do cafeeiro. Pesquisa Agropecuária Brasileira 37(6):769-773.
  • Gómez-Candón D, Castro AI, López-Granados FL (2014) Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precision Agriculture 15(1):44-56. DOI: https://doi.org/10.1007/s11119-013-9335-4
    » https://doi.org/10.1007/s11119-013-9335-4
  • Guerra-Hernández J, González-Ferreiro E, Sarmento A, Silva J, Nunes A, Correia AC, Fontes L, Tomé M, Díaz-Varela R (2016) Using high resolution UAV imagery to estimate tree variables in Pinus pinea plantation in Portugal. Forest Systems 25(2):1-5. DOI: http://dx.doi.org/10.5424/fs/2016252-08895
    » http://dx.doi.org/10.5424/fs/2016252-08895
  • Hunt Jr. ER, Horneck DA, Spinelli CB, Turner RW, Bruce AE, Gadler DJ, Brungardt JJ, Hamm PB (2018) Monitoring nitrogen status of potatoes using small unmanned aerial vehicles. Precision Agriculture 19(2):314-333. DOI: https://doi.org/10.1007/s11119-017-9518-5
    » https://doi.org/10.1007/s11119-017-9518-5
  • Jorge LA, Inamasu RY (2014) Uso de veículos aéreos não tripulados (VANT) em agricultura de precisão. In: Bernardi ACC, Naime JM, Resende AV, Bassol LH, Inamasu RY (ed). Agricultura de precisão: resultados de um novo olhar. Brasília, Embrapa, p109-134.
  • Khot LR, Sankaran S, Carter AH, Johnson DA, Cummings TF (2016) UAS imaging-based decision tools for arid winter wheat and irrigated potato production management. International Journal of Remote Sensing 37(1): 125-137. DOI: https://doi.org/10.1080/01431161.2015.1117685
    » https://doi.org/10.1080/01431161.2015.1117685
  • López-Granados F, Torres-Sánchez J, Serrano-Pérez A, de Castro AI, Mesas-Carrascosa FJ, Peña-Barragán J (2015) Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precision Agriculture 17(2):183-199. DOI: https://doi.org/10.1007/s11119-015-9415-8
    » https://doi.org/10.1007/s11119-015-9415-8
  • Matese A, Gennaro SFD, Berton A (2017) Assessment of a canopy height model (CHM) in a vineyard using UAV-based multispectral imaging. International Journal of Remote Sensing 38(8):2150-2160. DOI: https://doi.org/10.1080/01431161.2016.1226002
    » https://doi.org/10.1080/01431161.2016.1226002
  • Peña JM, Torres-Sánchez J, de Castro AI, Kelly M, López-Granados F (2013) Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. Plos One 8(10):e77151. DOI: https://doi.org/10.1371/journal.pone.0077151
    » https://doi.org/10.1371/journal.pone.0077151
  • Przybilla HJ, Wester-Ebbinghaus W (1979) Bildflug mit ferngelenktem Kleinflugzeug. Bildmessung und Luftbildwesen 47(5):137-142.
  • Rüegg J, Viret O, Raisigl U (1999) Adaptation of spray dosage in stone-fruit orchards on the basis of the tree row volume. EPPO Bulletin 29(1):103-110.
  • Siegfried W, Viret O, Huber B, Wohlhauser R (2007) Dosage of plant protection products adapted to leaf area index in viticulture. Crop Protection 26(2):73-82. DOI: https://doi.org/10.1016/j.cropro.2006.04.002
    » https://doi.org/10.1016/j.cropro.2006.04.002
  • Sutton TB, Unrath CR (1988) Evaluation of the tree-row-volume model for full season pesticide application on apples. Plant Disease 72(7):629-632. DOI: https://doi.org/10.1094/PD-72-0629
    » https://doi.org/10.1094/PD-72-0629
  • Torres-Sánchez J, López-Granados F, Borra-Serrano I, Peña JM (2018) Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precision Agriculture 19(1): 115-133. DOI: https://doi.org/10.1007/s11119-017-9502-0
    » https://doi.org/10.1007/s11119-017-9502-0
  • Torres-Sánchez J, Peña JM, de Castro AI, López-Granados F (2014) Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Computers and Electronics in Agriculture 103:104-113. DOI: https://doi.org/10.1016/j.compag.2014.02.009
    » https://doi.org/10.1016/j.compag.2014.02.009
  • Torres-Sánchez J, López-Granados F, Serrano N, Arquero O, Peña JM (2015) High-throughput 3-D monitoring of agricultural-tree plantations with unmanned aerial vehicle (UAV) technology. PLoS One 10(6):e0130479. DOI: https://doi.org/10.1371/journal.pone.0130479
    » https://doi.org/10.1371/journal.pone.0130479
  • Vega FA, Ramírez FC, Saiz MP, Rosúa FO (2015) Multitemporal imaging using an unmanned aerial vehicle for monitoring a sunflower crop. Biosystems Engineering 132:19-27. DOI: https://doi.org/10.1016/j.biosystemseng.2015.01.008
    » https://doi.org/10.1016/j.biosystemseng.2015.01.008
  • Winter JCF (2013) Using the Student's t-test with extremely small sample sizes. Practical Assessment, Research & Evaluation 18(10):1-12.
  • Wolf PR, Dewitt BA (2000) Elements of Photogrammetry: with applications in GIS (Vol. 3). New York, McGraw-Hill, 624.

Edited by

  • Area Editor: Fabio Henrique Rojo Baio

Publication Dates

  • Publication in this collection
    09 Sept 2019
  • Date of issue
    Sept 2019

History

  • Received
    21 Feb 2019
  • Accepted
    03 June 2019
location_on
Associação Brasileira de Engenharia Agrícola Associação Brasileira de Engenharia Agrícola - SBEA, Departamento de Engenharia - FCAV/UNESP, Via de Ac. Prof. Paulo Donato Castellane, KM 05, CEP: 14884-900 , Phone: +55 (16) 3209-7619, WhatsApp: +55 (16) 98118-8978 - Jaboticabal - SP - Brazil
E-mail: revistasbea@sbea.org.br
rss_feed Acompanhe os números deste periódico no seu leitor de RSS
Acessibilidade / Reportar erro