740 IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 10, NO. 2, FEBRUARY 2017 Mapping Mosaic Virus in Sugarcane Based on Hyperspectral Images Érika Akemi Saito Moriya, Nilton Nobuhiro Imai, Antonio Maria Garcia Tommaselli, and Gabriela Takahashi Miyoshi Abstract—The aim of this research was to develop a methodology involving aerial surveying using an unmanned aerial system (UAS), processing and analysis of images obtained by a hyperspectral cam- era, achieving results that enable discrimination and recognition of sugarcane plants infected with mosaic virus. It was necessary to characterize the spectral response of healthy and infected sugar- cane plants in order to define the correct mode of operation for the hyperspectral camera, which provides many spectral band options for imaging but limits each image to 25 spectral bands. Spectral measurements of the leaves of infected and healthy sugarcane with a spectroradiometer were used to produce a spectral library. Once the most appropriate spectral bands had been selected, it was pos- sible to configure the camera and carry out aerial surveying. The empirical line approach was adopted to obtain hemispherical coni- cal reflectance factor values with a radiometric block adjustment to produce a mosaic suitable for the analysis. A classification based on spectral information divergence was applied and the results were evaluated by Kappa statistics. Areas of sugarcane infected with mosaic were identified from these hyperspectral images acquired by UAS and the results obtained had a high degree of accuracy. Index Terms—Phytosanitation, precision agriculture, unmanned aerial system (UAS). I. INTRODUCTION T ECHNOLOGICAL innovations in precision agriculture have contributed to agronomic development by automating and optimizing processes involved in the agriculture, leading to benefits in production and in the environment [1]. It is well known that agriculture is very dynamic, requiring frequent monitoring. Products stemming from remote sensing and photogrammetry can provide accurate information to the farmer in quasi-real time [2]; hence, technologies and products based on these techniques have been widely used in the precision farming. Manuscript received March 7, 2016; revised June 15, 2016 and November 14, 2016; accepted November 20, 2016. Date of publication December 19, 2016; date of current version January 23, 2017. This work was supported in part by the Fundação de Amparo à Pesquisa de São Paulo (FAPESP) under Grant 2013/50426-4, in part by the Coordination for the Improvement of Higher Education Personnel (CAPES), and in part by the National Council for Scientific and Technological Development (CNPq), for Doctoral Scholarship and Master’s Degree Scholarship. (Corresponding author: Dr. Érika Akemi Saito Moriya.) É. A. S. Moriya and G. T. Miyoshi are with the School of Technology and Sciences, São Paulo State University, Presidente Prudente 19060-900, Brazil (e-mail: erikaasaito@gmail.com; takahashi.gabi@gmail.com). N. N. Imai and A. M. G. Tommaselli are with the Department of Cartography, School of Technology and Sciences, São Paulo State University, Presidente Pru- dente 19060-900, Brazil (e-mail: nimai@fct.unesp.br; tomaseli@fct.unesp.br). Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/JSTARS.2016.2635482 According to Mulla [3], there has been significant interest in remote sensing applications in evaluating growth and stress in crops. In the particular case of sugarcane specifically in the State of São Paulo, Brazil, the use of satellite images is among the leading precision farming technologies adopted by sugarcane plantations (76%), followed by the use of automatic pilot (39%) and aerial images (33%) [4]. Hyperspectral remote sensing measurements are taken in a broad spectral range but in narrow spectral bands. In this way, hyperspectral imaging has revolutionized the ability to distin- guish multiple characteristics of agricultural crops, including nutrients, water, pests, diseases, weeds, biomass, and canopy structure [3]. In order to characterize the phenological stages of barley, Lausch et al. [5] used vegetation index with parameters that relate to phenology and hyperspectral images. The authors believe that the use of hyperspectral images (or data cubes) is important in recording plant’s development, enabling quantification of changes occurring at each stage, fundamental in ecological modeling. Hyperspectral data obtained in field or laboratory by the spec- troradiometer has also been used for the diagnosis of agricultural diseases, with methodology based on the spectral properties of the plants to distinguished diseased plants [6]–[8]. The identi- fication of diseased plants is important in monitoring the phy- tosanitary state of crops and defining appropriate management to control the spread of the disease. The combination of aerial multispectral and hyperspectral images with spectral measurements taken in field and laboratory provide even more refined data to identify changes resulting from diseases in agriculture, such as, the case of citriculture raised by Li et al. [9], who identified areas affected by greening Huanglongbing (HLB), a bacterial disease affecting citrus plants. Sensor systems with greater flexibility are required for many applications, allowing easy access to the site [10]. Airborne sen- sors offer greater flexibility than satellite platforms in mission planning. They also offer higher spatial resolution, allowing better detail of the ground target [11]. A useful technological tool is the unmanned aerial system (UAS) based aerial remote sensing, which can fill the gap be- tween manned aircraft and terrestrial measurement [10]. Images from UASs generally have a high-spatial resolution in centime- ter levels with the potential of avoiding cloud cover [12]. Fur- thermore, multispectral and hyperspectral images can enable 1939-1404 © 2016 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications standards/publications/rights/index.html for more information. MORIYA et al.: MAPPING MOSAIC VIRUS IN SUGARCANE BASED ON HYPERSPECTRAL IMAGES 741 accurate spectral analysis provided they are properly radiomet- rically calibrated. Brazil is the world leader in producing and exporting various agricultural products, being the leading producer of the sugar- cane [13]. More than 216 diseases affecting sugarcane crops had been identified by 1994, of which 58 were to be found in Brazil. At least ten, which can be caused by bacteria, fungi or viruses, are economically significant for farmers [14]. The most important diseases are usually controlled by the use of resistant varieties. However, they are caused by living beings, organisms able to adapt to the environment to ensure the survival of their species, generating new infective organisms with greater genetic variability, having greater resistance linked to factors such as the production environment. A new contamination can occur resulting in an epidemic [14]. Mosaic was the main disease of viral origin in the Brazilian sugarcane cultivation [15]. A mosaic epidemic hampered the sugarcane industry in the State of São Paulo between 1922 and 1930, resulting in a 93% reduction in the production of sugarcane and 90% in alcohol [16]. The virus causing the disease occurs in grasses such as maize (Zea mays), sorghum (Sorghum bicolor), and in several weeds [15]. Transmission of the virus occurs through the use of con- taminated sugarcane seedlings and insect vectors, aphids biting the plant and transmitting the virus [16]. One suitable approach for disease monitoring in sugarcane can is based on hyperspectral images taken from a UAS. How- ever, a major challenge to this technique is the generation of an orthophoto mosaic image free of the effects caused by the bidi- rectional reflectance distribution function (BRDF) with pixels representing a radiometric value, which could be compared to the spectral signatures of a spectral library. The Rikola hyper- spectral camera is very versatile and can be configured to take image cubes with 25 bands with a bandwidth of approximately 15 nm, which are interactively selected from a large number of wavelengths. In this paper it is assumed that it is possible to compare the spectral characterization of healthy sugarcane and infected by disease with the signatures of a spectral library to sort images using 25 spectral bands. For this reason, it is necessary that the wavelengths taken by the camera characterize the spectral regions where the differences are more evident. To achieve this, an image mosaic representing a reflectance factor free of the effects caused by the geometry of illumination and acquisition must be produced. The sugarcane plant has long and narrow leaves that are usu- ally upright. Changes in the spectral response caused by diseases manifest themselves throughout each leaf. The crop canopy shows only a small projection of these leaves in vertical images. However, we assumed in this paper that a well-corrected radio- metric mosaic of 25 wavelength images can be compared to the spectral signatures of sample leaves taken in the laboratory or in the field. The major challenge is to produce a reliable spectral reflectance factor to be analyzed. This paper presents an innovative approach to monitor sugar- cane plantations, which involves the selection of spectral bands to determine the configuration of the hyperspectral camera used for aerial surveying, processing, and analyzing the images ob- tained by the UAS sensor. The final product is a map of infected and healthy sugarcane produced by applying a classifier, which adopts signatures from a spectral library. An accurate, reliable, and cost-effective solution for sugarcane monitoring, such as the technique presented in this paper, is highly relevant when con- sidering its economic and social impact, because its widespread application will improve productivity while reducing environ- mental impacts. Sugarcane has great economic, social, and environmental im- portance to Brazil because of its usage in the production of various products (food, biofuels, bioenergy) and job creation. Concern over the scarcity of fossil fuels encourages the produc- tion and use of renewable fuels such as ethanol, produced from sugarcane, and these factors are contributing to the expansion of the sugarcane crop. However, it is essential that the expansion should be oriented to sustainable standards of agricultural pro- duction meeting criteria aimed at reducing the environmental impact. This paper presents a relevant contribution to the develop- ment of a methodology for Precision Agriculture for detection and mapping of disease in sugarcane with high spatial and spec- tral detail. The paper presents a specific application in sugarcane health monitoring. The map of areas with phytosanitary prob- lems in sugarcane provides crucial information for production, planning, and management. This kind of map makes it possible to perform localized application of pesticides, contributing to lowering the environmental impact and production cost. II. METHODOLOGY A. Field Procedure The study area is located in the municipality of Euclides da Cunha Paulista, in the interior of the State of São Paulo. The area planted with sugarcane in the municipality amounts to 4,045 hectares (40.45 km²) [17]. The sugarcane area selected, is near the “Ponte Branca” farm belonging to the Ênio Pipino Foundation, and is located at 22° 23’51.21 “S, 52° 31’3.90” W in the WGS84 system [18]. These sugarcane plantations belong to the Alcı́dia, a Brazilian sugarcane industry. Weather conditions on the day of the flight (05/19/2015) were: moderately cloudy with temperatures around 19.6 °C, rel- ative humidity of 83%, atmospheric pressure 985.3 hPa, winds speed of 0.1 m/s speed, and radiation 472.9 kJ/m2. These data were collected from the automatic weather observation station, Paranapoema-PR [19]. Fifteen ethylene vinyl acetate (EVA) targets, with the dimen- sions 0.80 m × 0.49 m, were distributed along the access road to the sugarcane plantation to evaluate the quality of the spectral response of the corrected images. Seven of these EVA targets were white, three grey, and four black. The coordinates of the EVA targets were measured with a Garmin GPS navigation re- ceiver. Four targets, size 1.50 m × 0.75 m, with two black circles [20] [see Fig. 1(a)], which were used as ground control points (GCPs), were distributed over the area for the phototriangulation process (bundle block adjustment). Fig. 1(b) shows the control 742 IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 10, NO. 2, FEBRUARY 2017 Fig. 1. (a) Control target [20]. (b) Control target in hyperspectral image. Fig. 2. Mosaic in sugarcane. target in the hyperspectral image. The coordinates of these tar- gets were determined using a Topcon Hiper GNSS dual receiver (L1/L2). Plants infected with mosaic in the sugarcane plantation were identified in the field. Furthermore, the presence of weeds was observed between rows of planted sugarcane crop. Radiometric reference targets were measured with a spectroradiometer from analytical spectral devices (ASD), FieldSpec Ultraviolet/Near- infrared (UV/NIR) with a field of view (FOV) of 1°. Radiomet- ric measurements were collected from healthy sugarcane leaves and those infected with mosaic (see Fig. 2), weeds and soil sam- ples to calculate the hemispherical conical reflectance factors (HCRFs). An SX8 multirotor UAS with eight propellers [see Fig. 3(a)] was equipped as follows: 1) a hyperspectral camera model DT-0014 based on Fabry– Perot interferometer (FPI), developed by VTT Techni- cal Research Center, and manufactured by Rikola [see Fig. 3(b)]; 2) an inertial navigation system (INS), model IGM-S1, from Novatel, was used for platform positioning; 3) a Raspberry portable computer was used to record the INS data; 4) an irradiance sensor; 5) a navigation grade GNSS receiver, attached to the Rikola camera; 6) a GPS receiver for the UAS autopilot; and 7) two batteries for power supply. The SX8 is an octocopter with a structure made of carbon fiber and aircraft aluminum, thus being composed of a light material ensuring greater operational safety. The UAS has an operational range up to one hour of flight. An autonomous navigation option Fig. 3. (a) SX8 multirotor UAS. (b) Hyperspectral camera based on Fabry- Perot Interferometer (Source: Rikola Ltd. [21]). from predefined flight plans can be set and monitored by the UAS control station. The hyperspectral camera has two monochrome complemen- tary metal oxide semiconductor, with pixel size of 5.5 μm × 5.5 μm, which allows image acquisition with integration time from 10 to 50 ms, Rikola Ltd [21] recommends 10 ms of integration time for sunny days and 20–50 ms for cloudy days. The camera has horizontal and a vertical FOV of 37°, the focal length is 9 mm, and approximate weight 700 g. A GNSS receiver and an irradiance sensor were included in the camera package. The exposure time can be adjusted between 0.06–3000 ms. The Rikola camera is able to acquire an image data cube with up to 25 bands from selectable wavelengths. The spectral range of the camera is between 500 to 900 nm. This image data cube is not instantaneously collected, due to the need to change the interferometer airgap to acquire the spectral bands. Owing to this feature of the camera, band-to-band registration is required, a task that can be performed with a software provided by the manufacturer. Field measurements were performed first in order to select the most suitable spectral bands. Spectral signatures were mea- sured from sugarcane leaves, both sick and healthy, using an ASD handheld FieldSpec UV/NIR spectroradiometer to form a spectral library [22]. Spectral differences between the signatures in this spectral library were assessed in order to determine the spectral regions, where the differences are higher. These results were combined with photosynthetic pigments absorption bands, carotenoids and chlorophyll [23]–[25] to define the bands configuration of the MORIYA et al.: MAPPING MOSAIC VIRUS IN SUGARCANE BASED ON HYPERSPECTRAL IMAGES 743 TABLE I HYPERSPECTRAL CAMERA BANDS CONFIGURATION Band Central FWHM Band Central FWHM wavelength (nm) (nm) wavelength (nm) (nm) 1 506.1 15.6 14 680.1 21.0 2 520.0 17.5 15 689.6 21.7 3 535.5 16.4 16 699.6 21.9 4 550.8 15.2 17 709.7 20.8 5 564.7 16.6 18 720.0 20.8 6 580.1 15.1 19 729.6 20.8 7 591.5 14.7 20 740.5 20.6 8 605.6 13.8 21 749.7 19.4 9 619.5 14.6 22 770.5 19.4 10 629.9 15.9 23 790.1 18.5 11 650.3 24.1 24 810.2 17.7 12 660.3 24.1 25 829.9 18.6 13 670.0 21.7 – – – Fig. 4. Maximum viewing angle considering a forward overlap of 60%. camera. Table I shows the hyperspectral camera bands configu- ration with its value of full width half maximum (FWHM). The flight took place over the selected area at approximately 11:50 am, with a flight height of 160 m acquiring 113 images with a ground sample distance GSD of approximately 0.11 m. These images have a forward overlap of 60% and side overlap of 30%. The camera coverage angle is normal and the maximum viewing angle was around 17.45° (see Fig. 4) considering that areas nearest to the nadir (half the forward overlap) are used to produce the mosaic. This acquisition geometry was planned to avoid strong variations of the BRDF. B. Digital Image Processing Fig. 5 shows a flowchart summarizing the steps of hyperspec- tral images processing, which are detailed in this section. 1) Radiometric Correction: The images obtained by the hy- perspectral sensor were corrected from the dark current using an image of a dark target as the dark reference, to remove cam- era’s electronic noise. The displacement caused by the FPI air- gap changing is eliminated by registering the bands, taking one Fig. 5. Hyperspectral images processing chain. band as a reference [26]. In this sense, a band matching process available in the CoRegister software provided by Rikola was applied, ensuring the spatial coincidence of pixels in the data cube. With data from geometric calibration (interior orientation pa- rameters), initial coordinates from GNSS receiver, and attitude parameters from INS, the coregistered images were triangulated using the ERDAS LPS software (Leica Photogrammetry Suite) to estimate the exterior orientation parameters for all images. Because of the large volume of images and data cubes, bundle block adjustment was performed for only one spectral band. Band 8 (centered in 605.6 nm) was chosen because it presented good contrast, which facilitates the identification of targets and GCPs. After carrying out bundle block adjustment for one of the bands composing the hyperspectral cube, georeferencing of all images was carried out. This procedure was performed using ENVI software. Radiometric processing of the hyperspectral images was per- formed using a software developed by Honkavaara [26], [27], which enables radiometric bundle block adjustment of images. The BRDF model (1), developed by Walthall et al. [28], used in this software performs the radiometric processing in blocks of images and compensates for the anisotropy factor in the targets. Rjk (θi, θr , ϕ) = Rk (0, 0, ϕ) ( a′θ2 r + b′θr cosϕ + 1 ) (1) where Rjk (θi, θr , ϕ) is the reflectance from point k in image j; Rk (0, 0, ϕ) is the reflectance factor with the viewing geometry at nadir; the coefficients a′ and b’ are derived from the reflected geometry of θr = 0; and ϕ = ϕr − ϕi is the relative azimuth angle. The software also corrects possible illumination variations during the acquisition of the images. The methodology devel- oped by Honkavaara et al. [26], [29] was adopted for the cor- rection of this illumination variation in the blocks of images, which considers measurements of irradiance taken in the field during image acquisition. In this study, measurements of irradi- ance (total irradiance with solar and diffuse components) were calculated with a grey plate. The irradiance is the flux of radia- tion incident on a surface per unit of area, given as Wm–2 [30]. The radiometric correction was made from the application of 744 IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 10, NO. 2, FEBRUARY 2017 the (2) [31]: Ljc(λ)at sensor = Lj (λ)at sensor( Ej (λ) E r e f (λ) ) = Lj (λ)at sensorCj (λ) (2) where Ljc(λ)at sensor is the corrected radiance of image j; Lj (λ)at sensor is the radiance at the sensor of image j, Ej (λ) is the target irradiance in the image j; Eref (λ) is the irradiance of the reference (ref) target measured in situ; and Cj (λ) is the correction factor for image j to normalize the irradiance to the reference ref. Since the irradiance measurements of the reference plate were collected between 325 to 1075 nm, it was necessary to tune the wavelength ranges so that they matched the configuration of the bands of the hyperspectral sensor that acquired the images. In this way, the spectral curves were simulated in accordance with the spectral ranges of the bands of the hyperspectral sensor, adopting the Gaussian curve for spectral sensitivity [32], [33], because no data is available concerning to this sensitivity. In this procedure of correcting the illumination of the images, the irradiance measure used was that closest to the instant the image was acquired by the hyperspectral sensor, and the irradiance in a selected scene was used as a reference producing a factor compensating for differences in irradiance value between scenes according to the (2). In addition, absolute radiometric calibration parameters are not ignored, otherwise it may cause errors in the results. The dependence of the digital number (DN) and the radiance of each sensor is given in (3), where c1 and c2 are the radiometric calibration parameters DN = c1L(λ)at sensor + c2 . (3) The transformation factors for the DNs were calculated in physical values, such as reflectance, from a linear regression [34]. An image was selected as a reference for the empirical line calibration process. This image includes white, black, and gray EVA plates and sugarcane sample for which HCRF was measured in field using the ASD spectroradiometer. The radiometric bundle adjustment therefore used tie points based on a radiometric digital surface model (DSM) with 10 m spacing in order to produce the relative correction values of illumination. To adjust the relative correction values, it was necessary to use acquisition geometry so that the angles of the target radius and solar lighting on the target surface were taken into account. The parameters to make the relative radiometric calibration were taken from the exterior orientation parameters computed by bundle adjustment and the time the scenes were made. Finally, an orthophoto mosaic was produced using interior and exterior orientation parameters, and radiometric correction parameters [26], [29]. RGB overlapped images acquired by the Ultracam-XP [35] with 0.40 m of GSD were used to produce the DSM employed in the orthophoto mosaic generation. The GSD of both the DSM and the orthophoto mosaic produced is 0.50 m. 2) Classification Process: Once the radiometric bundle block adjustment process had been completed, the image pixels Fig. 6. Classification process flowchart. Fig. 7. (a) Spectral curve of healthy sugarcane. (b) Spectral curve of sugarcane infected with mosaic virus. (c) Spectral curve of weed (Panicum maximum). (d) Spectral curve of bare soil. store physical values, that is, the orthophoto mosaic hold pixels with HCRF values. Subsequently, infected areas of sugarcane were classified using the spectral information divergence (SID) classification process [36], available in ENVI, the classification process is summarized in the flowchart shown in Fig. 6. SID is a classification approach that compares the similarity between one signature from a spectral library and the spectral response of each pixel from orthophoto mosaic, measuring the discrep- ancy (divergence) between the two signals [37]. The minimum divergence value between the signals indicates a higher degree of similarity to the reference spectra [36]. The spectral response (see Fig. 7) of healthy sugarcane leaves and those infected by mosaic virus, weed (guinea grass - Pan- icum maximum), and bare soil, were included in the spectral library, which was created for use in the classification process. One sample for each spectral reference was measured in field with a spectroradiometer. In this study, samples were not in- cluded representing variations due to the degree of commitment of the disease, because the objective was only to identify dis- eased plants, not the damage level. In the SID classification process, maximum divergence threshold values were assigned to each reference curve, which made it possible to obtain a better result in the classification. MORIYA et al.: MAPPING MOSAIC VIRUS IN SUGARCANE BASED ON HYPERSPECTRAL IMAGES 745 Fig. 8. Sugarcane infected with mosaic. TABLE II MAPPED CLASSES Class Mapped area Healthy sugarcane 6049.25 m2 (54.1%) Sugarcane infected with mosaic virus 4457.75 m2 (39.9%) Weed 364.25 m2 (3.3%) Bare soil 147.75 m2 (1.3%) Unclassified 159.75 m2 (1.4%) The possibility of specifying tolerance value for the difference between pairwise compared spectral response is a great advan- tage for the purpose of this paper. Thus, a single signature for each class was sufficient for classification. The threshold values adopted were obtained in an empirical analysis: for healthy sug- arcane, 0.20; for mosaic-infected sugarcane, 0.04; for the weed, 0.03; and for bare soil, 0.07. Data collected in the field was used to check the reliability of maps of diseased sugarcane and the accuracy of the classification was verified from the confusion matrix and the Kappa statistical coefficient [38]–[40]. Due to some access limitations, it was only possible to make verifications in the border areas of the plantation, because of the high density of the sugarcane. Therefore, 80 points were used in the validation study area. III. RESULTS AND DISCUSSION A. Classification Results and Accuracy Fig. 8 shows the sugarcane mapping by the SID classifier in the study area, defined into four classes: healthy sugarcane, sugarcane infected with mosaic, weed, and bare soil. The study area covers a total of 111, 178.75 m2 of which 39.90% was infected with mosaic. Additional data about the areas of the other classes are given in Table II. Table III shows the classification confusion matrix for the study area. The accuracy was 92.50%, the omission error for sugarcane infected with mosaic was 8.57%, and inclusion er- ror for healthy sugarcane was 5.88%. The Kappa coefficient obtained was 0.87, which is regarded as an excellent rating by Landis and Koch [41]. In general, the SID classifier performed well giving a detailed identification of areas infected by the disease. During the verifi- cation in the field, it was noted that, often, indications of disease in sugarcane often does not occur in all clusters (infected areas), but only on some leaves. Even so, the measurements of the SID classifier based on the spectral curves measured in the field were efficient and able to detect the disease. The areas of sugarcane infected with mosaic were located across the entire plantation, worsening from the central region forming a main clump going westwards from the selected area. Many of the areas classified as infected had deteriorated badly, as can be seen from the image and in the field. B. Weed, Unclassified Areas, and Bare Soil Some plants of the field have been classified as weed. There was a great concentration of weed in the study area mainly with crabgrass (Digitaria horizontalis) [see Fig. 9(a)]. The Guinea grass (Panicum maximum) can reach 1–2 m [see Fig. 9(b)] in height presenting shaped clumps, the Digitaria horizontalis can reach 0.60 m in length, thus smaller than Guinea grass (Panicum maximum). A lesser quantity of weed was found in the study area, which may be because crabgrass (Digitaria horizontalis) is a smaller plant than sugarcane; the sugarcane may have obscured information of the spectral characteristic of weed recorded in the image. The presence of weed can damage the sugarcane plantation, acting as competitors in the production environment, because it absorbs nutrients present in the soil. Guinea grass (Panicum maximum), for example, is difficult to control and affects the quality of the cane plantation, so it must be properly managed [41]. The location of weeds in crops becomes important and aerial images are a promising tool for identifying and mapping weeds in agricultural crops. Remote sensing is a noninvasive method of acquiring a synoptic view of the quantity of weeds in the soil [11]. Hyperspectral images from the hyperspectral sensor com- bined with spectral curve information about the weeds obtained 746 IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 10, NO. 2, FEBRUARY 2017 TABLE III CONFUSION MATRIX Field reference Healthy Sugarcane Weed Bare Unclassified Total Classification Sugarcane with mosaic Soil Healthy sugarcane 32 3 0 0 0 35 Sugarcane with mosaic 1 34 0 0 0 35 Weed 0 0 6 0 0 6 Bare Soil 0 0 0 4 0 4 Unclassified 0 0 0 0 0 0 Total 33 37 6 4 0 80 Fig. 9. (a) Crabgrass (Digitaria horizontalis). (b) Guinea grass (Panicum maximum). Fig. 10. Failure resulting plant line mosaic infection in sugarcane. in the field were useful in detecting and determining the position of the areas infested by weeds. The unclassified areas were those in shaded parts of the image or that had exposed soil of a dark color and the background of image. Many of areas of bare soil are areas of the sugarcane plantation showing gaps in the row of planting, which could come from plants that did not survive to mosaic. C. Sugarcane With Mosaic The plantation was checked for the presence of mosaic in february (02/13/2015), that is, three months before the images were taken (05/19/2015). It was noted that areas of sugarcane infected with the mosaic virus were relatively degraded. As the sugarcane plant had been attacked by the mosaic virus in the early stages of growth, many plants could not survive, causing gaps in the planting line [see Fig. 10(a) and (b)]. Losses caused by mosaic in sugarcane vary depending on the resistance of the type planted, fertility of the soil, presence of other diseases and closeness of the virus to the host, resulting in a decrease in production of between 46–86% [42]. Plants that survived have low growth and a low cluster den- sity [see Fig. 10(c)]. Tokeshi and Rago [42] confirm that clumps infected with mosaic show retarded development, possibly re- sulting in their height being reduced by half, as was observed under field conditions [see Fig. 10(c)]. Tokeshi and Rago [43] also pointed out that the mosaic symp- toms often occur in young plantations, and with good vegetative growth; and an alternative method of management that can be taken up to facilitate the identification of diseased plants is to apply high doses of nitrogen in cultivating sugarcane, which will increase the contrast between green and chlorotic areas. The contrast between plants infected with mosaic, with a lighter shade of gray in image, and healthy plants with more vigor showing a stronger shade of gray, can be seen in Fig. 10(c). IV. CONCLUSION This paper proposed a processing chain in which hyperspec- tral images taken from UAS were first treated geometrically to obtain the exterior orientation parameters by photogrammetric bundle adjustment. A DSM surface was then produced by pho- togrammetric processes applied on a set of RGB overlapped im- ages acquired by the Ultracam-XP, so the acquisition geometry could be recovered and an orthophoto mosaic of the image cubes was produced. The radiometric variation effects introduced by BRDF were corrected based on the geometry of acquisition, considering the target radius position of each pixel in relation to the canopy surface, as well as the angle of incidence of the solar radiation. The produced orthophoto mosaic delivered physical values for the image, the HCRF, and analysis of these values en- abled accurate determination of sites infected with diseases. MORIYA et al.: MAPPING MOSAIC VIRUS IN SUGARCANE BASED ON HYPERSPECTRAL IMAGES 747 The higher spatial resolution of the images acquired by optical sensors onboard UAS produces a great volume of data to be processed. However, it allows more details on the crop canopy development that make feasible a better understanding of the phenomenon, especially in plant healthy status, highlighting the diseased plants. UAS based aerial remote sensing is a potential tool for Precision Agriculture due to its flexibility to perform imaging and its capability to acquire images bellow cloud cover. These characteristics are fundamental for detecting diseases dur- ing particular crop development period and to monitor the crop evolution of the affected plants. The hyperspectral images ac- quired by UAS have high potential for applications not only in the culture of sugarcane, but also with other crops, though it is necessary to make adaptations according to the type of crop being studied. The spectral library in which there is only one signature to represent each interest class was enough to recognize the targets in the data cube. The accuracy of the spectral response of the data cube, the suitability of the wavelengths chosen to set the hyper- spectral camera and the similarity distance adopted to evaluate each pixel were the main keys for the success of this work. The production of hyperspectral image mosaics with this rig- orous approach is possible only by combining accurate geome- try acquisition with the application of radiometric models that compensate for spectral response variations often found in im- ages taken by aircraft. The radiometric calibration applied was fundamental in creating an accurate spectral response for each pixel, which were classified with the required accuracy. Mapping areas of infected sugarcane provides the exact loca- tion of plant health problems in the crop, not only in relation to the disease, but also with respect to weeds. The precise location of this problem enables the use of pesticides to reach outbreaks of disease or pest, and makes for the most appropriate crop management, either with the application of pesticides or even the disposition of diseased plants. ACKNOWLEDGMENT The authors would like to thank Grupo Ruette Agroindustrial, Unidade Monte Rey, in Ubarana for their important support in the collection of radiometric measurements in the field, and especially R. de Souza, corporate human resources manager, L. A. Camargo and F. Benaducci. The authors are thankfull to São Paulo Research Foundation- FAPES to support part of this research (2013/50426-4) and Coordination for the Improvement of Higher Education Personnel-CAPES and National Council for Scientific and Tech- nological Development – CNPq, for assistance in the form of a Doctoral Scholarship and Master’s Degree Scholarship. REFERENCES [1] N. Zhang et al., “Precision agriculture: A worldwide overview,” Comput. Electron. Agriculture, vol. 36, pp. 113–132, 2002. [2] S. K. Seelan et al., “Remote sensing applications for precision agriculture: a learning community approach,” Remote Sens. Environ., vol. 88, pp. 157– 169, 2003. [3] D. J. Mulla, “Twenty five years of remote sensing in precision agriculture: key advances and remaining knowledge gaps,” Biosyst. Eng., vol. 114, no. 4, pp. 358–371, 2013. [4] B. C. Silva et al., “Adoption and use of precision agriculture technologies in the sugarcane industry of São Paulo State, Brazil,” Precision Agricul- ture, vol. 12, no. 1, pp. 67–81, 2011. [5] A. Lausch et al., “Deriving phenology of barley with imaging hyper- spectral remote sensing,” Ecological Model., vol. 295, pp. 123–135, 2015. [6] P. Sirisomboon et al., “Study on non-destructive evaluation methods for defect. Pods for green soybean processing by near-infrared spectroscopy,” J. food Eng., vol. 93, no. 4, pp. 502–512, 2009. [7] N. Jin et al., “Hyperspectral identification of cotton verticillium disease severity,” Optik, vol. 124, no. 16, pp. 2569–2573, 2013. [8] L. Yuan et al., “Analysis of spectra difference between the foreside and backside of leaves in yellow rust disease detection for winter wheat,” Precision Agriculture, vol. 14, no. 5, pp. 495–511, 2013. [9] X. Li et al., “Spectral difference analysis and airbone imaging classifi- cation for citrus greening infected trees,” Comput. Electron. Agriculture, vol. 83, pp. 32–46, 2012. [10] T. Hakala et al., “Acquisition of bidirectional reflectance factor dataset using a micro unmanned aerial vehicle and a consumer camera,” Remote Sens., vol. 2, no. 3, pp. 819–832, 2010. [11] D. W. Lamb and R. B. Brown, “Remote sensing and mapping of weeds in crops,” J. Agricultural Eng. Res., vol. 78, no. 2, pp. 117–125, 2001. [12] C. Zhang and J. M. Kovacs, “The application of small-unmanned aerial systems for precision agriculture: a review,” Precision Agriculture, vol. 13, no. 6, pp. 693–712, 2012. [13] UNICA. União da indústria de cana-de-açúcar. 2008. [Online]. Avail- able: . (In Portuguese). [14] R. Rosseto and A. D. Santiago, “Doenças da cana-de-açúcar e seu controle,” Informações Agronômicas, Piracicaba, Brazil, no. 67, 1994. [Online]. Available: . (In Portuguese). [15] M. C. Gonçalves, “Doenças causadas por virus,” in Cana-De-Açúcar, L. Dinardo-Miranda et al., Eds.. Campinas, Brazil: Instituto Agronômico, 2010 (ISBN: 978-85-855564-17-9) (In Portuguese). [16] A. Sanguino, “As principais doenças da cana-de-açúcar,” in Curso à Distância Tópicos Da Cultura de Cana-De-Açúcar. Ribeirão Preto, Brazil: Instituto Agronômico, 2012. (In Portuguese). [17] Instituto Brasileiro de Geografia e Estatı́stica. Cidades, Brazil, 2014. [Online]. Available: http://www.cidades.ibge.gov.br/. (In Portuguese). [18] Sistema Integrado de Gerenciamento de Recursos Hı́dricos do Estado de São Paulo. SIGRH, 2014. [Online]. Available: .pdf. (In Portuguese). [19] Instituto Nacional De Meteorologia. Dados de estações automáticas. 2015. [Online]. Available: . (In Portuguese). [20] A. Berverglieri and A. M. G Tommaselli, “Exterior orientation of hyper- spectral frame images collected with UAS forest applications”, in Proc. Int. Arch. Photogrammetry, Remote Sens. Spatial Inf. Sci., Eur. Calibration and Orientation Workshop, Lausanne, Switzerland, Feb 2016, vol. XL- 3/W4, pp. 45–50. [21] Rikola Ltda, “Hyperspectral camera,” 2013. [Online]. Available: . [22] É. A. S. Moriya, “Identificação de bandas espectrais para detecção de cana-de-açúcar sadia e doente utilizando câmara hiperespectral embarcada em VANT,” M.S. thesis, Dep. Cart., Estadual Paulista Univ., Presidente Prudente, Brazil, 2015. (In Portuguese). [23] A. A. Gitelson et al., “Assessing carotenoid content in plant leaves with reflectance spectroscopy,” Photochem. Photobiol., vol. 75, no. 3, pp. 272– 281, 2002. [24] D. O. Hall and K. K. Rao. Photosynthesis. London U.K.: Edward Arnold, 1977. [25] H. Lodish et al., Molecular Cell Biology. New York, NY, USA: Freeman, 2000. [26] E. Honkavaara et al., “Processing and assessment of spectrometric, stereo- scopic imagery collected using a lightweight UAV Spectral camera for precision agriculture,” Remote Sens., vol. 5, no. 10, pp. 5006–5039, 2013. [27] E. Honkavaara, “Calibrating digital photogrammetric airbone imaging systems using a test field,” M.S. thesis, Dept. Surveying, Helsinki Univ. Technol., Espoo, Finland, 2008. [28] C. L.Walthall et al., “Simple equation to approximate the bidirectional reflectance from vegetative canopies and bare soil surfaces,” Appl. Opt., vol. 24, no. 3, pp. 383–387, 1985. [29] E. Honkavaara et al., “Hyperspectral reflectance signatures and point clouds for precision agriculture by light weight UAV imaging system”, in 748 IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, VOL. 10, NO. 2, FEBRUARY 2017 Proc. ISPRS Ann. Photogrammetry, Remote Sens. Spatial Sci.,Melbourne, VIC, Australia, 25 Aug.–01 Sep. 2012, vol. I-7, pp. 353–358. [30] J. R. Jensen, Remote Sensing of the Environment: An Earth Resource Perspective. Upper Saddle River, NJ, USA: Prentice-Hall, 2007. [31] T. Hakala et al., “Spectral imaging from UAV varying illumination condi- tions,” in Proc. Int. Arch. Photogrammetry, Remote Sens. Spatial Inf. Sci.,, Rostock, Germany, 4–6 Sep. 2013, vol. XL-1/W2, pp. 189–194. [32] S. Q. Kidder and T. H. V. Haar, Satellite Meteorology an Introduction. New York, NY, USA: Academic, 1995. [33] R. A. Schowengerdt, Remote Sensing, Models and Methods for Image Processing, 2 ed. New York, NY, USA: Academic, 1997. [34] R. N. Sahoo et al., Processing of Hyperspectral Remote Sensing Data. New Delhi, India: Div. Agriculture Phy., Indian Agriculture Res. Inst., 2013. [35] R. A. Oliveira et al., “Potential of dense image matching for DSM gener- ation in tropical forests using UAV and aerial images,” in Proc. Int. Soc. Photogrammetry Remote Sens., Geospatial Week 2015, France, Sep. 28, 2015–Oct. 2, 2015, p. 4. [36] H. Du et al., “New hyperspectral discrimination measure for spectral characterization,” Opt. Eng., vol. 43, no. 8, pp. 1777–1786, 2004. [37] C. I. Chang, “An information-theoretic approach to spectral variability, similarity, and discrimination for hyperspectral image analysis,” IEEE Trans. Inf. Theory, vol. 46, no. 5, pp. 1927–1932, Aug. 2000. [38] J. A. Cohen, “Coeficiente of agrément for nominal scales,” Edu. Psychol. Meas., vol. 20, pp. 37–46, 1960. [39] W. D. Hudson and C. V. Ramm, “Correct formulation of the Kappa coefi- ciente of agréement,” Photogrammetric Eng. Remote Sens., vol. 53, no. 4, pp. 421–422, 1987. [40] R. G. Congalton and K. Green, Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, FL, USA: Lewis publishers, 1999. [41] J. R. Landis and G. G. Koch, “The measurement of observer agrément for categorical data,” Biometrics, vol. 33, no. 1, pp. 159–174, 1977. [42] E. B.Melo et al., “Alternativas para a catação quı́mica de touceiras de capim-colonião e capim-braquiária em cana-soca,” Revista Brasileira de Herbicidas, vol. 12, no. 3, pp. 307–317, 2014. (In Portuguese). [43] H. Tokeshi and A. Rago, “Doenças da cana-de-açúcar,” in Manual de Fitopatologia: Doenças Das Plantas Cultivadas, H. Kimati, et al., Eds. São Paulo, Brazil: Biblioteca Agronômica Ceres, 2005, vol. 2. (ISBN 85-318.0043-9) (In Portuguese). Érika Akemi Saito Moriya received the Graduate degree in cartographic engineering from the School of Technology and Sciences, São Paulo State Univer- sity, Presidente Prudente, Brazil, in 2007, the Mas- ter’s degree in remote sensing from the National In- stitute for Space Research and the Doctorate degree in cartographic sciences both from São Paulo State University, in 2010 and 2015, respectively. She has an experience in following topics: geo- science, remote sensing and GIS, precision agricul- ture, plant pathology, sugarcane, deforestation and human occupation in the Amazon. Nilton Nobuhiro Imai received the Graduate degree in agricultural engineering from the State University of Campinas, Campinas, Brazil, in 1979. He received the Master’s degree in remote sensing from the Na- tional Institute for Space Research and the Doctorate degree in geography (human geography) from the University of São Paulo, São Paulo, Brazil, in 1986 and 1996, respectively. He is currently an Assistant Professor in the School of Technology and Sciences, São Paulo State Univer- sity, Presidente Prudente, Brazil. He has an experi- ence in geosciences, remote sensing with emphasis, mainly in remote sensing and GIS in the analysis, and development of models for inferring the physical environment variables. Antonio Maria Garcia Tommaselli received the Graduated degree in cartographic engineering from the School of Technology and Sciences, São Paulo State University (UNESP), Presidente Prudente, Brazil, in 1983. He received the Master’s degree in geodetic sciences from the Federal University of Paraná, Curitiba, Brazil, in 1988, the Ph.D. degree in electrical engineering from the State University of Campinas, Campinas, Brazil, in 1993, the Habil- itation degree in photogrammetry from UNESP, in 1998, and the Postdoctorate degree in photogramme- try from University College London, London, U.K., 1998. He is currently a Professor at UNESP. He has an experience in geosciences with an emphasis on photogrammetry, acting on the following topics: dig- ital photogrammetry, feature extraction, mapping and calibration of digital cameras. Prof. Tommaselli is a Collaborator of the International Society for Pho- togrammetry and Remote Sensing and Fellow FP 1C of the National Scientific and Technological Development Council. Gabriela Takahashi Miyoshi received the Graduate degree in cartographic engineering from the School of Technology and Sciences, São Paulo State University (UNESP), Presidente Prudente, Brazil, in 2014, and Master’s in cartographic sciences, UNESP, in 2016, with emphasis on remote sensing of the environment. << /ASCII85EncodePages false /AllowTransparency false /AutoPositionEPSFiles true /AutoRotatePages /None /Binding /Left /CalGrayProfile (Gray Gamma 2.2) /CalRGBProfile (sRGB IEC61966-2.1) /CalCMYKProfile (U.S. Web Coated \050SWOP\051 v2) /sRGBProfile (sRGB IEC61966-2.1) /CannotEmbedFontPolicy /Warning /CompatibilityLevel 1.4 /CompressObjects /Off /CompressPages true /ConvertImagesToIndexed true /PassThroughJPEGImages true /CreateJobTicket false /DefaultRenderingIntent /Default /DetectBlends true /DetectCurves 0.0000 /ColorConversionStrategy /sRGB /DoThumbnails true /EmbedAllFonts true /EmbedOpenType false /ParseICCProfilesInComments true /EmbedJobOptions true /DSCReportingLevel 0 /EmitDSCWarnings false /EndPage -1 /ImageMemory 1048576 /LockDistillerParams true /MaxSubsetPct 100 /Optimize true /OPM 0 /ParseDSCComments false /ParseDSCCommentsForDocInfo true /PreserveCopyPage true /PreserveDICMYKValues true /PreserveEPSInfo false /PreserveFlatness true /PreserveHalftoneInfo true /PreserveOPIComments false /PreserveOverprintSettings true /StartPage 1 /SubsetFonts false /TransferFunctionInfo /Remove /UCRandBGInfo /Preserve /UsePrologue false /ColorSettingsFile () /AlwaysEmbed [ true /Algerian /Arial-Black /Arial-BlackItalic /Arial-BoldItalicMT /Arial-BoldMT /Arial-ItalicMT /ArialMT /ArialNarrow /ArialNarrow-Bold /ArialNarrow-BoldItalic /ArialNarrow-Italic /ArialUnicodeMS /BaskOldFace /Batang /Bauhaus93 /BellMT /BellMTBold /BellMTItalic /BerlinSansFB-Bold /BerlinSansFBDemi-Bold /BerlinSansFB-Reg /BernardMT-Condensed /BodoniMTPosterCompressed /BookAntiqua /BookAntiqua-Bold /BookAntiqua-BoldItalic /BookAntiqua-Italic /BookmanOldStyle /BookmanOldStyle-Bold /BookmanOldStyle-BoldItalic /BookmanOldStyle-Italic /BookshelfSymbolSeven /BritannicBold /Broadway /BrushScriptMT /CalifornianFB-Bold /CalifornianFB-Italic /CalifornianFB-Reg /Centaur /Century /CenturyGothic /CenturyGothic-Bold /CenturyGothic-BoldItalic /CenturyGothic-Italic /CenturySchoolbook /CenturySchoolbook-Bold /CenturySchoolbook-BoldItalic /CenturySchoolbook-Italic /Chiller-Regular /ColonnaMT /ComicSansMS /ComicSansMS-Bold /CooperBlack /CourierNewPS-BoldItalicMT /CourierNewPS-BoldMT /CourierNewPS-ItalicMT /CourierNewPSMT /EstrangeloEdessa /FootlightMTLight /FreestyleScript-Regular /Garamond /Garamond-Bold /Garamond-Italic /Georgia /Georgia-Bold /Georgia-BoldItalic /Georgia-Italic /Haettenschweiler /HarlowSolid /Harrington /HighTowerText-Italic /HighTowerText-Reg /Impact /InformalRoman-Regular /Jokerman-Regular /JuiceITC-Regular /KristenITC-Regular /KuenstlerScript-Black /KuenstlerScript-Medium /KuenstlerScript-TwoBold /KunstlerScript /LatinWide /LetterGothicMT /LetterGothicMT-Bold /LetterGothicMT-BoldOblique /LetterGothicMT-Oblique /LucidaBright /LucidaBright-Demi /LucidaBright-DemiItalic /LucidaBright-Italic /LucidaCalligraphy-Italic /LucidaConsole /LucidaFax /LucidaFax-Demi /LucidaFax-DemiItalic /LucidaFax-Italic /LucidaHandwriting-Italic /LucidaSansUnicode /Magneto-Bold /MaturaMTScriptCapitals /MediciScriptLTStd /MicrosoftSansSerif /Mistral /Modern-Regular /MonotypeCorsiva /MS-Mincho /MSReferenceSansSerif /MSReferenceSpecialty /NiagaraEngraved-Reg /NiagaraSolid-Reg /NuptialScript /OldEnglishTextMT /Onyx /PalatinoLinotype-Bold /PalatinoLinotype-BoldItalic /PalatinoLinotype-Italic /PalatinoLinotype-Roman /Parchment-Regular /Playbill /PMingLiU /PoorRichard-Regular /Ravie /ShowcardGothic-Reg /SimSun /SnapITC-Regular /Stencil /SymbolMT /Tahoma /Tahoma-Bold /TempusSansITC /TimesNewRomanMT-ExtraBold /TimesNewRomanMTStd /TimesNewRomanMTStd-Bold /TimesNewRomanMTStd-BoldCond /TimesNewRomanMTStd-BoldIt /TimesNewRomanMTStd-Cond /TimesNewRomanMTStd-CondIt /TimesNewRomanMTStd-Italic /TimesNewRomanPS-BoldItalicMT /TimesNewRomanPS-BoldMT /TimesNewRomanPS-ItalicMT /TimesNewRomanPSMT /Times-Roman /Trebuchet-BoldItalic /TrebuchetMS /TrebuchetMS-Bold /TrebuchetMS-Italic /Verdana /Verdana-Bold /Verdana-BoldItalic /Verdana-Italic /VinerHandITC /Vivaldii /VladimirScript /Webdings /Wingdings2 /Wingdings3 /Wingdings-Regular /ZapfChanceryStd-Demi /ZWAdobeF ] /NeverEmbed [ true ] /AntiAliasColorImages false /CropColorImages true /ColorImageMinResolution 150 /ColorImageMinResolutionPolicy /OK /DownsampleColorImages true /ColorImageDownsampleType /Bicubic /ColorImageResolution 150 /ColorImageDepth -1 /ColorImageMinDownsampleDepth 1 /ColorImageDownsampleThreshold 1.50000 /EncodeColorImages true /ColorImageFilter /DCTEncode /AutoFilterColorImages false /ColorImageAutoFilterStrategy /JPEG /ColorACSImageDict << /QFactor 0.76 /HSamples [2 1 1 2] /VSamples [2 1 1 2] >> /ColorImageDict << /QFactor 0.40 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /JPEG2000ColorACSImageDict << /TileWidth 256 /TileHeight 256 /Quality 15 >> /JPEG2000ColorImageDict << /TileWidth 256 /TileHeight 256 /Quality 15 >> /AntiAliasGrayImages false /CropGrayImages true /GrayImageMinResolution 150 /GrayImageMinResolutionPolicy /OK /DownsampleGrayImages true /GrayImageDownsampleType /Bicubic /GrayImageResolution 300 /GrayImageDepth -1 /GrayImageMinDownsampleDepth 2 /GrayImageDownsampleThreshold 1.50000 /EncodeGrayImages true /GrayImageFilter /DCTEncode /AutoFilterGrayImages false /GrayImageAutoFilterStrategy /JPEG /GrayACSImageDict << /QFactor 0.76 /HSamples [2 1 1 2] /VSamples [2 1 1 2] >> /GrayImageDict << /QFactor 0.40 /HSamples [1 1 1 1] /VSamples [1 1 1 1] >> /JPEG2000GrayACSImageDict << /TileWidth 256 /TileHeight 256 /Quality 15 >> /JPEG2000GrayImageDict << /TileWidth 256 /TileHeight 256 /Quality 15 >> /AntiAliasMonoImages false /CropMonoImages true /MonoImageMinResolution 1200 /MonoImageMinResolutionPolicy /OK /DownsampleMonoImages true /MonoImageDownsampleType /Bicubic /MonoImageResolution 600 /MonoImageDepth -1 /MonoImageDownsampleThreshold 1.50000 /EncodeMonoImages true /MonoImageFilter /CCITTFaxEncode /MonoImageDict << /K -1 >> /AllowPSXObjects false /CheckCompliance [ /None ] /PDFX1aCheck false /PDFX3Check false /PDFXCompliantPDFOnly false /PDFXNoTrimBoxError true /PDFXTrimBoxToMediaBoxOffset [ 0.00000 0.00000 0.00000 0.00000 ] /PDFXSetBleedBoxToMediaBox true /PDFXBleedBoxToTrimBoxOffset [ 0.00000 0.00000 0.00000 0.00000 ] /PDFXOutputIntentProfile (None) /PDFXOutputConditionIdentifier () /PDFXOutputCondition () /PDFXRegistryName () /PDFXTrapped /False /CreateJDFFile false /Description << /CHS /CHT /DAN /DEU /ESP /FRA /ITA (Utilizzare queste impostazioni per creare documenti Adobe PDF adatti per visualizzare e stampare documenti aziendali in modo affidabile. I documenti PDF creati possono essere aperti con Acrobat e Adobe Reader 5.0 e versioni successive.) /JPN /KOR /NLD (Gebruik deze instellingen om Adobe PDF-documenten te maken waarmee zakelijke documenten betrouwbaar kunnen worden weergegeven en afgedrukt. De gemaakte PDF-documenten kunnen worden geopend met Acrobat en Adobe Reader 5.0 en hoger.) /NOR /PTB /SUO /SVE /ENU (Use these settings to create PDFs that match the "Suggested" settings for PDF Specification 4.0) >> >> setdistillerparams << /HWResolution [600 600] /PageSize [612.000 792.000] >> setpagedevice