10
Validation of a crop field modeling to simulate agronomic images Gawain Jones 1 , Christelle Gée 1* , Sylvain Villette 1 and Frédéric Truchetet 2 1 AgroSup Dijon, Unité Propre GAP: Génie des Agro-équipements et des Procédés, 26 Bld Dr Petitjean, BP 87999, 21079 Dijon Cedex, France 2 UMR 5158 uB-CNRS, 12 rue de la Fonderie, 71200 Le Creusot, France *[email protected] Abstract: In precision agriculture, crop/weed discrimination is often based on image analysis but though several algorithms using spatial information have been proposed, not any has been tested on relevant databases. A simple model that simulates virtual fields is developed to evaluate these algorithms. Virtual fields are made of crops, arranged according to agricultural practices and represented by simple patterns, and weeds that are spatially distributed using a statistical approach. It ensures a user-defined Weed Infestation Rate (WIR). Then, experimental devices using cameras are simulated with a pinhole model. Its ability to characterize the spatial reality is demonstrated through different pairs (real, virtual) of pictures. Two spatial descriptors (nearest neighbor method and Besag’s function) have been set up and tested to validate the spatial realism of the crop field model, comparing a real image to the homologous virtual one. ©2010 Optical Society of America OCIS codes: (200.4740) Optical processing; (330.4060) Vision modeling; (330.6100) Spatial discrimination; (330.7326) Visual optics, modeling. References and Links 1. L. Streit, P. Federl, and M. Costa Sousa, “Modelling Plant Variation Through Growth,” Comput. Graph. Forum 24, 497–506 (2005). 2. J. Bossu, C. Gée, and F. Truchetet, “Development of a machine vision system for a real time precision sprayer,” Electron. Lett. Comput. Vision 7, 54–66 (2008). 3. J. Bossu, C. Gée, G. Jones, and F. Truchetet, “Wavelet transform to discriminate between crop and weed in perspective agronomic images,” Comput. Electron. Agric. 65(1), 133–143 (2009). 4. C. Onyango, J. Marchant, A. Grundy, K. Phelps, and R. Reader, “Image Processing Performance Assessment using Crop Weed Competition Models,” Precis. Agric. 6(2), 183–192 (2005). 5. G. Jones, C. Gée, and F. Truchetet, “Modelling agronomic images for weed detection and comparison of crop/weed discrimination algorithm performance,” Precis. Agric. 10(1), 1–15 (2009). 6. G. Jones, C. Gée, and F. Truchetet, “Assessment of an inter-row weed infestation rate on simulated agronomic images,” Comput. Electron. Agric. 67(1-2), 43–50 (2009). 7. J. Cardina, G. Johnson, and D. Sparrow, “The nature and consequence of weed spatial distribution,” Weed Sci. 43, 364–373 (1997). 8. R. A. Fisher, and R. E. Miles, “The role of spatial pattern in the competition between plants and weeds. A theoretical analysis,” Math. Biosci. 18(3-4), 335–350 (1973). 9. F. Goreaud, “Apports de l’analyse de la structure spatiale en fôret tempérée à l’étude et la modélisation des peuplements complexes,” (Clermont-Ferrand, 2000). 10. O. Faugeras, Three-Dimensional Computer Vision (MIT Press, 1993). 11. D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen, “Color indices for weed identification under various soil, residue, and lightning conditions,” Trans. ASAE 38, 259–269 (1995). 12. J.-Y. Bouguet, (juin 2008), retrieved juillet 2009, http://www.vision.caltech.edu/bouguetj/calib_doc/index.html. 13. J. G. Skellam, “Studies in statistical ecology I. Spatial pattern,” Biometrica 39(3/4), 346–362 (1952). 14. B. D. Ripley, Spatial statistics (Wiley, New-York, 1981). 15. P. J. Diggle, “Statistical analysis of spatial point patterns,” (Academic Press, 1983). 16. F. Goreaud, M. Loreau, and C. Millier, “Spatial structure and the survival of an inferior competitor: a theoretical model of neighbourhood competition in plants,” Ecol. Modell. 158(1-2), 1–19 (2002). 17. J. E. Besag, “Comments on Ripley’s paper,” J. R. Stat. Soc., B 39, 193–195 (1977). 18. B. D. Ripley, “Modelling spatial patterns,” J. Royal Stat. Soc. Ser. B (Methodol.), 172–212 (1977). 19. H. A. Gleason, “Some applications of the quadrat method,” Bull. Torrey Bot. Club 47(1), 21–33 (1920). 20. N. A. C. Cressie, Statistics for spatial data, Wiley Series in Probability and Mathematical Statistics (John Wiley and and Sons, New York, 1993). #120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010 (C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10694

Validation of a crop field modeling to simulate agronomic images

Embed Size (px)

Citation preview

Page 1: Validation of a crop field modeling to simulate agronomic images

Validation of a crop field modeling to simulate agronomic images

Gawain Jones1, Christelle Gée1*, Sylvain Villette1 and Frédéric Truchetet2 1AgroSup Dijon, Unité Propre GAP: Génie des Agro-équipements et des Procédés, 26 Bld Dr Petitjean,

BP 87999, 21079 Dijon Cedex, France 2UMR 5158 uB-CNRS, 12 rue de la Fonderie, 71200 Le Creusot, France

*[email protected]

Abstract: In precision agriculture, crop/weed discrimination is often based on image analysis but though several algorithms using spatial information have been proposed, not any has been tested on relevant databases. A simple model that simulates virtual fields is developed to evaluate these algorithms. Virtual fields are made of crops, arranged according to agricultural practices and represented by simple patterns, and weeds that are spatially distributed using a statistical approach. It ensures a user-defined Weed Infestation Rate (WIR). Then, experimental devices using cameras are simulated with a pinhole model. Its ability to characterize the spatial reality is demonstrated through different pairs (real, virtual) of pictures. Two spatial descriptors (nearest neighbor method and Besag’s function) have been set up and tested to validate the spatial realism of the crop field model, comparing a real image to the homologous virtual one.

©2010 Optical Society of America

OCIS codes: (200.4740) Optical processing; (330.4060) Vision modeling; (330.6100) Spatial discrimination; (330.7326) Visual optics, modeling.

References and Links

1. L. Streit, P. Federl, and M. Costa Sousa, “Modelling Plant Variation Through Growth,” Comput. Graph. Forum 24, 497–506 (2005).

2. J. Bossu, C. Gée, and F. Truchetet, “Development of a machine vision system for a real time precision sprayer,” Electron. Lett. Comput. Vision 7, 54–66 (2008).

3. J. Bossu, C. Gée, G. Jones, and F. Truchetet, “Wavelet transform to discriminate between crop and weed in perspective agronomic images,” Comput. Electron. Agric. 65(1), 133–143 (2009).

4. C. Onyango, J. Marchant, A. Grundy, K. Phelps, and R. Reader, “Image Processing Performance Assessment using Crop Weed Competition Models,” Precis. Agric. 6(2), 183–192 (2005).

5. G. Jones, C. Gée, and F. Truchetet, “Modelling agronomic images for weed detection and comparison of crop/weed discrimination algorithm performance,” Precis. Agric. 10(1), 1–15 (2009).

6. G. Jones, C. Gée, and F. Truchetet, “Assessment of an inter-row weed infestation rate on simulated agronomic images,” Comput. Electron. Agric. 67(1-2), 43–50 (2009).

7. J. Cardina, G. Johnson, and D. Sparrow, “The nature and consequence of weed spatial distribution,” Weed Sci. 43, 364–373 (1997).

8. R. A. Fisher, and R. E. Miles, “The role of spatial pattern in the competition between plants and weeds. A theoretical analysis,” Math. Biosci. 18(3-4), 335–350 (1973).

9. F. Goreaud, “Apports de l’analyse de la structure spatiale en fôret tempérée à l’étude et la modélisation des peuplements complexes,” (Clermont-Ferrand, 2000).

10. O. Faugeras, Three-Dimensional Computer Vision (MIT Press, 1993). 11. D. M. Woebbecke, G. E. Meyer, K. Von Bargen, and D. A. Mortensen, “Color indices for weed identification

under various soil, residue, and lightning conditions,” Trans. ASAE 38, 259–269 (1995). 12. J.-Y. Bouguet, (juin 2008), retrieved juillet 2009, http://www.vision.caltech.edu/bouguetj/calib_doc/index.html. 13. J. G. Skellam, “Studies in statistical ecology I. Spatial pattern,” Biometrica 39(3/4), 346–362 (1952). 14. B. D. Ripley, Spatial statistics (Wiley, New-York, 1981). 15. P. J. Diggle, “Statistical analysis of spatial point patterns,” (Academic Press, 1983). 16. F. Goreaud, M. Loreau, and C. Millier, “Spatial structure and the survival of an inferior competitor: a theoretical

model of neighbourhood competition in plants,” Ecol. Modell. 158(1-2), 1–19 (2002). 17. J. E. Besag, “Comments on Ripley’s paper,” J. R. Stat. Soc., B 39, 193–195 (1977). 18. B. D. Ripley, “Modelling spatial patterns,” J. Royal Stat. Soc. Ser. B (Methodol.), 172–212 (1977). 19. H. A. Gleason, “Some applications of the quadrat method,” Bull. Torrey Bot. Club 47(1), 21–33 (1920). 20. N. A. C. Cressie, Statistics for spatial data, Wiley Series in Probability and Mathematical Statistics (John Wiley

and and Sons, New York, 1993).

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10694

Page 2: Validation of a crop field modeling to simulate agronomic images

21. S. Jacquemoud, S. L. Ustin, J. Verdebout, G. Schmuck, G. Andreoli, and B. Hosgood, “Estimating leaf biochemistry using the PROSPECT leaf optical properties model,” Remote Sens. Environ. 56(3), 194–202 (1996).

22. B. Hapke, “Bidirectional Reflectance Spectroscopy,” J. Geophys. Res. 86(B4), 3039–3054 (1981).

1. Introduction

Nature modeling is highly used by different communities on multiple purposes. In Computer Graphics, trees, flowers and plants are designed to reproduce reality and to beautify virtual scenes [1]. Agronomists need complex models to estimate yield or weed invasion through plant competition, growth or response to weather, fertilizer or herbicides. Each simulation need a different level of detail, depending on its aim (for a video game, the aim could be to obtain the best realism/computing time ratio).

Remote sensors (i.e., cameras, radiometers...) are currently developed in agriculture to reduce chemical pollution at source. When embedded in mobile engines (tractors, aircraft) [2], they help to improve farmers’ decision making. For instance, different automatic weed control systems based on vision systems have been developed to reduce herbicide application: they spray specifically the weed infested areas rather than the entire field. Most crop/weed discrimination algorithms [3] require spatial, morphological or spectral information and are applied to very particular applications (real-time or post-herbicide treatments). However, the effectiveness of these algorithms has to be discussed. A few authors assess their performance by simply comparing their ability to discriminate crop plants from weed plants with a manual validation involving a few images [4]. Such a tedious technique is totally unlikely when the number of images to process is too large. Jones et al. [5] consequently investigated a modeling approach looking at spatial information and developed a virtual agronomic field simulation tool that is specifically dedicated to the spraying context (i.e., plants are considered at early two-leaf stage of growth). They built a huge database of simulated images where the position of the virtual camera and the parameters of the CCD are changed. The initial characteristics of the images are perfectly known: location of each crop and weed plant, corresponding pixel number, Weed Infestation Rate (WIR)… Thus all realistic or unrealistic situations (i.e. 0% to 100% of WIR) can be realized. However, the realism of the spatial structure of this field simulation model has not been validated to confirm that virtual scenes can be substituted for real pictures. After a short description of the model, this paper aims at using statistical descriptors to validate the spatial structure of both crop and weed plants, and at comparing a real picture to the homologous virtual one.

2. Crop field modeling

Before going any further, we should make a distinction between a picture and an image to avoid ambiguity: a picture always refers to a photograph (virtual or taken by a camera) while an image refers to a model.

A spatial crop field model is designed using a predefined spatial distribution of crop and weed plants. This model that has been previously described in [6] divides into two steps: (1) simulation of the crop field where weed plants are described by spatial stochastic models (point or cluster processes) and (2) construction of a virtual photograph of the crop field according to the extrinsic and intrinsic parameters of the virtual CCD sensor.

The goal of the model is to test the efficiency of crop/weed discrimination algorithms in crop field subjected to different Weed Infestation Rates determined by the following equation Eq. (1):

* 100

(%)( )

weed pixelsWIR

crop weed pixels=

+ (1)

WIR values are predefined on demand before running the model.

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10695

Page 3: Validation of a crop field modeling to simulate agronomic images

2.1. Crop model

Crop sowing complies with reality: lines are equally spaced with an inter-row width depending on the crop type (16-18cm for wheat, 45 cm for sunflower, etc.). Two different sowing schemes are proposed: an “in-line” one for cereals and a periodical one for sunflower or maize. Periodical sowing introduces the notion of intra-row frequency, which is defined as the distance between crop plants in the row. For each crop, three different patterns and sizes are available (with three different orientations) to mimic the unpredictable nature of plant growth. The presence of crop is controlled by a stochastic variable to reproduce growth issues.

2.2. Weed model

Weeds, which are also described by different patterns, are placed in the field according to spatial distributions. As for crop plants, they can be presented at different growth stages and their size varies as a function of a stochastic variable.

In crop field, two weed spatial distributions are usually observed: point and aggregate as presented by Cardina et al. [7].

2.2.1. Point distribution

Weed position can be described by a discrete stochastic distribution with no memory between two successive events: the Poisson law [8].

The virtual field is split into several parts S to keep the sparsity modeled by the law.

Thus, the size of S depends on the WIR and λ defines the weed density. Poisson law is

determined by using Eq. (2), giving the probability to obtain k weed plants on a surface S .

( )

( ) ( ) e!

k

S

k

SP S P X k

k

λλλ −= = = (2)

The number of weed plants in each area is given by Eq. (2) and their positions are chosen stochastically, with a uniform distribution restricted by the size of S.

2.2.2. Aggregate distribution

Due to multiple factors (seed pools, soil heterogeneity and suitability, crop production practices), weeds also follow aggregate distributions. This type of distribution has been proposed for forest representation with a Neyman-Scott process [9]. This process is split up in two steps: germs are first randomly chosen in the centre of a given region of the field. Second, the number of weed plants is calculated by a Poisson law (the initial parameter of the law determine the region size) for each region and a position is assigned around the germ taking into account agricultural practices. In consequence, the shape of the aggregate is estimated as an ellipsoid oriented in the crop row direction. It is unlikely that weeds follow only one distribution so that a mixture of both is also proposed with the ability to control the ratio of point and aggregate distribution.

3. Virtual photographs: world to camera transformation

To create an adaptive model that simulates picture shot from a remote sensor embedded in a tractor, a drone or any other experimental platform, a simple mathematical projective model is used, the pinhole model that maps a 3D world point into picture coordinates [10]. It requires two kinds of input parameters: the intrinsic parameters, which define the camera characteristics (CCD size, focal length, distortion), and the extrinsic parameters, which define the camera location in a 3D space (rotation and translation). A matrix (Eq. (3)) encompasses them and allows a simple and fast calculation of the transformation.

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10696

Page 4: Validation of a crop field modeling to simulate agronomic images

(3)

where f is the focal length, Suv the skewness, (uo, vo) the principal point in pixel and Celli, Cellj the CCD width and height.

The field image size has to be included in the camera’s field of view. Consequently, the field size is given by the smaller rectangle containing the CCD projection. As shown in Fig. 1 the difference of resolution between the field image and the CCD implies a mixture of several pixels for each picture pixel. Particularly, at the border of vegetation leaf and soil, a pixel can be composed of both type of pixel. The resulting picture is a grey-scale picture.

Fig. 1. Results of pixel integration and grey-level appearance.

As an example, Fig. 2 presents two different pictures of two different crops (i.e. wheat and sunflower) issued from two different virtual camera configurations.

Fig. 2. a) Wheat picture: inter-row width = 16cm, camera’s height = 1.20m, pitch angle = 65°, roll angle = 0° and WIR = 20% with a mixture of both distributions. b) Sunflower picture: inter-row width = 45cm, intra-row frequency = 10cm, camera’s height = 15m, pitch angle = 0°, roll angle = 20° and WIR = 10% with a punctual distribution.

Testing discrimination algorithms in various configurations is now possible with a large range of parameters:

- crop type and it’s corresponding inter-row width (e.g. 18cm for wheat, 45cm for sunflower…),

- weed spatial distribution (point, aggregate and a mixture of both), - Weed Infestation Rate (WIR),

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10697

Page 5: Validation of a crop field modeling to simulate agronomic images

- camera’s intrinsic (focal length, CCD size, skewness) and extrinsic (3D location and orientation) parameters.

Based on this knowledge we are able to evaluate the accuracy of any spatial crop/weed discrimination algorithms by a simple comparison between initial and detected crop and weed location.

4. Modeling validation

To evaluate the realism of the spatial structure in the virtual pictures, it is important to notice that a preprocessing treatment using vegetation indices and a threshold is applied to real pictures. It results in a binary picture that permits the separation of vegetation from soil [11]. The comparison has to take this step into account to be relevant. The validation procedure is described hereafter. Starting from a real picture where the extrinsic and intrinsic parameters are determined, its virtual homologous picture is created. The pinhole model has been tested to validate the world-to-camera transformation: a camera calibration toolbox for Matlab [12] has been performed, and the resulting intrinsic and extrinsic parameters have been used to simulate calibration pictures (a checkerboard viewed from different positions). As the simulated checkerboard perfectly fits the original picture, we assume that the implementation of the pinhole model offers a realistic world-to-picture transformation.

As an example, Fig. 3 presents a pre-processed sunflower field picture issued from a camera embedded in a drone (Fig. 3a) flying at a height of 10 m and its virtual homologous picture (Fig. 3b).

Then, the degree of spatial similarity between the real picture and its homologous virtual one was estimated for both crop and weed patterns from spatial descriptors at different length scale: from local information to global information.

Fig. 3. (a) Pre-processed sunflower field picture from a drone. (b) Virtual picture from initial parameters deduced from the real one.

4.1. Validation of the spatial crop distribution

To quantify the multi-scale similarity of the crop pattern, a range nearest neighbor (RNN) method [13] has been developed to highlight the spatial crop distribution.

Local moving windows are commonly used to define the local information content around a centre pixel. In our case, we consider the neighbors of vegetation pixels, i.e. pixels with a

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10698

Page 6: Validation of a crop field modeling to simulate agronomic images

value of one. We chose patterns that describe the spatial structure of the images at different scales and grouped them according to vertical and/or horizontal symmetry. The local windows used for this study were composed of nine different patterns (with 0 or 1) that are defined as follows (starting from the left in clockwise): pattern 1 {0 0 0 0}; pattern 2 {1 0 0 0 or 0 0 1 0}; pattern 3 {1 0 1 0}; pattern 4 {0 1 0 0 or 0 0 0 1}; pattern 5 {1 1 0 0 or 0 1 1 0 or 1 0 0 1 or 0 0 1 1}; pattern 6 {1 1 1 0 or 1 0 1 1}; pattern 7 {0 1 0 1}; pattern 8 {1 1 0 1 or 0 1 1 1} and pattern 9 {1 1 1 1}. The spatial information analysis based on moving window across the whole image was conducted with various window sizes, from 1 to (image width /2) pixels for each of the nine patterns (Fig. 4).

An example of a local window composed of the pattern N°1 is presented in Fig. 4.

Fig. 4. Structure of the local moving window (example for the pattern 1) centered on a pixel of intensity equals one. The window size d varies from 1 to (image width)/2 allowing a spatial study at different scales.

The number of occurrences in the real picture (line) and virtual picture (dots) is analyzed as a function of the local window size for each pattern. Figure 5 presents the results obtained for a local window for each of the nine patterns. The error bars deduced from the simulation of 2000 virtual images are specified in the figure and they indicate an estimation of the standard deviation of the virtual sampling distribution. For each of the nine patterns of the local window tested, the standard deviation value σsimulated is reported in Tab.1.

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10699

Page 7: Validation of a crop field modeling to simulate agronomic images

Fig. 5. Comparison of the evolution of the crop spatial information at different length scales in the real picture (line) and the virtual picture (dots) for every pattern. The width ([σsimulated, σsimulated]) of error bars characterizes the standard deviation of a virtual data set composed of 2000 images.

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10700

Page 8: Validation of a crop field modeling to simulate agronomic images

To assess the quality of the crop field modeling we calculate the Root Means Square Error (RMSE) which can be calculated as follows:

( ) ( ) 2

1

1( )

N

simulated n real n

n

RMSE X XN

ρ ρ=

= −∑ (4)

Where realρ and simulatedρ are the real and virtual data issued from pictures. N is the number

of points. RMSE values are in pixel unit. For the nine patterns of local window tested, the RMSE values (Tab.1) have been

calculated. The smallest RMSE reaches 0.0003 (local window with the pattern 9) and the largest 0.008 for a local window with the pattern 4. These values have to be compared with the measurement accuracy (σsimulated). If the RMSE values are lower than σsimulated it indicates the goodness of the spatial crop pattern modeling otherwise the modeling must be improved related to the crop pattern.

From tab. 1, the comparison between the RMSE values and σ assesses that the effectiveness of the crop field modeling to create virtual pictures clearly mimics the crop spatial information contained in real picture.

Other pair of pictures (real/virtual) have been tested, particularly pictures with perspective effects as shown in Fig. 2. The validation results are not as good as those obtained for non perspective images but the RMSE values (<σsimulated) confirm the crop field modeling goodness to reproduce a real spatial crop sowing.

Table 1: RMSE values for all the nine patterns deduced from real picture and its virtual homologous one presented in Fig. 3.

Pattern number

1 2 3 4 5 6 7 8 9

σsimulated 0.0407 0.0437 0.0472 0.0572 0.0160 0.0032 0.0271 0.0071 0.0006

RMSE 0.0045 0.0077 0.006 0.008 0.0037 0.0011 0.0037 0.0038 0.0003

4.2. Validation of the spatial weed plant distribution

In the case of weed plants, the approach was simplified: each plant has been represented by a point, defined by its spatial coordinates (x, y). Among all the methods [14–16] to analyze the

spatial structure of a point pattern, we consider Besag’s ( )L r function [17], a linearization of

Ripley’s ( )K r function [18]. Compared to other similar methods like the quadrat method

[19], it allows to characterize the spatial distribution (Poisson, Neyman-Scott…) at different ranges [20]. Assuming a distribution with a homogeneous density, Ripley’s function is defined by:

^

^1

1 1( ) . .

N

ij

i j i

K r kNλ = ≠

= ∑∑ (5)

Where i and j are weed plants, 1ijk = if the distance between i and j is lower than r (0

else), ˆ N Sλ = is the global density of the picture, N is the number of points in the pattern

and S is the area of the studied region. The quantity ( )ˆ K̂ rλ is the expected number of

neighbors in a circle of radius r centered on an arbitrary point of the process [18]. Since this function is usually complex to analyze, a simplification of Ripley’s function

interpretation is proposed by Besag [17] who defines a linearized function ( )L̂ r as:

^^ ( )( )

K rL r r

π= − (6)

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10701

Page 9: Validation of a crop field modeling to simulate agronomic images

Three types of spatial distributions can be then characterized by Besag’s function. For a Poisson pattern, the result must be included in a predefined confidence interval and the

function value is zero. For a clustered pattern, ( )L̂ r is positive at a distance r while it is

negative for a regular pattern at distance r .

Fig. 6. (a) and (b) represent the real and the virtual weed patterns. The pictures are composed of a number of 177 weed plants. The figures (c) and (d) are the point weed distributions associated to each weed plant pattern for both real and virtual pictures. The figures (e) and (f) are Besag’s functions deduced for both real and virtual pictures. These functions are calculated with a confidence interval of 1%. In both case, they reveal that weed plant pattern is a homogeneous point distribution.

To estimate the degree of spatial similarity of the weed patterns between the real picture and the homologous virtual one, the procedure is based on two steps:

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10702

Page 10: Validation of a crop field modeling to simulate agronomic images

- the weed plant pattern (Fig. 6a), and so a point pattern (Fig. 6c), is manually extracted from a real picture. Then the virtual homologous picture of weed plants (Fig. 6b) and its point pattern (Fig. 6d) are created according the knowledge of the number of weed plants and the weed distribution (Poisson or Neyman-Scott process).

- the validation of spatial distribution model of weed plants is obtained by a simple comparison between the type of spatial distribution characterized by Besag’s function deduced from real and virtual pictures (Fig. 6e and Fig. 6f). If the distributions are similar, it indicates that the model correctly describes the spatial statistical reality of weed plants.

Several real pictures have been tested successfully with the following procedure: for each pair of (real, virtual) pictures, we verify that the weed plants have similar distributions.

The validation of a Poisson and Neyman-Scott processes by Ripley function should be an interesting way for other applications such as radiography or default in material in which this processes are used to simulate noise or material textures.

5. Conclusion and perspectives

A two-dimensional statistical and spatial crop field model was developed and allows to select given specificities (crop size, inter-row width…) for different crop species (wheat, sunflower, maize…). The Weed Infestation Rate is set on demand with three different spatial distributions (point, aggregate, or a mixture) ensuring the diversity of resulting simulated field. The pinhole model is used to simulate snapshot of virtual fields with a total freedom in the parameter choice (from the camera position to its intrinsic parameters). To resume, a powerful tool that can be adapted to any experimental devices is proposed to evaluate the robustness of spatial discrimination algorithm.

The spatial realism of the model was validated by spatial descriptors looking at the spatial information at different scales through a pair of (real, virtual) pictures. Concerning the spatial crop distribution, a nearest neighbor method based on moving local window across the whole image was conducted with various window sizes and various patterns for both real and virtual pictures. In these circumstances different RMSE values have been performed confirming the modeling goodness to mimic the agronomic reality. In the case of weed distribution the spatial descriptor is based on Besag’s function revealing whether weed plants have homogenous (i.e. point or aggregate) distribution or not in real conditions. Moreover, the weed distribution in the homologous virtual picture is compared to the one deduced from the real picture.

Currently, the modeling can run to perform important image databases with different WIR in order to evaluate the correct performance assessment of any crop/weed discrimination algorithms based on spatial information.

The simplicity of the model allows the addition of other levels to bring more functionality. A promising one is to explore spectral possibilities to discriminate crop from weeds. To add this spectral layer, we need a tool that allows the calculation of a spectral signal with respect to its viewing angle, its lightning angle and the object type (crop, weed and soil) [21]. A radiative transfer model developed by Hapke [22] allows to determine a set of parameters, based on multiple sample measurements (with different camera and light orientations) and it can be used to approximate the spectral signal of this sample in different configurations. This approach requires a large number of precise spectral measurements to obtain valuable Hapke’s parameters and a goniospectrophotometer can complete this task.

Acknowledgements

This work was funded by the Burgundy Council, the European Social Fund (FSE) and the Tecnoma Company (http://www.tecnoma.com).

#120873 - $15.00 USD Received 2 Dec 2009; revised 3 Mar 2010; accepted 11 Mar 2010; published 7 May 2010(C) 2010 OSA 10 May 2010 / Vol. 18, No. 10 / OPTICS EXPRESS 10703