39
1 LSGI332 Remote sensing Lecture 11. Image Classification Lillesand and Keifer 6 th edition pp.545-581, 585-592

Remote Sensing Lec 11

Embed Size (px)

Citation preview

Page 1: Remote Sensing Lec 11

1

LSGI332 Remote sensing

Lecture 11. Image ClassificationLillesand and Keifer 6th edition pp.545-581, 585-592

Page 2: Remote Sensing Lec 11

2

Image classification• Objective: automatically categorise all pixels using

numerical, Spectral Pattern for each pixel NB. ‘pattern’ refers to Spectral Pattern, not spatial

• Spectral Pattern Recognition includes a range of classification procedures

• Usually refers to Land Cover Mapping• Uses more than one band:otherwise = Density Slice• Comprises 2 main approaches:

– Supervised, and – Unsupervised

Page 3: Remote Sensing Lec 11

3

Classified SPOT image of Hong Kong by Supervised classification method

Note scattered water pixels- what causes these?

Page 4: Remote Sensing Lec 11

4

Image classification: spectral pattern recognition

Fig. 7.18

Page 5: Remote Sensing Lec 11

5

Supervised classification

• Operator controlled• There are three distinct stages:

– Training– Classification– Output

Page 6: Remote Sensing Lec 11

6

Supervised classification

Fig. 7.17

Page 7: Remote Sensing Lec 11

7

Training stage in supervised classification• Training stage – analyst identifies representative training areas – develops

numerical description of spectral attributes of each cover type in scene• Analyst must develop training statistics for all spectral classes constituting

each information class to be discriminated by the classifier

Examples‘water’ may be represented by 4 or 5 spectral classes‘agriculture’ may contain several crop types and each crop type may contain several spectral classes due to different planting dates, moisture conditions etc.

Page 8: Remote Sensing Lec 11

8

Training area creationmain objective – to assemble a set of descriptive statistics

training areas must be representative and complete‘The point that must be emphasised is that all spectral classes constituting each information class must be adequately represented in the training set statistics used to classify an image’ (Lillesand and Keifer)

Page 9: Remote Sensing Lec 11

9

Training area statistics for one class, 6 bands

Page 10: Remote Sensing Lec 11

10

Sample histograms for

pixels in training areas for a cover

type ‘hay’

Page 11: Remote Sensing Lec 11

11

Coincident spectral plots for training

areas in 5 bands for 6 cover types

Very useful for examining the data before classifying

Thermal band often very useful if optical bands have overlap

Page 12: Remote Sensing Lec 11

12

Classification stage

• each pixel in dataset categorised into the land cover class it most closely resembles

• if not similar to any, labelled ‘unknown’

• category label set to ‘output image’

• every pixel is classified – a pixel-by-pixel operation (unlike image enhancement and image filtering

Page 13: Remote Sensing Lec 11

13

Classification methods

• Several different approaches:– Minimum-distance-to-means classifier– Parallelepiped classifier– Gaussian maximum likelihood classifier– Neural network

• Increasing sophistication

Page 14: Remote Sensing Lec 11

14

Band 2

Ban

d 1

0

2550

255

TM Band 3

TM B

and

4concretehigh buildingsgrass slopewaterbare soilsforest

Scatterplots for training areas from SPOT image of Hong Kong

Pixel in 2-Dimensional space

Training area pixels in 2-D space: should be visualised as n-dimensional

Page 15: Remote Sensing Lec 11

15

Scattergram of training areas• scattergram

illustrating the pixel observations of six cover types

0

2550

255

TM Band 3

TM B

and

4

concretehigh buildingsgrass slopewaterbare soilsforest

Cover Type Colour No. Points

Water Cyan 3793

Concrete Purple 975

High buildings Thistle 1866

Bare soils Coral 784

Grass slope Yellow 924

Forest Green 3122

Water

High buildings

Concrete

Bare soils

Forest

Page 16: Remote Sensing Lec 11

16

Scattergram of real image• Spectral ranges of the training areas on the scattergram (right)

and the image classes to which they belong (left).

TM Band 3

8

28

49

69

90

14 35 56 77 88

TM B

and

4

Concrete

Water

Page 17: Remote Sensing Lec 11

17

Scatter plot for training areas

Source: Lillesand and Keifer 6th ed.

Page 18: Remote Sensing Lec 11

18

Minimum Distance to Means Classifier

Page 19: Remote Sensing Lec 11

19

Scatterplot training area pixels for SPOT image of Hong Kong, with cluster means

0

2550

255

TM Band 3

TM B

and

4

concretehigh buildingsgrass slopewaterbare soilsforest

1 2

Page 20: Remote Sensing Lec 11

20

Parallelpiped Classifier

Page 21: Remote Sensing Lec 11

21

Scatterplot training area pixels for SPOT image of Hong Kong with boxes drawn for range

0

2550

255

TM Band 3

TM B

and

4

concretehigh buildingsgrass slopewaterbare soilsforest

1

2

Page 22: Remote Sensing Lec 11

22

Parallel Piped Classifier with stepped class region boundaries

Page 23: Remote Sensing Lec 11

23

Equiprobability Contours defined by Maximum Likelihood Classifier

Page 24: Remote Sensing Lec 11

24

Probability density function for Maximum Likelihood Classifier

Page 25: Remote Sensing Lec 11

25

Equiprobability contours plotted against scatterplot of Hong Kong training area data

0

2550

255

TM Band 3

TM B

and

4

concretehigh buildingsgrass slopewaterbare soilsforest

1 2

Page 26: Remote Sensing Lec 11

26

Comparison of SPOT image classified by Maximum Likelihood, and Minimum Distance to means classifiers

Page 27: Remote Sensing Lec 11

27

Unsupervised classification• Software controlled• Examine all bands for each pixel in an image - the

process is iterative• Aggregate them into a number of classes based on natural

groupings or clusters present in the image values• Result in spectral classes (clusters) and initially, their

identity are unknown• Each class is then identified from knowledge or reference

data• Operator intervention only after data has been placed into

classes

Page 28: Remote Sensing Lec 11

28

Menu for Unsupervised classification

Page 29: Remote Sensing Lec 11

29

ISODATA classification of part of Guangdong

Page 30: Remote Sensing Lec 11

30

Classes resulting from Unsupervised Classification

Page 31: Remote Sensing Lec 11

31

Classification and post processing• Classified image before (left) and after (right)

post-classification process (majority filter)

Water Concrete High buildings Bare soils Grass slope Forest

Page 32: Remote Sensing Lec 11

32

Problems of land use classification in urban areas

• Similar spectral response of urban surfaces• Bi-modal spectral response of urban land use

features• Land Cover more useful concept than Land Use

Page 33: Remote Sensing Lec 11

33

A.The nature of urban areas affecting image interpretation

Source: Remote Sensing of the environment by J.R. Jensen, 2000

(i) Spectral characteristics of common urban materials

Page 34: Remote Sensing Lec 11

34

Appearance of tree canopy: colour infra-red air photo and IKONOS multispectral images

Page 35: Remote Sensing Lec 11

35

Product validation: Accuracy assessment

• Error matrix (or confusion table)– Comparison of the relationship between known

reference data and the corresponding results of the classification

– It is common to average the correct classifications and regard this as the overall classification accuracy (in this case 81%)

– omission errors shown in column and commission errors shown in row

Page 36: Remote Sensing Lec 11

36

Example of confusion matrixTraining data (known cover types)

Classificationresults

Water Concrete Highbuildings

Baresoils

Grassslopes

Forest Rowtotal

Water 93 0 2 1 0 0 96

Concrete 0 65 4 6 0 0 75

High buildings 2 3 124 5 9 12 155

Bare soils 2 3 21 165 24 12 227

Grass slopes 0 0 6 16 201 45 268

Forest 0 0 8 9 76 512 605

Column total 97 71 165 202 310 581 1426

Producer’s accuracy User’s accuracy

W = 93/97 = 96% B = 165/202 = 82% W = 93/96 = 97% B = 165/227 = 73%

C = 65/71 = 92% G = 201/310 = 65% C = 65/75 = 87% G = 201/268 = 75%

H = 124/165 = 75% F = 512/581 = 88% H = 124/155 = 80% F = 512/605 = 85%

Overall accuracy = (93 + 65 + 124 + 165 + 201 + 512)/ 1426 = 81% = (1160 - 365.11) / (1426 - 365.11) = 0.749

• 124 sample points have been correctly classified as high buildings• But 2 genuine high building samples have been classified as water, • and 2 water samples have been classified as high buildings

Page 37: Remote Sensing Lec 11

37

Interpretation of confusion matrix

• Producer’s accuracy– Correct classified pixels / column total (total pixels of

pixels allocated to the class), e.g. 124/165=75%– Meaning: There are 165 pixels which are actually

High Buildings and only 124 of those are correctly assigned to High Buildings

• User’s accuracy– Correct classified pixels / row total (total number of

pixels classified as a class), e.g. 124/155=80%– Meaning: A total 155 pixels have been classified as

High Buildings but only 124 pixels are actually High Buildings

Page 38: Remote Sensing Lec 11

38

Accuracy assessment: k coefficient• Global assessment of classification accuracy:

Cohen’s kappa coefficient ()• The optimal score is 1.0 (perfect classification). In

previous example, N = 1426, d = 1160, q = 520609, and = 0.749.

n

jiji xx

1,

n

ijij xx

1,

i.e. the sum over all columns for row i

i.e. the sum over all rows for column j

qNqdN

2

*

N = total number of samples, d = total number of cases in diagonal cells of the error matrixq=x+,j*x,i+

Page 39: Remote Sensing Lec 11

39

Output stage

• production of thematic maps, tables or statistics

• the classification output becomes a GIS input

• needs knowledge of area and/or reference data