lecture 12 1 introduction to computer vision image texture analysis

104
Lecture 12 1 Introduction to Computer Vision Image Texture Analysis

Upload: abigayle-knowlton

Post on 14-Dec-2015

228 views

Category:

Documents


0 download

TRANSCRIPT

Lecture 12

1

Introduction to Computer Vision

Image Texture Analysis

2

A few examples

• Morphological processing for background illumination estimation

• Optical character recognition

Roger S. Gaborski

3

Image with nonlinear illumination

Original Image Thresholded with graythresh

4

Obtain Estimate of Background

Roger S. Gaborski

background = imopen(I,strel('disk',15)); %GRAYSCALE

figure, imshow(background, [])

figure, surf(double(background(1:8:end,1:8:end))),zlim([0 1]);

5Roger S. Gaborski

%subtract background estimate from original image

I2 = I - background;

figure, imshow(I2), title('Image with background removed')

level = graythresh(I2);

bw = im2bw(I2,level);

figure, imshow(bw),title('threshold')

6

Comparison

Roger S. GaborskiOriginal Threshold Background Removal - Threshold

7

Optical Character Recognition

• After segmenting a character we still need to recognize the character.

• How do we determine if a matrix of pixels represents an ‘A’, ‘B’, etc?

Roger S. Gaborski

8Roger S. Gaborski

9Roger S. Gaborski

10

Approach

• Select line of text• Segment each letter• Recognize each letter as ‘A’, ‘B’, ‘C’, etc.

Roger S. Gaborski

11Roger S. Gaborski

Select line 3:

Samples of segment of individual letters in line 3:

12

• We need labeled samples of each potential letter to compare to unknown

• Take the product of the unknown character and each labeled character and determine with labeled character is the closest match

Roger S. Gaborski

13Roger S. Gaborski

%Load Database of characters (samples of known characters)

load charDB08182009.mat

whos char08182009

Name Size Bytes Class Attributes

char08182009 26x1050 218400 double

EACH ROW IS VECTORIZED CHARACTER BITMAP

14Roger S. Gaborski

BasicOCR.m

CODE SOMETHING LIKE THIS:

cc = ['A' 'B' 'C' 'D' 'E' 'F' 'G' 'H' 'I' 'J' 'K' 'L' 'M' 'N' 'O' ...

'P' 'Q' 'R' 'S' 'T' 'U' 'V' 'W' 'X' 'Y' 'Z'];

First, convert matrix of text character to a row vector

for j=1:26

score(j)= sum(t .* char08182009R(j,:));

end

ind(i)=find(score= =max(score));

fprintf('Recognized Text %s, \n', cc(ind))

OUTPUT: Recognized Text HANSPETERBISCHOF,

How can I segment this image?

15University of BonnRoger S. Gaborski

Assumption: uniformity of

intensities in local image region

What is Texture?

16University of BonnRoger S. Gaborski

17Roger S. Gaborski

18

• Edge Detection• Histogram• Threshold - graythresh

Roger S. Gaborski

19Roger S. Gaborski

20Roger S. Gaborski

21Roger S. Gaborski

22Roger S. Gaborski

lev = graythresh(I)

lev =

0.5647

>> figure, imshow(I<lev)

What is Texture

• No formal definition– There is significant variation in intensity levels

between nearby pixels – Variations of intensities form certain repetitive

patterns (homogeneous at some spatial scale)– The local image statistics are constant, slowly varying

• human visual system: textures are perceived as homogeneous regions, even though textures do not have uniform intensity

23Roger S. Gaborski

Texture

• Apparent homogeneous regions:

– In both cases the HVS will interpret areas of sand or bricks as a ‘region’ in an image– But, close inspection will reveal strong variations in pixel

intensity

24

A brick wallSand on a beach

Roger S. Gaborski

Texture

• Is the property of a ‘group of pixels’/area; a single pixel does not have texture

• Is scale dependent– at different scales texture will take on different properties

• Large number of (if not countless) primitive objects– If the objects are few, then a group of countable objects

are perceived instead of texture• Involves the spatial distribution of intensities

– 2D histograms– Co-occurrence matrixes

25Roger S. Gaborski

Scale Dependency

• Scale is important – consider sand• Close up

– “small rocks, sharp edges”– “rough looking surface”– “smoother”

• Far Away– “one object – brown/tan color”

26Roger S. Gaborski

Terms (Properties) Used to Describe Texture

• Coarseness• Roughness• Direction• Frequency• Uniformity• DensityHow would describe dog fur, cat fur, grass, wood grain,

pebbles, cloth, steel??

27Roger S. Gaborski

“The object has a fine grain and a smooth surface”

• Can we define these terms precisely in order to develop a computer vision recognition algorithm?

28Roger S. Gaborski

Features

• Tone – based on pixel intensity in the texture primitive

• Structure – spatial relationships between primitives• A pixel can be characterized by its Tonal/Structural

properties of the group of pixels it belongs to

29Roger S. Gaborski

30

• Tonal:– Average intensity– Maximum intensity– Minimum intensity– Size, shape

• Spatial Relationship of Primitives:– Random– Pair-wise dependent

Roger S. Gaborski

Artificial Texture

31

Roger S. Gaborski

Artificial Texture

32

Segmenting into regions based on texture

Roger S. Gaborski

Color Can Play an Important role in Texture

33

Roger S. Gaborski

Color Can Play an Important Role in Texture

34

Roger S. Gaborski

Statistical and Structural Texture

35

Consider a brick wall:

• Statistical Pattern – close up pattern in bricks

• Structural (Syntactic) Pattern – brick pattern

on previous slides can be represented by a grammar,

such as, ababab )Roger S. Gaborski

36

Most current research focuses on statistical texture

Edge density is a simple texture measure

- edges per unit distance

Segment object based on edge density

HOW DO WE ESTIMATE

EDGE DENSITY??

Roger S. Gaborski

37

Move a window across the image

and count the number of edges in

the window

ISSUE – window size?

How large should the window be?

What are the tradeoffs?

How does window size affect

accuracy of segmentation?

Segment object based

on edge density

Roger S. Gaborski

38

Move a window across the image

and count the number of edges in

the window

ISSUE – window size?

How large should the window be?

Large enough to get a good estimate

Of edge density

What are the tradeoffs?

Larger windows result in larger overlap

between textures

How does window size affect

Accuracy of segmentation?

Smaller windows result in better region

segmentation accuracy, but poorer

Estimate of edge density

Segment object based

on edge density

Roger S. Gaborski

Average Edge Density Algorithm

• Smooth image to remove noise• Detect edges by thresholding image• Count edges in n x n window• Assign count to edge window

• Feature Vector [gray level value, edge density]• Segment image using feature vector

39Roger S. Gaborski

Run Length Coding Statistics

• Runs of ‘similar’ gray level pixels• Measure runs in the directions 0,45,90,135

40

0 1 2 30 2 3 32 1 1 13 0 3 0

Image

Y( L, LEV, d)

Where L is the number of runs of

length L

LEV is for gray level value and

d is for direction d

Roger S. Gaborski

41

0 1 2 30 2 3 32 1 1 13 0 3 0

0 degrees

1 2 3 4

0

1

2

3

Gra

y Le

vel,

LEV

Run Length, L

1 2 3 4

0

1

2

3

Gra

y Le

vel,

LEV

Run Length, L

Image

45 degrees

Roger S. Gaborski

42

0 1 2 30 2 3 32 1 1 13 0 3 0

0 degrees

1 2 3 4

0 4 0 0 0

1 1 0 1 0

2 3 0 0 0

3 3 1 0 0

Gra

y Le

vel,

LEV

Run Length, L

1 2 3 4

0 4 0 0 0

1 4 0 0 0

2 0 0 1 0

3 3 1 0 0

Gra

y Le

vel,

LEV

Run Length, L

Image

45 degrees

Roger S. Gaborski

Run Length Coding

• For gray level images with 8 bits 256 shades of gray 256 rows

• 1024x1024 1024 columns• Reduce size of matrix by quantizing:

– Instead of 256 shades of gray, quantize each 8 levels into one resulting in 256/8 = 32 rows

– Quantize runs into ranges; run 1-8 first column, 9-16 the second…. Results in 128 columns

43Roger S. Gaborski

Gray Level Co-occurrence Matrix, P[i,j]• Specify displacement vector d = (dx, dy)

• Count all pairs of pixels separated by d having gray level values i and j. Formally:

P(i, j) = |{(x1, y1), (x2, y2): I(x1, y1) = i, I(x2, 21) = j}|

44Roger S. Gaborski

Gray Level Co-occurrence Matrix

• Consider simple image with gray level values 0,1,2

Roger S. Gaborski 45

2 1 2 0 1

0 2 1 1 2

0 1 2 2 0

1 2 2 0 1

2 0 1 0 1

x

y

One pixel right

One pixel down

x

y

• Let d = (1,1)

46

2 1 2 0 1

0 2 1 1 2

0 1 2 2 0

1 2 2 0 1

2 0 1 0 1

Count all pairs of pixels in which the

first pixel has value i and the second

value j displaced by d.

P(1,0) 1

0

P(2,1) 2

1

Etc.

Roger S. Gaborski

Co-occurrence Matrix, P[i,j]

47

2 1 2 0 1

0 2 1 1 2

0 1 2 2 0

1 2 2 0 1

2 0 1 0 1

0 1 2

0 0 2 2

1 2 1 2

2 2 3 2

i

j

P(i, j)

There are 16 pairs, so normalize by 16

Roger S. Gaborski

Uniform Texture

48

d=(1,1)

Let Black = 1, White = 0

P[i,j]

P(0,0)=

P(0,1)=

P(1,0)=

P(1,1) =

x

y

Roger S. Gaborski

Uniform Texture

49

d=(1,1)

Let Black = 1, White = 0

P[i,j]

P(0,0)= 24

P(0,1)= 0

P(1,0)= 0

P(1,1) = 25

x

y

Roger S. Gaborski

Uniform Texture

50

d=(1,0)

Let Black = 1, White = 0

P[i,j]

P(0,0)= ?

P(0,1)= ?

P(1,0)= ?

P(1,1) = ?

x

y

Roger S. Gaborski

Uniform Texture

51

d=(1,0) x

y

Let Black = 1, White = 0

P[i,j]

P(0,0)= 0

P(0,1)= 28

P(1,0)= 28

P(1,1) = 0Roger S. Gaborski

Randomly Distributed Texture

52

What if the Black and white pixels where randomly distributed?

What will matrix P look like??

1 1 1 0 0 1 0 0

0 0 1 0 1 0 0 1

1 1 0 0 0 1 0 1

0 1 1 1 0 0 1 1

1 1 0 0 1 1 0 0

1 1 0 0 1 1 1 1

0 0 1 0 0 1 0 1

0 0 0 1 1 0 1 1

No preferred set of gray level

pairs, matrix P will have

approximately a uniform

population

Roger S. Gaborski

Co-occurrence Features

• Gray Level Co-occurrence Matrices(GLCM)– Typically GLCM are calculated at four different

angles: 0, 45,90 and 135 degrees– For each angles different distances can be used,

d=1,2,3, etc.– Size of GLCM of a 8-bit image: 256x256 (28).

Quantizing the image will result in smaller matrices. A 6-bit image will result in 64x64 matrices

– 14 features can be calculated from each GLCM. The features are used for texture calculations

53Roger S. Gaborski

Co-occurrence Features• P(ga,gb,d,t):

– ga gray level pixel ‘a’– gb gray level pixel ‘b’– d distance d– t angle t (0, 45,90,135)

54

In many applications the transition ga to gb and gb to ga are both counted. This results in symmetric

GLCMs:

For P(0,0,1,0)

0 0

results in an entry of 2 for the ‘0 0’ entryRoger S. Gaborski

Co-occurrence Features

• The data in the GLCM are used to derive the features, not the original image data

• How do we interpret the contrast equation?

55

2,

,

( )i ji j

Contrast P i j

Roger S. Gaborski

Co-occurrence Features

• The data in the GLCM are used to derive the features, not the original image data: Measures the local variations in the gray-level co-occurrence matrix.

• How do we interpret the contrast equation? The term (i-j)2: weighing factor (a squared term)

– values along the diagonal (i=j) are multiplied by zero. These values represent adjacent image pixels that do not have a gray level difference.

– entries further away from the diagonal represent pixels that have a greater gray level difference, that is more contrast, and are multiplied by a larger weighing factor.

56

2,

,

( )i ji j

Contrast P i j

Roger S. Gaborski

Co-occurrence Features

• Dissimilarity:

– Dissimilarity is similar to contrast, except the weights increase linearly

57

,,

| |i ji j

dissimilarity P i j

Roger S. Gaborski

Co-occurrence Features

• Inverse Difference Moment

– IDM has smaller numbers for images with high contrast, larger numbers for images low contrast

58

,

2, 1 ( )

i j

i j

PIDM

i j

Roger S. Gaborski

Co-occurrence Features

• Angular Second Moment(ASM) measures orderliness: how regular or orderly the pixel values are in the window

• Energy is the square root of ASM

• Entropy:

where ln(0)=0

59

2,

,i j

i j

ASM P

2,

,i j

i j

E P

2, ,

,

( ln )i j i ji j

Entropy P P

Roger S. Gaborski

Matlab Texture Filter Functions

Function Description

rangefilt Calculates the local range of an image.

stdfilt Calculates the local standard deviation of an image.

entropyfilt Calculates the local entropy of a grayscale image. Entropy is a statistical measure of randomness

60Roger S. Gaborski

61

rangefilt

Roger S. Gaborski

A =

1 3 5 5 2

4 3 4 2 6

8 7 3 5 4

6 2 7 2 2

1 8 9 6 7

Symmetrical Padding

1 1 3 5 5 2 2 max = 4, min = 1, range = 3

1 1 3 5 5 2 2

4 4 3 4 2 6 6

8 8 7 3 5 4 4

6 6 2 7 2 2 2

1 1 8 9 6 7 7

1 1 8 9 6 7 7

62

rangefilt Results (3x3)

Roger S. Gaborski

A =

1 3 5 5 2

4 3 4 2 6

8 7 3 5 4

6 2 7 2 2

1 8 9 6 7

>> R = rangefilt(A)

R =

3 4 3 4 4

7 7 5 4 4

6 6 5 5 4

7 8 7 7 5

7 8 7 7 5

63

rangefilt Results (5x5)

Roger S. Gaborski

A =

1 3 5 5 2

4 3 4 2 6

8 7 3 5 4

6 2 7 2 2

1 8 9 6 7

>> R = rangefilt(A, ones(5))

R =

7 7 7 5 4

7 7 7 5 5

8 8 8 7 7

8 8 8 7 7

8 8 8 7 7

Original image

64Roger S. Gaborski

65

Imfilt = rangefilt(Im);

figure, imshow(Imfilt, []), title('Image by rangefilt')

Roger S. Gaborski

66

Imfilt = stdfilt(Im);

figure, imshow(Imfilt, []), title('Image by stdfilt')

Roger S. Gaborski

67

Imfilt = entropyfilt(Im);

figure, imshow(Imfilt, []), title('Image by entropyfilt')

Roger S. Gaborski

Matlab function: graycomatrix

• Computes GLCM of an image– glcm = graycomatrix(I) analyzes pairs of

horizontally adjacent pixels in a scaled version of I. If I is a binary image, it is scaled to 2 levels. If I is an intensity image, it is scaled to 8 levels.

– [glcm, SI] = graycomatrix(...) returns the scaled image used to calculate GLCM. The values in SI are between 1 and 'NumLevels'.

68Roger S. Gaborski

69

Parameters

• ‘Offset’ determines number of co-occurrences matrices generated

• offsets is a q x 2matrix– Each row in matrix has form [row_offset,

col_offset]– row_off specifies number of rows between pixel of

interest and its neighbors– col_off specifies number of columns between pixel

of interest and its neighbors

Roger S. Gaborski

70

Offset

• [0,1] specifies neighbor one column to the left• Angle Offset• 0 [0 D]• 45 [-D D]• 90 [-D 0]• 135 [-D –D]

Roger S. Gaborski

71

Orientation of offset

• The figure illustrates the array: offset = [0 1; -1 1; -1 0; -1 -1]

Roger S. Gaborski

90, [-1,0]

135, [-1,-1] 45, [ -1,1]

0

,

[

0

,

1

]

72

Intensity Image

– mat2gray Convert matrix to intensity image.I = mat2gray(A,[AMIN AMAX]) converts the matrix A to the intensity image I.The returned matrix I contains values in the range 0.0 (black) to 1.0

Roger S. Gaborski

73

graycomatrix Example

Roger S. Gaborski

From textbook, p 649

>> f = [ 1 1 7 5 3 2;

5 1 6 1 2 5;

8 8 6 8 1 2;

4 3 4 5 5 1;

8 7 8 7 6 2;

7 8 6 2 6 2]

f =

1 1 7 5 3 2

5 1 6 1 2 5

8 8 6 8 1 2

4 3 4 5 5 1

8 7 8 7 6 2

7 8 6 2 6 2

Need to convert to an Intensity image [0,1]

74Roger S. Gaborski

>> fm = mat2gray(f)

fm =

0 0 0.8571 0.5714 0.2857 0.1429

0.5714 0 0.7143 0 0.1429 0.5714

1.0000 1.0000 0.7143 1.0000 0 0.1429

0.4286 0.2857 0.4286 0.5714 0.5714 0

1.0000 0.8571 1.0000 0.8571 0.7143 0.1429

0.8571 1.0000 0.7143 0.1429 0.7143 0.1429

75

Quantize to 8 Levels

Roger S. Gaborski

IS =

1 1 7 5 3 2

5 1 6 1 2 5

8 8 6 8 1 2

4 3 4 5 5 1

8 7 8 7 6 2

7 8 6 2 6 2

76Roger S. Gaborski

>> offsets = [0 1];

>> [GS, IS] =

graycomatrix(fm,'NumLevels', 8, 'Offset', offsets)

GS =

1 2 0 0 0 1 1 0

0 0 0 0 1 1 0 0

0 1 0 1 0 0 0 0

0 0 1 0 1 0 0 0

2 0 1 0 1 0 0 0

1 3 0 0 0 0 0 1

0 0 0 0 1 1 0 2

1 0 0 0 0 2 2 1

See NEXT PAGE

77Roger S. Gaborski

GS =

1 2 0 0 0 1 1 0

0 0 0 0 1 1 0 0

0 1 0 1 0 0 0 0

0 0 1 0 1 0 0 0

2 0 1 0 1 0 0 0

1 3 0 0 0 0 0 1

0 0 0 0 1 1 0 2

1 0 0 0 0 2 2 1

IS =

1 1 7 5 3 2

5 1 6 1 2 5

8 8 6 8 1 2

4 3 4 5 5 1

8 7 8 7 6 2

7 8 6 2 6 2

78Roger S. Gaborski

'GrayLimits'

Two-element vector, [low high], that specifies how the grayscale

values in I are linearly scaled into gray levels. Grayscale values

less than or equal to low are scaled to 1. Grayscale values greater

than or equal to high are scaled to NumLevels. If graylimits is set

to [], graycomatrix uses the minimum and maximum grayscale

values in the image as limits, [min(I(:)) max(I(:))].

>> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[])

79Roger S. Gaborski

>> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[])

>> I = rand(5)

I =

0.0085 0.8452 0.2026 0.1901 0.6818

0.6311 0.1183 0.1947 0.1580 0.5397

0.2303 0.8539 0.6766 0.8251 0.9968

0.4624 0.7807 0.7231 0.5540 0.1104

0.3995 0.4229 0.7560 0.3559 0.6204

80Roger S. Gaborski

>> [GS, IS] = graycomatrix(f,'NumLevels', 8, 'Offset', offsets, 'G',[])

GS =

1 2 0 0 0 1 1 0

0 0 0 0 1 1 0 0

0 1 0 1 0 0 0 0

0 0 1 0 1 0 0 0

2 0 1 0 1 0 0 0

1 3 0 0 0 0 0 1

0 0 0 0 1 1 0 2

1 0 0 0 0 2 2 1

IS =

1 1 7 5 3 2

5 1 6 1 2 5

8 8 6 8 1 2

4 3 4 5 5 1

8 7 8 7 6 2

7 8 6 2 6 2

81Roger S. Gaborski

>> [GS, IS] = graycomatrix(f,'NumLevels', 4, 'Offset', offsets, 'G',[])

GS =

3 0 3 1

1 2 1 0

6 1 1 1

1 0 4 5

IS =

1 1 4 3 2 1 ORIGINAL IMAGE QUANTIZED

3 1 3 1 1 3 TO 4 LEVELS

4 4 3 4 1 1

2 2 2 3 3 1

4 4 4 4 3 1

4 4 3 1 3 1

Texture feature formula

Energy Provides the sum of squared elements in the GLCM. (square root of ASM)

Entropy Measure uncertainty of the image(variations)

Contrast Measures the local variations in the gray-level co-occurrence matrix.

HomogeneityMeasures the closeness of the distribution of elements in the GLCM to the GLCM diagonal.

82

2,

,i j

i j

P

2, ,

,

( ln )i j i ji j

P P 2

,,

( )i ji j

P i j

,

, 1 || ||i j

i j

P

i j

Roger S. Gaborski

glcms = graycomatrix(Im, 'NumLevels', 256, 'G',[]))stats = graycoprops(glcms, 'Contrast Correlation Homogeneity’);figure, plot([stats.Correlation]); title('Texture Correlation as a function of offset'); xlabel('Horizontal Offset'); ylabel('Correlation')

83Roger S. Gaborski

Texture Measurement

84

Quantize 256

Gray Levels to 32

Data Window

31x31 or 15x15

GLCM0 GLCM45 GLCM90 GLCM135

Feature for Each Matrix

ENERGY

ENTROPY

CONTRAST

etc

Generate

Feature

Matrix

For Each

FeatureRoger S. Gaborski

85

image

Ideal

map

Roger S. Gaborski

86

Classmaps generated using the 3 best CO feature images

Roger S. Gaborski

87

31x31 produces the

Best results, but

large errors at borders

Classmaps generated using the 7 best CO feature images

Roger S. Gaborski

Law’s Texture Energy Features

• Use texture energy for segmentation• General idea: energy measured within textured

regions of an image will produce different values for each texture providing a means for segmentation

• Two part process:– Generate 2D kernels from 5 basis vectors– Convolve images with kernels

88Roger S. Gaborski

Law’s Kernel GenerationLevel L5 = [ 1 4 6 4 1 ]Ripple R5 = [ 1 –4 6 –4 1 ]Edge E5 = [ -1 –2 0 2 1 ]To generate kernels, multiply one vector by the transpose of itself or another vector:L5E5 = [ 1 4 6 4 1 ]’ * [ -1 –2 0 2 1 ]

89

-1 -2 0 2 1

-4 -8 0 8 4

-6 -12 0 12 6

-4 -8 0 8 4

-1 -2 0 2 1

• 25 possible 2D kernels are

possible, but only 24 are used

• L5L5 is sensitive to mean

brightness values and is not used

Spot S5 = [ -1 0 2 0 –1 ]

Wave W5 = [ -1 2 0 -2 1 ]

Roger S. Gaborski

90Roger S. Gaborski

91Roger S. Gaborski

92Roger S. Gaborski

textureExample.m

• Reads in image• Converts to double and grayscale• Create energy kernels• Convolve with image• Create data ‘cube’

93Roger S. Gaborski

stone_building.jpg

94Roger S. Gaborski

95Roger S. Gaborski

96Roger S. Gaborski

97Roger S. Gaborski

Test 2

98Roger S. Gaborski

99Roger S. Gaborski

100Roger S. Gaborski

Scale

• How will scale affect energy measurements?• Reduce image to one quarter size imGraySm = imresize(imGray, 0.25, bicubic');

101Roger S. Gaborski

Data ‘cube’

102

>> data = cat(3, im(:,:,1), im(:,:,2), im(:,:,3), imL5R5, imR5E5);

>> figure, imshow(data(:,:,1:3))

>> data_value=data(7,12,:)

data_value(:,:,1) = 142

data_value(:,:,2) = 166

data_value(:,:,3) = 194

data_value(:,:,4) = 22

data_value(:,:,5) = 10

Roger S. Gaborski

Fractal Dimension• Hurst coefficient can be used to calculate the fractal

dimension of a surface• The fractal dimension can be interpreted as a

measure of textureConsider the 5 pixel wide neighborhood (13 pixels)

103

dc b c

d b a b dc b c

d

PixelClass Number Distance

from center

a 1 0b 4 1c 4 1.414d 4 2

Roger S. Gaborski

Fractal Dimension Algorithm

• Lay mask over original image• Examine pixels in each of the classes• Record the brightest and darkest for each class• The pixel brightness difference (range) for each pixel

class is used to generate the Hurst plot• Use least squares fit to construct a ln distance vs ln

range plot• The slope of this line is the Hurst coefficient for the

specific pixel

104Roger S. Gaborski