color image denoising in wavelet domain using adaptive thresholding incorporating the human visual...
TRANSCRIPT
-
7/28/2019 Color Image Denoising In Wavelet Domain Using Adaptive Thresholding Incorporating the Human Visual System M
1/4
-
7/28/2019 Color Image Denoising In Wavelet Domain Using Adaptive Thresholding Incorporating the Human Visual System M
2/4
discontinuities. The wavelet representation naturally facilitates
the construction of such spatially adaptive algorithm, which
compresses the essential of signal into relatively few, large
coefficients. Thresholding methods compare the input to a
given threshold and set it to zero if its magnitude is less
than the threshold. The idea is that coefficients insignificant
relative to the threshold are likely due to noise, whereas
significant coefficients are important signal structures. Thresh-olding essentially creates a region around zero where the
coefficients are considered negligible. Outside of this region,
the thresholded coefficients are kept to full precision. Here
two different adaptive soft thresholding methods are used for
image denoising. These are BayesShrink and Lees threshold.
BayesShrink thresholding technique was proposed by Chang
[1],[2] and Lees thresholding technique was proposed by Lee
[15].
IV. DWT WITH ADAPTIVE THRESHOLDING USING HVS
MODEL
The color image denoising algorithm using HVS is shown in
Fig. 2. First, the RGB color images are converted to CIELABcolor space. CIE L*a*b* (CIELAB) is the most complete color
space which describes all the colors visible to the human eye.
Wavelet transform is a desirable choice for the HVS model
[16]. The CSF can be implemented by modifying the wavelet
coefficients in every sub-band using ISF.
Fig. 2. Denoising Algorithm with HVS Model.
A. CSF implementation in Wavelet Denoising:
The CSF describes the sensitivity of the HVS as a func-
tion of contrast and spatial frequency. The spatial frequency
response of the luminance has the characteristics of band-pass
filter while spatial frequency responses of the Red-Green and
Blue-Yellow channels have the characteristic of low pass filter.
Sensitivity is good for low spatial frequencies, but declines at
higher frequencies.
B. Mapping the CSF curve into DWT coefficients:
Let us assume that an image is optimized with given
resolution r pixels (dots) per inch and a viewing distance v
meters. The spatial sampling frequency fs in pixels per degree
is then given by
fs =2vrtan(0.5)
0.0254(1)
If the signal is critically down-sampled at Nyquist rate,
0.5 cycles per pixel are obtained. That means the maximum
frequency represented in the signal measured in cycles perdegree is fmax = 0.5fs.
The sub-band of level j = 2 and orientation = 2 ishighlighted in Fig. 4. This sub-band contains mostly horizontal
Fig. 3. CSF Curves for Luminance and Chromatic channels.
Fig. 4. Relation between CSF and 5-level 2D Mallat wavelet decomposition
details. The horizontal spatial frequencies of this sub-band
varies from 0 0.25fmax and the vertical frequencies from0.25 to 0.5fmax.The weighting that must be used for thewavelet coefficients in this sub-band is described by portions
of 2D CSF functions.
C. Extracting coefficients for CSF compensation:The CSF is basically used to modify the wavelet coefficients
on the basis of invariant single factor (ISF) [6],[7]. In ISF we
assign a weighting factor into every sub-band. It is supposed
to describe the average sensitivity of the HVS for the covered
frequency range. The weighting factor is used to multiply the
coefficients before wavelet thresholding.
f = (fh, fv) =[(fL, fL), T[w1csf(fL, fH)], T[w
2csf(fH, fL)],
T[w3csf(fH, fH)]] (2)
where wcsf is CSF weighting factor along every orientation
and operator T stands for adaptive thresholding.
To obtain the CSF weighting factors denoted by wcsf foreach sub-band, the CSF curve is sampled at the mid frequency
of the frequency range for each sub-band. The CSF functions
for Luminance proposed by Mannos and Sarkinson [4].
499
-
7/28/2019 Color Image Denoising In Wavelet Domain Using Adaptive Thresholding Incorporating the Human Visual System M
3/4
H(fLum) = 2.6(0.192 + 0.114fLum)e[(0.114fLum)
1.1]) (3)
The chromatic curves are given by the following equation
H(fChrom) = exp(afChrom + b) (4)
Where fLum and fChrom are the spatial frequencies with
units of cycles/degree. For RG, a = 0.13 and b = 4.84. ForYB, a = 0.14 and b = 4.22 [17].
The CSF weighting factors are obtained directly from the
CSF curves in the normalized spatial frequency domain. The
image is decomposed into 5 levels, for which a 11-weightsystem is proposed instead of the generally used 6-weightsystem. The 11-weights are derived as follows.
1) For each level of decomposition,the average of the CSF
curve for the frequency range of each HH sub-band as
well as for the corresponding low frequency band is
calculated. Thus, for first level of decomposition, the
average of the CSF curve for frequency range 0.5fmax
fmax is taken and is denoted by p1. The averageof the CSF curve for frequency range 0 0.5fmaxis taken and is denoted by q1. Similarly, for second
level the average of the CSF curve for frequency range
0.52fmax 0.5fmax is taken and is denoted by p2and the average of the CSF curve for frequency range
0 0.52fmax is denoted by q2 and so on upto p5 andq5.
2) The CSF weight for the HH1 sub-band is given by p1,
the HH2 sub-band by p2, and so on. The CSF weight
for the final fifth level LL sub-band is given by q5.
3) The CSF weight for the LH1 and HL1 sub-band is
given by
(p1 q1) , the LH2 and HL2 sub-band by(p2 q2) , and so on.4) Finally, all the 11 weights are normalized so that the
lowest weight equals 1.
Fig. 5 shows the CSF curve for luminance, for a spatial
frequency fs = 64 pixels/degree . The 11 CSF weightsextracted from this curve have been shown in Fig. 6.
Fig. 5. CSF curve for luminance for fs=64 pixels/degree.
The weights thus extracted are multiplied with the wavelet
coefficients to modify the image into a contrast enhanced one.
Fig. 6. CSF mask with 11 unique weights for fs = 64 pixels/degree.
The modified image is then denoised by adaptive thresholding
algorithm applied to the various frequency components ob-
tained by DWT. In this paper , two such thresholding schemes
are used - BayesShrink [1],[2] and Lees Thresholding [15].
The denoised sub-bands are reconstructed to obtain the de-
noised image using Inverse DWT.
V. EXPERIMENTAL RESULTS
The images of LENA, PEPPERS and BARBARA have been
used to test the algorithm. BayesShrink and Lees adaptive
thresholding method have been used for denoising. The results
for different values of noise variance for the two methods are
shown in Table I. The images are viewed in a 19 monitorhaving a screen resolution of 96 pixels(dots)/inch. The viewingdistance is taken to be 55 cm. Biorthogonal 6.8 wavelet is used
in decomposition and reconstruction DWT filters.
TABLE I
PERFORMANCE EVALUATION IN TERMS OF PSNR AN D Q FOR DIFFERENT
TEST IMAGES WITH VARYING PERCENTAGE OF NOISE USING
BAYESSHRINK AND LEE S ADAPTIVE THRESHOLD
IMAGE 2 BayesShrink Lees
PSNR Q PSNR Q
0.03 23.55 0.117 23.88 0.124
LENA 0.05 23.64 0.121 23.21 0.116
0.07 22.55 0.105 22.81 0.118
0.03 21.56 0.166 23.34 0.205
PEPPERS 0.05 20.73 0.152 20.65 0.152
0.07 21.22 0.151 21.48 0.158
0.03 25.81 0.071 25.74 0.073
BARBARA 0.05 24.82 0.069 24.36 0.069
0.07 24.07 0.063 24.32 0.066
In general, the PSNR and Universal Image Quality Index
(Q)[8] of the denoised output image decreases with the in-
crease in noise variance as seen in Table I. Table II shows
the comparison between the traditional 6-weight system andthe proposed 11-weight system. The comparison clearly shows
that the 11-weight system outperforms the 6-weight system
both in terms of PSNR and Q value.It is observed from Fig. 7 and Fig. 8 that incorporation of
CSF provides better image quality by enhancing the contrast.
The comparison of the output images without implementation
500
-
7/28/2019 Color Image Denoising In Wavelet Domain Using Adaptive Thresholding Incorporating the Human Visual System M
4/4
TABLE II
COMPARISON OF PERFORMANCES OF 6- WEIGHT AND 11 -WEIGHT
SYSTEMS IN TERMS OF PSNR AN D Q FOR DIFFERENT TEST IMAGES FOR A
NOISE VARIANCE 2 = 0.03 USING BAYESSHRINK AND LEE S ADAPTIVE
THRESHOLD
PSNR Q PSNR Q
IMAGE 6 weights 11 weights
BayesShrink
LENA 23.20 0.10 23.55 0.1168
PEPPERS 22.91 0.17 21.56 0.1658
BARBARA 25.48 0.0662 25.81 0.0707
Lees
LENA 23.73 0.12 23.88 0.1235
PEPPERS 22.62 0.18 23.34 0.2046
BARBARA 24.73 0.0745 25.74 0.0726
(a) (b) (c) (d)
(e) (f) (g) (h)
Fig. 7. Comparison of denoising methods with and without CSF using
BayesShrink adaptive threshold for images corrupted with Gaussian noisehaving 2 = 0.03 (a) Original LENA image, (b) Noisy LENA image, (c)Output without CSF, (d) Output with CSF, (e) Original PEPPERS image,(f) Noisy PEPPERS image, (g) Output without CSF, (h) Output with CSF.
of CSF shows that BayesShrink gives a better perceptable
image than that of Lee. However, with the implementation
of CSF the blurring effect in the case of Lees thresholding is
removed and the final output has a better PSNR and Q than
that of BayesShrink.
V I. CONCLUSION
This paper compensates for the varying contrast sensitivity
of the HVS to obtain a better denoised output. It is evident
from the experimental results that the 11-weight system out-performs the general 6-weight system by providing a bettermapping of the CSF curve into the different frequency sub-
bands of the image. Also, Lees thresholding algorithm gives
better PSNR than BayesShrink in most cases. Further improve-
ments in the image quality are possible by considering the
masking effects of the HVS or by employing a more complex
HVS model.
REFERENCES
[1] S. G. Chang, M. Vetterli, Spatial adaptive wavelet thresholding for imagedenoising, Proc. of Int. Conf. on Image Processing, pp. 374-377, 1997
[2] S.G. Chang, B. Yu, M. Vetterli, Adaptive wavelet thresholding for imagedenoising and compression IEEE Trans. Image Process, vol. 9, pp. 1532-1546, September 2000.
(a) (b) (c) (d)
(e) (f) (g) (h)
Fig. 8. Comparison of denoising methods with and without CSF usingLees adaptive threshold for images corrupted with Gaussian noise having2 = 0.03 (a) Original LENA image, (b) Noisy LENA image, (c) Outputwithout CSF, (d) Output with CSF, (e) Original PEPPERS image, (f) NoisyPEPPERS image, (g) Output without CSF, (h) Output with CSF.
[3] K. Romberg, H. Choi, R.G. Baraniuk, Bayesian tree-structured imagemodeling using wavelet-domain hidden Markov models, IEEE Trans.
Image Processing, vol. 10, pp. 1056-1068, 2001.[4] P. G. J. Barten, Contrast Sensitivity of the Human Eye and Its Effects
on Image Quality, SPIE, Bellingham, Washington, 1999.[5] M. A. Garcf-Perez, The perceived image: efficient modeling of visual
inhomogeneity, Spatial Vision, pp. 89-99, 1992.[6] M.J. Nadenau, J. Reichel, M. Kunt, Wavelet based color image com-
pression: exploiting the contrast sensitivity function, IEEE Trans. ImageProcessing, vol. 12, pp. 58 - 70, 2003.
[7] A.B. Watson, G.Y. Yang, J.A. Solomon, J. Villasenor, Visual thresholdsfor wavelet quantization error, Proc. of SPIE Conf. on Human Visionand Electronic Imaging, pp. 2657 , 1997.
[8] Z. Wang, A. Bovik, A universal image quality index, IEEE SignalProcess. Lett., vol. 9, pp. 81-84 , 2002.
[9] A.R. Weeks, Fundamentals of Electronic Image Processing, SPIE Optical
Engineering Press and IEEE Press, New York, 1996.[10] G. Sharma, M. Vrehl, H. Trussell, Color image for multimedia, Proc.
of IEEE, vol. 86, pp. 1088-1108, June 1998.[11] I. Daubechies, W Swelden, Factoring wavelet transforms into lifting
steps, Jounal of Fourier Anal. Appl., vol. 4, no. 3, pp. 247-269, 1998.[12] E. Peli, Contrast in complex images, Jounal of Opt. Soc. Am. A, vol.
7, pp. 2032-2040, 1990.[13] W. Schreiber, Fundamentals of Electronic Imaging Systems, Springer,
New York, 1993.[14] S. A. Klein, et al., Seven models of masking, Proc. of SPIE, San Jose,
CA, vol. 3016, pp. 13-24, 1997.[15] Y. H. Lee, S. B. Rhee, Wavelet-based image denoising with optimal
filter, Int. Journal of Information Processing Systems, vol.1, no.1, pp.
32-35, 2005.[16] S. Mallat, Wavelet for a vision, Proc. of IEEE, vol. 84, pp. 604-614,
1996.[17] K. T. Mullen, The contrast sensitivity of human color vision to red-
green and blue-yellow chromatic gratings, Journal of Physiol., vol. 35,pp. 281-400, 1985.
501