urban visibility measurements during tropical weather ... visibility... · keywords- rainfall...

6
Urban Visibility Measurements During Tropical Weather Events Using Image Processing Roy Khristopher O. Bayot, Rollyn T. Labuguen, Edsel Jose P. Volante, Nathaniel Joseph C. Libatique, Gregory L. Tangonan Department of Electronics Computer & Communications Engineering Ateneo de Manila University Quezon City, Philippines Abstract— Visibility measurements are useful in areas such as air traffic control, air pollutant monitoring, urban traffic monitoring, vehicle safety, and also rain monitoring. Instead of using transmissometers and nephelometers, a simple off the shelf camera was deployed on the 11th floor of a multi-tenant building to sample a scene with a wide depth of field. The camera was automated to take pictures once per minute for seven days during and after a typhoon. Selected daytime events were processed. Three different patches of varying distances from the camera were processed for 6 different channels (RGB and HSV) for the entire duration using 6 different criteria - mean, column contrast, row contrast, gradient mean, spectral norm, and information entropy. In a separate deployment over the same area, an acoustic sensor was deployed on the field that corresponds to one patch in the image. It was demonstrated that the contrasts move together with the gradient, the spectral norms move together with the mean, while the saturation entropy has an inverse relationship with the contrast as well as the mean. It was also demonstrated that the envelope of the RGB mean follows the envelope for the acoustic sensor. Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION Quantitative measures of visibility are increasingly becoming important in various areas because it is primarily affected by precipitates. These affect other important events. For example, the presence of fog in airports can affect visibility measurement useful in air traffic control. By knowing visibility at the airport, pilots are assured safety as they take off or land. Sudden heavy bursts of rain or appearance of fog on the ground can also affect vehicle traffic. As drivers experience a decrease in visibility, cars slow down and can induce traffic jams and even cause vehicular accidents. Drivers may also be unable to read or detect road signs that may be important as they travel through unfamiliar areas. Finally, the degree of decrease in visibility may be indicative of the degree of rainfall and may indicate an impending flood. Tropical rain also decreases signal strength of various telecommunications links. As rain hits over a certain area with a microwave link, there is a notable drop in signal strength. This also happens for other electromagnetic signals such as the ones used in GSM networks as well as the ones used for broadband internet. In the same area with the burst of tropical rain, there is also an observable drop in visibility. Aside from rain, air pollution and air quality also affects visibility. Visibility is reduced with more pollutants in the atmosphere and low visibility might suggest low air quality. Therefore, by studying visibility and its measurement, there may be a way to characterize and mitigate the problems mentioned. Although visibility is usually measured using transmissometers and nephelometers, this study will use cameras as visibility measurement devices. Cameras are ubiquitous. It can be in different degrees of quality such as off the shelf webcams, to consumer point and shoot cameras, to the DSLR range, and to industrial grade cameras. Furthermore, cameras also record in different wavelength channels that may reveal added information. Finally, pictures also reveal structure in a scene that may be useful in measuring visibility. The focus of this study is to explore various methods of analysis to quantify visibility from images taken from off the shelf web cameras. Both the RGB and HSV channels are explored. Regions are selected from the series of images for various methods of analysis. The first method is the use of summary statistics for all channels in a given patch. The second method involves the patch contrast when taking the contrast as computed for the modulation transfer function. Third is the use of gradients of an image. Fourth is the spectral norm. Finally, information entropy is also used as method to analyze images. II. RELATED LITERATURE The World Meteorological Organization published a guide in [1] that estimates the quantity using a human observer. However, to eliminate subjectivity of the measurement, meteorological optical range (MOR) was the quantity defined. MOR is the path length in the atmosphere needed to reduce the luminous flux in a collimated beam of an incandescent lamp, at a color temperature of 2700K to 5 percent its original value. It is dependent mainly on the extinction coefficient and this is usually measured by telephotometric instruments, visual extinction meters, transmissometers, and visibility lidars. However this work departs from the usual instrumentation used and explores the use of digital cameras. One of the early works on visibility and cameras is the study by Steffens which used film cameras and densitometers to measure light transmission [2]. It used contrast of three black objects against TENCON 2012 1569633959 1

Upload: trantu

Post on 30-Mar-2018

218 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Urban Visibility Measurements During Tropical Weather ... Visibility... · Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION ... indicative of the degree of

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

Urban Visibility Measurements During Tropical Weather Events Using Image Processing

Roy Khristopher O. Bayot, Rollyn T. Labuguen, Edsel Jose P. Volante, Nathaniel Joseph C. Libatique, Gregory L. Tangonan

Department of Electronics Computer & Communications EngineeringAteneo de Manila University

Quezon City, Philippines

Abstract— Visibility measurements are useful in areas such as air traffic control, air pollutant monitoring, urban traffic monitoring, vehicle safety, and also rain monitoring. Instead of using transmissometers and nephelometers, a simple off the shelf camera was deployed on the 11th floor of a multi-tenant building to sample a scene with a wide depth of field. The camera was automated to take pictures once per minute for seven days during and after a typhoon. Selected daytime events were processed. Three different patches of varying distances from the camera were processed for 6 different channels (RGB and HSV) for the entire duration using 6 different criteria - mean, column contrast, row contrast, gradient mean, spectral norm, and information entropy. In a separate deployment over the same area, an acoustic sensor was deployed on the field that corresponds to one patch in the image. It was demonstrated that the contrasts move together with the gradient, the spectral norms move together with the mean, while the saturation entropy has an inverse relationship with the contrast as well as the mean. It was also demonstrated that the envelope of the RGB mean follows the envelope for the acoustic sensor.

Keywords- rainfall sensing, visibility, image processing

I. INTRODUCTION

Quantitative measures of visibility are increasingly becoming important in various areas because it is primarily affected by precipitates. These affect other important events. For example, the presence of fog in airports can affect visibility measurement useful in air traffic control. By knowing visibility at the airport, pilots are assured safety as they take off or land. Sudden heavy bursts of rain or appearance of fog on the ground can also affect vehicle traffic. As drivers experience a decrease in visibility, cars slow down and can induce traffic jams and even cause vehicular accidents. Drivers may also be unable to read or detect road signs that may be important as they travel through unfamiliar areas. Finally, the degree of decrease in visibility may be indicative of the degree of rainfall and may indicate an impending flood. Tropical rain also decreases signal strength of various telecommunications links. As rain hits over a certain area with a microwave link, there is a notable drop in signal strength. This also happens for other electromagnetic signals such as the ones used in GSM networks as well as the ones used for broadband internet. In the same area with the burst of tropical rain, there is also an observable drop in

visibility. Aside from rain, air pollution and air quality also affects visibility. Visibility is reduced with more pollutants in the atmosphere and low visibility might suggest low air quality. Therefore, by studying visibility and its measurement, there may be a way to characterize and mitigate the problems mentioned.

Although visibility is usually measured using transmissometers and nephelometers, this study will use cameras as visibility measurement devices. Cameras are ubiquitous. It can be in different degrees of quality such as off the shelf webcams, to consumer point and shoot cameras, to the DSLR range, and to industrial grade cameras. Furthermore, cameras also record in different wavelength channels that may reveal added information. Finally, pictures also reveal structure in a scene that may be useful in measuring visibility. The focus of this study is to explore various methods of analysis to quantify visibility from images taken from off the shelf web cameras. Both the RGB and HSV channels are explored. Regions are selected from the series of images for various methods of analysis. The first method is the use of summary statistics for all channels in a given patch. The second method involves the patch contrast when taking the contrast as computed for the modulation transfer function. Third is the use of gradients of an image. Fourth is the spectral norm. Finally, information entropy is also used as method to analyze images.

II. RELATED LITERATURE

The World Meteorological Organization published a guide in [1] that estimates the quantity using a human observer. However, to eliminate subjectivity of the measurement, meteorological optical range (MOR) was the quantity defined. MOR is the path length in the atmosphere needed to reduce the luminous flux in a collimated beam of an incandescent lamp, at a color temperature of 2700K to 5 percent its original value. It is dependent mainly on the extinction coefficient and this is usually measured by telephotometric instruments, visual extinction meters, transmissometers, and visibility lidars.

However this work departs from the usual instrumentation used and explores the use of digital cameras. One of the early works on visibility and cameras is the study by Steffens which used film cameras and densitometers to measure light transmission [2]. It used contrast of three black objects against

TENCON 2012 1569633959

1

Page 2: Urban Visibility Measurements During Tropical Weather ... Visibility... · Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION ... indicative of the degree of

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

the horizon to estimate the visibility. However this differs from such an implementation because black objects can not be readily placed in an urban setting.

The approach is similar to [3], [4], [5], [6], [7], [8], [9], [17], where a camera is fixed to take photos of the same scene over a period of time. The approach uses contrast within a patch and not against the sky. However it does not consider energy of the high frequency content in a Fourier transformed image unlike [8] and [9]. It also does not consider using Haar functions as well as homomorphic filters unlike [10], nor does not employ multiple objects at the same path unlike [2]. It does however check for three different landmarks at different angles. It is similar to [9] in that regard but does not use any of its landmark recognition methods. Aside from local contrast, it uses summary statistics, spectral norm, and also information entropy for 6 different channels. It would be interesting to incorporate various models such as that of [14], [15], [16], however it already increases the complexity of the algorithms. Furthermore, the idea of dark channel priors from [12] were not used because specified patches are already considered and a prior in that patch may most likely not exist due to the nature of tropical rain. The approach taken by this study also differs from Poduri et al. in [20] that uses a mobile phone and computes for the azimuthal angle on a sunny day. Another adaptation done by this work could be related with [18].

Finally, the images analyzed in this study are not due to haze or fog but specific to tropical rain. As Ando et al. in [21] had observed in localized behaviors of rain, this provides an image dimension to their approach of using wireless links.

.

III. METHODOLOGY

A. Hardware Setup

A Logitech C109 USB enabled webcam was used for the capture. It was connected to an EEE PC with Ubuntu operating system version 8.04. The camera was securely fastened into a mount which was then fixed to the side of the building. It was fixed outside the window of a room on the 11th floor of a condominium. The camera was oriented to a heading of approximately 32 degrees. Given the location, this orientation gives the view with a high depth of field and useful landmarks as features.

Figure 1. Logitech C109 USB enabled webcam

B. Capture Settings

FSwebcam, a program with command line interface, was scripted to capture 1 picture every minute for 7 days. To

ensure uniformity in the image size, it was also automatically scripted to resize the image to 640x480. White balance was kept at automatic. The resulting pictures were combined to form a series of videos. Each video shows the images at 15 frames per second and it consists of pictures for a 12 hour interval capture. The daytime interval consists of pictures from 6:00AM until 5:59PM. The night time interval consists of pictures from 6:00PM until 5:59AM of the following day.

C. Software Workflow

The workflow could be illustrated by the flowchart below.

Figure 2. Software flowchart

For each image, three predefined patches were selected. Patch 1 is centered at the pixel at row 128, column 60. Patch 2 is at row 237, column 254. Patch 3 is at row 240, column 535. Each patch is predefined to be 41x41 pixels and was selected because of the interesting features. Each patch also has a varied distance across the scene. The figure below shows a sample image of the scene and the patches.

Figure 3. Different selected patches for each frame

Figure 4. Bird’s eye view of the deployment

Figure 3 shows a bird’s eye view of the deployment taken from Google Earth. The approximate distances from the camera towards the patches were also taken. Patch 1 (blue) is

2

Page 3: Urban Visibility Measurements During Tropical Weather ... Visibility... · Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION ... indicative of the degree of

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

approximately 214 meters; patch 2 (green) is 244 meters; and patch 3 (red) is 300 meters away from the camera.

After the patches were selected, it was processed in both the RGB and HSV channels and 6 different criteria for the channels were computed. First is the mean of the channel. The second and third would be mean contrast with a separate calculation for column and row contrast. This is given by the equation below.

max min

max min

( )V V

C fV V

-=

+ (1)

The fourth is the image gradient which is computed by convolving a Sobel filter and taking only the magnitude. This is given by the equations below.

( , )x xD S I x y= Ä (2)( , )y yD S I x y= Ä (3)

1 0 1

2 0 2

1 0 1xS

-é ùê ú= -ê úê ú-ë û

(4)

1 2 1

0 0 0

1 2 1yS

é ùê ú= ê úê ú- - -ë û

(5)

The fifth criterion is the spectral norm which is defined as the square root of the maximum eigenvalue that could be taken from a non-singular matrix. Finally, information entropy is taken as the last criterion. Given events S1, S2, …, Sn which occur with probabilities p(S1), p(S2), …, p(Sn), the average uncertainty associated with each of the possible events is given by the equation below.

21

( ) ( ) log ( )n

i ii

H S p S p S=

= -å (6)

D. Acoustic Sensor Setup

The acoustic sensor setup implemented by Trono [22] is basically an HTC Wildfire Android enabled smartphone with a microphone attachment placed inside a cylindrical aluminum can. The inside of the can is lined with styrofoam to reduce echoes. For this deployment, the Android phone was programmed to record from 2 to 6 samples per second for duration of 3 hours. This was deployed in an area near the location where patch 2 was analyzed. The succeeding images show this acoustic sensor. After the recording, the program stores all the values in decibels of all the samples into a comma separated value with particular time stamps.

IV. RESULTS AND DISCUSSION

A. Vision System during Typhoon Pedring

1) RGB Mean and Spectral NormIf we examine the graphs for patch 1, there can be

similarities that could be observed. In the succeeding graph, the RGB channels all seem to follow a certain trend. When we examine the frames that correspond with the peaks and valleys in the graph, we can observe that there is a peak in the graph when the rain intensifies and blurs the image.

Figure 5. RGB Mean over Time.

The higher the mean value for the RGB channel, the stronger the rain, and the greater the blurring. Also, all of the channels tend to have similar peaks and valleys that move together but the blue channel has the lowest mean intensity among the three. Most of the time, either the red or the green channel has the higher intensity. This could be evidence of Rayleigh scattering wherein the degree at which the light gets scattered depends on the wavelength.

One similarity between graphs is that the value channel for HSV tracks the pattern of the RGB channels. This is not too much of a surprise since it is just the average of the RGB channels. What is more interesting is that RGB spectral norm follows the RGB mean except that it is shifted up to value around 5000 as shown by the figure below.

Figure 6. RGB Spectral norm over timeSpectral norm gives the distance between the image

matrices. Perhaps the distance between both matrices could be approximated by the mean of the values within the matrix.

2) RGB Contrast and GradientA more evident similarity exists between column contrast,

row contrast, and gradient mean. As seen on the graphs in the succeeding page, the trend between the rise and fall seems evident. There was a separation between row and column contrast in the implementation because patch 1 displayed bar patterns in both directions. Thus the row and column contrast could be different and therefore need different computations. However, in this case it seems that either could demonstrate contrast degradation due to rain. But in other cases where the pattern is only in one direction or in a direction not solely x or y, then it might be better to check for contrast in both directions. It also could explain why the gradient mean follows contrast. Gradients in one direction are just basically a subtraction of adjacent intensity values and it follows how contrast is computed. Contrast is also a subtraction although the minimum intensity from the maximum intensity. There is that subtle difference however given that the object is at the distance, the pixel values will be close together and the adjacent values will not be as far from the minimum or the maximum. If the object were much nearer, then perhaps

3

Page 4: Urban Visibility Measurements During Tropical Weather ... Visibility... · Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION ... indicative of the degree of

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

changes in contrast will be more pronounced from changes in gradient.

Figure 7. RGB column contrast in time of patch 1

Figure 8. RGB row contrast in time of patch 1

Figure 9. RGB gradient mean in time of patch 1

Another similarity comes between information entropy from the saturation channel and that for gradient mean, column contrast and row contrast.

3) HSV Information EntropyThe succeeding figure shows HSV entropy in time.

Observing the saturation channel and comparing it with the graphs in Fig. 7 to 9, we can see that as the gradient or contrast rises, so does the information contained in the saturation channel. This is an interesting observation since the information seems to be contained in the gradient or the contrast. The saturation information entropy ranges from 1.89 bits to 6.431 bits (70% drop) and seems to be more sensitive than contrast or gradient.

Figure 10. HSV information entropy in time.

The first main difference observed is that between mean of RGB channels of Fig. 6, as well as the RGB spectral norms in Fig. 7 do not move in the same direction as contrast, gradient, and entropy in Fig. 8 to 10. This does not come as a surprise since as rain falls over a certain area where the patch is, the gradient and contrast decreases because the rain blurs the objects. It averages out the values of adjacent pixels. With a more uniform value than before, the information entropy contained in that patch drops. Another observation to note is the fact that once rain falls over a certain area, illumination on that area also decreases, and also constitute drops in the contrasts, gradients, and entropy. Also the graph of RGB entropy over time below shows another result.

Figure 11. RGB entropy over time.

There are only a few spikes in the figure and it shows only the photos with the clear view of the building.

4) Comparison of Different CriteriaWe have also made a comparison of the different criteria

that was tested on the images. Three pairs of frames were selected to check the relative change of the values of the different criteria. Each pair corresponds to a frame that captured heavy rain event and a light rain event. The frames were inspected manually. The representative frames are 44 (clear), 111 (rainy), 190 (clear), 237 (rainy), 492 (clear), and 502 (rainy).

TABLE I. COMPARISON OF RELATIVE CHANGE OF ALL RGB CRITERIA ON REPRESENTATIVE FRAME NUMBERS 44 (CLEAR) AND 111 (RAINY), 190 (CLEAR) AND 237 (RAIN), AND 492 (CLEAR) AND 502 (RAINY).

MEAN COLUMN CONTRAST ROW CONTRAST GRADIENT SPECTRAL NORM ENTROPY

R G B R G B R G B R G B R G B R G B

FRAME

44 95.04 92.8 83.680 0.333 0.351 0.395 0.386 0.397 0.437 117.90 85.36 83.26 3941 3854 3484 0 0 0

111 171.38 173.5 155.19 0.069 0.071 0.070 0.023 0.024 0.022 70.67 50.16 45.95 7033 7122 6368 0 0 0

%DIFF 0.445 0.46 0.461 0.617 0.611 0.567 0.762 0.748 0.731 0.657 0.681 0.715 0.440 0.459 0.453

190 90.17 83.1 70.13 0.127 0.137 0.171 0.092 0.100 0.117 40.50 27.22 23.70 3712 3426 2896 0 0 0

237 159.65 156.2 144.05 0.082 0.079 0.092 0.022 0.021 0.027 66.49 45.76 42.90 6552 6412 5914 0 0 0

%DIFF 0.435 0.46 0.513 0.358 0.421 0.459 0.763 0.789 0.766 0.391 0.405 0.447 0.434 0.466 0.510

492 110.10 107.6 86.47 0.132 0.148 0.189 0.105 0.108 0.135 72.55 48.81 43.34 4523 4423 3558 0 0 0

502 144.22 141.5 124.10 0.098 0.091 0.115 0.058 0.060 0.068 76.69 52.41 47.71 5920 5809 5096 0 0 0

%DIFF 0.237 0.24 0.303 0.261 0.388 0.388 0.448 0.443 0.495 0.054 0.069 0.092 0.236 0.239 0.302

4

Page 5: Urban Visibility Measurements During Tropical Weather ... Visibility... · Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION ... indicative of the degree of

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

TABLE II. COMPARISON OF RELATIVE CHANGE OF ALL HSV CRITERIA ON REPRESENTATIVE FRAME NUMBERS 44 (CLEAR) AND 111 (RAINY), 190 (CLEAR) AND 237 (RAIN), AND 492 (CLEAR) AND 502 (RAINY).

As seen in tables 1, the row contrast of RGB channel gives the highest relative change in the values. In table 2, the HSV channels row contrast also has one of the highest for the first pair, but in other cases, it would be the saturation mean, value row contrast, and saturation gradient as well as saturation contrast.

5) Representative FramesThe figures below show two sample frames for those

captured during a rain event and that with little or no rain.

Figure 12. Representative frame for an image with no rain (frame 45)

Figure 13. Representative frame for an image blurred by rain (frame 110)

Figure 11 shows a representative frame for an image that was not affected by rain. Figure 12 has patch 1 affected by rain that looks heavier than that in the location of patch 2 and patch 3. This shows that rain could even be localized even with a distance of 50 meters.

B. 01-10-2011: Vision System with Rain Acoustic Sensors

The given figure shows the mean of RGB over time.

Figure 14. RGB sensor reading in time

Some of the local maxima are at frames 31, 69, 101, 128, 179, and 205. These peaks correspond to the rain events. Some of the local minima are at frames 44, 89, 164, and 199 and correspond to clear image with little to no rain at all. An interesting result comes from our simultaneous measurements of acoustic power shown in the succeeding graph.

Figure 15. Acoustic sensor reading in time

The envelope of acoustic level over time seem to have the same envelope as the RGB values over time. The range of intensities from figure 14 is in the range of 22.34 until 101.4. This seem to correspond to the range of values in figure 15 which is from -72.7634 dB until -37.1057 dB in acoustic signal level. This is interesting because the result recorded from a point sensor (acoustic sensor) has a strong relation with a volume sensor (camera). In our experiment, targeted analysis of certain patches in the field of view reveal spatial localization of a rain event, as confirmed by our point sensors.

Figure 16. Tipping Bucket Data. Rainfall in mm/min. Time in (hh:mm)

Finally, there is also a recording of tipping bucket data which was deployed from the Manila Observatory weather station during the rain event. Figure 16 shows the measurements from the device and it shows a similar trend to that seen from the camera and the acoustic sensor. Since we measured visibility effects of rain, other sensors such as the tipping bucket and acoustic rain sensor could be used in tandem for a possible sensor fusion.

V. CONCLUSION

This study was able to demonstrate the use of cameras as a visibility sensor and more specifically, a rain intensity indicator as well as the use of various algorithms to compute for such values. It was demonstrated that the intensity levels of the RGB channels could be used as visibility indicators as well as variations of contrast and gradient. Information entropy in the saturation channel could also serve as another indicator. The study was able to show fluctuations from 1.89 bits to

MEAN COLUMN CONTRAST ROW CONTRAST GRADIENT SPECTRAL NORM ENTROPY

FRAME

H S V H S V H S V H S V H S V H S V

44 0.13 0.130 95.3 0.38 0.526 0.334 0.466 0.650 0.385 0.115 0.131 85.55 5.580 5.551 3954 3.491 5.413 0

111 0.19 0.106 173.6 0.07 0.113 0.071 0.047 0.101 0.024 0.085 0.041 50.16 7.639 4.355 7125 2.151 2.956 0

%DIFF 0.28 0.181 0.45 0.819 0.784 0.788 0.899 0.844 0.938 0.260 0.686 0.414 0.270 0.275 0.445 0.384 0.454

190 0.11 0.224 90.17 0.08 0.155 0.127 0.077 0.11 0.092 0.055 0.084 29.25 4.452 9.250 3712 1.786 4.550 0

237 0.13 0.098 159.6 0.11 0.145 0.082 0.078 0.10 0.022 0.069 0.037 46.66 5.326 4.025 6552 1.544 2.999 0

%DIFF 0.16 0.563 0.44 0.29 0.068 0.358 0.009 0.120 0.763 0.206 0.564 0.373 0.164 0.565 0.434 0.136 0.341

492 0.15 0.216 110.1 0.13 0.243 0.132 0.069 0.16 0.105 0.077 0.105 49.04 6.129 8.916 4523 2.729 4.789 0

502 0.15 0.140 144.2 0.13 0.131 0.098 0.080 0.10 0.058 0.071 0.054 53.11 5.955 5.744 5920 2.137 3.390 0

%DIFF 0.029 0.352 0.237 0.007 0.464 0.261 0.134 0.391 0.448 0.079 0.481 0.077 0.028 0.356 0.236 0.217 0.292

5

Page 6: Urban Visibility Measurements During Tropical Weather ... Visibility... · Keywords- rainfall sensing, visibility, image processing I. INTRODUCTION ... indicative of the degree of

1 2 3 4 5 6 7 8 91011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556576061

6.431 bits for saturation channel information entropy which is a relative drop of 70%. However, with a comparison of all the different criteria with pairs of images that were visually inspected to show heavy blurring and rain with light blurring and rain, it turns out that row contrast for the RGB channel showed to have the highest relative change in values. This is due to the fact that the patch selected had a high contrast horizontal pattern.

An interesting result is that the mean RGB graph generated from analyzing the images from a camera has the same envelope with the signal recorded from the acoustic sensor as well as that of the tipping bucket data. And finally, we have also observed localizations of the rain event even in the range of 50 meters.

VI. RECOMMENDATIONS

The results given are mostly dimensionless quantities except for mean intensity values as well as entropy. These results can be compared with simultaneous independent measurements from transmissometers and nephelometers for the extinction coefficient as well as the metrological optical range. Correlation with other data sources should also be done.

ACKNOWLEDGMENT

We would like to thank DOST-ERDT for their full support in this project. We would also like to thank the Manila Observatory for the tipping bucket data.

REFERENCES

[1] WMO CIMO Guide. Internet: http :/ / www . Wmo . Int / pages / prog / www / IMOP / publications/ CIMO-Guide / CIMO%20 Guide%20 7th%20 Edition,%20 2008 / CIMO_Guide-7th_Edition-2008.pdf. [Jan. 20, 2012].

[2] C. Steffens. “Measurement of visibility by photographic photometry,” Industrial Engineering Chemistry, vol. 41, pp. 2396-2399, January 1949.

[3] Z. Bin. "Visibility Measurement Using Digital Cameras in Qingdao," Symposium on Photonics and Optoelectronic (SOPO), 2010, vol., no., pp.1-4, 19-21 June 2010.

[4] J.A. Betts. "The instrumental assessment of visual range," in Proceedings of the IEEE , vol.59, no.9, pp. 1370- 1371, Sept. 1971

[5] C. Luo, S. Liu, and C. Yuan, “Measuring Atmospheric Visibility by Digital Image Processing,” Aerosol and Air Quality Research, vol.2, pp. 23-29, 2002.

[6] F. M. Janeiro, F. Wagner, P. M. Ramos, and A. M. Silva. “Automated atmospheric visibility measurements using a digital camera and image registration,” in 1st IMEKO TC19 International Symposium on Measurements and Instrumentation for Environmental Monitoring, Iaşi, Romenia, Sept. 2007.J.G. Shannon. "Light attenuation measurements with a unique transmissometer system," in IEEE International Conference on Engineering in the Ocean Environment, Ocean '74 -, vol., no., pp.254-259, 21-23 Aug. 1974

[7] F. Caimi, D. Kocak, and J. Justak. “Remote visibility measurement technique using object plane data from digital image sensors,” in Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, 2004, pp. 3288–3291

[8] L. Xie, A. Chiu, and S. Newsam. Estimating Atmospheric Visibility Using General-Purpose Cameras. in Proceedings of the 4th International Symposium on Advances in Visual Computing, Part II, page 367. Springer, 2008.

[9] J.V. Molenar, D.S. Cismoski, F. Schreiner , and W.C. Malm. ”Analysis of digital images from Grand Canyon Great Smoky Mountains,“ in Regional and Global Perspectives on Haze: Causes, Consequences and Controversies Visibility Specialty Conference, 2004.

[10] J.-J. Liaw, S.-B. Lian, Y.-F. Huang, and R.-C. Chen. “Using sharpness image with Haar function for urban atmospheric visibility measurement,” Aerosol and Air Quality Research 10, 323–330, 2010.

[11] N. Graves and S. Newsam. "Using visibility cameras to estimate atmospheric light extinction," In IEEE Workshop on Applications of Computer Vision (WACV), 2011, vol., no., pp.577-584, 5-7 Jan. 2011H.F. Shu and R.D. Benton. "Data acquisition and analysis on the two-color-laser transmissometer system," in IEEE Proceedings of Southeastcon '91., vol., no., pp.810-814 vol.2, 7-10 Apr 1991

[12] K. He, J. Sun, and X. Tang. “Single image haze removal using dark channel prior,” In IEEE Conference on Computer Vision and Pattern Recognition (CVPR’09), pp. 1956–1963, 2009.

[13] N. Hautiére, R. Babari, É. Dumont, R. Brémond and N. Paparoditis. “Estimating Meteorological Visibility using Cameras: a Probabilistic Model-Driven Approach,” in 10th Asian Conference on Computer Vision, 2010.

[14] R. Babari, N. Hautière, E. Dumont, and N. Paparoditis. A model-driven approach to estimate atmospheric visibility with ordinary cameras. Atmospheric Environment, 2011.

[15] K. Garg and S. Nayar. “Vision and rain,” International Journal of Computer Vision, vol. 75 no. 1, pp.3–27, October 2007.

[16] S.G. Narasimhan and S.K. Nayar, “Contrast Restoration of Weather Degraded Images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, no. 6 pp. 713-724, June 2003.

[17] J-P.Andreu, H. Ganster, E. Schmidt, M. Uray, and H. Mayer. “Comparative Study of Landmark Detection Techniques for Airport Visibility Estimation,” Austrian Association for Pattern Recognition Workshop.

[18] H.S. Prasantha, H.L. Shashidhara, and K.N. Balasubramanya Murthy. "Image Compression Using SVD," in International Conference on Computational Intelligence and Multimedia Applications, 2007 , vol.3, no., pp.143-145, 13-15 Dec. 2007.

[19] D.Y. Tsai, Y. Lee and E. Matsuyama. “Information Entropy Measure for Evaluation of Image Quality,” Journal of Digital Imaging: The Official Journal of the Society for Computer Applications in Radiology, vol. 21 (3), pp. 338-47, Sep 2008.

[20] S. Poduri, A. Nimkar, and G. S. Sukhatme. “Visibility monitoring using mobile phones,” Technical report, Department of Computer Science, University of Southern California, 2010.

[21] M. Ando, M.M. Hasan, R. Jayawardene, T. Hirano, and J. Hirokawa. "Localized behaviors of rain measured in Tokyo Tech millimeter-wave wireless network," in Proceedings of the 5th European Conference on Antennas and Propagation (EUCAP), vol., no., pp.1688-1692, 11-15 April 2011

[22] E. M. Trono (2011). Design and Performance of an Android-Based Integrated Rainfall Sensor and Alarm System. Quezon City: Ateneo de Manila University.

6