wqra healthstream - issue 60

34
HEALTH STREAM DECEMBER 2010 PAGE 1 Issue 60 Public Health Newsletter of Water Quality Research Australia December 2010 In this Issue: Cryptosporidium Outbreak In Sweden 1 DBPs, Pools And Bladder Cancer 3 EDC Sources In Water 5 News Items 6 From The Literature 6 Web Bonus Articles Arsenic Cyanobacteria Disinfection Byproducts Emerging Contaminants Outbreaks POU Treatment Public Communication Recycled Water Reproductive Outcomes Viruses Mailing List Details 20 Editor Martha Sinclair Assistant Editor Pam Hayes WQRA Internet Address: www.wqra.com.au A searchable Archive of Health Stream articles, literature summaries and news items is available via the WQRA Web page. Cryptosporidium Outbreak In Sweden Contamination of the drinking water supply for the Swedish town of Östersund has resulted in a cryptosporidiosis outbreak affecting over 11,000 people. Östersund is the only city and primary commercial centre for the county of Jämtland and is a popular venue for winter sports, including a number of international events. The city is located in the centre of Sweden and draws its water supply from Lake Storsjön (also known as the Great Lake or Grand Lake). The population of Östersund is about 58,000 people, with 53,000 receiving tap water from the municipal supply. According to reports from the Swedish Institute for Infectious Disease Control (SMI), a few sporadic laboratory-confirmed cases of cryptosporidiosis were detected among residents of Östersund in early November, although the number was not unusually high. This pathogen is considered to be a relatively rare cause of community gastroenteritis in Sweden, however public health concerns had been raised by a foodborne cryptosporidiosis outbreak among 250 people attending a conference in a different town in Jämtland in early October. Thus initial investigation of the Östersund cases looked for evidence of a food- borne source, possibly linked to the previous outbreak, but no common food or drink exposures could be identified. By the third week of November increasing reports of workplace absenteeism due to “stomach flu” made it evident that a widespread outbreak was occurring in Östersund, and local authorities began to suspect that drinking water might be the source. On 26 November test results from intensified screening of faecal specimens revealed Cryptosporidium infection in the majority of patients

Upload: worldweb-management-services

Post on 14-Mar-2016

232 views

Category:

Documents


1 download

DESCRIPTION

WQRA HealthStream - Issue 60

TRANSCRIPT

Page 1: WQRA HealthStream - Issue 60

HEALTH STREAM DECEMBER 2010 PAGE 1

Issue 60 Public Health Newsletter of Water Quality Research Australia December 2010 In this Issue: Cryptosporidium Outbreak In Sweden 1 DBPs, Pools And Bladder Cancer 3 EDC Sources In Water 5 News Items 6 From The Literature 6 Web Bonus Articles Arsenic Cyanobacteria Disinfection Byproducts Emerging Contaminants Outbreaks POU Treatment Public Communication Recycled Water Reproductive Outcomes Viruses Mailing List Details 20 Editor Martha Sinclair Assistant Editor Pam Hayes WQRA Internet Address: www.wqra.com.au A searchable Archive of Health Stream articles, literature summaries and news items is available via the WQRA Web page.

Cryptosporidium Outbreak In Sweden Contamination of the drinking water supply for the Swedish town of Östersund has resulted in a cryptosporidiosis outbreak affecting over 11,000 people. Östersund is the only city and primary commercial centre for the county of Jämtland and is a popular venue for winter sports, including a number of international events. The city is located in the centre of Sweden and draws its water supply from Lake Storsjön (also known as the Great Lake or Grand Lake). The population of Östersund is about 58,000 people, with 53,000 receiving tap water from the municipal supply. According to reports from the Swedish Institute for Infectious Disease Control (SMI), a few sporadic laboratory-confirmed cases of cryptosporidiosis were detected among residents of Östersund in early November, although the number was not unusually high. This pathogen is considered to be a relatively rare cause of community gastroenteritis in Sweden, however public health concerns had been raised by a foodborne cryptosporidiosis outbreak among 250 people attending a conference in a different town in Jämtland in early October. Thus initial investigation of the Östersund cases looked for evidence of a food-borne source, possibly linked to the previous outbreak, but no common food or drink exposures could be identified. By the third week of November increasing reports of workplace absenteeism due to “stomach flu” made it evident that a widespread outbreak was occurring in Östersund, and local authorities began to suspect that drinking water might be the source. On 26 November test results from intensified screening of faecal specimens revealed Cryptosporidium infection in the majority of patients

Page 2: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 2

tested, with no evidence of other gastroenteric pathogens. A boil water notice was issued on 26 November, and the presence of the protozoan parasite in the drinking water supply was confirmed on 29 November by the results of water tests conducted by SMI. To date, genotyping of isolates from human patients has shown only C. hominis infections, indicating a human source for the contamination. An internet survey inviting residents to report gastroenteritis symptoms was initiated on the Öresund municipality website on 29 November, and more than 1,000 people responded within the first 24 hours. The number of reports of illness continued to grow and reached 11,200 on 9 December. The outbreak has caused widespread absenteeism among the workforce, but few people have been ill enough to require admission to hospital. The boil water notice has disrupted the normal activities of the community, and the municipality has announced that businesses will be able to claim compensation for their losses, but this will not extend to households. During this incident Lake Storsjön was described by the Environmental Manager for the municipality as “one of the cleanest lakes in Europe” although it is subject to agricultural run-off and sewage discharges. According to information on the website of the municipality, the high quality of raw water means that only relatively simple processes are needed to treat the water. The lake water is treated by alkalinisation, ozonation, filtration and disinfection before distribution to consumers. Ozone is normally considered to have good disinfectant capability against Cryptosporidium oocysts, however its efficacy is affected by water temperature and other factors. Maximum ambient air temperatures in Östersund at this time of year are several degrees below zero Celsius, although Lake Storsjön does not normally freeze solid until later in the winter period. Media articles have cited a number of water quality experts and environmental scientists who have stated that the water quality in the lake is frequently poor and that the water treatment process is not sufficient to meet the potential contamination challenges. The location of the water intake point only 3 km from a sewage discharge outlet and the low degree of

sewage treatment prior to discharge has also been criticised. The municipality has reacted by inviting experts from the national company Swedish Water to conduct an independent review the system and its operation to identify any problems. Ozonation and chlorination levels for the water supply were increased as an immediate response to the contamination, and installation of a UV treatment system to provide additional disinfection began on 10 December. Authorities have stated that the boil water notice may need to remain in force for several more weeks until the new UV treatment system is fully operational and the contaminated water has been flushed out of the distribution system. An extensive water sampling program has been undertaken to identify possible sources of the contamination, including the local sewage treatment plant as well as stormwater outlets and streams discharging into the lake. On 2 December authorities announced that a localised area of contamination had been found and that efforts would be made to contain the contamination, but no further details were given. On 9 December it was announced that the contamination had been traced to a single residential building in the city. The sewage line from the building was reported to have been illegally or mistakenly connected to a rainwater drainage system, which presumably drains into the lake via the stormwater system. Environmental crime investigators from the Swedish police force have been involved since the early stages of the investigation and it is possible that criminal charges may be laid over the incident after all the information has been assessed. At present the available evidence suggests that sewage contamination from a small number of people has caused this large scale outbreak. Volunteer studies of Cryptosporidium infection have demonstrated that infected individuals excrete large numbers of oocysts during the course of an infection (mean approximately 1 million, maximum over 100 million), and for some strains ingestion of as few as 10 oocysts may carry a 50% risk of infection. It would appear that the water treatment system did not provide an effective barrier against passage of viable oocysts into the drinking water supply.

Page 3: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 3

DBPs, Pools and Bladder Cancer The November issue of the journal Environmental Health Perspectives featured four papers on disinfection byproducts; three relating to DBP exposure in swimming pools and one analysing associations between genetic markers and bladder cancer risk. The papers were produced from a joint research program between teams in Spain and the US, with additional collaborators in Canada, Greece, Germany and the Netherlands. The three swimming pool studies were carried out in public pools in Barcelona, Spain. Some studies of DBP exposure and bladder cancer have found elevated risks associated with swimming exposures as well as chlorinated tap water consumption. Complex mixtures of DBPs are formed in swimming pools due to the continuous levels of disinfection and shedding of organic matter by swimmers. The three studies comprised: • characterisation of DBP content and mutagenicity

of one chlorinated pool and one brominated pool, • a study of genotoxicity biomarkers in subjects

swimming in the chlorinated pool, • a study of respiratory function and lung

biomarkers in subjects swimming in the chlorinated pool.

In the DBP characterisation study, mutagenicity of pool water extracts (two from each pool) was assessed using the Salmonella mutagenicity assay (1). Trihalomethanes in water were measured using gas chromatography/mass spectrometry (GC/MS), and disinfectant species were assayed by photometry. DBPs were identified using low- and high-resolution GC/MS, and THMs and trichloramine in air were also measured. Pool water samples (28 litres) were collected on five different days for the chlorinated pool and two different days for the brominated pool Sampling days were several weeks apart. Concentrates were prepared using XAD resin and split for DBP characterisation (20 litre equivalent) or mutagenicity testing (8 litre equivalent). More than 100 DBPs were identified including haloacids, halomethanes, haloacetonitriles, haloaldehydes, haloketones, halonitromethanes, haloamides, haloalcohols, and halophenols. Many of these

compounds had not been previously reported in the literature or relevant databases. As expected due to the nitrogen-rich organic matter in swimming pool water, many of the identified DBPs were nitrogen-containing. Such compounds are considered to be more likely candidates for carcinogenic effects on the bladder than non-nitrogenous DBPs. Bromoform was present at higher levels in the brominated pool but other brominated DBPs occurred with equal frequency in both pool types, probably reflecting the relatively high levels of bromine in local tap water. The study of genotoxicity biomarkers was conducted in healthy adults who swam in the chlorinated indoor pool under controlled conditions (2). Participants in the study (n=50 healthy adult nonsmokers) were asked to abstain for swimming for 1 week before the scheduled 40 minute swimming exposure. Samples of blood, urine and exhaled air were collected from participants before and after the swimming exposure. Questionnaire data were collected on personal and family history of atopic diseases, cigarette smoke exposure, demographics, water consumption etc. Composite water samples were collected from four points in the pool during each swim and assayed for free chlorine and a range of DBPs. The concentration of THMs in exhaled breath was used at the exposure metric, and comparisons were made of micronuclei and DNA damage (as measured by the comet assay) in lymphocytes before swimming and one hour after swimming. The mutagenicity of urine before and two hours after swimming was compared, and the presence of micronuclei in exfoliated bladder epithelial cells shed in urine were also compared before swimming and two weeks after. Laboratory analyses were conducted blinded to the type of sample (before or after swimming). One subject was excluded from analysis due to failure to complete all sampling requirements. As expected, levels of THM in exhaled breath were significantly increased after swimming. Post-swimming blood samples showed that an increase in the frequency of micronuclei in lymphocytes was associated with levels of brominated THMs in breath but not with levels of chloroform. The comet assay showed no significant changes in levels of DNA damage after swimming. Urine tests showed a

Page 4: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 4

significant increase in mutagenicity after swimming and this was associated with bromoform levels in exhaled breath. No changes were seen in the frequency of micronuclei in exfoliated bladder cells after swimming. The authors comment that although swimming undoubtedly has many health benefits, and adequate disinfection of pool water is needed to prevent waterborne infections, disinfection may also carry health risks which warrant further attention. The same group of subjects took part in a study assessing respiratory function, biomarkers of airway inflammation, oxidative stress and lung permeability (3). Some previous studies have indicated an association of respiratory effects and asthma with swimming in chlorinated pools, but results have been inconsistent. Lung function tests and sampling of exhaled breath and blood were performed before and after the 40 minute swim. Two subjects were excluded from analysis due to a history of asthma. The only significant change observed after swimming was a moderate increase in levels of the Clara cell secretory protein (CC16) in the serum (median increase 3.3%). This lung surfactant protein is secreted by the lung epithelium. The change in CC16 levels was associated with energy expenditure during swimming and change in THM concentrations in exhaled breath. However given that CC16 levels are highly variable in healthy subjects, the clinical significance (if any) of this short term increase is unknown. In the fourth paper, data from a previously published case-control study of bladder cancer in Spain was further analysed using genotyping information for three enzymes involved in the metabolism of DBPs (4). All of these genes are known to be polymorphic in humans (i.e. to exist in at least two genetically determined forms with differing enzymic activity). Cytochrome P450 2E1 (CYP2E1) is involved in the primary oxidation of THMs and also metabolizes a wide variety of aliphatic hydrocarbons, solvents, and industrial monomers. Glutathione S-transferase (GST) theta-1 (GSTT1) is able to activate brominated THMs to mutagenic form in a transgenic strain of Salmonella. GST zeta-1 (GSTZ1) catalyses the oxygenation of dichloro- and other α-haloacids, some of which are animal carcinogens.

The analysis was carried out on 680 cases (595 males, 85 females) and 714 controls (622 males, 92 females) where both genotype and adequate exposure information existed. These numbers were similar to those analysed in the previously published study and represent approximately half of potentially eligible incident bladder cancer cases (n=1457) and their controls (1465) identified during the study period. As in the previous analysis, there was a significant association between bladder cancer and increasing levels of THM exposure without taking genotype into consideration. THM exposure was calculated as the average THM concentration in home tap water from age 15 until age of cancer diagnosis (or equivalent age for controls). THM exposure quartiles 2, 3 and 4 adjusted odds ratios (ORs) and 95% confidence intervals (CIs) were 1.2 (0.8–1.8), 1.8 (1.1–2.9), and 1.8 (0.9–3.5). When the genotypes were included in the analysis, the associations between THM exposure and bladder cancer differed between people with different version of each of the three enzyme genes. The findings were consistent with understanding of the biochemical functions of the enzymes. Cancer risks were particularly elevated for those carrying the high risk forms of both GSTT1 and GSTZ1. For this subgroup, the ORs for quartiles 2, 3, and 4 of THM exposure were 1.5 (0.7–3.5), 3.4 (1.4–8.2), and 5.9 (1.8–19.0), respectively. Based on the frequency of genotypes in the control group, more than 20% of population may carry high-risk gene combinations. The authors note that these findings strengthen the evidence base for a causative role of DBPs in bladder cancer, and similar studies in other populations are needed to confirm these observations. (1) What’s in the Pool? A Comprehensive Identification of Disinfection By-products and Assessment of Mutagenicity of Chlorinated and Brominated Swimming Pool Water. Richardson SD, DeMarini DM, Kogevinas M et al. (2010) EHP 118(11):1523–1530. (2) Genotoxic Effects in Swimmers Exposed to Disinfection By-products in Indoor Swimming Pools. Kogevinas M, Villanueva CM, Font-Ribera L et al. (2010) EHP 118(11):1531–1537 (3) Short-Term Changes in Respiratory Biomarkers after Swimming in a Chlorinated Pool. Font-Ribera L, Kogevina M, Zock JP et al. (2010) EHP 118(11):1538–1544. (4) Polymorphisms in GSTT1, GSTZ1, and CYP2E1, Disinfection By-products, and Risk of Bladder Cancer in Spain. Cantor KP, Villanueva CM, Silverman DT, et al. (2010) EHP 118(11):1545-1550.

Page 5: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 5

EDC Sources In Water US researchers have recently published a review of evidence on the sources of endocrine disrupting chemicals (EDCs) which showed that contrary to popular belief, oral contraceptives (OC) are not the major contributor of such chemicals in environmental waters (1). Using data from the US and western Europe, the study assessed the daily excretion of natural estrogens by the human population, the excretion of prescribed pharmaceutical estrogens (OC, hormone replacement therapy (HRT), and hormone treatment for certain cancers), veterinary medical pharmaceuticals, pesticides, plant and food industry sources and natural hormones excreted by livestock. The human body produces three types of steroid estrogens; estrone (E1), 17-beta-estradiol (E2), and estriol (E3). The synthetic estrogen hormone most commonly used in OCs is 17-alpha-ethinyl estradiol (EE2), while HRT comprises a mixture of conjugated equine estrogens. These hormones differ in potency and are normally measured relative to the potency of E2 in biological assays (which is deemed to have a potency of 1). Their relative potencies are EE2: 2.0; E2: 1; E1: 0.2-0.4; E3: 0.024-0.026. Prescribed HRT comprises a mixture of related hormones with highly variable potencies relative to E2. The amounts of estrogen hormones excreted each day varies by gender, age and reproductive state. Non-pregnant women of reproductive age excrete about 2 to 2.5 times more estrogens than men, while during pregnancy, estrogen excretion rises about 400-fold. The contribution of each component of the population to estrogens in sewage effluent will depend on the age and gender profile, pregnancy rates and prevalence of OCP and HRT use. A Dutch estimate made in 2001 found that pregnant women contributed around 44% of excreted estrogens by weight, non-pregnant women contributed 36%, men 12%, elderly people 2%, children below 14 years of age 2%, and children aged 14 to 19 years contributed 3%. OC use was estimated to contribute only 1% of excreted estrogens from the population in a situation where 43% of women of reproductive age used OC. There are few data on excretion of estrogens from HRT, but one study reported that the amounts of

prescribed estrogen from HRT and veterinary pharmaceuticals exceeded the amount in prescribed OCPs. Activated sludge treatment of sewage results in removal of at least 85% of E2 and EE2. Livestock excrete the same three estrogen hormones as humans, and generally in larger amounts, although there is insufficient information to determine the relative proportion originating from natural and veterinary sources. It has been estimated that livestock in the US produce about 13 times more solid waste than the total sanitary waste production from the human population. While human sewage usually undergoes treatment before release into the environment, livestock manure is generally untreated and is often spread as a fertiliser. It has been estimated that animal waste contributes 90% of estrogens in the environment, and is likely to be the major source of estrogenic compounds entering surface and ground waters. A number of pesticides have endocrine disrupting properties and although these compounds have much lower potency than E2 (100 to 1,000 fold less) they have been detected at high levels in some water sources. Plants also produce estrogenic compounds (phytoestrogens) and these dietary sources are believed to contribute to estrogen excretion levels in humans and animals. The potency of these compounds is also very low compared to E2 but there have been reports of levels sufficient to cause estrogenic effects in fish in streams receiving wastewater from soy milk and biodiesel factories in some locations in the US. A wide range of industrial chemicals also possess endocrine disrupting activities, and the occurrence of these in sewage and source waters varies greatly. This analysis indicates that 17-alpha-ethinyl estradiol (EE2) from oral contraceptives is at most a minor contributor to EDC effects in environmental waters. The authors conclude that the particular focus on this synthetic hormone in the popular media and scientific literature may not be warranted, and greater attention should be paid to other EDC sources, particularly those that are currently unregulated and untreated. (1) Are Oral Contraceptives a Significant Contributor to the Estrogenicity of Drinking Water? Wise A, O’Brien K and Woodruff T. (2010) Environmental Science & Technology doi: 10.1021/es1014482

Page 6: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 6

News Items Long Term Health Risks From E.coli O157 A paper published recently in the British Medical Journal has highlighted the risks of long term adverse effects following infection by E. coli O157. Increased risks of hypertension, cardiovascular disease and renal impairment were found among Walkerton residents who suffered severe gastroenteritis during the waterborne outbreak in May 2000 compared to those who had mild illness or were not ill during the event. After a median follow up of 7.9 years, those who had severe gastroenteritis were twice as likely to have suffered a heart attack, stroke or congestive heart failure. The risk of developing hypertension increased by 33% and the risk of renal impairment increased by 15%. Long term risk for hypertension, renal impairment, and cardiovascular disease after gastroenteritis from drinking water contaminated with Escherichia coli O157:H7: a prospective cohort study. Clark WF et al. British Medical Journal 341:doi:10.1136/bmj.c6020 Climate change impacts on poor In the lead up to the Cancun Climate Change Conference in Mexico, the international aid agency Oxfam has issued a media release detailing the impacts of climate change on developing nations, especially among the poorest sectors of their populations. According to the release, extreme weather events during the first 9 months of 2010 have killed 21,000 people, more than twice as many as the previous year. Major events this year included massive floods in Pakistan and a heat wave in Russia that was accompanied by numerous forest fires. Costs of Waterborne Diseases The US Centers for Disease Control and Prevention have estimated that the combined healthcare costs for infections caused by Legionella, Giardia and Cryptosporidium in the US may be as much as $539 million per year. This estimate is based on extrapolation of data from a large insurance database and covers only inpatient hospitalised cases. There were large uncertainly factors around the estimate but the average costs for individual cases of each disease were estimated at $34,000 for Legionnaires' disease, approximately $9,000 for giardiasis and more than $21,000 for cryptosporidiosis.

From the Literature Web-bonus articles

Summaries of these additional articles are available in the web page version of Health Stream and are included in the searchable archive at: www.wqra.com.au/WQRA_publications.htm Arsenic exposure and risk of spontaneous abortion, stillbirth, and infant mortality. Rahman A, Persson LA, Nermell B, et al. (2010) Epidemiology 21(6):797-804. Lung function in adults following in utero and childhood exposure to arsenic in drinking water: preliminary findings. Dauphine DC, Ferreccio C, Guntur S, et al. (2010) International Archives of Occupational and Environmental Health, DOI 10.1007/s00420-010-0591-6 Disinfection by-products in drinking water and colorectal cancer: a meta-analysis. Rahman MB, Driscoll T, Cowie C and Armstrong BK. (2010) International Journal of Epidemiology, 39(3); 733-745. Pharmaceuticals, personal care products and endocrine-disrupting chemicals in U.S. surface and finished drinking waters: A proposed ranking system. Kumar A and Xagoraraki I. (2010) Science of the Total Environment 408(23);5972-5989. Cytotoxic drugs in drinking water: A prediction and risk assessment exercise for the Thames catchment in the United Kingdom. Rowney NC, Johnson AC and Williams RJ. (2009) Environmental Toxicology and Chemistry, 28(12); 2733-2743. Association between rainfall and pediatric emergency department visits for acute gastrointestinal illness. Drayna P, McLellan SL, Simpson P, Li SH and Gorelick MH. (2010) Environmental Health Perspectives 118(10); 1439-1443. An outbreak of shiga toxin-producing Escherichia coli infection associated with a school camp. McCall BJ, Slinko VG, Smith HV, et al. (2010) Communicable Diseases Intelligence, 34(1); 54-56. Epidemiologic evidence on the health effects of perfluorooctanoic acid (PFOA). Steenland K, Fletcher T and Savitz DA. (6423) Environmental Health Perspectives, 118(8); 1100-1108. Two-year monitoring of Cryptosporidium parvum and Giardia lamblia occurrence in a recreational and drinking water reservoir using standard microscopic and molecular biology techniques. Helmi K, Skraber S, Burnet JB, et al. (2010) Environmental Monitoring and Assessment, DOI 10.1007/s10661-010-1726-7 High-level exposure to lithium, boron, cesium, and arsenic via drinking water in the Andes of Northern Argentina. Concha G, Broberg K, Grander M, et al. (2010) Environmental Science and Technology, 44(17); 6875-6880. Presence of enteric viruses in source waters for drinking water production in the Netherlands. Lodder WJ, Van Den Berg HHJL, Rutjes SA and De Roda Husman AM. (2010) Applied and Environmental Microbiology, 76(17); 5965-5971.

Page 7: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 7

Arsenic

Lifetime exposure to arsenic in drinking water and bladder cancer: A population-based case-control study in Michigan, USA. Meliker, J.R., Slotnick, M.J., AvRuskin, G.A., Schottenfeld, D., Jacquez, G.M., Wilson, M.L., Goovaerts, P., Franzblau, A. and Nriagu, J.O. (2010) Cancer Causes and Control, 21(5); 745-757. There is evidence to show that chronic exposure to arsenic in drinking water increases the risk of bladder cancer in humans, however evidence for this association has been based on studies in high arsenic regions (greater than 150 micro g/L). At lower levels of arsenic exposure (less than 100 micro g/L) risks remain uncertain. A population based bladder cancer case-control study was undertaken in Michigan. High rates of bladder cancer mortality have been reported for nearly 50 years in southeastern Michigan, however these elevated rates are not well explained. This region of Michigan has a population of 2.8 million people, with about 1.6 million drinking groundwater. Around 230,000 people are estimated to be drinking groundwater with arsenic levels greater than 10 micro g/L (the new Maximum Contaminant Level set by the U.S. EPA). This study was conducted in eleven counties of southeastern Michigan. These counties were selected to investigate the risk of moderately elevated arsenic concentrations in groundwater (less 100 micro g/L). There were 411 cases diagnosed with urinary bladder cancer between 2000 and 2004 that were recruited from the Michigan Cancer Surveillance Program. To be eligible, cases had to have resided in the eleven county study area for at least the previous 5 years and be aged between 21 and 80 years at diagnosis. There were 566 controls that were frequency matched to recruited cases by age (plus or minus 5 years), gender and race which were recruited by random digit dialling. Participants answered a 30-45 min computer-aided telephone interview which included questions on water and other fluid consumption, dietary habits, smoking and medical history. Residential and occupational history details were collected via postal questionnaires.

Water samples were collected from residential sources used for drinking (including coffee) and cooking, as well as untreated water if participants used a private well. Water samples were analysed for arsenic using an inductively coupled plasma mass spectrometry unit (ICP-MS). A geostatistical model was developed to predict arsenic concentrations at past residences in private well water. To estimate arsenic concentrations at past residences in public water supplies, 1,675 arsenic measurements in public well water supplies in the study area were extracted from a Michigan state arsenic database (1983-2004). A water supply history between 1920 and 2003 was also obtained. The state arsenic database was combined with the water supply history to generate an arithmetic mean of arsenic concentration for each public groundwater supply over applicable time periods. Estimates of arsenic concentration in drinking water were linked to individual lifetime residential histories. Arsenic concentration estimates were also calculated for public and private supplies outside the study area. Arsenic concentration estimates inside and outside of the study area were combined for each participant to provide estimates of arsenic concentration in drinking water over the life-course. Two arsenic exposure metrics were used for each participant: (a) arsenic concentration estimates (micro g/L) from the lifetime exposure assessment; (b) arsenic intake (micro g/day) over the adult life-course. These metrics were used to construct time-weighted average (TWA) exposure measures. TWA life-course arsenic concentration and arsenic intake exposure metrics were not associated with the risk of bladder cancer in unadjusted or multivariate adjusted analyses. The TWA life-course arsenic concentration metric resulted in higher ORs than the arsenic intake metric. Multivariate-adjusted analyses stratified by ever smoking produced no significant findings using both exposure metrics. Among never-smokers however, risk was significantly elevated for higher levels of TWA life-course arsenic concentration using continuous measures (OR = 1.29; 95% CI: 1.03, 1.63, per 5 micro g/L increase). In the subgroup who drank more than 1 litre of water at home each day, multivariate-adjusted analyses showed a near-significant association between arsenic and bladder cancer in continuous analyses

Page 8: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 8

(OR = 1.17; 95% CI: 0.94, 1.44, per 5 micro g/L increase). Yearly estimates of exposure were analysed for a relationship with bladder cancer. In analyses stratified by water consumption greater than 1 L/day, multivariate analyses indicated that exposures occurring when participants were between 45 and 52 years old were associated with peaks in estimated risk using continuous or categorical measures of arsenic exposure. Over this 8 year age-at-exposure period, 30-32 cases and 20-25 controls were in the above average water consumption and high-arsenic (greater than 10 micro g/L) strata contributing to these results. Risk was elevated for exposures between ages 30 and 55, but this was not significant for all years. Further stratification by above average percentage of fluids that contain water from home revealed a similar time period of elevated risk. Private well use (regardless of arsenic content) was not associated with bladder cancer in the multivariate-adjusted full analyses (OR = 1.01; 95% CI: 0.76, 1.32); or among strata greater than 1 L/day home water consumers (OR = 1.16; 95% CI: 0.78, 1.73), or among strata of above average home water consumers who also consumed a greater than average proportion of fluids from their home water source (OR = 1.13; 95% CI: 0.63, 2.04). The findings overall, indicate that low-level TWA arsenic concentration in drinking water and arsenic intake were not associated with bladder cancer. In the underpowered stratified analyses however, effects were suggested for those consuming greater than 1 L/day of water with arsenic greater than 10 micro g/L and who drank an above average percentage of fluids containing water from home. Time-specific analyses suggested that arsenic exposure levels greater than 10 micro g/L among individuals between ages 30 and 55 with above average home water consumption were associated with subsequent development of bladder cancer. Further studies are required to investigate effects of low-level arsenic exposure in analyses stratified by fluid consumption and to determine whether there are one or more age-specific intervals of exposure susceptibility.

Cyanobacteria

Vulnerability of Quebec drinking-water treatment plants to cyanotoxins in a climate change context. Carriere, A., Prevost, M., Zamyadi, A., Chevalier, P. and Barbeau, B. (2010) Journal of Water and Health, 8(3); 455-465. The province of Quebec Canada has experienced an occurrence of cyanobacterial blooms in recent years, with 356 bloom cases reported from 2004 to 2007. This increase in occurrence is considered to be due to climate change with increases in water temperature, growing season, runoff and nutrient availability. Studies conducted in Quebec to evaluate the concentration of cyanotoxins entering the treatment plants found maximum concentrations of 5.35 micro g/L equ. microcystin – LR (MC-LR) and 2.3 micro g/L anatoxin-a. This study was undertaken to evaluate the vulnerability of the 284 surface water treatment plants of the province of Quebec and their capacity to treat cyanotoxins to achieve compliance with relevant drinking water guidelines (maximum recommended concentration of 1.5 micro g/L for MC-LR and the provisional concentration of 3.7 micro g/L for anatoxin-a). A subgroup of 29 surface water treatment plants (WTP) in Quebec were selected for an in depth investigation of their efficiency. This involved a questionnaire and telephone interview with personnel. These plants were selected on the basis of (i) use of a recognised treatment for cyanotoxins, (ii) history of cyanobacterial blooms in the source water, (iii) geographical coverage of the province and (iv) source water type (lake and. river). For these 29 plants, the efficiency of individual treatment units in coping with microcystins and anatoxin-a was calculated based on the information found in the literature for each treatment process. On the basis of the calculated treatment unit efficiencies, extrapolation was made to the other surface water plants. The study assumed that 100% of the cyanotoxin was in soluble extracellular form in order to assess the maximum theoretical risk scenario. In reality, a proportion of the toxin(s) in a bloom would be intracellular, and physical removal of intact cells would contribute to toxin reduction.

Page 9: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 9

The theoretical maximum toxin removal was calculated for the 11 plants where powder activated carbon (PAC) is employed using their maximum PAC dosage capacity (2.5-45 mg/L). The predicted adsorption was expressed on a log scale. There were four WTPs which presented negligible removal of toxins due to the low capacity of PAC dosage (less than 10 mg/L). The other plants obtained between 0.5 and 2.8 toxins log removal, with mean and median removals of 1.1 and 0.89 log, respectively. Three oxidants (chlorine, ozone and potassium permanganate) were also investigated for their potential to oxidise toxins. The pH, contact time, and residual concentration of oxidant and the hydraulic efficiency were obtained from the questionnaire and telephone interviews. Potassium permanganate was used in 5 plants of the 29, mostly for taste and odour control. The dosages of permanganate applied were all below 1.0 mg/L. The residual permanganate concentrations required to calculate toxin removals were only available for two plants. The use of permanganate for toxin treatment would require a better process control including residual concentration measurement. All of the current systems inject permanganate in raw water so that manganese levels in finished water are minimised, however in the context of a cyanobacterial bloom pre-oxidation may lead to cell lysis and release of intracellular toxins. The log removals of anatoxin-a and MC-LR by chlorine and ozone were calculated. The MC-LR log removals by chlorine varied widely (0.15 to 31 log) reflecting large differences between the WTPs in oxidant exposure, contact time and pH. The use of chlorine to oxidise anatoxin-a was not a viable option for the WTP investigated as the maximum removal calculated is under 1 log. Using ozonation on the other hand, provided reductions of both anatoxin-a and microcystin-LR in excess of 2 log, with the average reduction exceeding 50 log. The efficiency of treatment trains to control MC-LR and anatoxin-a were calculated considering three occurrence scenarios: (i) historical maximums, (ii) proposed guidance for design and (iii) climate change. As the vulnerability of the various water

bodies to cyanobacterial blooms was not known, all plants were considered vulnerable, which is a very conservative hypothesis. The percentage of treatment trains able to lower MC-LR concentration under the acceptable limit of 1.5 micro g/L was calculated considering the three source water occurrence scenarios. The percentage of treatment trains able to lower anatoxin-a concentration under the acceptable limit of 3.7 micro g/L considering the three scenarios was also calculated. A performance projection was made for the 284 surface WTPs in Quebec for increasing source water toxin concentrations using the information obtained from the scenarios. It was calculated that 80% of the WTPs are presently able to treat the maximum historical concentration of microcystin measured in Quebec (5.35 micro g/L MC-LR equ.). About two-thirds WTPs can cope with MC-LR concentrations of 30 micro g/L (the Quebec design guidance value). A climate change scenario of 60 micro g/L MC-LR changes this percentage to 60%. Anatoxin-a presents a more difficult challenge for the water industry. When concentrations are above15 micro g/L in source water, only a fraction of the plants using PAC (45%) and all those using ozone are able to deal with this toxin. Potassium permanganate may be able to improve anatoxin-a removals however significant operational challenges still need to be resolved with this process. With the growing concern in Quebec about cyanotoxins in the treatment of surface water, further research is required to correctly evaluate the risks. Increased anotoxin-a levels in source waters would be significantly harder to deal with for the water industry than MC-LR. Anotoxin-a is highly resistant to conventional treatment and occurrence data are limited. PAC and/or potassium permanganate maybe suitable options for treatment, however further research is needed to evaluate their applicability and resolve operational questions.

Emerging Contaminants

Toxicological relevance of pharmaceuticals in drinking water. Bruce, G.M., Pleus, R.C. and Snyder, S.A. (2010) Environmental Science and Technology, 44(14); 5619-5626.

Page 10: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 10

Interest is increasing as to the public health significance of trace levels of pharmaceuticals in potable water. Even though most pharmaceuticals are tested for human safety before marketing, the potential for adverse effects in nontarget populations exposed to low levels in the environment has not been established. Suggested possible health effects of long-term exposure to pharmaceuticals in drinking water may include endocrine disruption, antibiotic resistance, genotoxicity, carcinogenicity or allergic reactions, as well as effects on reproduction or foetal/child development. No regulatory limits have been established for pharmaceuticals in drinking water, however agencies are beginning to recommend monitoring. There have been several studies investigating the potential for adverse effects from exposure to pharmaceuticals in water, and in general the authors have concluded that human health risks from exposures are not expected. In the U.S. however, an examination of the significance of pharmaceuticals detected in drinking waters has not been conducted. This study was undertaken to identify a target list of pharmaceuticals likely to be present or represent a potential health risk, obtain occurrence data from drinking water sources, develop health risk-based screening levels and compare screening levels to detected concentrations. Target pharmaceuticals were selected for analysis on the basis of likelihood of presence and potential toxicity as well as interest among the public or water utilities. The target analytes comprised a range of drug classes, including three of the top 25 compounds prescribed during 2003 or 2004. Meprobamate, an antianxiety agent was included because of evidence of frequent detection. Seven pharmaceuticals that California DHS suggests to be included in monitoring programs for groundwater recharge reuse projects were included: atorvastatin, carbamazepine, gemfibrozil, meprobamate, phenytoin, sulfamethoxazole and trimethoprim. In total there were 15 pharmaceutically active ingredients and four metabolites. Health risk-based screening levels were derived to establish exposure levels not likely to be associated with adverse health effects following chronic exposure. For noncarcinogenic end points, threshold

doses for toxic effects were identified from studies in animals or humans that identified no observed adverse effect levels (NOAELs) and lowest observed adverse effects levels (LOAELs) for reproductive, developmental, systemic and other toxicity end points. NOAELs and LOAELs were divided by Uncertainty Factors (UFs) to account for potentially sensitive populations and uncertainties in the data set. The lowest dose resulting was chosen as the point of departure (POD) for estimating noncarcinogenic risks. For compounds having evidence of carcinogenicity in animal studies, a linear extrapolation model was used to predict the tumourigenic response at low doses if data on tumour incidence were available. The resulting slope factor (SF) was the upper-bound estimate of risk per increment of dose (1 mg/kg-d of exposure), and this was used to estimated the probability of developing cancer at a given dose. For those drugs with evidence of carcinogenicity in animals but for which tumour incidence data were not found, a method was used to predict the cancer potential. This resulted in maximum tolerated dose (MTD). Screening levels were converted to drinking water equivalent levels (DWELs) for comparison to concentrations in water by multiplying by an average adult body weight of 70 kg and dividing by a daily drinking water ingestion rate of 1 L/d. DWELs for carcinogens were estimated assuming a lifetime excess cancer risk of one additional cancer per million lifetime exposures, and continuous exposure for 30 years over a 70-year lifetime. DWELs were compared to the highest concentration detected in drinking water collected from 19 drinking water treatment plants (DWTPs) across the United States or water collected at 15 distribution sites representing 13 of the DWTPs. DWELs were found to range from 0.49 micro g/L (risperidone) to 20,000 micro g/L (naproxen). Of the 15 parent compounds, eight of these had screening levels based on noncarcinogenic effects and of these, seven were based on reproductive or developmental end points; these compounds had UFs ranging from 300 to 1,000. The meprobamate screening level was based on effects of carisoprodol on the liver and a UF of 10,000 was applied due to limited data. For the

Page 11: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 11

remaining seven compounds, screening levels were based on evidence of carcinogenicity. There was sufficient tumour incidence data to develop SFs for gemfibrozil and phenytoin and for the other five compounds screening levels were calculated from the MTD. Maximum-detected concentrations in finished or distribution water were compared to DWELs. There were ten compounds detected in one or both sources. The other nine compounds were not detected at detection limits of 0.5 ng/L or lower in either source. Of the detected compounds, the estimated risks associated with atenolod, diazepam, fluoxetime, meprobamate, norfluoxetine and sulfamethoxazole were all low, with a margin of exposure (MOE, the ratio of the DWEL to the maximum detected concentration) of 3,800 or greater. The screening levels for these compounds were based on noncarcinogenic effects. The screening levels for the other four compounds (carbamazepine, gemfibrozil, phenytoin and risperidone) were based on carcinogenicity. Out of these compounds, risperidone appears to be the most potent, with reported increases in pituitary gland and endocrine pancreatic adenomas, mammary gland adenocarcinomas and renal tubular adenomas and adenocarcinomas in rodents. However, risperidone was only detected above the reporting limit in one of the 15 distribution water samples and in no finished drinking water or source sample, with a MOE of 170. The other three carcinogens were detected in one-quarter or more of the finished and distribution water samples, with MOEs ranging from 110 (phenytoin) to 6,500 (gemfibrozil). Gemfibrozil is considered a Group 3 carcinogen (not classifiable as to its carcinogenicity in humans), as there is inadequate evidence in humans and limited evidence in animals. However phenytoin is considered a possible human carcinogen based on evidence in rodents but unclear evidence in humans. Overall, the concentrations of the detected pharmaceuticals at the DWTPs and distribution sites in the U.S. were well below DWELs and would not be expected to cause adverse health effects. As drinking water standards for pharmaceuticals have not been established at present and little has been

published on the safety of long-term exposure levels, it is challenging to determine health-based treatment goals for drinking water utilities. Each community therefore must determine whether it will wait for more information to become available or take proactive measure to treat and remove pharmaceuticals, regardless of the cost and uncertain benefits. Research examining other pharmaceutically active ingredients and exposure to mixtures of these compounds should continue, particularly for sensitive population groups.

Outbreaks

An extensive gastroenteritis outbreak after drinking-water contamination by sewage effluent, Finland. Laine, J., Huovinen, E., Virtanen, M.J., Snellman, M., Lumio, J., Ruutu, P., Kujansuu, E., Vuento, R., Pitkanen, T., Miettinen, I., Herrala, J., Lepisto, O., Antonen, J., Helenius, J., Hanninen, M.L., Maunula, L., Mustonen, J. and Kuusi, M. (2010) Epidemiology and Infection. doi:10.1017/S0950268810002141 This article reports on an investigation of a serious waterborne gastroenteritis outbreak that occurred in a Finnish town in autumn 2007. Nokia (population 30,016) is located in southern Finland and over 90% of the population are served by the municipal drinking-water supply. The waterworks uses groundwater that is treated with sodium hypochlorite for disinfection and pH adjustment. One municipal waste water plant processes sewage and this serves 86% of the population. On 28 November 2007, maintenance work was carried out at the wastewater plant in Nokia. On the same day, technical problems were encountered in the municipal drinking water system following which, customers complained about a bad smell and taste in their tap water. Environmental health and municipal waterworks officials received reports of episodes of diarrhoea and vomiting among residents from 30 November, and increases in the number of patients with gastroenteritis was observed at the town’s municipal health centre. A water sampling program began to determine the extent of contamination, a boil-water notice was issued and the

Page 12: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 12

water chlorine concentration was increased. The water contamination was found to have occurred because a valve in the waste-water plant connecting a drinking water line and a waste-water effluent line had been opened during the maintenance work and accidentally left open for 2 days. This allowed sewage effluent to enter the drinking water supply. It was estimated that 450 cubic m of waste water was mixed with drinking water and contamination levels were highest in the southern parts of the town. Clean water was distributed and schools and daycare centres were closed from 6 to 9 December. The boil water notice was withdrawn after 12 days in the uncontaminated area, but remained in place in the contaminated area of the system for nearly 3 months. A comprehensive investigation was conducted to determine the magnitude and consequences of the waterborne outbreak. Patients who visited the municipal health centre between 28 November and 31 December 2007 due to acute gastroenteritis were registered and information recorded included the timing and range of symptoms and visits made to emergency rooms in Tampere University Hospital. The town was divided into two study areas, contaminated (population 9538) and uncontaminated (population 20,478) based on microbiological data and water-flow modelling. A control population (population 27,259) was chosen 35 km from Nokia. There were 1,000 persons randomly selected from the population register from each study group. Groups were match by age and gender. A postal questionnaire was conducted 8 weeks after the contamination had occurred. Participants were asked about the timing and nature of gastrointestinal symptoms. A case was defined as persons suffering acute diarrhoea or vomiting between 28 November 2007 and 20 January 2008. Faecal specimens were analysed for bacterial pathogens (Campylobacter, Salmonella, Yersinia and Shigella spp.) from cases with more severe symptoms. Five patients were randomly selected within the first week of the outbreak for detection of norovirus. Other viral pathogens were studied in samples from children admitted to the emergency department of the University Hospital. Specimens were taken from ten consecutive patients during the

second week of the outbreak in order to detect enteric parasites. When cases of giardiasis were detected, parasites were investigated in all symptomatic patients. Specimens were taken at the health centre or in the University Hospital. Samples from the drinking-water distribution system were analysed for the presence of Campylobacter, Salmonella and Clostridium spp, enteric viruses, Giardia cysts and Cryptosporidium oocysts. The health centre database showed that there were 1222 visits made between 28 November and 31 December 2007 due to gastrointestinal symptoms. The number of patients presenting with gastrointestinal complaints during the busiest week was 53-fold higher than expected compared to the weekly median number of cases in a 7-month period preceding the outbreak. The largest group were those aged less than 10 years. Data were available on 1024 patients with the most common symptoms reported being diarrhoea followed by vomiting, fever and then abdominal pain. There were 204 visits made to the Tampere University Hospital emergency rooms and 145 (71%) involved children. There were 2123 participants who correctly completed the questionnaire. Most of the subjects fell ill within 1 week of the water supply contamination. The most common symptoms were diarrhoea, vomiting, nausea and abdominal pain. There were 579 respondents who met the case definition, of which 540 (93%) were residents of Nokia. When results were extrapolated for the whole of Nokia population, it was estimated that about 8453 residents fell ill with gastroenteritis. The excess number of cases was 6500 (95% CI 5613-7370) in Nokia, compared to the control population. The gastroenteritis attack rate was eightfold higher in the polluted area (53%) and 2.4-fold higher in the unpolluted area (15.6%) compared to the control municipality (6.5%). The probability of falling ill increased along with the quantity of tap water consumed in the contaminated area, this was not the cases in the uncontaminated area or in the control area. There were 7 enteropathogens detected in faecal specimens. The most frequently detected being Campylobacter sp., (mostly C. jejuni), followed by Giardia. All of the five samples investigated for

Page 13: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 13

norovirus were positive. Other finding included non-typhoidal Salmonella species, Clostridium difficile and rotavirus. Four of the stool cultures grew Shigella boydii. The same pathogens were also detected in the water-distribution system samples apart from Shigella boydii. Within the first 3 weeks of the outbreak, 98% of Campylobacter- and Salmonella-positive cultures occurred. This was the largest reported waterborne outbreak in Finland with over half of the population in the contaminated area falling ill with gastroenteritis. This probably reflects the high concentration of pathogens in the drinking water as well as their infectivity, especially in the case of viruses. The range of pathogens in both water and patient samples was wide and is consistent with the mechanism of contamination. Most subjects fell ill shortly after the valve was opened and the microbes found in stools were in accordance with those found in pipeline samples. This study showed the value of a population based questionnaire in estimating the disease burden in outbreaks when illnesses are mild or moderate. The questionnaire showed that the actual number of those affected was almost seven times higher than the number of people that sought medical care. The use of the control population also allowed baseline morbidity to be differentiated from the observed morbidity caused by the water contamination and therefore enabling a more accurate estimate of the burden of disease of this outbreak. Comment

This outbreak was reported in Health Stream issues 48 and 49.

POU Treatment

Evaluation of long-term performance of Point of Use (POU) systems for drinking water treatment. Muhammad, N., Sinha, R., Krishnan, E.R., Heckman, L. and Patterson, C.L. (2010) Proceedings of the World Environmental and Water Resources Congress 2010:Challenges of Change. p289-299. Point-of-Use (POU) devices are of increasing interest as a final barrier before water consumption for the routine protection of drinking water or as an emergency measure against intentional and

accidental contamination of drinking water. A research program was initiated in 2008 by the Water Supply and Water Resources Division of the U.S. Environmental Protection Agency (EPA) National Risk Management Research Laboratory to evaluate the performance of various POU devices for treating tap water in homes and transient non-community water systems. This paper describes the results of studies conducted on two reverse osmosis (RO) units. A POU test apparatus was designed and constructed to allow different POU devices to be plumbed in for testing their effectiveness in removing biological contaminants. The two RO POU devices tested were: a system designed by Watts Premier (Phoenix, AZ) and a Purefecta system from Kinetico (Newbury, OH). Both systems included five stages of treatment. In the first three stages of the Watts Premier system, the inlet water passes through a sediment filter and two sequential carbon-block filters. The fourth stage incorporated the RO membrane element. The permeate water is then sent to a 3-gallon storage tank and the reject water is sent to a drain. The fifth stage is a granular activated carbon filter downstream of the storage tank. In the Kinetico Purefecta system, influent water passes through a carbon or sediment filter and then through the RO membrane element. The RO permeate then passes through a mechanical filter to remove viruses and is stored in a 3-gallon tank. After leaving the storage tank, the water passes through a carbon filter and then through the last stage of the Pall biofilter for bacteria removal. The POU devices were operated for about 22 months (670 days) in a manner which mimicked household use to evaluate their long-term performance. The POU devices were challenged with microbiological contaminants at different time intervals to evaluate continued treatment effectiveness. To evaluate bacterial removal, the devices were challenged with Bacillus subtilis, a predominant aerobic spore and Escherichia coli, a human pathogen. Virus removal was evaluated using MS2 bacteriophage, a surrogate for pathogenic enteric viruses. Cryptosporidium parvum removal was evaluated using a non-biological surrogate, Polystyrene Latex (PSL) beads, with a mean size of 2.83 micrometers. All tests were undertaken using dechlorinated potable water.

Page 14: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 14

Both the POU systems achieved complete removal of PSL beads during 660 days of operation and satisfied the Long Term 2 Enhanced Surface Water Treatment Rule log removal value of 5.5 for the highest Bin Category 4. There was no deterioration of removal seen during the long-term operation of the units. Both of the POU systems achieved excellent removal of E. coli during operation for around 660 days. The average log removal value achieved by the Watts Premier and Kinetico Purefecta RO systems were 6.0 and 5.4, respectively. There was no noticeable deterioration of removal performance observed during the long-term operation of the Watts Premier RO system, and it satisfied the EPA Microbiological Purifier requirement of 6.0 log removal value for E. coli. There was a slight deterioration found in removal performance during the long-term operation of the Kinetico Purefecta RO system. Both of the POU systems achieved excellent removal of B. subtilis during long-term operation of around 650 days. The average log removal value achieved by both units was 6.0. There was no noticeable deterioration of performance seen during the long-term operation of the units. Both of the POU systems achieved complete removal of MS2 bacteriophage during long-term operation of around 650 days. Both units satisfied the log removal value requirement of 4.0 for viruses as suggested by the EPA for POUs devices. There was no deterioration of removal performance seen during the long-term operation of the units. The test results conducted to date show complete removal of the selected microorganisms and surrogates by the POU systems both at start up and after long-term operation.

Public Communication

Communication, perception and behaviour during a natural disaster involving a 'Do Not Drink' and a subsequent 'Boil Water' notice: a postal questionnaire study. Rundblad, G., Knapton, O. and Hunter, P.R. (2010) BMC Public Health, 10:641. Good communication between the emergency response agencies and the public is essential when

there is a major public health emergency. To ensure that people protect themselves from injury or disease effective communication is particularly important. During the period from 10th to 20th July 2007, the equivalent of two months of rain (125 mm) fell over Gloucestershire and neighbouring areas in the United Kingdom, causing widespread flash flooding and fluvial flooding of the River Severn and the River Avon. About two days later the Mythe water treatment works which is one of the main water works in Gloucestershire and managed by Severn Trent Water (STW) was flooded and the electricity substation had to be shut down. There were 140,000 homes without mains water and 48,000 homes without electricity. Electricity was returned within 48 hours however consumers were left without drinking water for up to 17 days. Alternative water was supplied with 40 million bottles of water distributed and an average of 3 million litres of water per day provided by mobile water tanks. Consumers were issued with a ‘Do No Drink’ notice when water supply restoration began which was subsequently replaced by a water industry standard precautionary ‘Boil Water’ notice seven days later. On the 7th of August the water company declared the tap water safe to drink. It is exceptionally rare for a combination of the two notices to be issued and this provided a unique opportunity to investigate compliance with public health advice and to investigate factors associated with risky or cautious behaviour to this advice. In January-February 2009, a postal questionnaire was sent to 1,000 households supplied with mains water from Mythe waterworks who were affected by loss of water and the two subsequent water notices during the summer of 2007. The questionnaire contained a short section on demographics followed by four sections on: the initial period without mains water (‘Water Loss’), the ‘Do Not Drink’ notice, the ‘Boil Water’ notice and the time immediately after the water was declared safe to drink (’Water Safe’). There were 159 completed questionnaires included in the final analysis. Information about the imminent loss of tap water reached consumers via the local radio station or they were told by family/friends (30.7% and 30.0%, respectively (N=140). Authorities delivered official water notice leaflets for the three

Page 15: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 15

consecutive notices ‘Do Not Drink’, Boil Water’ and ‘Water Safe’ however only an average of 40% of all households used them as an information source. Information about notices tended to come from the local radio, followed by family/friends, local newspapers and STW (by telephone or internet). Predictors of information source use were entered into a Multivariate Analysis of Variance (MANOVA) and the key observations were that increasing age and being in paid employment were negatively associated with using leaflets and using the local newspaper was marginally associated with increasing age. Most consumers (75.6%) were informed about the loss of tap water beforehand. The majority of consumers reported receiving tap water advice with 63.9% reporting receiving all three notices. There were five consumers who reported receiving no advice at all. There was a lot of confusion about which water notices were in place during the ‘Do Not Drink’ and ‘Boil Water’ stages and correct recall varied from 23.2% to 26.7%. A MANOVA was used to test the association between information source and clarity of advice and feeling informed. The use of local newspapers as an information source was positively associated with increased clarity of advice. Water company use was associated with feeling informed. Use of the local radio and feeling informed approached significance. For the whole incident, 21 of the 69 participants who used temporary water supplies reported serious problems securing access to temporary water supplies. Access failure were mainly due to empty bowsers and/or no available stock of bottled water (n=15). Other reasons included problems travelling to and from the water sites (n=6) or not being able to locate them (n=7). Collecting bottled water from distribution sites was the most favoured means of accessing water (84.9%, N=159), compared to bowser water (57.9%). Consumers also bought bottled water (35.8%), or collected water from family/friends (8.2%). Consumers were asked to rank different types of drinking water in terms of frequency of use during the ‘Do Not Drink’ and ‘Boil Water’ stages. Tap water was the least preferred water source for both stages, including tap water from family/friends, while bottle water was the most popular water preference.

Overall, non-compliance with water advice, i.e., including brushing teeth and preparing/cooking food, was 62.9% (100/159) and 48.3% (56/116) for the Do Not Drink’ and ‘Boil Water’ notices, respectively. The higher non-compliance rates for ‘Do Not Drink’ were mostly attributed to the high use of boiled water. Some consumers engaged in over-cautious behaviour such as flushing the toilet with boiled tap water or avoiding flushing. Risky behaviour such as brushing teeth and preparing food with unboiled tap water was also reported. The main factor associated with both drinking water compliance and overall compliance was incident stage (i.e., Do Not Drink’ stage versus ‘Boil Water’ stages). Drinking water compliance was also associated with employment, with those in paid employment less likely to comply. There were a high proportion of consumers, especially the elderly, that reported not having used the official leaflets containing advice on safe and usage water behaviour and instead, local media and family/friends were the main sources of information. Therefore, when an incident occurs, contacts with local media and community/personal networks should be established, maintained and kept up-to-date with drinking water standards and emergency protocols so official advice can reach all consumers of all demographic backgrounds in a timely manner. The high degrees of non-compliance, especially with the Do Not Drink’ notice is presumably due to incorrect beliefs regarding water contaminants and boiling. Commonly, unsafe behaviour was coupled with over-cautious behaviour showing that consumers are equally unaware of what actions are safe and what are not. Drinking water knowledge needs to be included in current public health education, in order to minimise risky behaviour and avoid the unnecessary stress from over-cautious behaviour during incidents.

Recycled Water

Valuing the subsurface pathogen treatment barrier in water recycling via aquifers for drinking supplies. Page, D., Dillon, P., Toze, S., Bixio, D., Genthe, B., Jimenez Cisneros, B.E. and Wintgens, T. (2010) Water Research, 44(6); 1841-1852.

Page 16: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 16

There is increasing worldwide interest in the recharge of aquifers for augmenting urban drinking water supplies. Aquifer recharge may use a variety of non-traditional source waters including urban stormwater and reclaimed water from sewage effluent that may contain hazards to human health. A quantitative microbial risk assessment (QMRA) was conducted at four managed aquifer recharge (MAR) sites where reclaimed wastewater and stormwater is recycled via aquifers for drinking water supplies. The Tula valley site is a semiarid area located 100 km north of Mexico City and has received untreated wastewater from Mexico City since 1986. It is estimated that about 60 cubic m/s of wastewater is used for irrigation in the area and as a result the local aquifer is being recharged with at least about 25 cubic m/s from the infiltration of untreated wastewater from unlined irrigation channels, storage dams and excess water used for irrigation. The infiltrated wastewater is hydraulically connected to local springs (aquifer residence time 20-40 days) that are used as drinking water supplies. This study only considers one spring, the Cerro Colorado spring, which currently produces 0.6 cubic m/s of potable water. Chlorination is the only treatment after extraction. The Parafield aquifer storage transfer and recovery (ASTR) site is located about 17 km from the city centre in a northern suburb of Adelaide. Urban stormwater from a mixed residential and industrial catchment is passed through a constructed wetland prior to recharge via injection wells into a confined limestone aquifer. Water is recovered via separate wells with a mean residence time of 241 days. The site is currently managed as a trial to determine the suitability of water for drinking supplies. Post-aquifer treatment options are being considered including UV and chlorine disinfection. The Atlantis site is 40 km north of Cape Town. Secondary treated reclaimed water along with wetland-treated urban stormwater from a residential catchment is recharged to an unconfined sandy aquifer. Pre-treatment includes secondary wastewater treatment (activated sludge) prior to blending with urban stormwater flows and passing through a

wetland. Water is infiltrated using two recharge basins and has a residence time in the aquifer of about one year prior to recovery by using two well fields. Post treatment involves softening of the water and chlorination before water is blended with Cape Town mains water entering the drinking water distribution system. The Torreele/St-Andre site is located at the Flemish coast in Belgium. Tertiary (reverse osmosis) treated effluent is recharged to an unconfined sandy aquifer via an infiltration basin and recovered via a series of extraction wells after a residence time of a minimum 30 days and average 55 days. Post treatment includes aeration, rapid sand filtration and UV disinfection prior to supply to the drinking water network. The microbial risk assessment approached used here measured health impact in terms of disability adjusted life years (DALYs) rather than maximum levels of infection or disease. Three representative pathogens were used: rotavirus, Cryptosporidium and Campylobacter, to assess the risk of viruses, protozoa and bacteria. The mean, median and 95th percentile were routinely calculated for each pathogen risk. The tolerable mean risk adopted was 10-6 DALYs per person per year. A QMRA was performed to determine the residual risk for each case study site and the value of the aquifer treatment. A triangular probability distribution function (PDF) describing each engineered treatment barrier was adopted from literature for each pathogen. The triangular distribution was defined by a minimum, most likely and maximum log10 removal value. The log10 removal value for the aquifer treatment barrier was calculated from the product of two PDFs; the aquifer residence time and a daily pathogen decay rate (expressed in log10/day). The initial pathogen numbers in the stormwater and wastewater were sourced from the literature. Tula Valley was found to have the highest residual risk for rotavirus (median 8.4 x 10-4 DALYs) and Cryptosporidium (median 1.5 x 10-3 DALYs). This is due to the lack of pre-treatment and the low residence time of the reclaimed water in the aquifer before recovery. Atlantis had acceptable risk for Campylobacter (median less than 1.0 x 10-10

Page 17: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 17

DALYs), but higher risk for Cryptosporidium (median 5.3 x 10-9 DALYs) and rotavirus (median 4.9 x 10-5 DALYs). Parafield had low risks for all of the pathogens. Torreele/St-Andre had a very low risk for each pathogen because of the large pre- and post-recovery treatment trains. A sensitivity analysis was also performed for each barrier in the treatment train for each of the study sites and the factor sensitivity (FS) was calculated. The FS calculation standardises the comparison between each of the water treatment barriers and the aquifer and therefore helps in valuing the aquifer as part of the larger treatment train. The initial pathogen numbers in the water to be recharged for each MAR site were a function of the pre-treatment barriers and the type of source water (stormwater versus sewage effluent). The probability of infection was multiplied by the DALYs per infection to calculate the final residual risk of each MAR system. The change in DALYs from Campylobacter as a function of aquifer residence time were not quantifiable for all sites. The change in mean DALYs from rotavirus and Cryptosporidium with increasing residence time in the aquifer were calculated for Tula Valley, Parafield and Atlantis but not Torreele/St-Andre, as the calculated risk was less than 1.0 x 10-10 annualised DALYs for each of the reference pathogens. For Tula Valley and Atlantis, the risks from rotavirus remained high. For Parafield, the risk from rotavirus was reduced after approximately 200 days in the aquifer, comparable to the average aquifer residence time at this site. For Parafield for Cryptosporidium, the risk was low due to the nature of the source water (stormwater). For the Atlantis site, over 500 days is required to reach the value of 1 x 10-6 DALYS for Cryptosporidium, which is within the actual range of residence times for this site but exceeds the mean value. The risks at Tula Valley remain higher than other sites for Cryptosporidium. In general the risks for each of the MAR sites were in the following order: Tula Valley - Atlantis - Parafield - Torrelle/St-Andre for Cryptosporidium and rotavirus, and all had low risks for Campylobacter. Only Torrelle/St-Andre and Parafield met the WHO guideline for all the reference pathogens.

For each of the sites studied, the aquifer consistently had the highest log10 removal predicted for Campylobacter whereas rotavirus and Cryptosporidium had more variable removal rates which depended on the residence time in the aquifer. QMRA was found to be useful way of establishing the value of the aquifer within the treatment train and allowed the assessment of human health risk from pathogens in terms of DALYs. The sensitivity analysis used was valuable in assessing which of the treatment barriers was the most important in each of the MAR systems. The approach used here allows the integration of the aquifer treatment characteristics into the larger engineered treatment train and may be used to quantitatively assess the reduction of human health risk for MAR systems more generally.

Reproductive Outcomes

The relationship between water intake and foetal growth and preterm delivery in a prospective cohort study. Wright, J.M., Hoffman, C.S. and Savitz, D.A. (2010) BMC Pregnancy and Childbirth, 10:48. A possible increased risk of impaired foetal growth with increasing exposure to disinfection by-products has been found in previous studies but a decreased risk of preterm delivery (PTD). Interpretation of previous associations between water intake and adverse birth outcomes is difficult as the amount and type of water consumed can be a non-specific marker of exposure or underlying behavioural characteristics. Using a prospective cohort study, the birth weight and risk of small for gestational age (SGA) among term births and risk of PTD (ie, less than 37 gestational weeks) in relation to daily bottled, cold tap, total tap and total water intake was examined. The study population included 2766 pregnant women enrolled in a prospective cohort study conducted from December 2000 – May 2004 in three study sites in the United States. Participants were enrolled early in pregnancy or while planning to become pregnant. Subjects were eligible if they were 18 years of age or older, did not have any fertility treatment for the study pregnancy and intended to deliver in the study area. There were 2039 pregnancies for PTD analysis

Page 18: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 18

after exclusions. There were 1854 live births available for the birth weight analysis, and 1783 live term births available for the SGA analysis. Information was available on infant date of birth, birth weight and gender from medical records for 43% of live births, from vital records for 57% and from participant self-report for less than 1%. Gestational age at birth was estimated from self-reported last menstrual period (LMP) and an early ultrasound combined with infant date of birth. Data on exposures and potential confounding factors were collected via telephone interviews before 16 weeks of gestation (baseline interview) and between 20-24 weeks of gestation (follow-up interview). The interviews included detailed questions about the pregnancy, maternal health, demographic information, behavioural characteristics and water use practices. Maternal health characteristics included pre-pregnancy body mass index (BMI). Behavioural variables included recreational exercise, smoking, intake of caffeine, vitamins, alcohol and illicit drugs. Detailed information on hot and cold water consumption at work and home was collected. Water use measures were divided into quartiles and analysed using the lowest quartile as the referent. When compared to the lowest quartile of water intake (less than or equal to 51 ounces/day), the unadjusted mean birth weight in grams was higher for the upper three quartiles (greater than 51-78, greater than 78-114, and greater than 114 ounces/day) of total water intake and was, 60 (95% CI: -1, 122), 67 (95% CI: 5, 128) and 83 (95% CI: 21, 145) grams, respectively. After adjustment for confounders the mean birth weight in grams was increased by 27 (95% CI: -34, 87), 39 (95% CI: -22, 99) and 50 (95% CI: -11, 111) grams in the second, third and fourth quartiles compared to the lowest quartile. When total water intake was examined as a continuous measure (per 20 ounce/day increased intake), the adjusted increase in mean birth weight was 7.3 (95% CI: -0.8, 15.5) grams. Adjusted results were found to be similar in magnitude for the upper quartiles of cold tap water intake, total tap water intake, and total water intake for both the follow-up data and an average of the follow-up and baseline data. Compared to no bottled water intake, the increase in adjusted mean birth

weight for bottled water consumers was 31 grams (95% CI: -20, 82) based on follow-up data and 43 grams (95% CI: -27, 113) based on an average of the follow-up and baseline data. Risk ratios (RRs) and 95% CIs for SGA for the upper three quartiles compared with the lowest quartile following adjustment for confounding were: 0.8 (0.4, 1.4), 0.6 (0.3, 1.0) and 0.9 (0.5, 1.6), respectively. Relative to the lowest quartile, adjusted RRs were slightly higher for cold tap and total tap water intake especially for the average follow-up and baseline data. In comparison to women not drinking bottled water, the adjusted RR for SGA was 0.9 (95% CI: 0.5, 1.4) based on follow-up data and 1.4 (0.6, 3.0) based on the average follow-up and baseline data. RRs and 95% CIs for PTD for the upper three quartiles compared with the lowest quartile following adjustment for confounding were: 1.2 (0.7, 1.9), 1.1 (0.7, 1.8) and 1.4 (0.9, 2.2), respectively. RRs and 95% CIs for PTD for the cold tap and total tap water quartiles were generally below 1.0 compared to the lowest quartile for follow-up data and average follow-up and baseline data. Compared to women not drinking bottled water, the adjusted RR for any bottle water intake was 1.2 (95% CI: 0.8, 1.8) for the follow-up data and 0.8 (95% CI: 0.5, 1.4) based on the average baseline and follow-up data. There was limited evidence of an association between specific measures of water intake and risk of adverse pregnancy outcomes such as SGA and PTD. A small increase in mean birth weight ranging from 27-50 grams for higher levels of total water intake during pregnancy and an exposure response relationship for cold tap and total tap water were suggested. Further examination of this relationship with mean birth weight may be warranted in higher risk populations.

Assessment of sewer source contamination of drinking water wells using tracers and human enteric viruses.

Viruses

Hunt, R.J., Borchardt, M.A., Richards, K.D. and Spencer, S.K. (2010) Environmental Science and Technology, 44(20); 7956-7963.

Page 19: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 19

Recent studies have shown the occurrence of human enteric viruses in domestic and municipal wells in the United States. Virus-contaminated groundwater has caused outbreaks in many countries, suggesting that hydrologic conditions suitable for virus survival and transport are widespread. Most studies on pathogens in wells have focused on sampling at the wellhead and through the distribution system. However in order to assess drinking water well vulnerability to virus contamination, a knowledge of virus source and source-to-well travel time is also needed. Travel times through aquifers may be greatly affected by pumping rates from wells. This current study evaluated sanitary sewers as a potential source of human enteric viruses in drinking water wells in several communities in Wisconsin, U.S. In 2005 and 2006, 33 municipal wellheads in 14 communities were sampled once to determine viruses and wastewater compounds; 37 wells were sampled twice for stable isotopes of water. The wastewater compounds analysed included a range of commonly used household and industrial chemicals and faecal sterols. Stable isotopes of oxygen and hydrogen were measured to assess the contribution of surface water to well water. On the basis of these results, one well site in three different communities was chosen to be studied more intensively. Viruses were sampled monthly during four three-month periods in the spring and fall of 2006 and 2007. Wastewater compounds were sampled in the fall of 2006 (monthly, September-November) at all three sites however in the fall of 2007, monthly sampling was limited to Community No.3 only. Subsurface water quality and transport environment was characterised by detailed site instrumentation at two locations at each well site: an area near the wellhead (‘near”) and a second, more distant area along a flowline from a suspected source (‘far’). Instrumentation consisted of a water table well, data loggers installed in each water table well recording water level and temperature at 15 min intervals, with specific conductance data also collected at Community No. 3. Estimates of travel time were obtained by evaluating inflection points in the specific conductance time series. Piezometer nests were installed at each of the three sites near each water table well for depth-discrete water-quality sampling.

Groundwater samples were collected for analysis of stable water isotopes, nitrate, wastewater tracers and viruses. Samples were analysed by two-step quantitative reverse transcription polymerase chain reaction (RT-qPCR) for enterovirus, norovirus genogroups I and II, hepatitis A virus and rotavirus and by qPCR for adenovirus. Samples positive by qPCR for enterovirus and adenovirus were further evaluated for virus infectively. Enterovirus and adenovirus serotypes were determined by direct sequencing. Of the 33 wells sampled for viruses during the 2006 wastewater sampling, five were qPCR positive for viruses: two wells were positive for enteroviruses, two were positive for adenoviruses and one was positive for genogroup II norovirus. The two enteroviruses tested positive for infectivity by cell culture. Only two wells were simultaneously positive for both viruses and wastewater tracers when sampled concurrently. Analysis of hydraulic gradients in the three communities showed they were all toward the municipal wellhead at the three sites. Conductance measurements from the Community No. 3 site provided an estimated travel time and the lags in specific conductance inflection from the ‘far’ to ‘near’ monitoring wells which were consistent with a flow direction toward the municipal well and ranged from about 42 days during the relatively higher pumping in the fall to about 66 days during the lower pumping that followed (November-February). If similar average travel times are assumed for the entire site and viruses are assumed to move at the same velocity and survive for 2 years in the subsurface, then viruses transported from sources within 100-150 m could be expected to be infectious at the municipal wellhead. Nitrate was found to be common in the groundwater sampled from piezometers at the three sites, ranging from 1.3-7.9 mg/L-N. There was no relationship evident between nitrate concentration and virus or wastewater tracer occurrence. During the wastewater sampling in the fall of 2006, all three sites showed unambiguous wastewater tracers in both the municipal wells and associated piezometers, but

Page 20: WQRA HealthStream - Issue 60

WATER QUALITY RESEARCH AUSTRALIA

HEALTH STREAM DECEMBER 2010 PAGE 20

usually not during the same month. Community No. 3 was characterised by high numbers of tracer detections however none of the wastewater tracer detections exceeded regulatory concentrations. During the September-November 2007 sampling at Community No. 3, municipal well and associated piezometers showed wastewater tracers in both the municipal well and piezometer nests, but not necessarily during the same month or the same tracers. The 2006-2007 sampling for virus analyses from the three community municipal wells found that each well had two or more virus-positive samples. Of the 12 samples from the Community No. 3 well, six were positive for adenovirus, enterovirus (Coxsackievirus B2), or genogroup 1 norovirus. In the 2007 sampling of piezometers at this site, both the near and far piezometer nests had two virus-positive samples; one piezometer contained infectious adenovirus 2 and the other contained infectious enchovirus 30. Along the hydraulic gradient from the far to the near piezometer to the well, two of the three detected serotypes (adenoviruses 2 and 5) were common to all three sampling locations, which suggests a common source. Every virus-positive well and piezometer was also positive at some point during the study for an unambiguous wastewater tracer.

It is often assumed that transport distances are too long and virus survival time too short for viral contaminants to pose a threat to groundwater supplies however this study has shown otherwise. The high viral load of human faeces, low numbers of viruses required to cause infection and ability of viruses to survive and be transported in the subsurface suggests that fast-pathways of groundwater flow require characterisation even if such pathways only contribute minor amounts of water to the well. Given the age and condition of many sanitary sewer systems, the associated leakage reported from such systems and the short distances that commonly separate sanitary sewers from drinking water wells, public health may be best protected by disinfection of the drinking water withdrawn such wells. The 14 communities investigated in this study were not using any disinfection systems and therefore may be at risk from viral contamination.

Disclaimer

Whilst every effort is made to reliably report the data and comments from

the journal articles reviewed, no responsibility is taken for the accuracy

of articles appearing in Health Stream, and readers are advised to refer

to the original papers for full details of the research.

Health Stream is the quarterly newsletter of Water Quality Research Australia. Health Stream provides information on topical issues in health research which are of particular relevance to the water industry, news and updates on the recent literature. This newsletter is available free of charge to the water industry, public health professionals and others with an interest in water quality issues. An electronic version of the newsletter and a searchable archive of Health Stream articles are available via the WQRA Web page. Summaries of Web-bonus articles are available only in the electronic version. To be placed on the print mailing list for Health Stream please send your postal address details to: Pam Hayes Phone +61 (0)3 9903 0571 Epidemiology and Preventive Medicine Fax +61 (0)3 9903 0556 Monash University - SPHPM Email [email protected]

Alfred Hospital, Melbourne VIC 3004 AUSTRALIA

To be placed on the email notification list for Health Stream, please fill in the form on our website under Publications at:

http://www.wqra.com.au © Copyright WQRA. Health Stream articles may be reproduced and communicated to third parties provided WQRA is acknowledged as the source. Literature summaries are derived in part from copyright material by a range of publishers. Original sources should be consulted and acknowledged.

Page 21: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 1

Health Stream Issue 60 - Web bonus articles Arsenic Arsenic exposure and risk of spontaneous abortion, stillbirth, and infant mortality. Rahman, A., Persson, L.A., Nermell, B., El Arifeen, S., Ekstrom, E.C., Smith, A.H. and Vahter, M. (2010) Epidemiology 21(6):797-804. Arsenic is an established carcinogen and had been associated with a wide range of other chronic illnesses such as diabetes, hypertension and vascular diseases. Arsenic is known to cross the placental barrier and a number of epidemiological studies have suggested that arsenic exposure via drinking water may affected early human development, however findings have been varied. There have been several studies reporting an association of arsenic exposure with spontaneous abortion and stillbirth and infant mortality, however findings have been inconsistent. This study evaluated the associations of individual arsenic exposure with spontaneous abortion, stillbirth and infant mortality in a prospective cohort of pregnant women with a wide range of arsenic exposure. The study was conducted in the subdistrict of Matlab, Bangladesh. This prospective study was nested within a food and micronutrient supplementation trial –Maternal and Infant Nutrition Inventions, Matlab Study. Women with positive urine-based pregnancy tests were invited to provide the remaining urine sample for analysis of arsenic concentration. Urine samples were collected around gestational week 8. There were 2924 women who donated urine samples between February 2002 and January 2003, for arsenic analysis. Women were enrolled in the Matlab study if they met the eligibility criteria: viable foetus, gestational age less than 14 weeks by ultrasound, had no severe illness and consented to participate. Additional urine samples were collected in the late gestational period, around gestational week 30. There were 1725 women who had urine arsenic concentrations measured in both early and late gestation.

Exposure was assessed by measuring the concentration of inorganic arsenic and its metabolites in urine, which is a measure of the ongoing exposure to inorganic arsenic from all sources. After pregnancy was diagnosed by urine test, women were advised to attend the subcentre clinic for an ultrasound to verify the pregnancy and its gestational age. Pregnant women were interviewed each month to identify pregnancy outcomes (spontaneous abortion, induced abortion, stillbirth, live birth) and survival of the infant after birth. Detailed information was collected on the women’s age, gravidity, parity, education and household assets from the surveillance system databases and from interviews with the cohort participants. The women’s weight and height were measured during the visit to health facilities at study enrolment. Spontaneous abortion (unintended loss of foetus in first 28 weeks of gestation) risk was evaluated in relation to urinary arsenic concentrations at gestation week 8. Stillbirth (birth of a dead foetus after 28 weeks of gestation) and infant mortality (death of a live-born baby before 12 months of age) were assessed in relation to the average of 2 urinary arsenic concentrations measured at weeks 8 and 30. The median urinary arsenic concentrations in gestational week 8 was 80 micro g/L (range = 1-1253 micro g/L). The mean concentration was 154 micro g/L (SD = 176). Women with urine arsenic concentrations in the fifth quintile in early gestation had a 44% increased risk of spontaneous abortion (OR = 1.44 [95% CI = 0.96-2.15]), in comparison to women who had arsenic concentrations in the first quintile. The median urinary arsenic concentrations at gestational week 30 was 82 micro g/L, the mean was 171 micro g/L (SD = 217) and the range 2-3384 micro g/L; concentrations were significantly correlated with those in gestational week 8 (Spearman r = 0.61). Although the cohort average exposure was similar in early and late pregnancy, there was a distinct intra-individual variation in urine arsenic concentrations during pregnancy. Urinary arsenic concentrations decreased by at least 50 micro g/L between gestational weeks 8 and 30 for 198 women, however for 255 women the concentrations increased by at least 50 micro g/L.

Page 22: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 2

The highest odds ratio for stillbirth (3.4 [0.92-13]) was found for women with average urinary arsenic in the range 134-267 micro g/L (fourth quintile), in comparison with women who had arsenic concentrations below 38 micro g/L (first quintile). For infant mortality, dose-response associations were found for infant mortality with arsenic exposure. In comparison with women in the first quintile of urinary arsenic, the hazards ratio (HR) was 1.5 (95% CI = 0.49-4.6), 1.7 (0.57-5.2), 1.5 (0.46-4.7) and 3.4 (1.2-9.4) for women in the second, third, fourth and fifth quintiles, respectively (adjusted for asset index, gestational age, season and location of women’s residence). Of the infant deaths (n=44), 8 were excluded in the analysis due to birth asphyxia and accidents and the resulting HR for infant mortality was about 5 times higher (HR = 5.0 [1.4-18]) for women with average urinary arsenic concentrations at the fifth quintile (268-2019 micro g/L) in comparison with women with average arsenic concentrations below 38 micro g/L (first quintile). A dose-response trend was observed for infant mortality (P value for linear trend = 0.005). There were no meaningful changes in the associations of arsenic with stillbirth and infant death when results were stratified by exposure pattern, ie, low in both periods, low at 8 weeks and high at 30 weeks, high at 8 weeks and low at 30 weeks and high in both periods. There was a clear increase in infant mortality with increasing prenatal arsenic concentrations in mother’s urine. There was little evidence however of increased risks of spontaneous abortions or stillbirths; some odds ratios were elevated but the confidence intervals were wide and there was no clear evidence of dose-response relationships. The mechanisms for arsenic-induced foetal and infant mortality are not known and require further investigation. The findings of this study were robust even after adjustment for sociodemographic and biologically important covariates. Lung function in adults following in utero and childhood exposure to arsenic in drinking water: preliminary findings. Dauphine, D.C., Ferreccio, C., Guntur, S., Yuan, Y., Hammond, S.K., Balmes, J., Smith, A.H. and Steinmaus, C. (2010) International Archives of

Occupational and Environmental Health, DOI 10.1007/s00420-010-0591-6 Arsenic affects many body tissues, however the human lung seems particularly susceptible. Lung cancer appears to be the most common cause of death associated with high arsenic exposure from drinking water. Evidence also suggests that arsenic in drinking water causes non-malignant lung disease, however nearly all the data involves exposed adults. The long-term impacts of early-life arsenic exposures are mostly unknown. A pilot study was undertaken on adult lung function in relation to estimated early-life exposure in Antofagasta, northern Chile, using individual-level data. The drinking water supply for city of Antofagasta in Northern Chile has had variable arsenic levels over time. Prior to 1958 the level was about 90 micro g/L, but in 1958 the source of supply was changed to river water with higher levels (greater than 800 micro g/L). This supply was used until May 1970, when the world’s first large arsenic removal plant became operational. From then on arsenic concentrations remained below 150 micro g/l with few exceptions. Current levels are around the World Health Organization guideline value of 10 micro g/l. The nearby city of Arica has always had drinking water arsenic levels around 10 micro g/l. Both cities are located in an arid region and the majority of their populations drink municipal tap water. The changing exposure levels in Antofagasta over defined time periods have provided the setting for a number of epidemiological studies. This study compared lung function and prevalence of respiratory symptoms in adults with and without high early-life arsenic exposures. The exposed population consisted of long-term residents of Antofagasta and the unexposed comparison group consisted mostly of long-term residents of Arica. A sample population was recruited by local nurse-interviewers in each city. Participants were employees at the major nursing schools. There were 97 subjects enrolled, who were 32-65 years old and would have been young children or in utero during the high exposure period in Antofagasta. Each participant was administered a structured questionnaire to assess

Page 23: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 3

lifetime residential and occupational history, water source types (municipal tap water, bottled, other), current medications and medical history. Smoking histories were taken. Subjects were also asked about the types of fuels used at home. Subjects were asked questions about respiratory history. Height and weight were measured by nurse-interviewers and lung function was assessed. The main lung function values assessed were forced expiratory volume in 1 second (FEV1) and forced vital capacity (FVC). Municipal drinking water records were obtained from a database of over 15,000 arsenic measurements in Antofagasta and 11 other cities in northern Chile between 1962 and 1990. Drinking water records were linked with each participant’s residential history to obtain age-specific estimates of arsenic exposure. High arsenic exposure in early life was defined as drinking water containing greater than 800 micro g/l arsenic before age 10. There were 32 subjects exposed to greater than 800 micro g/l arsenic in drinking water before age 10, half were exposed in utero. There were 65 unexposed subjects without early life arsenic exposure. High early-life exposure to arsenic was associated with an 8.0% lower FEV1 and 7.9% lower FVC (P = 0.05). When the analysis was restricted to the 33 subjects who reported never smoking regularly, estimates changed to a 16.9% and 19.7% decrease in FEV1 and FVC, respectively, P less than 0.05 for both (results may be unstable due to the small number of subjects). Exposure-response relationships between peak arsenic concentration before age 10 and FEV1 and FVC were examined with subjects stratified into 3 arsenic exposure groups: less than 50, 50-250 and greater than 800 micro g/l. Subjects exposed to 50-250 micro g/L and greater than 800 micro g/l had 4.6% (P = 0.18) and 11.5% (P = 0.04) lower FEV1, respectively, than those exposed to less than 50 micro g/l. Similar patterns were seen for FVC. The prevalence odds ratios (PORs) and 95% confidence intervals (CIs) for a range of respiratory symptoms were calculated. Of the exposed subjects, 38% reported breathlessness walking at a group pace compared to 14% of unexposed subjects (POR = 5.94, 95% CI 1.36-

26.02). The POR for reporting any breathlessness was 2.53 (95% CI 0.68-9.45). The results of this study suggest that early-life exposure to arsenic in drinking water affects lung function and that these effects remain many years after high exposure has ceased. The magnitude of the decrease in both FEV1 and FVC suggests that early-life exposure to arsenic could have effects similar to smoking throughout adulthood and effects may be greater than secondhand smoke or air pollution. As this study had a small number of participants and non-random methods of recruitment, the results need to be confirmed by a larger study with participants who are representative of the source population. A larger study would also be able to investigate the effects of lower exposures as well as effect modification and confounding.

Disinfection byproducts

Disinfection by-products in drinking water and colorectal cancer: a meta-analysis. Rahman, M.B., Driscoll, T., Cowie, C. and Armstrong, B.K. (2010) International Journal of Epidemiology, 39(3); 733-745. The possible carcinogenic effects of humans ingesting disinfection by-products (DBPs) via their water sources and cancer incidence or mortality rates in different communities were first investigated in ecological studies. These studies suggested that people drinking chlorinated water, especially chlorinated surface water, might be at increased risk of bladder, colon and rectal cancers. A meta-analysis conducted by Morris and co-workers in 1992 concluded that higher exposure to DBPs in drinking water could be associated with a 10-40% excess risk of cancers of the bladder, colon and rectum. Since this meta-analysis, several other case-control studies have investigated the association between DBPs and colon and rectal cancers (CRCs) however results have been inconsistent. This current meta-analysis was conducted to include all relevant case-control and cohort studies of DBPs and risk of CRCs. A literature search was conducted including ‘grey literature’, to identify all relevant studies

Page 24: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 4

investigating the relationship between colorectal cancer and DBPs in drinking water. None of the studies were excluded on the basis of language and time-limits. Studies were included if: (i) they were case-control or cohort studies of DBPs and CRCs; (ii) a relative risk (RR) or odds ratio (OR) was reported, or one could be estimated from the published data; and (iii) the exposure was assessed by a method more specific for DBP exposure than just water source (whether surface water or ground water). A data extraction form was prepared to extract relevant information from those studies included in the analysis. Separate risk estimates for colon and rectal cancer were extracted from the studies. The quality of the studies included in the systematic review was assessed and each study was given a score (maximum value 100). Relative risks or odds ratios comparing the highest exposure category with the lowest were pooled using the inverse-variance weighted random effects methods. The final meta-analysis comprised 13 studies: 3 cohort studies and 10 case-control studies. There were 10 studies that considered colon cancer only, 10 considered rectal cancer only, and 7 considered both colon and rectal cancer. Cohort studies scored a little higher for total quality than case-control studies, average 41 points (range 38-44) vs average 40 points (range 26-56). The pooled RR estimated for colon cancer for cohort studies was 1.11 [95% confidence interval (CI): 0.73-1.70] and for case-control studies, 1.33 (95% CI: 1.12-1.57) and when both study types were combined 1.27 (95% CI: 1.08-1.50). The pooled RR estimated for rectal cancer for cohort studies was 0.88 (95% CI: 0.57-1.35) and for case-control studies, 1.40 (95% CI: 1.15-1.70) and when both study types were combined 1.30 (95% CI: 1.06-1.59). When each study was omitted in turn from the analysis, no single study was found to importantly influence the pooled estimate for either colon or rectal cancer. There was no evidence found of publication bias for colon cancer however this may have been a minor issue for rectal cancer. Of the six studies that provided some dose-response data, three of the five that reported on colon cancer and three of the four that reported on rectal cancer found an increased risk with increasing DBP exposure.

After pooling the estimates from the 13 studies, this meta-analysis found similar, statistically significant, but weakly increased risks of colon cancer and rectal cancer associated with exposure to DBPs. An unavoidable weakness of this analysis is that there were relatively few articles that had sufficiently specific exposure measures to allow them to be included. The small number of studies and problems with study quality prevent causal inference. However, as most people drink water containing DBPs and colorectal cancer is common, a possible small association could translate into a large population-attributable risk and a high number of cases. Cohort studies therefore need to be conducted using improved methods of exposure assessment to assess this risk.

Emerging Organic Contaminants

Pharmaceuticals, personal care products and endocrine-disrupting chemicals in U.S. surface and finished drinking waters: A proposed ranking system. Kumar, A. and Xagoraraki, I. (2010) Science of the Total Environment 408(23);5972-5989. Pharmaceuticals and personal care products (PPCPs) and endocrine-disrupting chemicals (EDCs) have been identified recently as emerging organic contaminants (together termed as EOCs) in environmental waters that may produce adverse ecological and human health effects. Selected EOCs have been monitored in recent studies in stream waters, raw and finished drinking waters and wastewater effluents. Selection for monitoring has been based on diverse multiple criteria such as occurrence, analytical capability, chemical properties, public perception and possible health effects. The presence of PPCPs in the influents and effluents of wastewater treatment plants has been well documented, and studies have been done on removal by various wastewater and water treatment processes. Nevertheless, it is still problematic for drinking water utilities to select EOCs for monitoring and treatment purposes. This study was undertaken to develop a comprehensive ranking systems called “EOCRank” for prioritising the monitoring and

Page 25: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 5

treatment of EOCs in U.S. stream water/source water and finished drinking water supplies. The proposed ranking system consisted for five steps: (1) selection of criteria, (2) development of attributes for every criterion, (3) data collection for every criterion, (4) calculation of rank score for every criterion and (5) calculation of overall rank score. The study used four different types of criteria: (1) occurrence in water (O), (2) treatment in drinking water utilities (T), (3) ecological effects (E) and (4) health effects (H). The “treatment” criterion was not considered for ranking EOCs in stream water/source water as stream water/source water does not undergo treatment. The “ecological effects” criterion was not considered for ranking EOCs in finished drinking water as aquatic species are not exposed to finished drinking water. Each of the criteria was represented by various attributes. The first criterion “occurrence” (O) is represented using two attributes: (1) prevalence (O1), represented as frequency of detection and (2) magnitude (O2), represented as concentration of an EOC in water. The second criterion “treatment in drinking water utilities” (T) is represented using an “overall removal” attribute (T1) of an EOC in drinking water treatment plant. The third criterion “ecological effects” (E) is represented using two attributes: (1) bioaccumulation (E1) and (2) ecotoxicity (E2). The fourth criterion “health effects” (H), is represented using two attributes (1) pregnancy category (H1) and (2) effects category (H2). The effects attribute (H2) is represented by 7 sub-attributes: carcinogenicity, mutagenicity, impairment of fertility, central nervous system (CNS) acting, developmental effects, immunotoxic effects and endocrine effects. Multiple criteria, attributes and sub-attributes have different importance in influencing overall ranking of EOCs in water and are expressed as importance weights. For every category (i.e., criteria or attributes or sub-attributes), the total sum of importance weights remains 1. Some attributes such as prevalence, magnitude, treatment, bioaccumulation and ecotoxicity effect have numerical values and other attributes such as

pregnancy category and human health category have qualitative values. To assign numerical values to quantitative/qualitative values of different attributes and sub-attributes, different utility functions (U) are used. Rank scores of EOCs were calculated as summations of multiplications of importance weights and utility functions of multiple criteria. The EOCRank system developed a priority list of 100 EOCs in U.S. waters. An EOC with a high overall rank score requires immediate attention compared to an EOC with a relatively low overall rank score. Two different ranking lists of EOCs were developed for U.S. finished drinking water and for stream water/source water. Statistical comparisons of overall scores of 100 EOCs in ranking lists of the two water types (score range: 0.2-0.7) indicated that they differ significantly (p=0.0 less than α). Therefore these two water types require individual considerations for addressing different priority EOCs. There were 20 priority EOCs highlighted in both water types to aid water utilities in selecting priority EOCs. EOCs in different ranking lists were classified into five categories of concern (COC) to study their relative importance in ranking lists of different U.S. waters: (1) pharmaceuticals, (2) personal care products (PCPs), (3) EDCs, (4) antibiotics and (5) teratogenic compounds. An increase in the number of pharmaceutical compounds were observed among priority 20 EOCs in the U.S. finished drinking water ranking list compared to that in the stream water/source water ranking list (16 pharmaceutical compounds versus 6 pharmaceutical compounds). This raised health concerns about the ingestion of finished water. In general, numbers of EOCs were seen to increase in the finished drinking water ranking list compared to the stream water/source water ranking list for all COCs, expect for the PCPs category. This indicates that category specific analyses of ranking lists provides additional information about EOCs. Category-specific ranking lists of 20 priority EOCs were devised using their overall rank scores to aid water utilities in conducting category-specific monitoring and/or removing specific EOCs from water. This EOCRank system could be used for ranking EOCs in other geographical locations by updating

Page 26: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 6

information on occurrence and importance weights of different components. Even though this ranking system helped to prioritise 100 EOCs and identify 20 priority EOCs within the five different categories in U.S. waters, it is still difficult for water utilities to monitor and/or remove these 20 priority EOCs due to analytical and economic considerations. More work is therefore required to integrate the EOCRank system with quantitative pharmaceutical risk assessment to help water utilities develop human health effects-based priority short lists for EOC for monitoring and removal purposes. Cytotoxic drugs in drinking water: A prediction and risk assessment exercise for the Thames catchment in the United Kingdom Rowney, N.C., Johnson, A.C. and Williams, R.J. (2009) Environmental Toxicology and Chemistry, 28(12); 2733-2743. A proportion of the pharmaceuticals that are consumed everyday are excreted unmetabolised, or in the form of conjugates that may subsequently be reactivated. Sewage treatment plants (STPs) often do not completely eliminate these compounds and trace levels are discharged into surface waters. There are over 100 such compounds that have so far been identified in surface and groundwater environments. There are no regulatory requirements presently to monitor pharmaceuticals in the environment or statutory maximum emission levels. Cytotoxic drugs are used in cancer chemotherapy and are designed to disrupt or prevent cellular proliferation, usually by interfering with DNA synthesis in some way. These high potency drugs have fetotoxic, genotoxic and teratogenic properties and are potentially the most dangerous contaminants of our water systems. This study investigates the water quality implications for the Thames catchment (United Kingdom) which may arise from the routine discharge of these drugs after use, into the drain and into the river. Literature was collected for 13 cytotoxic drugs (alkylating agents, antimetabolites and anthracycline antibiotics) on the consumption per capita, typical human excretion and estimated proportion of residue escaping from a typical STP. These values were used to give an effluent emission per capita per day and

used in the Low Flows 2000TM Water Quality Extension (LF2000-WQX) model for the Thames catchment. This model was developed from the Low Flows 2000 geographic information system (GIS) hydrological model to predict concentrations of chemicals in real catchments. The LF2000-WQX can predict statistical distributions of concentrations of down-the-drain chemicals in river stretches downstream of all major STPs in England and Wales. The predicted environmental concentrations (PECs) throughout the Thames catchment in general represented the upper end loading of cytotoxic drugs into British rivers as the higher consumption and excretion percentage and lower STP removal factors were selected. The predicted concentrations at drinking water abstractions points were then calculated. The drinking water abstraction points examined were at Farmoor, which serves the population of much of southern Oxfordshire and Wilshire, and Walton which serves the citizens of London. It was assumed that no cytotoxic drugs would be removed in drinking water purification and that an individual consumes 2 L of water per day. Exposure calculations were a product of multiplying drinking water consumption by the PEC at the intake point. Concentrations in the catchment for the alkylating agent group ranged from 0 to 145 ng/L, values for the antimetabolities ranged from 0 to 27.4 ng/L and for anthracycline antibiotics from 0 to 0.7 ng/L. The predicted mean and 90th percentile raw water intake concentrations for Walton were higher than for Farmoor. The predicted mean and 90th percentile environmental concentrations at Walton for the alkylating agent group were 11 and 20 ng/L, respectively, for the antimetabolities, 2 and 4 ng/L, respectively and for the anthracycline antibiotics, 0.05 and 0.09 ng/L, respectively. The predicted raw water concentrations were compared with effects or assumed safe levels for human consumption. Exposure values were compared with the threshold of toxicological concern (TTC) and no-significant-risk-level (NSRL) standards found in the literature and used by regulatory agencies. The European Medicines Agency (EMEA) has set a TTC intake value of 1.5

Page 27: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 7

micro g/d for any individual genotoxic impurity. Kroes et al. has suggested a 10-fold lower level of 0.15 micro g/d TTC for high-potency carcinogens. Schulman et al. have used a NSRL of 1 micro g/d for cyclophosphamide as this is the level of risk recommended by the State of California U.S. Environmental Protection Agency for drinking water. Considering the 90th percentile exposure assessment which represents concentrations likely under low flow conditions, the predicted anthracycline concentrations are extremely low, at less than 0.1 ng/L combined and therefore represent negligible risk. Antimetabolite and alkylating agents combined could reach 4 to 20 ng/L, respectively and if a person drinks 2 L of water per day under this worst case scenario, then they would be exposed to 8 to 40 ng/d of these combined cytotoxic drugs during low flow periods. The predicted concentrations for the selected combined alkylating agents are about a factor of 25 below the Schulman et al. no risk factor, a factor of 40 below the EMEA level and a factor of 4 below the Kroes et al. limit for high-potency drugs. The prevalence of cytotoxic drugs in drinking water is likely to increase due to population increases, the possibility of hotter and drier summers in the future and as the rates of cancer increase in England because of an ageing population. Therefore, the presence of cytotoxic drugs in drinking water warrants further investigation. The risk from exposure to cytotoxic drugs for healthy adults is probably low, however specific subgroups in the population such as pregnant women, their foetuses and breast feeding infants may be particularly vulnerable to highly potent pharmaceuticals.

Endemic Gastroenteritis

Association between rainfall and pediatric emergency department visits for acute gastrointestinal illness. Drayna, P., McLellan, S.L., Simpson, P., Li, S.H. and Gorelick, M.H. (2010) Environmental Health Perspectives 118(10); 1439-1443. Waterborne disease has been associated with heavy rainfall in the United States and in other countries, however most of the literature has focused on

reported disease outbreaks which represent a minority of actual cases. During heavy rainfall, the quality of water may be adversely affected in several ways: contamination of surface or groundwater sources by storm water runoff from impermeable or saturated surfaces where faecal contaminants are introduced; cross-contamination due to infiltration and inflow between sewage and water pipes, especially in places with aging water infrastructure; and release of sewage into local waterways due to sewage overflows or bypass. The association between rainfall and acute gastrointestinal illness (AGI) in children in the absence of sewage release and in the absence of disease outbreaks has not been well described. As climate change is predicted to lead to an increase in extreme precipitation events, the role of weather in the incidence of waterborne disease needs to be determined in the interest of public health. This study was undertaken in Wisconsin, United States to examine the association between rainfall and AGI visits to the Children’s Hospital of Wisconsin Emergency Department (CHW ED) in Wauwatosa. The CHW ED serves approximately 60,000 patients per year from a wide catchment area which includes urban, suburban and rural communities. Most of the patients seen in the CHW ED live in communities served by surface water from Lake Michigan, provided by public utilities, the remainder use groundwater, mostly from municipal wells. Daily total visits to the CHW ED with discharge diagnosis of AGI were identified over a 6-year period from 1 January 2002 to 31 December 2007. Daily rainfall totals were obtained from the National Oceanic and Atmospheric Administration monitoring station at General Mitchell International Airport. Data were obtained regarding sewage release events during the study period from the Milwaukee Metropolitan Sewerage District and Milwaukee Riverkeeper. An autoregressive moving average model was used, adjusting for confounding variables such as sewage release events and season, to find an association between daily visits and rainfall after a lag of 1-7 days. There were a total of 17,357 AGI visits to the ED identified (mean daily total, 7.9; range 0-56 in the

Page 28: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 8

2,191-day study period). During the study period there were 776 days of rain (35.4% of days). The average daily rainfall on days with rain was 0.58 cm (range 0.02-7.01), with a median of 0.25 cm. Extreme precipitation events (greater than 95th percentile defined as 24-hr rainfall greater than 2.54 cm or a 48-hr total greater than 3.81 cm) occurred 37 times. There were eight sewer overflows greater than 3,785,412 L reported during the study period. Seasonal effects were seen, with increased AGI in the winter months as expected. One of the eight sewer overflows occurred during the winter with an estimated volume of 147,631,100 L and this was significantly associated with increased visits. Of the lags tested, only rainfall 4 days prior was significantly associated with the number of visits. Each day was either classified as rain or no rain and the average visits (and percent change in visits) were calculated from the model after a lag of 4 days. The estimated mean number of visits 4 days after any rainfall was 8.1 (95% CI, 7.2-8.5), which is 11% higher than the estimated mean of 7.3 visits (95% CI, 7.1-7.6) 4 days after days without rainfall. This effect was smaller than the effect of winter month. Results were unchanged when visits were restricted to patients in areas served by surface water only (85% of total visits). The association found here between rainfall and paediatric ED visits for AGI is plausible given the presence of pathogens that are detectable in surface and groundwater at baseline levels that increase after rainfall. The observed increase in ED visits for AGI seen in this study occurred in the absence of any disease outbreaks reported to public health officials in the region and suggest that rainfall-associated illness may be underestimated. It is important to increase our understanding of the impact that rainfall has on the burden of disease as this is the first step in finding ways to mitigate risk of disease, especially in the presence of climate change. Further study is required to better examine this association and the potential mechanisms for the introduction of pathogens into environmental and drinking water.

Outbreaks

An outbreak of shiga toxin-producing Escherichia coli infection associated with a school camp. McCall, B.J., Slinko, V.G., Smith, H.V., Heel, K., Culleton, T.H., Kelk, V.R. and Stafford, R.J. (2010) Communicable Diseases Intelligence, 34(1); 54-56. Shiga toxin-producing Escherichia coli (STEC) are an important cause of gastroenteritis and are associated with considerable mortality and morbidity. Severe complications may occur in up to 8% of cases, including haemolytic uraemic syndrome (HUS) in children or thrombocytopenia purpura in adults. In Europe, North America and in other developed countries, a number of large outbreaks of STEC infection have been associated with contaminated food or water. However, in Australia there have been few reported outbreaks of STEC. In November, 2008 the Brisbane Southside Public Health Unit was notified of STEC infection in a teenage male. A remote rural school camp was identified as the likely source of infection. According to Queensland Health protocols, a case investigation was conducted along with contact tracing and the collection of faecal samples. A questionnaire was administered to students who had attended the camp to collect details of potential exposures including water sources, facilities used, animal and environmental exposures. An environmental health investigation was undertaken at the school camp site including testing of water samples from rainwater tanks, bores and other potential environmental sources. Stool samples were requested from students who attended the camp as well as school and site maintenance staff. Faecal sample testing for STEC was also offered to other household contacts of those camp attendees who were found to be positive for STEC. Among the 20 students who completed the questionnaire, no specific water source or other potential risk factors were found. A faecal sample from the index case was positive for Shiga-like toxin, had stx1, stx2, eaeA and ehxA genes detected and was serotyped as O26:H11. All faecal samples from the index case’s household contacts were negative for

Page 29: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 9

STEC. There were 13 other students who provided faecal samples. Four students who were asymptomatic tested positive for a variety of STEC genes. One of the 15 household contacts of the 4 asymptomatic cases tested positive for stx2. This parent had visited the camp and consumed water from a container in one of the common areas prior to the introduction of bottled water measures. Two water samples collected from the camp had STEC isolated. A sample from the case’s dormitory had O2:H8 isolated and stx1, stx2, ehxA and saa genes detected. The other positive sample had O91:H10 isolated and also had stx2, ehxa and saa genes detected. Of the 13 samples from rainwater tanks and their outlets, 10 had unacceptably high levels of E. coli (up to 310 E. coli per 100 ml) and did not meet the microbiological requirements of the Australian Drinking Water Guidelines 2004. The bore water was found to be microbiologically safe and no Shiga toxin genes were found in animal faecal samples collected at the camp. Food handling and general hygiene standards at the camp were found to be satisfactory. It is postulated that the tank water supply at this camp was the most likely source of this outbreak. The rainwater tanks were not fitted with ‘first flush’ devices (rainwater diverters), had not been cleaned out for many years and were not subject to any form of disinfection. There was no evidence documented of routine maintenance or a chemical or microbiological testing program for the tank or bore water supply. As a result of this outbreak remedial action was undertaken at the school camp to ensure the safety of the drinking water supply. The tank water reticulation system was redesigned, ultraviolet sterilisation of the drinking water sources instituted and all drinking water tanks were cleaned and disinfected. This outbreak is notable for the high degree of asymptomatic carriage of STEC. It is not currently known how long asymptomatic carriage may occur prior to the onset of infection or if there are certain physiological triggers than can switch carriage to infection. The clinical relevance of asymptomatic carriage of STEC is still unknown. The potential serious progression of STEC infection means that measures to improve the detection of

asymptomatic carriage of STEC and support disease control are important.

Perfluorinated Compounds

Epidemiologic evidence on the health effects of perfluorooctanoic acid (PFOA). Steenland, K., Fletcher, T. and Savitz, D.A. (6423) Environmental Health Perspectives, 118(8); 1100-1108. Perfluorooctanoic acid (PFOA or C8) does not occur naturally but has been used as an industrial chemical since the 1940s. PFOA has does not break down once in the environment and has been detected in the serum of human populations in a number of industrialised countries. There have been some reports of drinking water sources near manufacturing facilities being contaminated with PFOA, however population exposure sources are generally not well understood. PFOA has an estimated half-life of about 3 years and is not metabolised in the body. Animal data indicates that PFOA can cause several types of tumours, low birth weight, neonatal death and may have toxic effects on the immune, liver and endocrine systems. Data on the health effects of PFOA in humans is sparse and the relevance of animal data for humans is controversial and may not be able to be generalised to humans. The most consistent findings in the epidemiological literature on the health effects of PFOA have been for modest increases in cholesterol, and to a lesser degree for a modest increase in uric acid in those with higher PFOA levels. These findings however have mostly been based on cross-sectional data where exposure and outcome have been measured simultaneously and therefore making causal inference difficult. There have been three longitudinal studies that show changes in PFOA over time correlated with changes in cholesterol levels and one longitudinal study that shows a similar effect on uric acid. These longitudinal studies add strength to a causal relationship however they do not preclude that PFOA and cholesterol, or PFOA and uric acid are both associated with some other biological process that changes over time. Other cross-sectional studies have shown some evidence of an increase in

Page 30: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 10

liver enzymes with higher serum PFOA levels however these finding have not been consistent in all studies and the observed increases in enzyme levels have been modest. Human studies assessing immune suppression and PFOA are limited and evidence is insufficient. Reduction of thyroid hormone in relational to high exposure levels of PFOA has been examined and the occupational data shows no consistent evidence of an association. There is very limited human data on hormones in relation to serum PFOA levels and results have been inconsistent. Mortality has been assessed in two cohort studies of occupationally exposed workers. Finding in these studies have been inconsistent apart from for diabetes mortality, which was significantly elevated in both studies when worker comparison groups were used. However another large study of diabetes prevalence showed no relation with PFOA. There is some evidence of increased kidney cancer and heart disease from another cohort study, and another study showed a positive exposure-response trend with stroke and prostate cancer. The evidence is sparse and inconsistent for chronic disease and no conclusions can be drawn from it. The research on reproductive and developmental health end points is growing rapidly and offers sporadic evidence that is hard to interpret. The studies provide inconsistent suggestions of a possible decrease in birth weight associated with PFOA exposure. Different measures of foetal growth across studies make it difficult to asses whether the studies of background PFOA and those considering more elevated PFOA levels are consistent with each other, however so far the positive evidence for an effect of birth weight comes from studies of background exposure levels. Other indicators of foetal growth have been studied with suggestions of adverse effects of PFOA on head circumference and ponderal index (relationship between mass and height) and on abdominal circumference and birth length. Studies examining preterm birth have reported elevated risks for intermediate exposure categories only, or no association. Child health and development have been examined to a very limited extent and an association has been reported between PFOA levels exceeding the 90th percentile and birth defects in aggregate. In a

study based on birth certificates of the mid-Ohio Valley, birth defects in aggregate were not associated with residence in a high-PFOA area. In a study examining parent-reported developmental milestones in infants 6 and 18 months of age, no indication of delays were associated with higher PFOA exposure. The epidemiological evidence on the health effects of PFOA is still limited and the data is insufficient to draw definite conclusions regarding the role of PFOA in relation to any of the diseases of concern. The epidemiological evidence on the health effects of PFOA so far comes mainly from two occupational cohorts and several occupational cross-sectional studies as well as community populations with background exposure levels and one community with elevated exposure in the Mid-Ohio Valley. The exposed community in the Mid-Ohio Valley may provide an opportunity to extend our knowledge regarding the health effects of PFOA and studies are currently underway in this community. Levels of PFOA exposure in this population are markedly above background levels and the population is large and comprises a wide age range. Comment

This paper was written by members of a scientific panel established under a C8 Class Action Settlement Agreement between the DuPont company (owner of a PFOA manufacturing facility) and nearby residents alleging adverse effects from exposure to PFOA.

Protozoa

Two-year monitoring of Cryptosporidium parvum and Giardia lamblia occurrence in a recreational and drinking water reservoir using standard microscopic and molecular biology techniques. Helmi, K., Skraber, S., Burnet, J.B., Leblanc, L., Hoffmann, L. and Cauchie, H.M. (2010) Environmental Monitoring and Assessment, DOI 10.1007/s10661-010-1726-7 The occurrence of Giardia lamblia and Cryptosporidium parvum were investigated in the Esch-sur-Sure reservoir located in the north of the Grand Duchy of Luxembourg. This reservoir is used for recreational activities such as swimming and boating and is also the main drinking water resource

Page 31: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 11

for the country. Water is treated for drinking by ozonation, coagulation, slow sand filtration, pH adjustment and disinfection by chlorination. Water samples were collected for two years, 2006-2007, from three sampling sites. Site A was located at Pont-Misere predam and corresponded to the entering point of the Sure River in the reservoir. Site B corresponded to the drinking water treatment plant inlet and is located 18 km downstream of site A and 4 km downstream of a recreational area. Site C corresponded to the drinking water treatment plant outlet. Samples were collected once a month during 2006 and twice a month during 2007. The Giardia and Cryptosporidium (oo)cysts were detected in the environmental samples using standardised microscopy techniques as well as a molecular quantitative PCR (qPCR) techniques. Faecal indicator bacteria were quantified in water samples collected from sites B and C, but not from site A. The microbiological water quality was assessed by quantifying the bacterial indictors, Escherichia coli, intestinal enterococci and Clostridium. Physicochemical characteristics of the water were measured at site B and included pH, conductivity and turbidly according to standard methods. Rainfall data during the 3 days prior to each sampling day was collected from the nearest meteorological station at Esch-sure-Sure. The risk of infection with exposure to G. lamblia and C. parvum (oo)cyst was estimated by using an exponential dose-response model. For risk assessment associated with recreational activities in the reservoir, the average volumes of water accidentally ingested were assumed to be 16 ml for an adult and 37 ml for a child. There were 90 samples collected during the two years of the study. The highest parasite concentrations were observed at site A. Both parasites were present in the reservoir throughout the year with G. lamblia cysts in higher numbers compared to C. parvum oocysts in all surface water samples. The concentration medians for site A were 5.0 Giardia cysts vs 1.65 Cryptosporidium oocysts (per 10 l) and at site B, 0.9 Giardia cysts vs 0.3 Cryptosporidium oocysts. Neither parasite was detected in any of the finished drinking water samples from site C. Several sources of contamination could explain the

occurrence of these parasites including the discharge from a nearby wastewater treatment plant and also in the Upper-sure watershed, agriculture constitutes up to 50% of the land use, 14% of which consists of pasture land for cattle. It has been shown the cattle and in particular pre-weaned calves, can be an important source of C. parvum (oo)cysts. Microscopy was found to be more sensitive than qPCR. Only 27% and 23% of microscopy positive samples were positive by qPCR for Giardia and Cryptosporidium, respectively. This may be explained by either an overestimation of oocyst counts using microscopy or be due to PCR inhibitors in surface water samples. Concentrations of (oo)cysts assessed by microscopy and by qPCR were similar, although discrepancies between the two methods were seen, especially involving Cryptosporidium. Oocysts were always detected in higher concentrations by immunofluorescence microscopy. This overestimation of the oocyst concentration by microscopic count may be due to cross-reactions with other species of similar size and shape. Also the presence of empty oocysts that can be found in high amounts in environmental samples may contribute to this difference. Even though (oo)cyst occurrence could be correlated with the presence of different faecal indicators (E. coli, streptococci and entercocci), the results found here show that the absence of bacterial indicators is not sufficient to guarantee the absence of pathogenic protozoa as 12% of the water samples collected at site B were negative for faecal bacteria indicators while positive for parasites at concentrations that ranged from 0.1 to 0.3 (oo)cysts per10 litres. Conversely, 9% of the samples were positive for bacterial indicators while negative for (oo)cysts. Overall, 22% of the surface water samples collected at site B showed discrepancies between indicator and protozoa occurrences. Both parasites had the highest occurrence observed during the winter season, from December to March. Parasite occurrence and temperature appeared to be inversely correlated (Spearman Rank Correlation test, p less than 0.05) while pH appeared to be positively correlated (Spearman Rank Correlation test, p less

Page 32: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 12

than 0.05). No correlation between (oo)cyst occurrence and rainfall events was found at site A or B. A positive correlation between increasing (oo)cyst occurrence and increasing turbidity was observed at site B. A significant correlation was found between Giardia cyst and Cryptosporidium oocyst concentrations. The risk of infection with G. lamblia and C. parvum during recreational activity in the Esch-sur-Sure reservoir was assessed during the summer as no recreational activities take place on the lake in winter. At site B during the summer the maximum risk of infection (corresponding to the highest concentration of parasites detected in the water) ranged from 0.0006% to 0.0015% for Giardia and from 0.0002% to 0.0004% for Cryptosporidium. At site A the risk of infection during the summer ranged from 0.24% to 0.56% for Giardia and from 0.0015% to 0.0064% for Cryptosporidium (per swimming event). No (oo)cysts would be detected in large volumes of finished drinking water and a risk of infection would only occur in cases of malevolent acts, deficiencies in water treatments or problems occurring during maintenance operations. This is the first study to report the occurrence of G. lamblia and C. parvum (oo)cyst in surface and drinking waters of the Grand-Duchy of Luxembourg. However the risk to swimmers of infection was estimated be less than 1% and therefore there is a low probability of infection by these parasites in the studied area.

Toxic elements

High-level exposure to lithium, boron, cesium, and arsenic via drinking water in the Andes of Northern Argentina. Concha, G., Broberg, K., Grander, M., Cardozo, A., Palm, B. and Vahter, M. (2010) Environmental Science and Technology, 44(17); 6875-6880. In many areas of the world surface water is lacking or is highly polluted and people are becoming increasingly dependent on drinking groundwater. Groundwater is generally much less contaminated by human activity than surface water and is commonly

believed to be suitable for drinking. However, groundwater aquifers may sometimes contain potentially toxic metals, depending on local geology. Elevated concentrations of arsenic in groundwater have been reported in many parts of the world and because of the increasing use of groundwater for drinking, the number of people exposed to elevated arsenic continues to increase. In Argentina, elevated concentrations of arsenic in groundwater have been detected particularly in the Puna region in the Andes, the Chaco region, Cordoba and the Pampean Plain. This study was conducted to assess human exposure to other elements through drinking water in the same Puna villages in the Argentinean Andes. The study was conducted in San Antonio de los Cobres and in five surrounding Andean villages in northern Argentina. The exposure to potentially toxic elements of 198 women in this area was assessed. The women were interviewed about their sources of drinking water, amounts of water consumed, dietary habits and time of residence in the area. Water intake was estimated from the number of glasses of water consumed daily. Spot urine samples were obtained from all women to measure concentrations of a range of potentially toxic and essential elements. Urine is the major route of excretion of absorbed arsenic and several other elements. Drinking water was collected from the public water distribution systems or private wells in 20-mL acid-washed polyethylene bottles after flushing the water for about 1 minute. Water and urinary concentrations of 31 different elements were determined using inductively coupled plasma mass spectrometry (ICP-MS). The drinking water had high concentrations of arsenic (maximum 210 micro g/L), lithium (maximum 1000 micro g/L), cesium (maximum 320 micro g/L), rubidium (maximum 47 micro g/L) and boron (maximum 5950 micro g/L). Also the concentration of vanadium was moderately elevated in some villages. Concentrations of other toxic metals such as cadmium, lead, beryllium, chromium and uranium were low. None of the water samples contained significant amounts of essential trace elements and concentrations of calcium, magnesium, iron and manganese, which are often found to be

Page 33: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 13

elevated in groundwater, were low. Elevated concentrations of arsenic, lithium, cesium and boron were also found in the urine of the women in the study. Village median values ranged from 26 to 266 micro g/L of arsenic, 340 to 4550 micro g/L of lithium, 34 to 531 micro g/L of cesium and 2980 to 16,560 micro g/L of boron. The interviews showed that local tap water was the main source of drinking water. For the women in San Antonio de los Cobres there was a significant correlation of arsenic or lithium in urine with the number of glasses of water consumed per day (range one to more than three glasses), based on questionnaire data (for arsenic rs = 0.180, p = 0.023; for lithium rs = 0.173, p = 0.029). The temporal variation in the concentrations of elements (Li, B, V, As, Rb, Sr, Mo, Cs, U) in water samples collected in San Antonio de los Cobres over several years was fairly small. In four of the six studied Puna villages the concentrations of arsenic and boron exceeded the maximum levels allowed by the Argentinean Alimentary Code, as well as the guideline values recommended by the WHO. Guideline values need to be established for lithium and cesium. This study showed highly elevated exposure to arsenic, lithium, cesium and boron though drinking water in the Argentinean Puna region, in particular in the largest village, San Antonio de los Cobres which has a population of about 5000. There are apparent risks of toxic effects from long-term exposure to several of the elements studied. Studies of health effects from exposure to lithium, cesium and boron, in particular are warranted, specifically considering the combined, life-long exposure. This study shows that analyses of drinking water contaminants should not be limited to arsenic but needs to include many other potentially toxic elements, which is possible using ICP-MS.

Viruses

Presence of enteric viruses in source waters for drinking water production in the Netherlands. Lodder, W.J., Van Den Berg, H.H.J.L., Rutjes, S.A. and De Roda Husman, A.M. (2010) Applied and Environmental Microbiology, 76(17); 5965-5971.

In the Netherlands, surface water is used as source water for the production of drinking water but also for recreational purposes and for shellfish cultivation for human consumption. Surface water may contain high concentrations of human pathogenic viruses which may potentially affect the drinking water quality. The Dutch Drinking Water Directive states that it is obligatory to establish the drinking water quality by quantitative microbial risk assessment (QMRA) for index pathogens. To comply with this directive, the annual infection risk should be lower than one infected person in every 10,000 persons due to the consumption of unboiled tap water. To assess this risk, the index pathogens in raw source water need to be measured and the removal efficiency of treatment processes used assessed. This study examines the quality of Dutch source waters for drinking water production at 10 locations with respect to enteric viruses. Water samples were taken from 1999 to 2002 and the sampling period varied with each location. Large volumes of source water for drinking water production, 200 to 600 litres, were collected from each of the 10 locations in the Netherlands. Sampling was repeated 1 to 12 times during the study period at these locations. The locations were either the intake areas for the drinking water companies or upstream of a source water intake area. Enteric viruses were determined in the surface water samples. Bacteriophages were also determined because they have been previously suggested as useful virus indicators. Enteroviruses, reoviruses and bacteriophages were detected by cell culture, and norovirus and rotavirus RNA was detected by reverse transcription-PCR (RT-PCR). Physical parameters including temperature, pH and turbidity were also measured. There were a total of 75 samples taken throughout the 4 year study period from the 10 locations. Using (cell) culture methods, enteroviruses and reoviruses could be detected in 75% and 83% of all the samples tested. The mean and median enterovirus concentrations per location varied between 0.0052 and 2.4 PFU/litre and between 0 and 2.4 PFU/litre, respectively. The mean and median reovirus

Page 34: WQRA HealthStream - Issue 60

Web Bonus Articles

HEALTH STREAM DECEMBER 2010 PAGE 14

concentration per location varied between 0.013 and 1.3 PFU/litre and 0 and 0.8 PFU/litre, respectively. The enterovirus concentrations in the positive samples ranged from 0.0033 to 5.2 PFU/litre, and for reovirus ranged from 0.0030 to 5.9 PFU/litre. Somatic coliphages were found in all of the tested water samples and F-specific phages were found in 73 of 75 tested samples. The mean somatic and F-specific bacteriophage concentrations in the samples varied between 105 and 1.7 x 104 PFU/litre and between 2.0 and 4.3 x 103 PFU/litre, respectively. The median somatic and F-specific bacteriophage concentrations in the samples varied between 88 and 1.2 x 104 PFU/litre and between 0.44 and 3.4 x 103 PFU/litre, respectively. The concentrations of somatic coliphages found in the samples ranged from 1.1 to 1.1 x 105 PFU/litre, and the concentrations of F-specific phages found in the positive samples ranged from 0.18 to 1.4 x 104 PFU/litre. Overall, the concentration of viruses found was somewhat higher in the colder months of the year than in the warmer months. Of the 75 water samples, 69 were tested for the presence of norovirus and rotavirus RNA. In 45% and 48% of these samples, norovirus and rotavirus RNA could be detected, respectively. The numbers of norovirus and rotavirus RNA detected in the 69 samples tested ranged from 0 to 171 PCR-detectable units (PDU)/litre and from 0 to 2.2 x 103 PDU/litre, respectively. The mean norovirus concentrations varied between 0 and 26 PDU/litre and the mean rotavirus concentrations between 0.88 and 375 PDU/litre. No correlation could be found between the numbers of norovirus-positive samples and rotavirus-positive samples. A very strong correlation (P less than 0.00005) was found between the concentrations of somatic coliphages and F-specific phages detected in the samples. A significant correlation could also be found between the presence of these two phages and enteroviruses (P less than 0.0005). No other significant correlations between the tested viruses were seen. There were differences found in virus concentrations between locations, but the average virus concentrations found at a specific location in the same season over several years were similar.

The pH, temperature and turbidity of the water samples were measured for 93%, 77% and 50% of the samples, respectively. A moderate correlation (P less than 0.005) was found between the pH of the water and the norovirus concentrations with higher norovirus concentrations found at higher pHs. No other significant correlations were found with physical parameters. An unacceptable risk to public health may occur if peak concentrations of these enteric viruses occur at a time when the water treatment is insufficient or failing. The importance of monitoring for waterborne pathogenic viruses is therefore highlighted. To further assess the risk of transmission of viruses via water, it is necessary to gather more information about the identity and the viability of the viruses that are present in different water samples.