Document for Public Comment
Prepared by the Federal-Provincial-Territorial Committee on Drinking Water
Consultation period ends January 28, 2011
Help on accessing alternative formats, such as Portable Document Format (PDF), Microsoft Word and PowerPoint (PPT) files, can be obtained in the alternate format help section.
The Federal-Provincial-Territorial Committee on Drinking Water (CDW) has assessed the available information on enteric protozoa with the intent of revising the current drinking water guideline. The purpose of this consultation is to solicit comments on the proposed guideline, on the approach used for its development and on the potential economic costs of implementing it, as well as to determine the availability of additional exposure data.
The CDW has requested that this document be made available to the public and open for comment. Comments are appreciated, with accompanying rationale, where required. Comments can be sent to the CDW Secretariat via email at firstname.lastname@example.org. If this is not feasible, comments may be sent by mail to the CDW Secretariat, Water, Air and Climate Change Bureau, Health Canada, 3rd Floor, 269 Laurier Avenue West, A.L. 4903D, Ottawa, Ontario, Canada K1A 0K9. All comments must be received before January 28, 2011.
It should be noted that this guideline technical document on enteric protozoa in drinking water will be revised following evaluation of comments received, and a drinking water guideline will be established, if required. This document should be considered as a draft for comment only.
Where treatment is required for enteric protozoa, the proposed guideline for Giardia and Cryptosporidium is a health-based treatment goal of a minimum 3-log removal and/or inactivation of cysts and oocysts. Depending on the source water quality, a greater log removal and/or inactivation may be required. Treatment technologies and watershed or wellhead protection measures known to reduce the risk of waterborne illness should be implemented and maintained if source water is subject to faecal contamination or if Giardia or Cryptosporidium has been responsible for past waterborne outbreaks.
Protozoa are a diverse group of microorganisms. Most are free-living organisms that can reside in fresh water and pose no risk to human health. Some enteric protozoa, such as Giardia and Cryptosporidium, are pathogenic and have been associated with drinking water related outbreaks. They may be found in water following direct or indirect contamination by the faeces of humans or other animals. Person-to-person transmission is a common route of transmission of both Giardia and Cryptosporidium.
Health Canada recently completed its review of the health risks associated with enteric protozoa in drinking water. This Guideline Technical Document reviews and assesses identified health risks associated with enteric protozoa in drinking water. It evaluates new studies and approaches and takes into consideration the methodological limitations for the detection of protozoa in drinking water. From this review, the guideline for protozoa in drinking water is proposed as a health-based treatment goal of a minimum 3-log reduction of enteric protozoa.
During its November 2009 meeting, the Federal-Provincial-Territorial Committee on Drinking Water reviewed the proposed Guideline Technical Document for enteric protozoa and gave approval for this document to undergo public consultations.
The health effects associated with exposure to Giardia and Cryptosporidium, like those of other pathogens, depend upon features of the host, pathogen and environment. The host's immune status, the (oo)cyst's infectivity and the degree of exposure are all key determinants of infection and illness. Infection with Giardia or Cryptosporidium can result in both acute and chronic health effects.
Theoretically, a single cyst of Giardia would be sufficient to cause infection. However, studies have shown that the dose required for infection is usually more than a single cyst and is dependent on the virulence of the particular strain. Typically, Giardia is non-invasive and results in asymptomatic infections. Symptomatic giardiasis can result in nausea, diarrhoea (usually sudden and explosive), anorexia, an uneasiness in the upper intestine, malaise and occasionally low-grade fever or chills. The acute phase of the infection commonly resolves spontaneously, and organisms generally disappear from the faeces. Some patients (e.g., children) suffer recurring bouts of the disease, which may persist for months or years.
As is the case for Giardia and other pathogens, a single organism of Cryptosporidium can potentially cause infection, although studies have shown that more than one organism is generally required. Individuals infected with Cryptosporidium are more likely to develop symptomatic illness than those infected with Giardia. Symptoms include watery diarrhoea, cramping, nausea, vomiting (particularly in children), low-grade fever, anorexia and dehydration. The duration of infection depends on the condition of the immune system. Immunocompetent individuals usually carry the infection for a maximum of 30 days.
Giardia cysts and Cryptosporidium oocysts can survive in the environment for extended periods of time, depending on the characteristics of the water. They have been shown to withstand a variety of environmental stresses, including freezing and exposure to seawater. (Oo)cysts are routinely found in Canadian source waters. The sudden and rapid influx of these microorganisms into source waters, for which available treatment may not be sufficient or adequate, is likely responsible for the increased risk of infection associated with transmission through drinking water.
Giardia and Cryptosporidium are common causes of waterborne disease outbreaks; Giardia is the most commonly reported intestinal protozoan in Canada, North America and worldwide.
The multi-barrier approach is the best approach to reduce enteric protozoa and other waterborne pathogens in drinking water. Source water assessments should be part of routine vulnerability assessments and/or sanitary surveys. They should include routine monitoring of Giardia and Cryptosporidium as well as identification of potential sources of human and animal faecal contamination in the watershed/aquifer that may impact the quality of the water. A method that allows for the simultaneous detection of these protozoans is available, and has been validated for surface water. Where routine monitoring for Giardia and Cryptosporidium is not feasible (e.g., small supplies), (oo)cyst concentrations can be estimated. Estimates should be based on a source water assessment along with other water quality parameters that can provide information on the risk and/or level of faecal contamination in the source water.
Once the source water quality has been characterized, pathogen removal targets and effective treatment barriers can be established in order to achieve safe levels in the finished drinking water. In general, all water supplies should be disinfected, and an adequate concentration of disinfectant residual should be maintained throughout the distribution system at all times. The combination of physical removal and disinfection barriers is the most effective way to reduce protozoa in drinking water, because of their resistance to commonly used disinfectants such as chlorine. Treatment systems that rely solely on chlorine as the treatment barrier will not be able to effectively inactivate Giardia and Cryptosporidium that may be present in the source water.
Although the absence of E. coli and total coliforms does not necessarily indicate the absence of enteric protozoa, they remain the best available indicators for verifying microbiological drinking water quality. The application and control of a multi-barrier, source-to-tap approach, in combination with monitoring of a variety of indicators (e.g., turbidity, chlorine residual, E. coli) can be used to verify that the water has been adequately treated and is therefore of an acceptable microbiological quality.
Quantitative microbial risk assessment (QMRA) can be used as part of a multi-barrier approach to help provide a better understanding of risk related to a water system. QMRA uses source water quality data, treatment barrier information and pathogen-specific characteristics to estimate the burden of disease associated with exposure to pathogenic microorganisms in a drinking water source. Through this assessment, variations in source water quality and treatment performance can be evaluated for their contribution to the overall risk. Such analysis can be used to assess the adequacy of existing control measures or the requirement for additional treatment barriers or optimization and help establish limits for critical control points.
Specific enteric protozoa whose characteristics make them a good representative of all similar pathogenic protozoa are considered in QMRA to select a reference protozoa. It is assumed that controlling the reference protozoan would ensure control of all other similar protozoa of concern. Cryptosporidium parvum and Giardia lamblia have been selected as the reference protozoa for this risk assessment because of their high prevalence rates, potential to cause widespread disease, resistance to chlorine disinfection and the availability of a dose-response model for each organism.
Note: Specific guidance related to the implementation of the drinking water guideline should be obtained from the appropriate drinking water authority in the affected jurisdiction.
Exposure to Giardia and Cryptosporidium should be limited by implementing a "source-to-tap" approach to protect the quality of drinking water. This approach includes assessing the entire drinking water system, from the source water through the treatment and distribution systems to the consumer, in order to identify risks and appropriate measures to mitigate those risks.
Source water assessments should be part of routine vulnerability assessments and/or sanitary surveys. They should include routine monitoring of Giardia and Cryptosporidium as well as identification of potential sources of human and animal faecal contamination in the watershed/aquifer, and potential pathways and/or events (low to high risk) by which protozoa can make their way into the source water and impact water quality. Sources of human faecal matter, such as sewage treatment plant effluents, sewage lagoon discharges and improperly maintained septic systems, have the potential to be significant sources of Giardia and Cryptosporidium. Faecal matter from agricultural animals, wildlife and other animals are also considered an important source of Giardia and Cryptosporidium species capable of causing illness in humans.
It is important to conduct a comprehensive assessment of groundwater sources to classify them as either secure or as groundwater under the direct influence of surface water (GUDI). These assessments should include, at a minimum: a hydrogeological assessment; an evaluation of well integrity; and a sanitary survey of activities and physical features in the area. Secure groundwater sources, if properly classified, should not have protozoa present. However, even secure groundwater sources will have a degree of vulnerability, and should be periodically reassessed.
Assessments of water quality need to consider the "worst-case" scenario for that source water. For example, there may be a short period of poor source water quality following a storm. This short-term degradation in water quality may in fact embody most of the risk in a drinking water system. Collecting and analysing source water samples for Giardia and Cryptosporidium can provide important information for determining the level of treatment, and mitigation (risk management) measures, that should be in place to reduce the concentration of (oo)cysts to an acceptable level. Where source water sampling and analysis for Giardia and Cryptosporidium are not feasible (e.g., small supplies), (oo)cyst concentrations can be estimated. Estimates should take into account information obtained from the source water assessment along with other water quality parameters that can provide information on the risk and/or level of faecal contamination in the source water. Because these estimates will have a high level of uncertainty, additional factors of safety during engineering and design or upgrade of the treatment plant or a greater log reduction than calculated using a quantitative microbial risk assessment (QMRA) approach should be applied in order to ensure production of drinking water of an acceptable microbiological quality.
The information obtained from source water assessments is a key component of carrying out site-specific risk assessments. This information should be used along with treatment and distribution system information to help assess risks from source to tap. This document suggests the use of QMRA as a tool that can help provide a better understanding of the water system by evaluating the impacts of variations in source water quality and treatment process performance on the overall risk, including the potential impact of hazardous events, such as storms, contamination events or the failure of a treatment barrier. The resulting analysis can be used to assess the adequacy of existing control measures, to determine the need for additional treatment barriers or for optimization and to help establish limits for critical control points.
Where treatment is required, a minimum 3-log removal and/or inactivation of Giardia and Cryptosporidium (oo)cysts is required. In many surface water sources, a greater log reduction may be necessary.
Reductions can be achieved through physical removal processes, such as filtration, and/or by inactivation processes, such as ultraviolet light disinfection. Generally, minimum treatment of supplies derived from surface water sources or groundwater under the direct influence of surface waters should include adequate filtration (or equivalent technologies) and disinfection. The appropriate type and level of treatment should take into account the potential fluctuations in water quality, including short-term water quality degradation, and variability in treatment performance. Pilot testing or other optimization processes may be useful for determining treatment variability. In systems with a distribution system, a disinfectant residual should be maintained at all times.
As part of the multi-barrier approach, a variety of indicators (e.g., turbidity, chlorine residual, E. coli) should be routinely monitored in order to verify that the water has been adequately treated and therefore, meets the health-based treatment goal. These indicators can also be used for assessing the distribution system and to verify that the microbiological quality of the water is being maintained through the distribution system to the consumer's tap.
Protozoa are a diverse group of eukaryotic, typically unicellular, microorganisms. The majority of protozoa are free-living organisms that can reside in fresh water and pose no risk to human health. However, some protozoa are pathogenic to humans. These protozoa fall into two functional groups: enteric parasites and free-living protozoa. Human infections caused by free-living protozoa are generally the result of contact during recreational bathing (or domestic uses of water other than drinking); as such, this group of protozoa is addressed in the Guidelines for Canadian Recreational Water Quality (Health Canada, 2009). Enteric protozoa, on the other hand, have been associated with several drinking water-related outbreaks, and drinking water serves as an important route of transmission for these organisms; as such, a discussion of enteric protozoa is presented here.
Enteric protozoa are common parasites in the gut of humans and other mammals. They, like enteric bacteria and viruses, can be found in water following direct or indirect contamination by the faeces of humans or other animals. These microorganisms can be transmitted via drinking water and have been associated with several waterborne outbreaks in North America and elsewhere (Schuster et al., 2005; Karanis et al., 2007). The ability of this group of microorganisms to produce (oo)cysts that are extremely resistant to environmental stresses and conventional drinking water disinfection has facilitated their ability to spread and cause illness.
The enteric protozoa that are most often associated with waterborne disease in Canada are Cryptosporidium and Giardia. These protozoa are commonly found in source waters, are highly pathogenic, can survive for long periods of time in the environment and are highly resistant to chemical disinfection. Thus, they are the focus of the following discussion. A brief description of other enteric protozoa of human health concern (i.e., Toxoplasma gondii, Cyclospora cayetanensis and Entamoeba histolytica) is provided in Appendix C.
Giardia is a flagellated protozoan parasite (Phylum Protozoa, Subphylum Sarcomastigophora, Superclass Mastigophora, Class Zoomastigophora, Order Diplomonadida, Family Hexamitidae). It was first identified in human stool by Antonie van Leeuwenhoek in 1681 (Boreham et al., 1990). However, it was not recognized as a human pathogen until the 1960s, after community outbreaks and its identification in travellers (Craun, 1986; Farthing, 1992).
Giardia inhabits the small intestines of humans and other animals. The trophozoite, or feeding stage, lives mainly in the duodenum but is often found in the jejunum and ileum of the small intestine. Trophozoites (9-21 Ám long, 5-15 Ám wide and 2-4 Ám thick) have a pear-shaped body with a broadly rounded anterior end, two nuclei, two slender median rods, eight flagella in four pairs, a pair of darkly staining median bodies and a large ventral sucking disc (cytostome). Trophozoites are normally attached to the surface of the intestinal villi, where they are believed to feed primarily upon mucosal secretions. After detachment, the binucleate trophozoites form cysts (encyst) and divide within the original cyst, so that four nuclei become visible. Cysts are ovoid, 8-14 Ám long by 7-10 Ám wide, with two or four nuclei and visible remnants of organelles. Environmentally stable cysts are passed out in the faeces, often in large numbers. A complete life cycle description can be found in a review paper by Adam (2001).
The taxonomy of the genus Giardia is rapidly changing as emerging data on the isolation and identification of new species and genotypes, strain phylogeny and host specificity become available. The current taxonomy of the genus Giardia is based on the species definition proposed by Filice (1952), who defined three species: G. duodenalis (syn. G. intestinalis, G. lamblia), G. muris and G. agilis, based on the shape of the median body, an organelle composed of microtubules that is most easily observed in the trophozoite. Other species have subsequently been described on the basis of cyst morphology and molecular analysis. Currently, six Giardia species are recognized (Table 1) (Thompson, 2004; Thompson and Monis, 2004). These six species have been reported from mammals, birds, rodents and amphibians and are not easily distinguished. Their host preferences have been widely debated-except for G. agilis, which is morphologically different, has been reported only from amphibians and is not regarded as infective to humans (Adam, 1991).
|Species (assemblage)||Major host(s)|
|G. lamblia (A)||Humans, livestock, other mammals|
|G. lamblia (B)||Humans|
|G. lamblia (C)||Dogs|
|G. lamblia (D)||Dogs|
|G. lamblia (E)||Cattle, other hoofed livestock|
|G. lamblia (F)||Cats|
|G. lamblia (G)||Rats|
|G. microti||Muskrats, voles|
The name G. lamblia is commonly applied to isolates from humans, although this species is capable of infecting a wide range of mammals. Molecular characterization of this species has demonstrated the existence of genetically distinct assemblages: assemblages A and B infect humans and other mammals, whereas the remaining assemblages C, D, E, F and G have not yet been isolated from humans and appear to have restricted host ranges (and likely represent different species or groupings) (Table 1) (Adam, 2001; Thompson, 2004; Thompson and Monis, 2004; Xiao et al., 2004; Smith et al., 2007). In addition to genetic dissimilarities, these variants also exhibit phenotypic differences, including differential growth rates and drug sensitivities (Homan and Mank, 2001; Read et al., 2002). These genetic differences have been exploited as a means of distinguishing human-infective Giardia from other strains or species (Amar et al., 2002; Cacci˛ et al., 2002; Read et al., 2004); however, the applicability of these methods to analysis of Giardia within water has been limited (see Section 6.6). Thus, at present, it is necessary to consider that any Giardia cysts found in water are potentially infectious to humans.
Cryptosporidium is a protozoan parasite (Phylum Apicomplexa, Class Sporozoasida, Subclass Coccodiasina, Order Eucoccidiorida, Suborder Eimeriorina, Family Cryptosporidiidae) that was first recognized as a potential human pathogen in 1976 in a previously healthy 3-year-old child (Nime et al., 1976). A second case of cryptosporidiosis occurred two months later in an individual who was immunosuppressed as a result of drug therapy (Meisel et al., 1976). The disease became best known in immunosuppressed individuals exhibiting the symptoms now referred to as acquired immunodeficiency syndrome, or AIDS (Hunter and Nichols, 2002).
The recognition of Cryptosporidium as a human pathogen led to increased research into the life cycle of the parasite and an investigation of the possible routes of transmission. Cryptosporidium has a multi-stage life cycle, typical of an enteric coccidian. The entire life cycle takes place in a single host and evolves in six major stages: 1) excystation, where sporozoites are released from an excysting oocyst; 2) schizogony (syn. merogony), where asexual reproduction takes place; 3) gametogony, the stage at which gametes are formed; 4) fertilization of the macrogametocyte by a microgamete to form a zygote; 5) oocyst wall formation; and 6) sporogony, where sporozoites form within the oocyst (Current, 1986). A complete life cycle description and diagram can be found in a review paper by Smith and Rose (1990). Syzygy, a sexual reproduction process that involves association of the pre-gametes end to end or laterally prior to the formation of gametes, was recently described in two species of Cryptosporidium, C. parvum and C. andersoni, providing new information regarding Cryptosporidium's biology (life cycle) and transmission (Hijjawi et al., 2002; Rosales et al., 2005).
As a waterborne pathogen, the most important stage in Cryptosporidium's life cycle is the round, thick-walled, environmentally stable oocyst, 4-6 Ám in diameter. There is sometimes a visible external suture line, and the nuclei of sporozoites can be stained with fluorogenic dyes such as 4',6-diamidino-2-phenylindole (DAPI). Upon ingestion by humans, the parasite completes its life cycle in the digestive tract. Ingestion initiates excystation of the oocyst and releases four sporozoites, which adhere to and invade the enterocytes of the gastrointestinal tract (Spano et al., 1998a; Pollok et al., 2003). The resulting parasitic vacuole contains a feeding organelle along with the parasite, which is protected by an outer membrane. The outer membrane is derived from the host cell (intracellular). The sporozoite undergoes asexual reproduction (schizogony), releasing merozoites that spread the infection to neighbouring cells. Sexual multiplication (gametogony) then takes place, producing either microgametes ("male") or macrogametes ("female"). Microgametes are then released to fertilize macrogametes and form zygotes. A small proportion (20%) of zygotes fail to develop a cell wall and are termed "thin-walled" oocysts. These forms rupture after the development of the sporozoites, but prior to faecal passage, thus maintaining the infection within the host. The majority of the zygotes develop a thick, environmentally resistant cell wall and four sporozoites to become mature oocysts, which are then passed in the faeces.
Our understanding of the taxonomy of the genus Cryptosporidium is continually being updated. Cryptosporidium was first described by Tyzzer (1907), when he isolated the organism, which he named Cryptosporidium muris, from the gastric glands of mice. Tyzzer (1912) found a second isolate, which he named C. parvum, in the intestine of the same species of mice. This isolate was considered to be structurally and developmentally distinct by Upton and Current (1985). Although numerous species names have been proposed based on the identity of the host, most isolates of Cryptosporidium from mammals, including humans, are similar to C. parvum as described by Tyzzer (1907, 1912). At present, 20 valid species have been recognized (Table 2) (Egyed et al., 2003; Thompson and Monis, 2004; Xiao et al., 2004; Fayer et al., 2008; Jirku et al., 2008; Power and Ryan, 2008; Ryan et al., 2008).
|Species (genotype)||Major host|
|C. fayeri||Red kangaroos|
|C. galli||Finches, chickens|
|C. hominis (genotype H, I or 1)||Humans, monkeys|
|C. macropodum||Eastern grey kangaroos|
|C. meleagridis||Turkeys, humans|
|C. parvum (genotype C, II or 2)||Cattle, other ruminants, humans|
With the advent of molecular techniques, several genotypes of Cryptosporidium have been proposed among various animal groups, including rodents, marsupials, reptiles, wild birds and primates, and research suggests that these genotypes vary with respect to their development, drug sensitivity and disease presentation (Chalmers et al., 2002; Xiao and Lal, 2002; Thompson and Monis, 2004; Xiao et al., 2004). To date, over 40 genotypes have been identified (Xiao et al., 2004; Feng et al., 2007; Fayer, 2008; Fayer and Xiao, 2008). The molecular analysis of C. parvum human and bovine isolates, linked to human cryptosporidiosis outbreaks, indicates the existence of two predominantly distinct genotypes in humans (Morgan et al., 1997; Peng et al., 1997; Spano et al., 1998b; Sulaiman et al., 1998; Widmer et al., 1998; Awad-El-Kariem, 1999; Ong et al., 1999; Cacci˛ et al., 2000; McLauchlin et al., 2000; Xiao et al., 2001). Genotype 1 (syn. genotype I, genotype H and C. hominis) isolates are limited, for the most part, to humans, whereas genotype 2 (syn. genotype II and genotype C) isolates are zoonotic and have been reported in calves and other ruminants/ungulates, mice and humans. Genotype 1 was subsequently recognized as a new species, C. hominis (Morgan-Ryan et al., 2002). Further studies have identified additional genotypes in humans. Pieniazek et al. (1999) identified two novel Cryptosporidium genotypes, similar to a dog and a cat genotype, in persons infected with human immunodeficiency virus (HIV). Two new Cryptosporidium genotypes have been identified in humans, one similar to a cervine (deer) isolate (Ong et al., 2002) and a not-yet-identified genotype (i.e., not been previously identified in humans or other animals) (Wong and Ong, 2006). These findings have important implications for communities whose source water may be contaminated by faeces from wildlife. The epidemiological significance of these genotypes is still unclear, but findings suggest that certain genotypes are adapted to humans and transmitted (directly or indirectly) from person to person. Numerous other Cryptosporidium genotypes, for which a strain designation has not been made, have also been identified (Feng et al., 2007; Smith et al., 2007; Xiao and Fayer, 2008; Fayer, 2008).
Human and other animal faeces, especially cattle faeces, are major sources of Giardia. Giardiasis has been shown to be endemic in humans and in over 40 other species of animals, with prevalence rates ranging from 1% to 100% (Olson et al., 2004; Pond et al., 2004; Thompson, 2004; Thompson and Monis, 2004). Table 3 summarizes the prevalence of Giardia among humans and selected livestock animals and highlights the relatively high levels of giardiasis in cattle. Giardia cysts are excreted in large numbers in the faeces of infected humans and other animals (both symptomatic and asymptomatic). Infected cattle, for example, have been shown to excrete up to one million (106) cysts per gram of faeces (O'Handley et al., 1999; Ralston et al., 2003; O'Handley and Olson, 2006). Cysts are easily disseminated in the environment and are transmissible via the faecal-oral route. Beaver, dog, muskrat and horse faeces are also sources of Giardia, including human-source G.lamblia (Davies and Hibler, 1979; Hewlett et al., 1982; Erlandsen and Bemrick, 1988; Erlandsen et al., 1988; Traub et al., 2004; Eligio-GarcÝa et al., 2005; Traub et al., 2005). Giardia can also be found in bear, bird, cat and other animal faeces, but it is unclear whether it is pathogenic to humans (refer to Section 5.1.3).
Giardia cysts are commonly found in sewage and surface waters and occasionally in drinking water. In a cross-Canada survey of 72 municipalities performed between 1991 and 1995, Wallis et al. (1996) found that 72.6%, 21% and 18.2% of raw sewage, raw water and treated water samples, respectively, contained Giardia cysts. Table 4 highlights a selection of studies that have investigated the occurrence of Giardia in surface waters in Canada. Typically, Giardia concentrations in surface waters ranged from 2 to 200 cysts/100 L. Concentrations as high as 8700 cysts/100 L were reported and were associated with record spring runoff, highlighting the importance of event-based sampling (see Section 7.0) (Gammie et al., 2000). The typical range reported above is at the lower end of that described in a recent international review (Dechesne and Soyeux, 2007). Dechesne and Soyeux (2007) found that Giardia concentrations in source waters across North America and Europe ranged from 0.02 to 100 cysts/L, with the highest levels reported in the Netherlands. Source water quality monitoring data were also gathered for nine European (France, Germany, the Netherlands, Sweden and the United Kingdom) water sources and one Australian source. Overall, Giardia was frequently detected at relatively low concentrations, and levels ranged from 0.01 to 40 cysts/L. An earlier survey by Medema et al. (2003) revealed that concentrations of cysts in raw and treated domestic wastewater (i.e., secondary effluent) typically ranged from 5000 to 50 000 cysts/L and from 50 to 500 cysts/L, respectively.
|Province||Unit of measure||Giardia concentration
b It is important to consider that the sampling and analysis methods employed in these studies varied, and, as such, it may not be appropriate to compare cyst concentrations. It is also important to consider that the viability and infectivity of cysts were rarely assessed; as such, little information is available regarding the potential risk to human health associated with the presence of Giardia in these samples.
|Alberta||Single sample||494||LeChevallier et al., 1991a|
|Alberta||Annual geometric mean||8-193||Gammie et al., 2000|
|Alberta||Maximum||2500d||Gammie et al., 2000|
|Alberta||Annual geometric mean||98||EPCOR, 2005|
|British Columbia||Geometric mean||60.4||Ong et al., 1996|
|Ontario||Median||71||Van Dyke et al., 2006|
|Ontarioe||Range||< 2.5-20||Douglas et al., 2006|
|Arithmetic mean||3.6 and 3.9 (at two intakes)|
|Quebece||Geometric mean||1376||Payment and Franco, 1993|
|Quebec||Geometric mean||200||Payment et al., 2000|
Treated water in Canada is rarely tested for the presence of Giardia. When testing has been conducted, cysts are typically not present or are present in very low numbers (Payment and Franco, 1993; Ong et al., 1996; Wallis et al., 1996, 1998; EPCOR, 2005; Douglas et al., 2006), with some exceptions. In 1997, a heavy spring runoff event in Edmonton, Alberta, resulted in the presence of 34 cysts/100 L in treated water (Gammie et al., 2000). Cysts have also been detected in treated water derived from unfiltered surface water supplies (Payment and Franco, 1993; Wallis et al., 1996).
Giardia cysts can survive in the environment for extended periods of time. Survival in water can range from weeks to months (or possibly longer), depending on a number of factors, including the characteristics specific to the strain and of the water, such as temperature. The effect of temperature on survival rates of Giardia has been well studied. In general, as the temperature increases, the survival time decreases. For example, Bingham et al. (1979) observed that Giardia cysts can survive up to 77 days in tap water at 8°C, compared with 4 days at 37°C. DeRegnier et al. (1989) reported a similar effect in river and lake water. This temperature effect is, in part, responsible for peak Giardia prevalences reported in winter months (Isaac-Renton et al., 1996; Ong et al., 1996). Exposure to ultraviolet (UV) light will also shorten the survival time of Giardia. A detailed discussion of the effects of UV light on Giardia is provided in Section 7.1.4.
It is commonly assumed that the viability of Giardia cysts found in water is high, but monitoring experience suggests otherwise. Cysts found in surface waters are often dead, as shown by propidium iodide (PI) dye exclusion (Wallis et al., 1995). Observations by LeChevallier et al. (1991b) also suggest that most of the cysts present in water are non-viable; 40 of 46 cysts isolated from drinking water exhibited "non-viable-type" morphologies (i.e., distorted or shrunken cytoplasm). Studies have frequently revealed the presence of empty cysts ("ghosts"), particularly in sewage.
Person-to-person transmission is by far the most common route of transmission of Giardia (Pond et al., 2004; Thompson, 2004). Persons become infected via the faecal-oral route, either directly (i.e., contact with faeces from a contaminated person, such as children in daycare facilities) or indirectly (i.e., ingestion of contaminated drinking water, recreational water and, to a lesser extent, food). Animals may also play an important role in the (zoonotic) transmission of Giardia, although it is not clear to what extent. Cattle have been found to harbour human-infective (assemblage A) Giardia, as have dogs and cats. Assemblage A Giardia genotypes have also been detected in wildlife, including beavers and deer.
Although there is some evidence to support the zoonotic transmission of Giardia, most of this evidence is circumstantial or compromised by inadequate controls. Thus, it is not clear how frequently zoonotic transmission occurs or under what circumstances. Overall, these data suggest that, in most cases, animals are not the original source of human-infective Giardia, but may amplify zoonotic genotypes present in other sources (e.g., contaminated water). In cattle, for example, the livestock Giardia genotype (assemblage E) predominates; however, cattle are susceptible to infection with human-infective (zoonotic) genotypes of Giardia. It is likely that cattle acquire zoonotic genotypes of Giardia from their handlers and/or from contaminated water sources. Given that calves infected with Giardia commonly shed between 105 and 106 cysts per gram of faeces, they could play an important role in the transmission of Giardia.
The role that wildlife plays in the zoonotic transmission of Giardia is also unclear. Although wildlife, including beavers, can become infected with human-source G. lamblia (Davies and Hibler, 1979; Hewlett et al., 1982; Erlandsen and Bemrick, 1988; Erlandsen et al., 1988; Traub et al., 2004, 2005; Eligio-GarcÝa et al., 2005) and have been associated with waterborne outbreaks of giardiasis (Kirner et al., 1978; Lopez et al., 1980; Lippy, 1981; Isaac-Renton et al., 1993), the epidemiological and molecular data do not support zoonotic transmission via wildlife as a significant risk for human infections (Hoque et al., 2003; Stuart et al., 2003; Berrilli et al., 2004; Thompson, 2004; Hunter and Thompson, 2005; Ryan et al., 2005a). The data do, however, suggest that wildlife acquire human-infective genotypes of Giardia from sources contaminated by human sewage. As population pressures increase and as more human-related activity occurs in catchment areas, the potential for faecal contamination of source waters becomes greater, and the possibility of contamination with human sewage must always be considered. Erlandsen and Bemrick (1988) concluded that Giardia cysts in water may be derived from multiple sources, and that epidemiological studies that focus on beavers may be missing important sources of cyst contamination. Some waterborne outbreaks have been traced back to human sewage contamination (Wallis et al., 1998). Ongerth et al. (1995) showed that there is a statistically significant relationship between increased human use of water for domestic and recreational purposes and the prevalence of Giardia in animals and surface water. It is known that beaver and muskrat can be infected with human-source Giardia (Erlandsen et al., 1988), and these animals are frequently exposed to raw or partially treated sewage in Canada. The application of genotyping procedures has provided further proof of this linkage. Thus, it is likely that wildlife and other animals can act as a reservoir of human-infective Giardia from sewage-contaminated water and, in turn, amplify concentrations of Giardia cysts in water. If infected animals live in close proximity to drinking water treatment plant intakes, then they could play an important role in the waterborne transmission of Giardia. Thus, watershed management, to control both sewage inputs and the populations of aquatic mammals in the vicinity of water intakes, is important to disease prevention.
As is the case for livestock and wildlife animals, it is unclear what role domestic animals play in the zoonotic transmission of Giardia. Although dogs and cats are susceptible to infection with zoonotic genotypes of Giardia, few studies have provided direct evidence of transmission between them and humans (Eligio-GarcÝa et al., 2005; Shukla et al., 2006; Thompson et al., 2008).
Giardia is the most commonly reported intestinal protozoan in North America and worldwide (Farthing, 1989; Adam, 1991). The World Health Organization (WHO) estimates its worldwide incidence at 200 million cases per year. In Canada, just over 4000 confirmed cases of giardiasis were reported in 2004. This represents a significant decline from the 9543 cases that were reported in 1989. Incidence rates have similarly declined over this period (from 34.98 to 13.08 cases per 100 000 persons) (PHAC, 2007).
Giardia is a common cause of waterborne infectious disease outbreaks in Canada and elsewhere (Hrudey and Hrudey, 2004). Between 1974 and 2001, Giardia lamblia was the most commonly reported causative agent associated with infectious disease outbreaks related to drinking water in Canada (Schuster et al., 2005). Giardia was linked to 51 of the 138 outbreaks for which causative agents were identified. The majority (38/51; 75%) of these Giardia outbreaks were associated with public drinking water systems. Inadequate treatment appears to have been a major contributing factor (Schuster et al., 2005).
In the United States, outbreaks have been reported from 24 states (Jakubowski, 1994), especially Colorado and New England. During the period from 1965 to 1992, 115 outbreaks were reported that resulted in 26 530 known cases of giardiasis in the United States (Moore et al., 1993; Jakubowski, 1994). In an earlier study, Craun (1979) identified reliance on surface water, minimal treatment (usually only chlorination) and inadequate treatment facilities as common causes of waterborne giardiasis. Small water treatment systems that used otherwise good-quality surface water of low turbidity seemed to be the most frequently affected. A useful review of some select U.S. outbreaks has been compiled by Lin (1985), who concluded that these and other outbreaks had been caused by lack of filtration, improper filter operations, inadequate chlorination, cross-connections to sewers and drinking contaminated surface waters. Giardia was the most frequently identified etiological agent associated with waterborne outbreaks in the United States between 1971 and 1990, accounting for 18% of outbreaks (Craun et al., 2006). This trend continued into 1991 and through to 2002, with Giardia linked to 16% of reported waterborne outbreaks in the United States. Between 2001 and 2002, the U.S. Centers for Disease Control and Prevention (CDC) reported 17 waterborne disease outbreaks associated with drinking water; three of these outbreaks were linked to Giardia (CDC, 2004). In a recent worldwide review of waterborne protozoan outbreaks, G. lamblia accounted for 40.6% of the 325 outbreaks reported between 1954 and 2002 (Karanis et al., 2007).
Humans and animals, especially cows, are important reservoirs for Cryptosporidium. Human cryptosporidiosis has been reported in more than 90 countries over six continents (Fayer et al., 2000; Dillingham et al., 2002). Reported prevalence rates of human cryptosporidiosis range from 1% to 20% (Table 5), with higher rates reported in developing countries (Caprioli et al., 1989; Zu et al., 1992; M°lbak et al., 1993; Nimri and Batchoun, 1994; Dillingham et al., 2002; Cacci˛ and Pozio, 2006). Livestock, especially cattle, are a significant source of C. parvum (Table 5). In a survey of Canadian farm animals, Cryptosporidium was detected in faecal samples from cattle (20%), sheep (24%), hogs (11%) and horses (17%) (Olson et al., 1997). Oocysts were more prevalent in calves than in adult animals; conversely, they were more prevalent in mature pigs and horses than in young animals. Infected calves can excrete up to 107 oocysts per gram of faeces (Smith and Rose, 1990) and represent an important source of Cryptosporidium in surface waters (refer to Section 5.2.2). Wild ungulates (hoofed animals) and rodents are not a significant source of human-infectious Cryptosporidium (Roach et al., 1993; Ong et al., 1996).
Oocysts are easily disseminated in the environment and are transmissible via the faecal-oral route. Person-to-person transmission is one of the most common routes of transmission of Cryptosporidium. Contaminated drinking water, recreational water and food are also important mechanisms for transmission of Cryptosporidium. Contact with animals, especially livestock, also appears to be a major pathway for transmission. A more detailed discussion of zoonotic transmission is provided in Section 5.2.3.
Cryptosporidium oocysts are commonly found in sewage and surface waters and occasionally in treated water. In a cross-Canada survey of 72 municipalities performed between 1991 and 1995, Wallis et al. (1996) found that 6.1%, 4.5% and 3.5% of raw sewage, raw water and treated water samples, respectively, contained Cryptosporidium oocysts. Table 6 highlights a selection of studies that have investigated the occurrence of Cryptosporidium in surface waters in Canada. Typically, Cryptosporidium concentrations in surface waters ranged from 1 to 100 oocysts/100 L. Concentrations as high as 10 300 cysts/100 L were reported and were associated with a record spring runoff, highlighting the importance of event-based sampling (see Section 7.0) (Gammie et al., 2000).
|Province||Unit of measure||Cryptosporidium concentration
b It is important to consider that the sampling and analysis methods employed in these studies varied, and, as such, it may not be appropriate to compare oocyst concentrations. It is also important to consider that the viability and infectivity of oocysts were rarely assessed; as such, little information is available regarding the potential risk to human health associated with the presence of Cryptosporidium in these samples.
|Alberta||Single sample||34||LeChevallier et al., 1991a|
|Alberta||Annual geometric mean||6-83||Gammie et al., 2000|
|Alberta||Maximum||10 300d||Gammie et al., 2000|
|Alberta||Annual geometric mean||9||EPCOR, 2005|
|British Columbia||Geometric mean||3.5||Ong et al., 1996|
|Ontario||Median||15||Van Dyke et al., 2006|
|Ontarioe||Range||< 2.5-95||Douglas et al., 2006|
|Arithmetic mean||3.4 and 5.7 (at two intakes)|
|Quebece||Geometric mean||742||Payment and Franco, 1993|
|Quebec||Geometric mean||14||Payment et al., 2000|
A recent international review of source water quality data demonstrated that concentrations of Cryptosporidium in source waters across North America and Europe vary greatly (Dechesne and Soyeux, 2007). Cryptosporidium concentrations ranged from 0.006 to 250 oocysts/L. As part of this initiative, source water quality monitoring data were gathered for nine European (France, Germany, the Netherlands, Sweden and the United Kingdom) water sources and one Australian source. Overall, Cryptosporidium was frequently detected at relatively low concentrations, and levels ranged from 0.05 to 4.6 oocysts/L. In an earlier survey, Medema et al. (2003) reported concentrations of oocysts in raw and treated domestic wastewater (i.e., secondary effluent) ranging from 1000 to 10 000 oocysts/L and from 10 to 1000 oocysts/L, respectively.
Little is known about the occurrence of Cryptosporidium in groundwaters in Canada. Studies in the U.S. and elsewhere have reported the presence of oocysts in groundwaters, although in low frequencies, and at low concentrations (Hancock et al., 1998; Moulton-Hancock et al., 2000; Gaut et al., 2008).
The presence of Cryptosporidium in treated water in Canada is rarely assessed. When testing has been conducted, oocysts are typically not present or are present in very low numbers (Payment and Franco, 1993; Ong et al., 1996; Wallis et al., 1996; EPCOR, 2005; Douglas et al., 2006), with some exceptions (Gammie et al., 2000). Oocysts have been detected in treated water derived from unfiltered surface water supplies (Wallis et al., 1996) and after extreme contamination events. For example, in 1997, a heavy spring runoff event in Edmonton, Alberta, resulted in the presence of 80 oocysts/100 L in treated water (Gammie et al., 2000).
Cryptosporidium oocysts have been shown to survive in cold waters (4°C) in the laboratory for up to 18 months (AWWA, 1988). Robertson et al. (1992) reported that C. parvum oocysts could withstand a variety of environmental stresses, including freezing (viability greatly reduced) and exposure to seawater.
Although it is commonly assumed that the majority of oocysts in water are viable, monitoring experience suggests otherwise. Smith et al. (1993) found that oocyst viability in surface waters is often very low. A more recent study by LeChevallier et al. (2003) reported that 37% of oocysts detected in natural waters were infectious. It should, however, be emphasized that although low concentrations of viable oocysts are routinely found in raw water, they may not represent an immediate public health risk; rather, it is the sudden and rapid influx of parasites into source waters that is likely responsible for the increased risk of infection associated with transmission through drinking water. Environmental events such as flooding or high precipitation can lead to a rapid rise in oocyst concentration within a defined area of a watershed.
Low oocyst viability has also been reported in filtered water. A survey by LeChevallier et al. (1991b) found that, in filtered waters, 21 of 23 oocysts had "non-viable-type" morphology (i.e., absence of sporozoites and distorted or shrunken cytoplasm).
Direct contact with livestock and indirect contact through faecally contaminated waters are major pathways for transmission of Cryptosporidium (Fayer et al., 2000; Robertson et al., 2002; Stantic-Pavlinic et al., 2003; Roy et al., 2004; Hunter and Thompson, 2005). Cattle are a significant source of C. parvum in surface waters. For example, a weekly examination of creek samples upstream and downstream of a cattle ranch in the B.C. interior during a 10-month period revealed that the downstream location had significantly higher levels of Cryptosporidium oocysts (geometric mean 13.3 oocysts/100 L, range 1.4-300 oocysts/100 L) than the upstream location (geometric mean 5.6/100 L, range 0.5-34.4 oocysts/100 L) (Ong et al., 1996). A pronounced spike was observed in downstream samples following calving in late February. During a confirmed waterborne outbreak of cryptosporidiosis in British Columbia, oocysts were detected in 70% of the cattle faecal specimens collected in the watershed close to the reservoir intake (Ong et al., 1997).
Waterfowl can also act as a source of Cryptosporidium. Graczyk et al. (1998) demonstrated that Cryptosporidium oocysts retain infectivity in mice following passage through ducks. However, histological examination of the avian respiratory and digestive systems at 7 days post-inoculation revealed that the protozoa were unable to infect birds. In an earlier study (Graczyk et al., 1996), the authors found that faeces from migratory Canada geese collected from seven of nine sites on Chesapeake Bay contained Cryptosporidium oocysts. Oocysts from three of the sites were infectious to mice. Based on these studies, it appears that waterfowl can pick up infectious Cryptosporidium oocysts from their habitat and can carry and deposit them in the environment, including drinking water supplies.
Cryptosporidium is one of the most commonly reported enteric pathogens in North America and worldwide. In Canada, over 550 confirmed cases of cryptosporidiosis were reported in 2004; a similar number of cases was reported in 2000 (i.e., 623 cases). Incidence rates increased over this period from 1.85 (2000) to 2.67 (2004) cases per 100 000 persons (PHAC, 2007).
Cryptosporidium parvum and C. hominis are the major species associated with human cryptosporidiosis, although C. hominis appears to be more prevalent in North and South America, Australia and Africa, whereas C. parvum is responsible for more infections in Europe (McLauchlin et al., 2000; Guyot et al., 2001; Lowery et al., 2001b; Yagita et al., 2001; Ryan et al., 2003; Learmonth et al., 2004).
Waterborne outbreaks of cryptosporidiosis have been reported in many countries, including Canada (Fayer, 2004; Joachim, 2004; Smith et al., 2006). Between 1974 and 2001, Cryptosporidium was the third most reported causative agent associated with infectious disease outbreaks related to drinking water in Canada, representing 12 of the 138 outbreaks for which causative agents were identified (Schuster et al., 2005). The majority (11/12; 92%) of these Cryptosporidium outbreaks were associated with public drinking water systems. Inadequate treatment appears to be a major contributing factor (Schuster et al., 2005).
In the United States between 1984 and 2000, 10 outbreaks were associated with the presence of Cryptosporidium in drinking water; 421 000 cases of illness were reported, most of which (403 000) were associated with the Milwaukee outbreak in 1993 (U.S. EPA, 2006a). Between 2001 and 2002, CDC reported 17 waterborne disease outbreaks associated with drinking water; only one of these outbreaks was linked to Cryptosporidium (CDC, 2004). Cryptosporidium was the second most frequently identified infectious agent associated with waterborne outbreaks in the United States between 1991 and 2002, accounting for 7% of outbreaks (Craun et al., 2006). Nineteen outbreaks have been reported in the United Kingdom (Craun et al., 1998). In a recent worldwide review of waterborne protozoan outbreaks, Cryptosporidium accounted for 50.6% of the 325 outbreaks reported between 1954 and 2002 (Karanis et al., 2007). Attack rates were typically high, ranging from 26% to 40%, and many thousands of people were affected. In addition, there have been several outbreaks associated with swimming pools, wave pools and lakes.
The indicator organisms routinely monitored in Canada as part of the multi-barrier "source-to-tap" approach for assessing water quality are E. coli and total coliforms. E. coli is an effective indicator of recent faecal contamination and the potential presence of enteric pathogens, including enteric protozoa. However, its absence does not necessarily indicate that enteric protozoa are also absent. Total coliforms are not faecal specific and therefore cannot be used to indicate faecal contamination (or the potential presence of enteric pathogens). Instead, total coliforms are used to indicate general water quality issues. Further information on the role of E. coli and total coliforms in water quality management can be found in the Guideline Technical Documents on E. coli and total coliforms (Health Canada, 2006a,b).
Compared with protozoans, E. coli and members of the coliform group do not survive as long in the environment (Edberg et al., 2000) and are more susceptible to many of the disinfectants commonly used in the drinking water industry. As a result, although the presence of E. coli indicates the potential presence of enteric protozoa, the absence of E. coli does not necessarily indicate that enteric protozoa are also absent. As evidence of this, Giardia and Cryptosporidium (oo)cysts have been detected in filtered, treated drinking water meeting existing regulatory standards and have been linked to waterborne disease outbreaks (LeChevallier et al., 1991b; Craun et al., 1997; Marshall et al., 1997; Rose et al., 1997; Nwachuku et al., 2002; Aboytes et al., 2004).
Despite their limitations, E. coli and total coliforms are currently the best available indicators for verifying microbiological drinking water quality. Thus, if a multi-barrier, source-to-tap approach is in place and each barrier in the drinking water system has been controlled to ensure that it is operating adequately based on the quality of the source water, then E. coli and total coliforms can be used as part of the verification process to show that the water has been adequately treated and is therefore of an acceptable microbiological quality.
Several studies have investigated the relationship between indicator organisms and the presence or absence of enteric protozoa in surface water sources. In general, studies have reported little (Medema et al, 1997; Atherholt et al., 1998; Payment et al., 2000) or no (Rose et al., 1988, 1991; Chauret et al., 1995; Stevens et al., 2001; H÷rman et al., 2004; Dorner et al., 2007; Sunderland et al., 2007) correlation between protozoa and faecal indicators, including E. coli. In the cases where a correlation has been reported, it is with Giardia and at very high indicator levels. This lack of a strong correlation is likely due to a variety of factors, including differential survival rates in the environment, sampling location, and methodological differences related to the analysis of water (Payment and Pintar, 2006). Watershed characteristics, including sources and levels of faecal contamination, and geochemical factors, may also influence the correlation between faecal indicators and protozoa, leading to site-specific differences (Chauret et al., 1995).
These observations have raised significant questions regarding the appropriateness of using E. coli as an indicator of protozoan contamination in surface waters, and highlighted the need for routine protozoa monitoring of surface waters.
Only a few studies have reported the presence of enteric protozoa, specifically Cryptosporidium, in groundwater (see Section 5.2.1). As such, the usefulness of E. coli as an indicator of enteric protozoa contamination of groundwater sources has not been assessed.
The most widely recognized and used method for the detection of Giardia and Cryptosporidium in water is the U.S. Environmental Protection Agency's (EPA) Method 1623, as this method allows for the simultaneous detection of these protozoans and has been validated in surface water (U.S. EPA, 2005, 2006a). Although other methods for the detection of Giardia and Cryptosporidium in water exist, they have demonstrated lower recoveries and increased variance compared with the U.S. EPA's Method 1623 (Quintero-Betancourt et al., 2002). Like most methods used for the detection of Cryptosporidium and Giardia in water, the U.S. EPA's Method 1623 consists of four steps: 1) sample collection, 2) sample filtration and elution, 3) sample concentration and separation (purification) and 4) (oo)cyst detection. These steps are described in the following sections. Some emerging detection methods are also discussed, along with methods used for assessing (oo)cyst viability and infectivity.
Water samples can be collected as bulk samples or filtered in the field and then shipped on ice to a laboratory for processing as quickly as possible (ideally, within 24 hours). The volume of water collected depends on the expected level of (oo)cysts in the water (i.e., site specific); the lower the expected density of (oo)cysts, the greater the sample volume needed. In most cases, anywhere from 10 to 1000 L of water is collected. In the case of raw water, samples are typically collected near and at the depth of the drinking water intake point, in an effort to sample the source water used for supplying drinking water.
(Oo)cysts are generally present in small numbers in faecally contaminated water; as such, bulk water samples must be filtered to concentrate the pathogens to a detectable level. Typically, water is pumped through a filter, and (oo)cysts, along with extraneous particulate materials, are retained on the filter. Filtration can be achieved using a variety of filter types, including wound filters, membrane filters, hollow fibre filters and compressed foam filters. These filters vary in terms of the volume of water that they can process, their filtration rates, their practicality, their compatibility with subsequent processing steps, their cost and their retention ability. These differences account for the wide range of recovery efficiencies reported in the literature (Sartory et al., 1998; DiGiorgio et al., 2002; Quintero-Betancourt et al., 2003; Ferguson et al., 2004). A number of filters have been validated by the U.S. EPA for use with Method 1623 (U.S. EPA, 2005). Once filtration is complete, entrapped (oo)cysts on the filter are released through the addition of eluting solutions, producing a filter eluate.
(Oo)cysts in the filter eluate are further concentrated through centrifugation and separated from other particulates through immunomagnetic separation (IMS)/immunocapture. Alternatively, flotation (i.e., density gradient centrifugation) can be used for (oo)cyst separation; however, this approach has been associated with significant (oo)cyst losses and does not effectively remove other biological materials (e.g., yeast and algal cells) (Nieminski et al., 1995), which may affect subsequent (oo)cyst detection.
The partially concentrated (oo)cysts are then centrifuged, resulting in the formation of a pellet. This pellet is resuspended in a small volume of buffer. The concentrate is mixed with (oo)cyst-specific monoclonal antibodies attached to magnetized beads, also referred to as immunomagnetic beads. These beads will selectively bind to (oo)cysts. A magnetic field is then applied, resulting in the separation of (oo)cyst-bead complexes from extraneous materials. These materials are removed, the (oo)cyst-bead complex is dissociated and the beads are extracted, resulting in a concentrated suspension of (oo)cysts. Several studies have assessed the recovery potential of the IMS step alone. Fricker and Clancy (1998) reported that (oo)cysts added to (i.e., seeded into) low-turbidity waters can be recovered with efficiencies above 90%. In comparison, mean oocyst and cyst recoveries for turbid waters ranged from 55.9% to 83.1% and from 61.1% to 89.6%, respectively, for turbid waters (McCuin et al., 2001). Others have reported similar recoveries (Moss and Arrowood, 2001; Rimhanen-Finne et al., 2001, 2002; Sturbaum et al., 2002; Ward et al., 2002; Chesnot and Schwartzbrod, 2004; Greinert et al., 2004; Hu et al., 2004; Ochiai et al., 2005; Ryan et al., 2005b). Although IMS aids in reducing false positives by reducing the level of debris on slide preparations for microscopic analysis, it is a relatively expensive procedure, with few manufacturers supplying the immunomagnetic beads. Moreover, it has been reported that high levels of turbidity and/or iron (Yakub and Stadterman-Knauer, 2000), along with changes in pH (i.e., optimum pH of 7) (Kuhn et al., 2002), may inhibit IMS.
Once samples have been concentrated and (oo)cysts have been separated from extraneous materials, a number of detection techniques can be applied. The most commonly used detection approach is the immunofluorescence assay (IFA). Alternative detection methods, such as the polymerase chain reaction (PCR), flow cytometry and other molecular approaches, are increasingly being used. Molecular detection methods are generally more rapid and sensitive and have the potential of being paired with a variety of other methods to provide species/genotype information. However, only small volumes can be processed using these methods, and some methods (e.g., PCR) are susceptible to environmental inhibitors.
Following sample concentration and separation, a portion of the (oo)cyst suspension is transferred to a microscope slide. Fluorescently labelled antibodies directed at specific antigens on the (oo)cyst surface are then applied to the slide and allowed to incubate. Direct immunofluorescence microscopy is then used to locate fluorescing bodies, which are potential (oo)cysts. This process is referred to as an IFA. This assay requires specialized equipment and a high level of technical skill. It can be highly sensitive, however, because some autofluorescent algae are very close in size and staining characteristics to (oo)cysts; final identification of (oo)cysts often requires additional staining and microscopy. In most cases, a DAPI stain is applied. Because DAPI binds to deoxyribonucleic acid (DNA), it will highlight (oo)cyst nuclei and facilitate their identification.
Flow cytometry can be used as an alternative technique for detecting (oo)cysts following concentration. Flow cytometry allows the sorting, enumeration and examination of microscopic particles suspended in fluid, based on light scattering. Fluorescently activated cell sorting (FACS) is the flow cytometric technique that is used to enumerate and separate Cryptosporidium and Giardia from background particles. Typically, immunofluorescent antibodies are introduced into the (oo)cyst suspension, and the suspension is passed through a beam of light (within the flow cytometer). As particles pass through the stream of light, their fluorescence is measured, and they are then sorted into two or more vials.
FACS has proven to be highly sensitive and specific and is being used more and more as an alternative (oo)cyst detection technique (Vesey et al., 1997; Bennett et al., 1999; Reynolds et al., 1999; Delaunay et al., 2000; Lindquist et al., 2001; Kato and Bowman, 2002; Lepesteur et al., 2003; Hsu et al., 2005). This approach has the advantage of being rapid, allowing for high throughput. However, flow cytometers are expensive, and their operation requires significant user training. In addition, like IFA, this procedure can be adversely influenced by the presence of autofluorescent algae and antibody cross-reactivity with other organisms and particles. FACS also requires confirmation of (oo)cysts by microscopy, which is why it is often coupled with U.S. EPA's Method 1623. Although FACS shows promise, it is still in the development stage and is not used for routine analysis.
A number of molecular approaches have also been used in the detection of Giardia and Cryptosporidium (oo)cysts. A brief description of some of these methods is provided below. It is important to note that although molecular methods have many advantages, they also possess significant disadvantages that make them unsuitable for routine analysis of water. There are currently no validated molecular methods for the detection of Giardia and Cryptosporidium in water.
PCR is the most commonly used molecular method for detection of (oo)cysts. This method involves lysing (oo)cysts to release DNA and then introducing primers that are targeted at specific Giardia or Cryptosporidium coding regions (e.g., 18S ribosomal ribonucleic acid [rRNA]) and amplification of these regions. PCR can be highly sensitive (i.e., level of a single (oo)cyst) and specific (Deng et al., 1997, 2000; Bukhari et al., 1998; Di Giovanni et al., 1999; Kostrzynska et al., 1999; Rochelle et al., 1999; Hallier-Soulier and Guillot, 2000; Hsu and Huang, 2001; McCuin et al., 2001; Moss and Arrowood, 2001; Rimhanen-Finne et al., 2001, 2002; Sturbaum et al., 2002; Ward et al., 2002). It can be combined with other molecular techniques, such as restriction fragment length polymorphism (RFLP), to discriminate between species and genotypes of Cryptosporidium and Giardia (Morgan et al., 1997; Widmer, 1998; Lowery et al., 2000, 2001a,b), although this approach can be problematic, in that it can produce similar banding patterns for different species and genotypes. PCR is also amenable to automation, and reverse transcriptase (RT) PCR may permit discrimination of viable and non-viable (oo)cysts. However, PCR inhibition by divalent cations and humic and fulvic acids is a significant problem (Sluter et al., 1997). In an effort to remove these inhibitors, samples must go through several purification steps. In addition to inhibition, inefficient (oo)cyst lysis is often an issue. Despite these problems, many PCR assays have been developed for detection of waterborne (oo)cysts (Stinear et al., 1996; Kaucner and Stinear, 1998; Griffin et al., 1999; Lowery et al., 2000; Gobet and Toze, 2001; Karasudani et al., 2001; Ong et al., 2002; Sturbaum et al., 2002; Ward et al., 2002).
Other emerging molecular methods for detection of (oo)cysts include fluorescence in situ hybridization (FISH), real-time PCR and microarrays. FISH involves hybridizing a fluorescently labelled oligonucleotide probe that is targeted at the 18S rRNA region of Giardia and Cryptosporidium. This technique has shown some success, but it is limited by relatively weak signals (i.e., (oo)cysts do not fluoresce sufficiently) and related difficulties in microscopic interpretation (Deere et al., 1998; Vesey et al., 1998; Dorsch and Veal, 2001). Real-time PCR is a modified PCR that involves oligonucleotide probes that fluoresce. As the target region within (oo)cysts is amplified, the emitted fluorescence is measured, thereby allowing quantification of the PCR products. This method has several advantages, including the lack of post-PCR analysis, increased throughput, decreased likelihood of contamination (i.e., closed vessel system), ability to quantify (oo)cysts (MacDonald et al., 2002; Fontaine and Guillot, 2003; Bertrand et al., 2004) and ability to assess (oo)cyst viability (when paired with cell culture) (Keegan et al., 2003; LeChevallier et al., 2003). This approach has other unique advantages, including its ability to differentiate between species of Cryptosporidium and Giardia (using melting curve analysis) (Limor et al., 2002; Ramirez and Sreevatsan, 2006) and simultaneously detect different microorganisms (Guy et al., 2003). Although this assay has several advantages over traditional PCR and IFA and has proven useful in identification and enumeration of (oo)cysts, it requires a real-time PCR analyser, which is very costly and may limit its widespread use. Microarrays represent a very novel approach to (oo)cyst detection. A microarray is a collection of microscopic DNA spots, usually on a glass slide, against which pathogen DNA is hybridized. This approach has proven useful in the detection and genotyping of Giardia and Cryptosporidium (Straub et al., 2002; Grow et al., 2003; Wang et al., 2004), although more research is required to determine its specificity and sensitivity.
An integral part of the Giardia and Cryptosporidium detection process involves determining recovery efficiencies. As mentioned previously, there can be significant losses of (oo)cysts during the concentration and separation processes. In addition, the characteristics of the water (e.g., presence of suspended solids, algae) can significantly impact recovery efficiency. As a result, the true concentration of (oo)cysts in a water sample is almost always higher than the measured concentration. Thus, recovery efficiencies are determined to better approximate the actual concentration of (oo)cysts. The recovery efficiency is generally measured by introducing a known number of (oo)cysts into the water sample (i.e., seeding) before the sample is analysed. Ideally, the recovery efficiency should be determined for each sample; however, because this is expensive, recovery efficiency data are usually collected for a subset of samples. With the introduction of commercial preparations containing a certified number of (oo)cysts, this process may become more cost-effective and routine.
Several studies have attempted to describe the range of recovery efficiencies across laboratories and using different types of filters. McCuin and Clancy (2003) evaluated U.S. EPA Methods 1622 and 1623 using a commercial filtration system; seeded raw and finished water samples were analysed. Analysis of seeded tap water samples yielded recoveries of 48.4 ▒ 11.8% oocysts and 57.1 ▒ 10.9% cysts. Recoveries from raw water samples ranged from 19.5% to 54.5% for oocysts and from 46.7% to 70% for cysts. In 1995, Health Canada commissioned a similar study of commercial, government and research laboratories in Canada, and Giardia cyst recovery ranged from 0% to 90% (average 21%) for eight laboratories analysing 10 unknown samples. Cryptosporidium oocyst recovery ranged from 0% to 43% (average 5.3%) for the same samples (Clancy Environmental Consultants, Inc., 1996). LeChevallier et al. (1995) conducted a critical analysis of the immunofluorescence method and concluded that losses of Cryptosporidium oocysts typically exceed losses of Giardia cysts and that major losses occur during sample concentration and separation. Significant loss of (oo)cysts has also been reported during the filtration process (Feng et al., 2003).
A major drawback of existing methods for the detection of Giardia and Cryptosporidium is that they provide very limited information on the viability or human infectivity of (oo)cysts, which is essential in determining their public health significance. Whereas viability can be assessed relatively easily and rapidly, assessment of infectivity is much more complex. Methods used to evaluate viability and infectivity are very costly because of the need for maintaining cell lines, animals and qualified staff.
A variety of in vitro and in vivo methods have been developed to assess viability and infectivity. In vitro methods include excystation, fluorogenic dye inclusion/exclusion (i.e., staining), RT-PCR and FISH. In vivo methods include animal infectivity assays and cell culture. A brief discussion of these methods is provided in the following sections.
Viability (but not infectivity) can be estimated by subjecting (oo)cysts to conditions similar to those in the gut, in an effort to stimulate excystation (i.e., release of trophozoites/sporozoites). Excystation "cocktails" and conditions vary considerably and may result in conflicting observations. If (oo)cysts are capable of excystation, they are considered viable. Giardia can be excysted using acid and enzymes such as trypsin and grown in TYI-S-33 medium (Diamond et al., 1978; Rice and Schaefer, 1981), but the excystation rate for Giardia is often low. Cryptosporidium parvum oocysts can also be excysted as a measure of viability (Black et al., 1996). However, excystation methods have been shown to be relatively poor indicators of Cryptosporidium oocyst viability. Neumann et al. (2000b) observed that non-excysted oocysts recovered after commonly used excystation procedures are still infectious to neonatal mice.
Various staining methods have been developed to assess (oo)cyst viability (Robertson et al., 1998; Freire-Santos et al., 2000; Neumann et al., 2000b; Gold et al., 2001; Iturriaga et al., 2001). (Oo)cyst viability is assessed based on the inclusion or exclusion of two fluorogenic dyes, DAPI and PI (Robertson et al., 1998; Freire-Santos et al., 2000; Neumann et al., 2000b; Gold et al., 2001; Iturriaga et al., 2001). Three classes of (oo)cysts can be identified: 1) viable (inclusion of DAPI, exclusion of PI), 2) non-viable (inclusion of both DAPI and PI) and 3) quiescent or dormant (exclusion of both DAPI and PI, but potentially viable). In general, DAPI and PI give good correlation with in vitro excystation (Campbell et al., 1992). Neumann et al. (2000a) demonstrated a strong correlation between DAPI/PI staining intensity and animal infectivity of freshly isolated C. parvum oocysts. These stains have also been successfully used in conjunction with fluorescently labelled antibodies (used in FACS) to determine the viability and infectivity of (oo)cysts in water samples, because their fluorescence spectra do not overlap with those of the antibodies (Belosevic et al., 1997; Bukhari et al., 2000; Neumann et al., 2000b). In spite of these positive correlations, dye inclusion/exclusion, like excystation procedures, overestimates the viability and potential infectivity of (oo)cysts (Black et al., 1996; Jenkins et al., 1997).
RT-PCR can also be applied to the direct detection of viable (oo)cysts in water concentrates (Kaucner and Stinear, 1998). RT-PCR amplifies a messenger ribonucleic acid (mRNA) target molecule. As only viable organisms can produce mRNA, this experimental method may prove useful in assessing (oo)cyst viability. For example, when compared with the IFA DAPI/PI method, the frequency of detection of viable Giardia increased from 24% with IFA to 69% with RT-PCR. An advantage of this approach is that it can be combined with IMS, allowing for simultaneous detection and viability testing (Hallier-Soulier and Guillot, 2000, 2003); it can also be quantitative. RT-PCR, like other PCR-based methods, is highly susceptible to environmental inhibition and suffers from inefficient extraction of nucleic acids from (oo)cysts.
FISH has shown modest success in differentiating between living and dead (oo)cysts (Davies et al., 2005; Lemos et al., 2005; Taguchi et al., 2006); however, false positives are common (Smith et al., 2004). As 18S rRNA is present in high copy numbers in viable (oo)cysts but in low numbers in non-viable (oo)cysts, it is a useful target for assessing viability. Further research is required to validate this assay for use in assessing (oo)cyst viability. Like DAPI/PI staining, FISH is limited by its inability to assess (oo)cyst infectivity. Further research is required to validate this assay for use in assessing (oo)cyst viability.
The most direct method for assessing (oo)cyst viability and infectivity is to inoculate a susceptible animal and monitor for (oo)cyst shedding and any histological evidence of disease development. Giardia and Cryptosporidium are used to infect experimental animals such as the gerbil (for Giardia) (Belosevic et al., 1983) and the neonatal CD-1 mouse (for Cryptosporidium) (Finch et al., 1993). This approach has shown moderate success (Delaunay et al., 2000; Korich et al., 2000; Matsue et al., 2001; Noordeen et al., 2002; Okhuysen et al., 2002; Rochelle et al., 2002), but it is not practical, as most analytical laboratories do not maintain animal colonies, and animal infectivity assays are expensive to perform. In addition, there is limited knowledge on the diversity of species and genotypes of Giardia and Cryptosporidium that can infect animal models (i.e., some species/genotypes may not be amenable to infecting a particular animal host). Even with this information, this approach is not sensitive enough for environmental monitoring (i.e., high median infective dose [ID50]). These assays are typically reserved for research purposes, such as assessing disinfection effectiveness, rather than for routine assessment of (oo)cyst viability/infectivity.
Unlike Giardia, Cryptosporidium is an intracellular parasite that relies on host cells for replication. Thus, oocysts cannot be routinely grown in cell culture media. In vitro cell culture assays for Cryptosporidium infectivity assessment overcome several of the limitations associated with the use of animal models. These assays involve exposing (oo)cysts to excystation stimuli followed by their inoculation into a cultured mammalian cell line, such as human ileocaecal adenocarcinoma (HCT-8) cells, which support the parasite's growth and development. (Oo)cysts are typically inoculated on HCT-8 cell monolayers. After a 24- to 48-hour incubation, the cell monolayer is examined for the presence of Cryptosporidium reproductive stages using either an indirect IFA (Slifko et al., 1997) or PCR (Rochelle et al., 1997).
This approach has been used to estimate the infectivity of (oo)cysts in water (Di Giovanni et al., 1999; Hijjawi et al., 2001; Weir et al., 2001; Rochelle et al., 2002; Johnson et al., 2005; Schets et al., 2005; Coulliette at al., 2006) and has been shown to yield results comparable to those of the mouse infectivity model (Hijjawi et al., 2001; Rochelle et al., 2002; Slifko et al., 2002). In other comparison studies, average percent viabilities were comparable for cell culture, excystation and DAPI/PI assays (Slifko et al., 1997).
There are several advantages to the cell culture assay, including its high sensitivity (i.e., detection of a single viable oocyst), applicability to analysis of raw and treated water samples, ease of performance and rapid turnaround time for results. Another advantage of this approach is that C. parvum and C. hominis can be maintained in vitro for long periods of time, facilitating viability and immunotherapy studies. In addition, cell culture can be combined with other methods, including PCR, to more accurately assess viability/infectivity. Cell culture PCR (CC-PCR) has proven useful in assessing watershed contamination and in estimating risk (Joachim et al., 2003; LeChevallier et al., 2003; Masago et al., 2004). Although cell culture infectivity assays have several advantages, they also possess a number of disadvantages, including the need to maintain a cell line and poor reproducibility among similar samples for quantitative assessments. Moreover, existing cell culture methods detect only C. parvum and C. hominis; very little is known about how other Cryptosporidium species and genotypes of human health concern infect culture systems. The development of C. parvum in a host cell-free culture was recently reported (Hijjawi et al., 2004), but could not be reproduced (Girouard et al., 2006).
The multi-barrier approach, including watershed or wellhead protection, optimized filtration and disinfection, a well-maintained distribution system and monitoring the effectiveness of treatment (e.g., turbidity, disinfection residuals), is the best approach to reduce protozoa and other waterborne pathogens in drinking water. In general, all water supplies should be disinfected, and an adequate concentration of disinfectant residual should be maintained throughout the distribution system at all times.
Where events leading to protozoan impacts on the source water are well characterized, it may be possible to implement other barriers/risk management measures in addition to those mentioned above. These may include limiting capture of raw water during high risk events, selectively operating an additional barrier during high risk events, use of alternative sources, or blending of varying sources (groundwater and surface water).
Source water quality should be characterized. The best means of achieving this is to conduct routine analyses for Giardia and Cryptosporidium. Sanitary surveys, to identify potential sources of faecal contamination from humans and other animals, are also useful, but they are not a substitute for the routine monitoring of Giardia and Cryptosporidium. In order to understand the full range of source water quality, data should be collected during normal conditions as well as during extreme weather or spill/upset events (e.g., spring runoff, storms, etc.). For example, the flooding of sewage collection and treatment systems during heavy rainfall events can lead to sudden increases in protozoa and other microbial pathogens in the source water.
Once the source water quality has been initially characterized, a health-based treatment goal can be established for the specific source water, and effective pathogen removal and/or inactivation strategies put in place in order to achieve safe levels in the finished drinking water. To optimize performance for removal of microbial pathogens, the relative importance of each barrier must be understood. Some water systems have multiple redundant barriers, such that failure of a given barrier still provides adequate treatment. In other cases, all barriers must be working well to provide the required level of treatment. For these systems, failure of a single treatment barrier could lead to a waterborne outbreak.
The inactivation of protozoa from raw water is complicated by their resistance to commonly used disinfectants such as chlorine. Treatment systems that rely solely on chlorine as the treatment barrier will not be able to effectively inactivate Giardia and Cryptosporidium that may be present in the source water. The combination of physical removal and disinfection barriers is the most effective way to reduce protozoa in drinking water. In most cases, a well-operated conventional treatment plant should be able to produce water with a negligible risk of infection. Options for treatment and control of protozoa are discussed briefly in this document; however, more detailed information is available in other publications (U.S. EPA, 1991; Health and Welfare Canada, 1993; AWWA, 1999; Deere et al., 2001; Hijnen et al., 2004a; LeChevallier and Au, 2004; Smeets et al., 2006). These treatment and control options also need to take into account other treatment requirements, such as turbidity, disinfection by-product (DBP) formation and distribution system maintenance.
Treatment of surface water or surface-impacted groundwater systems should include physical removal methods, such as chemically assisted filtration (coagulation, flocculation, clarification and filtration), and disinfection, or equivalent technologies. It is essential that the physical removal and disinfection targets are achieved before drinking water reaches the first consumer in the distribution system. Adequate process control measures and operator training are also required to ensure the effective operation of treatment barriers at all times (U.S. EPA, 1991; Health and Welfare Canada, 1993; AWWA, 1999).
The level of treatment needed is based on the source water pathogen concentration and the required drinking water quality. Most source waters are subject to faecal contamination, as such, treatment technologies should be in place to achieve a minimum 3-log (99.9%) removal and/or inactivation of Cryptosporidium and Giardia. With this level of treatment, a source water concentration of 34 cysts/100 L can be reduced to 3.4 × 10-2 cysts/100 L, which meets the population health target of 10-6 disability-adjusted life year (DALY)/person per year (see Section 9.0). Similarly, a source water concentration of 13 oocysts/100 L can be reduced to 1.3 × 10-2 oocysts/100 L. However, many surface waters in Canada have much higher (oo)cyst concentrations (see Section 5.1.2) and therefore require additional removal/inactivation in order to meet the same concentration in the treated drinking water (see Section 9.3.4).
Source water Giardia and Cryptosporidium concentrations should be determined based on actual water sampling and analysis. Such characterization should take into account normal conditions as well as event-based monitoring, such as spring runoff, storms or spill events. Testing results should also take into account recovery efficiencies for the analytical method and pathogen viability in order to obtain the most accurate assessment of infectious pathogens present in the source water. Where source water sampling and analysis for Giardia and Cryptosporidium are not feasible (e.g., small supplies), (oo)cyst concentrations can be estimated. Estimates should be based on a source water assessment along with other water quality parameters that can provide information on the risk and/or level of faecal contamination in the source water. Because these estimates will have a high level of uncertainty, engineering safety factors or additional treatment reductions should be applied in order to ensure production of microbiologically safe drinking water.
The health-based treatment goal can be achieved through one or more treatment steps involving physical removal and/or primary disinfection. The (oo)cyst log reductions for each separate treatment barrier can be combined to define the overall reduction for the treatment process. In Canada, Payment and Franco (1993) showed that 99.998% (4.7 logs) of Giardia cysts and Cryptosporidium oocysts were removed from heavily polluted water by full conventional treatment (flocculation, settling, pre- and post-disinfection with chlorine dioxide and chlorine, and filtration) at three treatment plants in the Montreal, Quebec, area.
Conventional filtration is a practical method to achieve high removal/inactivation rates of (oo)cysts. A recent review of pilot- and full-scale study data concluded that coagulation, flocculation and sedimentation processes were associated with a 1.6-log Cryptosporidium removal credit (range of 0.4-3.7 logs) and a 1.5-log Giardia removal credit (range of 0-3.3 logs) (Hijnen et al., 2004a). Another review (Emelko et al., 2005) found that granular media filtration can achieve a 3-log removal, or better, of Cryptosporidium when filters are operated at or near optimal conditions. Coagulation and flocculation should be optimized for particles to be effectively removed by filtration. The end of a filter run is a vulnerable period for filter operation. The deterioration in oocyst removal by several log units was observed in the early stages of breakthrough when filter effluent particle counts had just begun to rise and turbidity had not always increased (Huck et al., 2002). Filters must be carefully controlled, monitored and backwashed such that particle breakthrough does not occur (Huck et al., 2001; Emelko et al., 2005), and filter backwash water should not be recirculated through the treatment plant without additional treatment. Slow sand and diatomaceous earth filtration can also be highly effective, with physical removals in the range of > 4 logs and 3.3 logs for Cryptosporidium and Giardia, respectively (Hijnen et al., 2004b). As there is wide variability in the characteristics of source waters, the selection of the most appropriate system must be made by experienced engineers after suitable analysis and/or pilot testing.
Many treatment processes are interdependent and rely on optimal conditions upstream in the treatment process for efficient operation of subsequent treatment steps. Thus, in order to effectively remove Cryptosporidium and Giardia through filtration barriers, it is important that the preceding coagulation and flocculation steps be optimized.
Membrane filtration has become an increasingly important component of drinking water treatment systems (Betancourt and Rose, 2004; Goh et al., 2005). Microfiltration membranes have the largest pore size (0.1 Ám; Taylor and Weisner, 1999). Whereas nanofiltration and reverse osmosis processes are effective in removing protozoan (oo)cysts, microfiltration and ultrafiltration are the most commonly applied/used technologies for microbial removal because of their cost-effectiveness. Jacangelo et al. (1995) evaluated the removal of G. muris and C. parvum from three source waters of varying quality using a variety of microfiltration and ultrafiltration membranes. Microfiltration membranes of 0.1 Ám and 0.2 Ám and ultrafiltration membranes of 100, 300 and 500 kilodaltons were assessed. Both microfiltration and ultrafiltration were capable of absolute removal of G. muris and C. parvum. The concentration of protozoa in the different raw waters tested varied from 104 to 105/L, and log removals of 4.7-7.0 for G. muris and 4.4-7.0 for C. parvum were achieved. More recently, States et al. (1999) reported absolute removal of Cryptosporidium (challenge concentration of 108 oocysts) and Giardia (challenge concentration of 107 cysts) by microfiltration. Parker et al. (1999) also reported an absolute removal of C. parvum from an influent concentration of approximately 2 × 105/100 L to an effluent concentration of less than 1/100 L (5.3-log removal) using microfiltration membranes (0.2 Ám).
Although membrane filtration is highly effective for removal of protozoan (oo)cysts, system integrity (breaks, O-rings, connectors, glue), membrane fouling and degradation must be considered. Membrane fouling is usually caused by accumulation of particles, chemicals and biological growth on membrane surfaces. Membrane degradation is typically the result of hydrolysis and oxidation. There can be very significant differences in pathogen removal, because the physical characteristics of a membrane can vary during the manufacturing process by different manufacturers and because polymeric membranes, regardless of their nominal classification, have a range of pore sizes. The (oo)cyst removal efficiency for a specific membrane must be demonstrated through challenge testing and verified by direct integrity testing. This process involves measuring pressure loss across the membrane or assessing removal of spiked particulates using a marker-based approach. More detailed information on filtration techniques can be found in the guideline technical document on turbidity (Health Canada, 2004).
Drinking water treatment plants that meet the turbidity limits established in the Guideline Technical Document for turbidity (Health Canada, 2004) can apply the estimated potential removal credits for Giardia and Cryptosporidium given in Table 7. These log removals are adapted from the removal credits established by the U.S. EPA as part of the "Long Term 2 Enhanced Surface Water Treatment Rule" (LT2ESWTR) (U.S. EPA, 2006b) and the "Long Term 1 Enhanced Surface Water Treatment Rule" (LT1ESWTR) Disinfection Profiling and Benchmarking Guidance Manual (U.S. EPA, 2003). Alternatively, log removal rates can be established on the basis of demonstrated performance or pilot studies. The physical log removal credits can be combined with the disinfection credits to meet overall treatment goals. For example, if an overall 5-log (99.999%) Cryptosporidium removal is required for a given system and conventional filtration provides 3-log removal, then the remaining 2-log reduction must be achieved through another barrier, such as primary disinfection.
c Values based on review of Schuler and Ghosh, 1990; AWWA, 1991; Schuler and Ghosh, 1991; Nieminski and Ongerth, 1995; Patania et al., 1995; McTigue et al., 1998; Nieminski and Bellamy, 2000; U.S. EPA 2003; DeLoyde et al., 2007; Assavavasilasukul et al., 2008.
|Conventional filtration||3 log||3 log|
|Direct filtration||2.5 log||2.5 log|
|Slow sand filtration||3 log||3 log|
|Diatomaceous earth filtration||3 log||3 log|
|Microfiltration and ultrafiltration||Demonstration and challenge testingd||Demonstration and challenge testingd|
|Nanofiltration and reverse osmosis||Demonstration and challenge testingd||Demonstration and challenge testingd|
Chemical disinfectants commonly used for treating drinking water include chlorine, chloramine, chlorine dioxide and ozone. Disinfection is typically applied after treatment processes that remove particles and organic matter. This strategy helps to ensure efficient inactivation of pathogens and minimizes the formation of DBPs. It is important to note that when describing microbial disinfection of drinking water, the term "inactivation" is used to indicate that the pathogen is no longer able to multiply within its host and is therefore non-infectious, although it may still be present.
Physical characteristics of the water, such as temperature, pH and turbidity, can have a major impact on the inactivation and removal of pathogens. For example, inactivation rates for Cryptosporidium and Giardia increase 2- to 3-fold for every 10°C rise in temperature (see concentration × time [CT] tables in Appendices A and B). When water temperatures are close to 0°C, as is often the case in winter in Canada, the efficacy of disinfection is reduced, and an increased disinfectant concentration and/or contact time are required to achieve the same level of inactivation.
The effectiveness of some disinfectants is also dependent on pH. When using free chlorine, increasing the pH from 6 to 9 reduces the level of Giardia inactivation by a factor of 3 (see CT tables in Appendix A). On the other hand, pH has been shown to have little effect on Giardia inactivation when using ozone or chlorine dioxide.
Reducing turbidity is an important step in the inactivation of Cryptosporidium and Giardia and other microorganisms. Chemical disinfection may be inhibited by particles that can protect Cryptosporidium and Giardia and other microorganisms. Additionally, turbidity will consume disinfectant and reduce the effectiveness of chemical disinfection. An increase in turbidity from 1 to 10 nephelometric turbidity units (NTU) resulted in an 8-fold decrease in free chlorine disinfection efficiency (Hoff, 1986). The effect of turbidity on treatment efficiency is further discussed in the guideline technical document on turbidity (Health Canada, 2004).
The efficacy of chemical disinfectants can be predicted based on knowledge of the residual concentration of a specific disinfectant, and of factors that influence its performance, mainly, temperature, pH, contact time and the level of disinfection required (AWWA 1999). This relationship is commonly referred to as the CT concept, where CT is the product of "C" (the residual concentration of disinfectant, measured in mg/L) and "T" (the disinfectant contact time, measured in minutes) for specific disinfectants at pH and temperatures encountered during water treatment. To account for disinfectant decay, the residual concentration is usually determined at the exit of the contact chamber rather than using the applied dose or initial concentration. Also, the contact time "T" is often calculated using a T10 value, which is defined as "the detention time at which 90% of the water passing through the unit is retained within the basin" (AWWA 1991) (i.e., 90% of the water meets or exceeds the required contact time). The T10 values can be estimated based on the geometry and flow conditions of the disinfection chamber or basin (AWWA 1991). Hydraulic tracer tests, however, are the most accurate method to determine the contact time under actual plant flow conditions. Since the T value is dependent on the hydraulics related to the construction of the treatment installation, it is less easily adjustable than the disinfectant dosage during the treatment plant operation.
Complete CT tables for 0.5-log to 3-log inactivation of Giardia and Cryptosporidium can be found in Appendix A and Appendix B, respectively. Some selected CT values are presented in Table 8 for 3-log (99.9%) inactivation of Giardia using chlorine, chloramine, chlorine dioxide and ozone. The CT values illustrate the fact that chloramine is a much weaker disinfectant than free chlorine, chlorine dioxide or ozone, as much higher concentrations and/or contact times are required to achieve the same degree of cyst inactivation. Consequently, chloramine is not recommended as a primary disinfectant for protozoa.
b Selected values adapted from Tables A.1-A.5 in Appendix A.
Free chlorine is the most common chemical used for primary disinfection because it is widely available, is relatively inexpensive and provides a residual that can be used for maintaining water quality in the distribution system. However, inactivation of Giardia using free chlorine requires relatively high concentrations and/or contact times. Chlorination is also not practical for the inactivation of Cryptosporidium. Ozone and chlorine dioxide are effective disinfectants against Cryptosporidium and Giardia. Ozone is a very strong oxidant capable of effectively inactivating Cryptosporidium and Giardia, as noted by the low CT values required for 3-log inactivation (Table 8). Whereas both ozone and chlorine dioxide are effective disinfectants, they are typically more expensive and complicated to implement, particularly in small treatment systems. Also, ozone decays rapidly after being applied during treatment and cannot be used to provide a secondary disinfectant residual. Chlorine dioxide is also not recommended for secondary disinfection because of its relatively rapid decay (Health Canada, 2008).
Although protozoa can be inactivated through chemical disinfection, they are much more resistant than bacteria or viruses. In general, Cryptosporidium is much more resistant to chemical disinfection than Giardia. This is, in part, due to the thick protective wall surrounding the oocyst, which is difficult to penetrate. CT values required to inactivate Cryptosporidium are approximately 5-200 times higher than for Giardia, most notably for chlorine-based disinfection (Korich et al., 1990; U.S. EPA, 1991; Finch et al., 1994, 1997). Therefore, the concentration of free chlorine necessary for inactivation of Cryptosporidium is not feasible, because it would conflict with other water quality requirements (i.e., DBP formation, taste and odour, etc.). As such, treatment systems that use free chlorine as the primary disinfectant must remove or inactivate Cryptosporidium by an additional treatment barrier, such as granular media filtration or UV disinfection. Watershed protection and an intact distribution system are also key to reducing Cryptosporidium and other waterborne pathogens in drinking water produced by treatment plants relying upon chlorination.
In addition to differences in disinfectant susceptibility between Giardia and Cryptosporidium, varying levels of resistance to disinfectants among strains must be considered. Chauret et al. (2001) observed that a 2-log (99%) inactivation required CT values of 70, 530 and 1000 mg·min/L for three different strains of Cryptosporidium parvum. Differential susceptibilities to disinfection have also been reported between environmental and laboratory strains (Maya et al., 2003). These findings highlight the importance of considering strain variability when reviewing treatment removals and potential health risks.
In addition to microbial inactivation, chemical disinfection can result in the formation of DBPs, some of which pose a human health risk. The most commonly used disinfectant, chlorine, reacts with naturally occurring organic matter to form trihalomethanes (THMs) and haloacetic acids (HAAs), along with many other halogenated organic compounds (Krasner et al., 2006). The use of ozone and chlorine dioxide can also result in the formation of inorganic DBPs, such as bromate and chlorite/chlorate, respectively. When selecting a chemical disinfectant, the potential impact of DBPs should be considered. It is critical to ensure that efforts made to minimize the formation of these DBPs do not negatively impact the effectiveness of disinfection.
UV light disinfection is considered an acceptable method of disinfection. UV light is usually applied after particle removal barriers, such as filtration, in order to prevent shielding by suspended particles and allow better light penetration through to the target pathogens. Studies have shown that relatively low UV doses can achieve substantial inactivation of protozoa (Clancy et al., 1998; Bukhari et al., 1999; Craik et al., 2000, 2001; Belosevic et al., 2001; Drescher et al., 2001; Linden et al., 2001, 2002; Shin et al., 2001; Campbell and Wallis, 2002; Mofidi et al., 2002; Rochelle et al., 2002). Based on these and other studies, the U.S. EPA developed UV light inactivation requirements for Giardia and Cryptosporidium in the LT2 (U.S. EPA, 2006a). The LT2 requires UV doses of 12 and 11 mJ/cm² to receive a 3-log credit for Cryptosporidium and Giardia removal, respectively (see Table 9). For water supply systems in Canada, a UV dose of 40 mJ/cm² is commonly applied (MOE, 2006); thus, protozoa should be effectively inactivated.
Several recent studies have examined the effect of particles on UV disinfection efficacy, and most have concluded that the UV dose-response of microorganisms is not affected by variations in turbidity up to 10 NTU (Christensen and Linden, 2002; Oppenheimer et al., 2002; Mamane-Gravetz and Linden, 2004; Passantino et al., 2004). However, the presence of humic acid particles and coagulants has been shown to significantly impact UV disinfection efficacy for two viral surrogates (MS2 coliphage and bacteriophage T4), with lower inactivation levels being achieved (Templeton et al., 2005). Further research is needed to better understand their relevance to protozoa inactivation as well as the effect of particles and coagulants on microbial inactivation by UV light. The hydraulic design of a UV reactor influences the UV dose delivered to the microorganisms passing through the reactors. The reactor hydraulics should be such that they allow for all microorganisms to receive the same dose of UV radiation (U.S. EPA, 2006c).
A multiple disinfectant strategy involving two or more primary disinfection steps is effective for inactivating protozoa, along with other microorganisms, in drinking water. For example, the use of UV light and free chlorine are complementary disinfection processes that can inactivate protozoa, viruses and bacteria. As UV light is highly effective for inactivating protozoa (but less effective for viruses) and chlorine is highly effective for inactivating bacteria and viruses (but less effective for protozoa), the multi-disinfectant strategy allows for the use of lower doses of chlorine. Consequently, there is decreased formation of DBPs. In some treatment plants, ozone is applied for the removal of taste and odour compounds, followed by chlorine disinfection. In such cases, both the ozone and chlorine disinfection may potentially be credited towards meeting the overall disinfection, depending on factors such as the hydraulics of the ozone contactor and the presence of an ozone residual at the point of contactor effluent collection.
The sequential use of two disinfectants has proved more effective for inactivating protozoa (along with other microorganisms in drinking water) than their individual application (i.e., sequential combination of disinfectants resulted in higher (oo)cyst inactivation compared with the sum of both disinfectants used separately). Finch et al. (1997) found that the application of chlorine followed by chloramine resulted in a 1.6-log inactivation of Cryptosporidium oocysts (when infectivity was measured using mice). Chlorine disinfection following ozone or chlorine dioxide was also found to be particularly effective. The possible synergistic effects of UV light followed by chemical inactivation are currently being studied.
In an effort to better understand and evaluate treatment systems, surrogates have been used as indicators of microbial inactivation and removal. Given the resistant nature of oocysts to disinfectants, surrogates have been applied to assess the effectiveness of conventional and alternative disinfection procedures for inactivation of Cryptosporidium. Both biological and non-biological surrogates have been used, including bacterial spores and polystyrene microspheres, respectively. Bacterial spores are not appropriate surrogates, as they are inactivated more readily than Cryptosporidium at lower temperatures and are typically more sensitive to certain disinfectants (e.g., chlorine dioxide) (Driedger et al., 2001; Larson and Mari˝as, 2003). Microspheres, on the other hand, could represent a feasible approach to evaluate oocyst inactivation, and recent studies have highlighted their possible usefulness in simulating oocyst inactivation in treatment systems that use multiple disinfectants (Baeza and Ducoste, 2004; Tang et al., 2005). Yeast cells have also been used (Rochelle et al., 2005) for assessing oocyst inactivation, but additional research on their feasibility is needed.
Residential scale treatment is also applicable to small drinking water systems. This would include both privately owned systems and systems with minimal or no distribution system that provide water to the public from a facility not connected to a public supply (also known as semi-public systems). Minimum treatment of all supplies derived from surface water sources or groundwater under the influence of surface waters should include adequate filtration (or equivalent technologies) and disinfection.
An array of options is available for treating source waters to provide high-quality drinking water. These include various filtration methods, such as reverse osmosis, and disinfection with chlorine-based compounds or alternative technologies, such as UV light or ozonation. These technologies are similar to the municipal treatment barriers, but on a smaller scale. In addition, there are other treatment processes, such as distillation, that can be practically applied only to small water systems. Most of these technologies have been incorporated into point-of-entry devices, which treat all water entering the system, or point-of-use devices, which treat water at only a single location-for example, at the kitchen tap.
Semi-public and private systems that apply disinfection typically rely on chlorine or UV light because of its availability and relative ease of operation. However, scaling or fouling of the UV lamp surface is a common problem when applying UV in systems with moderate or high levels of hardness, such as groundwater. Special UV lamp cleaning mechanisms or water softeners can be used to overcome this scaling problem.
Health Canada does not recommend specific brands of drinking water treatment devices, but strongly recommends that consumers look for a mark or label indicating that the device has been certified by an accredited certification body as meeting the appropriate NSF International (NSF)/American National Standards Institute (ANSI) standard. These standards have been designed to safeguard drinking water by helping to ensure the material safety and performance of products that come into contact with drinking water. For example, treatment units meeting NSF Standard 55 for Ultraviolet Disinfection Systems (Class A) are designed to inactivate microorganisms, including bacteria, viruses, Cryptosporidium oocysts and Giardia cysts, from contaminated water. They are not designed to treat wastewater or water contaminated with raw sewage and should be installed in visually clear water.
There are also NSF standards for cyst reduction claims; these include NSF Standard 58 for Reverse Osmosis, NSF Standard 53 for Drinking Water Treatment Units and NSF Standard 62 for Drinking Water Distillation Systems. These standards require a removal of 3 logs or better in order to be certified to a cyst reduction claim. However, they cannot be certified for inactivation claims, as the certification is only for mechanical filtration.
Certification organizations provide assurance that a product or service conforms to applicable standards. In Canada, the following organizations have been accredited by the Standards Council of Canada (www.scc.ca) to certify drinking water devices and materials as meeting the appropriate NSF/ANSI standards:
An up-to-date list of accredited certification organizations can be obtained from the SCC (www.scc.ca).
The health effects associated with exposure to Giardia and Cryptosporidium, like those of other pathogens, depend upon features of the host, pathogen and environment. The host's immune status, the (oo)cyst's infectivity and the degree of exposure (i.e., number of (oo)cysts consumed) are all key determinants of infection and illness. Infection with Giardia or Cryptosporidium can result in both acute and chronic health effects, which are discussed in the following sections.
Theoretically, a single cyst is sufficient, at least under some circumstances, to cause infection. However, studies have shown that the ID50 (the dose required for infection to be observed in 50% of the test subjects) is usually more than a single cyst and is dependent on the virulence of the particular strain. Human adult volunteer feeding trials suggest that the ID50 of Giardia is around 50 cysts (Hibler et al., 1987), although some individuals can become infected at a much lower dose (Rendtorff, 1978; Stachan and Kunstřr, 1983). The ID50 of Giardia in humans can also be extrapolated from dose-response curves. Using this approach, the ID50 for Giardia in humans is around 35 cysts (Rose and Gerba, 1991), which is comparable to that reported above. Giardia strains that are well adapted to their hosts (e.g., by serial passage) can frequently infect with lower numbers of cysts (Hibler et al., 1987). For example, Rendtorff (1978) reported an ID50 of 19 cysts when using human-source cysts in volunteers.
The prepatent period (time between ingestion of cysts and excretion of new cysts) for giardiasis is 6-16 days (Rendtorff, 1978; Stachan and Kunstřr, 1983; Nash et al., 1987), although this can vary, depending on the strain. Research with animal models has shown that smaller inocula result in longer prepatent periods but do not influence the resulting parasite burden (Belosevic and Faubert, 1983).
The specific mechanism(s) by which Giardia causes illness are not well understood, and no specific virulence factors have been identified. Some suggest that Giardia causes mechanical irritation or mucosal injury by attaching to the brush border of the intestinal tract. Others have proposed that Giardia attachment results in repopulation of the intestinal epithelium by relatively immature enterocytes with reduced absorptive capacities (leading to diarrhoea).
The host-parasite relationship is complex, and Giardia has been shown to be versatile in the expression of antigens (Nash, 1994), so universal lasting immunity is improbable. Humoral immune response is revealed by increased levels of circulating antibodies (immunoglobulin G [IgG] and IgM) and secretion of antibodies (IgA) in milk, saliva and possibly intestinal mucus. These antibodies may play a role in eliminating disease (Heyworth, 1988), but lasting immunity has not been demonstrated. Very little is known about cellular immunity, but spontaneous killing of trophozoites by human peripheral blood monocytes has been described (denHollander et al., 1988).
Typically, Giardia is non-invasive and results in asymptomatic infections. Based on U.S. data, 24% of individuals will develop symptomatic illness after infection with Giardia (Macler and Regli, 1993). Symptomatic giardiasis can result in nausea, anorexia, an uneasiness in the upper intestine, malaise and occasionally low-grade fever or chills. The onset of diarrhoea is usually sudden and explosive, with watery and foul-smelling stools (Wolfe, 1984). The acute phase of the infection commonly resolves spontaneously, and organisms generally disappear from the faeces. Assemblage A has been associated with mild, intermittent diarrhoea, whereas assemblage B has been linked to severe, acute or persistent diarrhoea (Homan and Mank, 2001; Read et al., 2002). Giardia infection can also lead to lactase deficiency (i.e., lactose intolerance) and a general malabsorptive syndrome. Some patients become asymptomatic cyst passers for a period of time and have no further clinical manifestations. Other patients, particularly children, suffer recurring bouts of the disease, which may persist for months or years (Lengerich et al., 1994). In the United States, an estimated 4600 persons per year are hospitalized for severe giardiasis, a rate similar to that of shigellosis (Lengerich et al., 1994). The median length of hospital stay is 4 days.
Giardiasis can be treated using a number of drugs, including metronidazole, quinacrine, furazolidone, tinidazole, ornidazole, nitazoxanide and nimorazole. Olson et al. (1994) showed that potential for a vaccine exists, but infections and symptoms are only attenuated, and prevention of infection is not feasible at this time.
Although human cryptosporidiosis is not well understood, dose-response information has become available through human volunteer feeding trials involving immunocompetent individuals. As is the case for Giardia and other pathogens, a single organism can potentially cause infection, although studies have shown that more than one organism is generally required (DuPont et al., 1995; Okhuysen et al., 1998, 2002; Chappell et al., 1999, 2006). Together, these studies suggest that the ID50 of Cryptosporidium is somewhere between 80 and 1000 oocysts (DuPont et al., 1995; Chappell et al., 1999, 2006; Okhuysen et al., 2002), indicating that Cryptosporidium isolates can differ significantly in their infectivity and ability to cause symptomatic illness. The TAMU isolate of C. parvum (originally isolated from a foal), for example, was shown to have an ID50 of 9 oocysts and an illness attack rate of 86%, compared with the UCP isolates of C. parvum (isolated from a calf), which had an ID50 of 1042 oocysts and an illness attack rate of 59% (Okhuysen et al., 1999). In contrast, the Iowa and Moredun isolates of C. parvum had an ID50 of 132 and approximately 300 oocysts, respectively, whereas illness attack rates were similar (i.e., 55-65%) (DuPont et al., 1995; Okhuysen et al., 2002). Based on a meta-analysis of these feeding studies, the ID50s of the TAMU, UCP and Iowa isolates were estimated to be 12.1, 2066 and 132 oocysts, respectively (Messner et al., 2001). The genetic basis for these differences is not known, although a number of virulence factors have been identified (Okhuysen and Chappell, 2002). In a separate meta-analysis using the TAMU, UCP and Iowa human study data, the probability of infection from ingesting a single infectious oocyst was estimated to range from 4% to 16% (U.S. EPA, 2006a). This estimate is supported by outbreak data, including observations made during the 1993 Milwaukee outbreak (Gupta and Haas, 2004).
The prepatent period for cryptosporidiosis is 4-9 days (Ma et al., 1985; DuPont et al., 1995; Okhuysen et al., 1999, 2002), although this can vary, depending on the isolate.
Infections of Cryptosporidium spp. in the human intestine are known to cause at least transient damage to the mucosa, including villous atrophy and lengthening of the crypt (Tzipori, 1983); however, the molecular mechanisms by which Cryptosporidium causes this damage are unknown. Several molecules are thought to mediate its mobility, attachment and invasion of host cells, including glycoproteins, lectins and other protein complexes, antigens and ligands (Okhuysen and Chappell, 2002; Tzipori and Ward, 2002). Most of the pathological data available have come from AIDS patients, and the presence of other opportunistic pathogens has made assessment of damage attributable to Cryptosporidium spp. difficult.
The primary mechanism of host defence appears to be cellular immunity (McDonald et al., 2000; Lean et al., 2002; Riggs, 2002), although humoral immunity is also known to be involved (Riggs, 2002; Okhuysen et al., 2004; Priest et al., 2006). Studies using animal models have demonstrated the importance of helper (CD4+) T cells, interferon gamma (IFN-γ) and interleukin 12 (IL-12) in recovery from cryptosporidiosis (Riggs, 2002). Antibody responses against certain glycoproteins involved in Cryptosporidium adhesion have been demonstrated (Riggs, 2002).
It is not clear whether prior exposure to Cryptosporidium provides protection against future infections or disease. Okhuysen et al. (1998) reported that initial exposure to Cryptosporidium was inadequate to protect against future bouts of cryptosporidiosis. Although the rates of diarrhoea were similar after each of the exposures, the severity of diarrhoea was lower after re-exposure. Chappell et al. (1999) reported that volunteers with pre-existing C. parvum antibodies (suggesting previous infection) exhibited a greater resistance to infection, as demonstrated by a significant increase in the median infectious dose, compared with those who were antibody negative. However, in contrast to the earlier findings (Okhuysen et al., 1998), the severity of diarrhoea (defined by the number of episodes and duration of the illness) was greater among the subjects presumed previously infected.
Individuals infected with Cryptosporidium are more likely to develop symptomatic illness than those infected with Giardia (Macler and Regli, 1993; Okhuysen et al., 1998, 1999). The most common symptom associated with cryptosporidiosis is diarrhoea, characterized by very watery, non-bloody stools. The volume of diarrhoea can be extreme, with 3 L/day being common in immunocompetent hosts and with reports of up to 17 L/day in immunocompromised patients (Navin and Juranek, 1984). This symptom can be accompanied by cramping, nausea, vomiting (particularly in children), low-grade fever (below 39°C), anorexia and dehydration. Extraintestinal cryptosporidiosis (i.e., in the lungs, middle ear, pancreas, etc.) and death have been reported, primarily among persons with AIDS (Farthing, 2000; Mercado et al., 2007), but are considered rare.
The duration of infection is dependent on the condition of the immune system (Juranek, 1995) and can be broken down into three categories: 1) immunocompetent individuals who clear the infection in 7-14 days, 2) AIDS patients or others with severely weakened immune systems (i.e., individuals with CD4 cell counts < 180 cells/mm³) who in most reported cases never completely clear the infection (it may develop into an infection with long bouts of remission followed by mild symptoms) and 3) individuals who are immunosuppressed following chemotherapy, short-term depression or illness (e.g., chicken pox) or malnutrition. In cases where the immunosuppression is not AIDS related, the infection usually clears (no oocyst excretion, and symptoms disappear) within 10-15 days of the onset of symptoms. However, there have been reported cases involving children in which the infection has persisted for up to 30 days. The sensitivity of diagnosis of cryptosporidiosis by stool examination is low-so low that oocyst excreters may be counted as negative prematurely. The application of more sensitive and rapid diagnostic tools, such as immunochromatographical lateral-flow assays, will help to reduce the number of false negatives (Cacci˛ and Pozio, 2006). Immunocompetent individuals usually carry the infection for a maximum of 30 days. With the exception of AIDS cases, individuals may continue to pass oocysts for up to 24 days. In an outbreak in a daycare facility, children shed oocysts for up to 5 weeks (Stehr-Green et al., 1987). The reported rate of asymptomatic infection is believed to be low, but a report on an outbreak at a daycare facility in Philadelphia, Pennsylvania, concluded that up to 11% of the children were asymptomatic (Alpert et al., 1986), and Ungar (1994) discussed three separate studies in day care centres where the asymptomatic infection rate ranged from 67% to 100%. It has been suggested that many of these asymptomatic cases were mild cases that were incorrectly diagnosed (Navin and Juranek, 1984).
Nitazoxanide is the only drug approved for treatment of cryptosporidiosis in children and adults (Fox and Saravolatz, 2005), although more than 200 drugs have been tested both in vitro and in vivo (Tzipori, 1983; O'Donoghue, 1995; Armson et al., 2003; Cacci˛ and Pozio, 2006). This can be explained, in part, by the fact that most inhibitors target biochemical pathways resident in the apicoplast (plastid-derived organelle) (Wiesner and Seeber, 2005), a structure that C. parvum (Abrahamsen et al., 2004) and C. hominis (Xu et al., 2004) lack. Some progress has been reported with furazolidone in reducing the symptoms of immunocompetent patients. Spiramycin has apparently been used with some success in Chile and the United States, but at this time it is not licensed for general use by the U.S. Food and Drug Administration (Janoff and Reller, 1987).
Analysis of the complete genome sequence of Cryptosporidium may help to identify virulence determinants and mechanisms of pathogenesis, thereby facilitating the development of antimicrobials (Umejiego et al., 2004), vaccines (Wyatt et al., 2005; Boulter-Bitzer et al., 2007) and immunotherapies (Crabb, 1998; Enriquez and Riggs, 1998; Schaefer et al., 2000; Takashima et al., 2003) against Cryptosporidium.
The adoption of a risk-based approach, such as a multi-barrier approach, is essential to the effective management of drinking water systems (CCME, 2004). This approach should include assessment of the entire drinking water system, from the watershed/aquifer and intake through the treatment and distribution chain to the consumer, to assess potential impacts on drinking water quality and public health.
Current drinking water quality guidelines encourage the adoption of a multi-barrier approach to produce clean, safe and reliable drinking water. Various indicators, such as indicator microorganisms, turbidity and disinfectant residuals, are used as part of the multi-barrier approach to determine the quality of the treated drinking water. For example, E. coli and total coliforms are bacteriological indicators that are routinely used to verify the microbiological quality of drinking water. Although indicators are an important aspect of a multi-barrier approach, they do not provide any quantitative information on pathogens or the potential disease burden in the population that would be associated with drinking water of a given quality. It is important to note that even water of an acceptable quality carries some risk of illness, although it is extremely low.
Quantitative microbial risk assessment (QMRA) is gaining acceptance as part of a multi-barrier approach. QMRA is a process that uses source water quality data, treatment barrier information and pathogen-specific characteristics to estimate the burden of disease associated with exposure to pathogenic microorganisms in a drinking water source. The benefit of using a QMRA approach is that assessments can be carried out by each water system to provide site-specific information:
Site-specific variations should include the potential impact of hazardous events, such as storms, contamination events or the failure of a treatment barrier. When interpreting the results from a QMRA, the following should be considered:
Because of these limitations, QMRA should not be used to try to estimate levels of illness in a population resulting from a particular water system. Rather, the disease burden estimates produced from a QMRA are useful for site-specific system evaluations as part of a multi-barrier approach to safe drinking water.
Health-based targets are the "goalposts" or "benchmarks" that have to be met to ensure the safety of drinking water. In Canada, microbiological hazards are commonly addressed by two forms of health-based targets: water quality targets and treatment goals. An example of a water quality target is the bacteriological guideline for E. coli, which sets a maximum acceptable concentration of this organism in drinking water (Health Canada, 2006a). Treatment goals describe the reduction in risk to be provided by measures such as treatment processes aimed at reducing the viability or presence of pathogens. Treatment goals assist in the selection of treatment barriers and should be defined in relation to source water quality. They need to take into account not only normal operating conditions, but also the potential for variations in water quality and/or treatment performance. For example, short periods of poor source water quality following a storm or a decrease in treatment effectiveness due to a process failure may in fact embody most of the risk in a drinking water system. The wide array of microbiological pathogens makes it impractical to measure all of the potential hazards; thus, treatment goals are generally framed in terms of categories of organisms (e.g., bacteria, viruses and protozoa) rather than individual pathogens. The health-based treatment goal for Giardia and Cryptosporidium is a minimum 3-log reduction and/or inactivation of (oo)cysts. Many source waters may require a greater log reduction and/or inactivation to maintain an acceptable level of risk.
The burden of disease estimates calculated during a risk assessment should be compared with a reference level of risk-that is, a level of risk that is deemed tolerable or acceptable. This comparison is needed to understand the public health implications of the disease burden estimate and is needed to set health-based treatment goals.
Risk levels have been expressed in several ways. WHO's Guidelines for Drinking-water Quality (WHO, 2008) use DALYs as a unit of measure for risk. The basic principle of the DALY is to calculate a value that considers both the probability of experiencing an illness or injury and the impact of the associated health effects (Murray and Lopez, 1996a; Havelaar and Melse, 2003). The WHO (2008) guidelines adopt 10-6 DALY/person per year as a health target. The Australian National Guidelines for Water Recycling (NRMMC-EPHC, 2006) also cite this target. In contrast, other agencies set acceptable microbial risk levels based on the risk of infection and do not consider the probability or severity of associated health outcomes. For example, the U.S. EPA has used a health-based target of an annual risk of infection of less than 1/10 000 (10-4) persons (Regli et al., 1991).
For comparison, the reference level of 10-6 DALY/person per year is approximately equivalent to an annual risk of illness for an individual of 1/1000 (10-3) for a diarrhoea-causing pathogen with a low fatality rate. For an illness with more severe health outcomes, such as a cancer, 10-6 DALY/person per year is approximately equivalent to a lifetime additional risk of cancer over background of 10-5 (i.e., 1 excess case of cancer over background levels per 100 000 people ingesting 1.5 L of drinking water containing the substance at the guideline value per day over a 70-year life span). QMRA is a useful tool in estimating whether a drinking water system can meet this health target, as current disease surveillance systems in developed nations such as Canada are not able to detect illness at such a low level.
The risk assessment in this guideline technical document estimates the disease burden in DALYs. There are several advantages to using this metric. DALYs take into account both the number of years lost due to mortality and the number of years lived with a disability (compared with the average healthy individual for the region) to determine the health impact associated with a single type of pathogenic organism. The use of DALYs also allows for comparison of health impacts between different pathogens and potentially between microbiological and some chemical hazards. Although no common health metric has been accepted internationally, DALYs have been used by numerous groups, and published, peer-reviewed information is available. The WHO (2008) reference level of 10-6 DALY/person per year is used in this risk assessment as an acceptable level of risk.
QMRA uses mathematical modelling and relevant information from selected pathogens to derive disease burden estimates. It follows a common approach in risk assessment, which includes four components: hazard identification, exposure assessment, dose-response assessment and risk characterization.
The first step of QMRA is hazard identification, a qualitative process of identifying hazards to the drinking water system or to human health, such as microorganisms or toxic chemicals. The enteric protozoa of most concern as human health hazards in Canadian drinking water sources are Giardia and Cryptosporidium. These organisms can cause serious illness in immunocompetent and immunocompromised individuals. Illness caused by Cryptosporidium is more serious because it is capable of causing death, particularly in immunocompromised individuals, and extraintestinal (i.e., lungs, pancreas, etc.) damage can occur.
The presence and types of Giardia and Cryptosporidium in a given drinking water source are variable. Therefore, it is important to identify all potential sources and events, regardless of whether they are under the control of the drinking water supplier, that could lead to Giardia and Cryptosporidium being present at concentrations exceeding baseline levels, on a site-specific basis. Faeces from humans and other animals are the main sources of enteric protozoa and may originate from point sources of pollution, such as municipal sewage discharges, or non-point sources, such as septic tanks and urban or livestock runoff. In addition to the potential sources of contamination, it is necessary to consider whether the presence of protozoa is continuous or intermittent or has seasonal pollution patterns and how rare events, such as droughts or floods, will influence the Giardia and Cryptosporidium concentrations in the source water.
Although all enteric protozoa of concern need to be identified, risk assessments do not usually consider each individual enteric protozoan. Instead, the risk assessment includes only specific enteric protozoa whose characteristics make them a good representative of all similar pathogenic protozoa. It is assumed that if the reference protozoan is controlled, this would ensure control of all other similar protozoa of concern. Ideally, a reference protozoa will represent a worst-case combination of high occurrence, high concentration and long survival time in source water, low removal and/or inactivation during treatment and a high pathogenicity for all age groups. Cryptosporidium parvum and Giardia lamblia have been selected as the reference protozoa for this risk assessment because of their high prevalence rates, potential to cause widespread disease, resistance to chlorine disinfection and the availability of a dose-response model for each organism.
Exposure assessments provide an estimate (with associated uncertainty) of the occurrence and level of a contaminant in a specified volume of water at the time of the exposure event (ingestion, inhalation and/or dermal absorption). The principal route of exposure considered in this risk assessment is consumption of drinking water. To determine exposure, the concentration of Cryptosporidium or Giardia and the volume of water ingested need to be known or estimated. Exposure can be determined as a single dose of pathogens ingested by a consumer at one time.
Drinking water is not usually monitored for protozoans. Therefore, to determine exposure, the concentrations of the reference protozoa in the source water need to be measured or estimated. Measurements, as opposed to estimates, will result in the highest-quality risk assessment. Short-term peaks in Cryptosporidium or Giardia concentrations may increase disease risks considerably and even trigger outbreaks of waterborne disease; thus, seasonal variation and peak events such as storms should be included in the measurements or estimates. Some of the factors that should be taken into consideration when determining concentrations in drinking water are the recovery efficiencies of Cryptosporidium and Giardia concentration and detection methods, which are much less than 100%, the variability around treatment removal and inactivation, and the viability or infectivity of the pathogen in the finished water. A variety of methods can be used to assess (oo)cyst viability and infectivity (see Section 6.6). In this risk assessment, the (oo)cysts reported in the source water are assumed to be viable and infectious. Once the source water concentrations are determined, treatment reductions are calculated to determine the concentration in the finished drinking water. This risk assessment assumes that any (oo)cysts that were not removed or inactivated during treatment are still capable of causing infection and illness.
For the volume of water ingested, it is important to consider only the unboiled amount of tap water consumed, as boiling the water inactivates pathogens and will overestimate exposure (Gale, 1996; Payment et al., 1997; WHO, 2008). In Canada, approximately 1.5 L of tap water are consumed per person per day. However, approximately 35% is consumed in the form of coffee or tea (Health and Welfare Canada, 1981). The elevated temperatures (boiling or near boiling) used for making coffee and tea would inactivate any enteric pathogens present. Therefore, for estimating risk from pathogenic organisms, the risk assessment uses an average consumption of 1 L of water per person per day for determining exposure. This estimate is similar to consumption patterns in other developed nations (Westrell et al., 2006; Mons et al., 2007). WHO, in its Guidelines for Drinking-water Quality, also suggests using an estimate of 1 L for consumption of unboiled tap water (WHO, 2008).
The dose-response assessment uses dose-response models to estimate the probability of infection and the risk of illness after exposure to (oo)cysts. The probability of infection (Pinfection) for this risk assessment is calculated using dose-response models for C. parvum and G. lamblia. These dose-response data are best explained by the exponential model (Haas et al., 1999):
This exponential model describes mathematically the distribution of the individual probabilities of any one organism to survive and start infection, where V is the single volume of liquid ingested, Á is the number of organisms per litre in the ingested volume and r is the fraction of ingested organisms that survive to initiate infection. The r parameter is different for Cryptosporidium and Giardia. In the case of C. parvum, r = 0.018 (Messner et al., 2001), whereas for G. lamblia, r = 0.0199 (Rose and Gerba, 1991). The r parameter is derived from dose-response studies of healthy volunteers and may not adequately represent effects on sensitive subgroups, such as immunocompromised persons, young children or the elderly.
An individual's daily dose of organisms is estimated using the information from the exposure assessment. An individual's yearly probability of infection is estimated using equation 2. For this risk assessment, it is assumed that there is no secondary spread of infection.
Not all infected individuals will develop a clinical illness. The risk of illness per year for an individual is estimated using equation 3:
The risk assessment is based on I values of 0.70 and 0.24 for Cryptosporidium (Okhuysen et al., 1998) and Giardia (Macler and Regli, 1993), respectively. S is assumed to be 1.
To translate the risk of illness per year for an individual to a disease burden per person, the DALY is used as a common unit of risk. The key advantage of the DALY as a measure of public health is cited as its aggregate nature, combining life years lost (LYL) with years lived with disability (YLD) to calculate the disease burden. DALYs can be calculated as follows:
For Giardia and Cryptosporidium, the health effects vary in severity from mild diarrhoea to more severe diarrhoea and potentially death. It is important to note that, as no published mortality information is available for Giardia, this risk assessment assumes that the risk of death is the same as that for Cryptosporidium. The health burden of gastroenteritis resulting from infection with Giardia and Cryptosporidium in drinking water is 1.70 DALYs/1000 cases (1.70 × 10-3 DALY/case) (Table 10).
|Health outcome||Outcome fractiona||Duration of illnessb||Severity weightc||DALY/case|
|Morbidity (YLD)||Mild diarrhoea||0.99999||0.01918 (7 days)||0.067||1.29 × 10-3|
|Mortality (LYL)||Death||0.00001||Life expectancyd; age at deathe||1||4.15 × 10-4|
|Health burden||1.70 × 10-3|
Using this health burden and the risk of illness per year in an individual, the disease burden in DALYs/person per year can be estimated:
Risk characterization brings together the data collected or estimated on pathogen occurrence in source water, pathogen removal or inactivation through treatment barriers, consumption patterns to estimate exposure and pathogen dose-response relationships to estimate the burden of disease. Using this information, the potential disease burden associated with the specified drinking water system can be calculated. Example disease burden calculations are provided in Figures 1 and 3. These calculations have been presented using point estimates; however, when mathematical models are used for QMRA, the calculations generally include probability functions with associated uncertainties (Appendix D). The calculated disease burden can then be compared with the acceptable risk level to determine if the drinking water being produced is of an acceptable quality. If the disease burden estimate associated with the drinking water does not meet the acceptable risk level, QMRA can then be used to calculate the level of treatment that would be required to meet the acceptable health risk target (10-6 DALY/person per year).
For example, as shown in Figures 1 and 2, when source waters have a concentration of 13 oocysts/100 L and the treatment plant consistently achieves at least a 3-log reduction in oocyst concentration, the burden of disease in the population would meet the reference level of 10-6 DALY/person per year (less than 1 case/1000 people per year). Although this source water oocyst concentration falls within the range of oocyst concentrations that would typically be found in Canadian source waters, many surface water sources will have higher Cryptosporidium concentrations (see Section 5.0). These higher levels would require a greater log reduction to meet the acceptable health burden. For example, when source waters have a concentration of 1300 oocyst/100 L, a 5-log reduction in oocyst concentration would have to be achieved in order to meet the disease burden target.
Figures 3 and 4 show that a source water with a concentration of 34 cysts/100 L of water would require the treatment plant to consistently achieve at least a 3-log reduction in cyst concentration in order to meet the acceptable reference level of risk. In contrast, a concentration of 3400 cysts/100 L of water would require the treatment plant to consistently achieve at least a 5-log reduction in cyst concentration in order to meet the acceptable reference level of risk. Consequently, the health-based treatment goal of a 3-log reduction of Giardia and Cryptosporidium is a minimum requirement. A site-specific assessment should be done to determine what level of (oo)cyst reduction is needed for any given source water. Monitoring, as opposed to estimating, source water Cryptosporidium and Giardia concentrations will result in the highest-quality risk assessment. However, if measurements are not possible, estimated concentrations may be based on perceived source water quality. Information obtained from sanitary surveys, vulnerability assessments and information on other water quality parameters can be used to help estimate the risk and/or level of faecal contamination in the source water. It is important to consider, as part of the site-specific assessment, events that can significantly change source water quality, such as hazardous spills or storm events. These will have an important impact on the treatment required, and including variations in source water quality will provide the best estimate of the risk in a system. Understanding and planning for the variations that occur in source water quality create a more robust system that can include safety margins. It is also important to take into consideration the level of uncertainty that is inherent in carrying out a QMRA, to ensure that the treatment in place is producing water of an acceptable quality. A sensitivity analysis using a QMRA model such as the one described in Appendix D can also help identify critical control points and their limits.
QMRA is increasingly being applied by international agencies and governments at all levels as the foundation for informed decision-making surrounding the health risks from pathogens in drinking water. WHO, the European Commission, the Netherlands, Australia and the United States have all made important advances in QMRA validation and methodology (Staatscourant, 2001; Medema et al., 2006; NRMMC-EPHC, 2006; U.S. EPA, 2006a,b; WHO, 2008). With the exception of the U.S. EPA, these agencies and governments have adopted an approach that takes full advantage of the potential of QMRA to inform the development of health targets (i.e., acceptable levels of risk or disease) and site-specific risk management (e.g., water safety plans as described in WHO, 2008). Building on the WHO work, the European Commission's MicroRisk project has published an extensive guidance document that establishes methods and a strong science basis for QMRA of drinking water (Medema et al., 2006).
The Netherlands and the U.S. EPA provide two examples of QMRA-based regulatory approaches. In the Netherlands, consistent with the WHO approach, water suppliers must conduct a site-specific QMRA on all surface water supplies to determine if the system can meet a specified level of risk. Dutch authorities can also require a QMRA of vulnerable groundwater supplies. In contrast, recent regulatory activity in the United States has seen the U.S. EPA assess the health risks from waterborne pathogens through QMRA and apply this information to set nationwide obligatory treatment performance requirements (U.S. EPA, 2006a,b). In general, drinking water systems must achieve a 3-log removal or inactivation of Giardia (U.S. EPA, 1989). To address risk from Cryptosporidium, drinking water systems must monitor their source water, calculate an average Cryptosporidium concentration and use those results to determine if their source is vulnerable to contamination and requires additional treatment. Water systems are classified into categories ("bins") based on whether they are filtered or unfiltered systems; these bins specify additional removal or inactivation requirements for Cryptosporidium spp. (U.S. EPA, 2006a).
Health Canada and the Federal-Provincial-Territorial Committee on Drinking Water have chosen the same approach as WHO (2008), providing QMRA-based performance targets as minimum requirements, but also recommending the use of a site-specific QMRA as part of a multi-barrier source-to-tap approach.This QMRA approach offers a number of advantages, including 1) the ability to compare the risk from representative groups of pathogens (e.g., viruses, protozoa, bacteria) in an overall assessment; 2) the transparency of assumptions; 3) the potential to account for variability and uncertainty in estimates; 4) the removal of hidden safety factors (these can be applied as a conscious choice by regulatory authorities at the end of the process, if desired); 5) the site-specific identification of critical control points and limits through sensitivity analysis; and 6) the clear implications of system management on a public health outcome.
Several species and genotypes of Giardia and Cryptosporidium are known to infect humans. These pathogens are excreted in the faeces of infected persons and animals and can potentially be found in source water. Their occurrence in source water varies over time and can be significantly affected by extreme weather or spill/upset events (i.e., increases in (oo)cyst levels associated with these events). The best way to safeguard against the presence of hazardous levels of Cryptosporidium and Giardia in drinking water is based on the application of the multi-barrier approach, including source water protection and adequate treatment, as demonstrated using appropriate physicochemical parameters, followed by the verification of the absence of faecal indicator organisms in the finished water. The protection of public health is accomplished by setting health-based treatment goals. To set health-based treatment goals, the level of risk deemed tolerable or acceptable needs to be determined. The Federal-Provincial-Territorial Committee on Drinking Water has chosen this acceptable level of risk as 10-6 DALY/person per year, which is consistent with the reference level adopted by WHO. This is a risk management decision that balances the estimated disease burden from Cryptosporidium and Giardia with the lack of information on the prevalence of these pathogens in source waters, limitations in disease surveillance and the variations in performance within different types of water treatment technologies.
Although all enteric protozoa of concern need to be identified, risk assessments do not usually consider each individual enteric protozoan. Instead, the risk assessment includes only specific enteric protozoa (reference pathogens or, in this case, reference protozoa) whose characteristics make them a good representative of all similar pathogenic protozoa. It is assumed that if the reference protozoa are controlled, this will ensure control of all other similar protozoa of concern. C. parvum and G. lamblia have been selected as the reference protozoa for this risk assessment because of their high prevalence rates, potential to cause widespread disease, resistance to chlorine disinfection and the availability of a dose-response model for each organism.
In Canada, many surface water sources will have Cryptosporidium and Giardia concentrations on the range of 1-200 (oo)cysts/100 L of water. The QMRA approach used in this guideline demonstrates that if a source water has a concentration of (oo)cysts at the lower end of this range-for example, approximately 13 oocysts/100 L and/or 34 cysts/100 L-a water treatment plant would need to consistently achieve at least a 3-log reduction in (oo)cyst concentration in order to meet the reference level of 10-6 DALY/person per year. Thus, a minimum 3-log reduction and/or inactivation of Cryptosporidium and Giardia has been established as a health-based treatment goal. Many source waters in Canada may require more than the minimum treatment goal to meet the acceptable level of risk.
QMRA can be used on a site-specific basis to evaluate how variations in source water quality may contribute to microbiological risk and to assess the adequacy of existing control measures or the requirement for additional treatment barriers or optimization. In most cases, a well-operated treatment plant employing effective coagulation, flocculation, clarification, filtration and disinfection achieving a sufficient CT value should produce water with a negligible risk of infection from enteric protozoa. Where possible, watersheds or aquifers that are used as sources of drinking water should be protected from faecal waste.
|Log inactivation||Water temperature (°C)|
|Log inactivation||Water temperature (°C)|
|Log inactivation||Water temperature (°C)|
|Log inactivationa||Water temperature (°C)|
|Log inactivationa||Water temperature (°C)|
Toxoplasma gondii is an obligate, intracellular parasite that affects almost all warm-blooded animals, including humans. It is usually transmitted by ingestion of tissue cysts through consumption of raw or undercooked infected meat, by ingestion of sporulated oocysts through consumption of contaminated food or water or after handling contaminated soil or infected cat faeces. Oocysts are extremely resistant to environmental conditions, including drying, and appear to retain their infectivity for several months (at temperatures of -5°C) (Dubey, 1998). Although this organism tends to cause mild flu-like symptoms, it can be life-threatening for immunocompromised individuals and pregnant women. Infection can result in mental retardation, loss of vision, hearing impairment and mortality in congenitally infected children. Little is known about the distribution of this organism in water sources; however, oocysts have been reported to survive for up to 17 months in tap water. There have been six reported human outbreaks of toxoplasmosis linked to ingestion of contaminated soil and water, including an outbreak in British Columbia in 1995 (Karanis et al., 2007). This outbreak involved 110 acute cases, including 42 pregnant women and 11 neonates (Bowie et al., 1997), and was thought to be due to contamination of a water reservoir by domestic and wild cat faeces (Isaac-Renton et al., 1998; Aramini et al., 1999). Limited information is available on the efficacy of water treatment processes in removing or inactivating T. gondii. However, because of its size, it should be readily removed by conventional coagulation/sedimentation and filtration processes. In effect, water treatment processes applied for the removal/inactivation of Giardia and Cryptosporidium should be effective against this organism.
Cyclospora cayetanensis is an obligate, intracellular coccidian parasite whose only natural host is humans (Eberhard et al., 2000). Cyclosporiasis has been reported worldwide but appears to be endemic throughout the tropics (Soave, 1996). Exact routes of transmission have yet to be elucidated; however, person-to-person transmission is unlikely (i.e., unsporulated oocysts are shed in faeces and require a period of maturation). Transmission is likely through food and water that have been contaminated with human faeces. Cyclospora cayetanensis has been detected in environmental samples, including water and wastewater, but detection still represents a challenge; few prevalence studies exist owing to the lack of sensitive methods, including methods to assess viability and infectivity. Cyclospora cayetanensis infection causes symptoms that mimic those caused by Cryptosporidium (i.e., nausea, anorexia, diarrhoea, etc.). Illness is usually self-limiting, but long-term health effects have been reported, including Reiter's syndrome. Epidemiological evidence strongly suggests that water can transmit C. cayetanensis. The first outbreak of cyclosporiasis to be associated with drinking water occurred in 1990 among hospital staff in Chicago, Illinois (Karanis et al., 2007), and was linked to a chlorinated water supply, suggesting the C. cayetanensis is resistant to levels of chlorine used in drinking water treatment. Although the efficacy of drinking water treatment processes for removal and/or inactivation of C. cayetenensis has not been evaluated, removal by conventional coagulation and filtration should be at least as effective as for Cryptosporidium, given that C. cayetanenis oocysts are larger.
Entamoeba histolytica is an obligate parasite that affects humans and other primates. Humans are the only reservoirs of significance, shedding trophozoites, cysts or both in their faeces. Entamoeba histolytica can be transmitted through ingestion of faecally contaminated water and food, but person-to-person contact is thought to be the primary route of transmission. Most infections are asymptomatic, but some can cause serious illness (i.e., amoebiasis). In the case of symptomatic infections, diarrhoea, fever and abdominal pain are common. More serious health effects, including chronic colitis, lower abscesses and death, have been reported (Kucik et al., 2004). Entamoeba histolytica cysts are resistant to environmental degradation; however, their survival is primarily a function of temperature. Cysts are rapidly killed by modest heat and freezing (Gillin and Diamond, 1980). Although no waterborne outbreaks of amoebiasis have been reported in Canada, outbreaks have been reported in the United States and elsewhere (Karanis et al., 2007). Outbreaks have occurred when chlorinated water became contaminated with sewage. Limited information is available on the efficacy of water treatment processes in removing or inactivating Entamoeba histolytica. However, because of its large cysts, it should be readily removed by conventional coagulation/sedimentation and filtration processes. In effect, water treatment processes applied for the removal/inactivation of Giardia and Cryptosporidium should be effective against this organism.
Mathematical models have been developed as a means to quantitatively assess the potential microbiological risks associated with a drinking water system, including the potential risks associated with bacterial, protozoan and viral pathogens. These models have been developed by international organizations (Smeets et al., 2008; Teunis et al., 2009), as well as by groups within Canada (Jaidi et al., 2009). QMRA models have also been used to estimate the potential health risks through other routes of exposure (Mara et al., 2007; Armstrong and Haas, 2008; Diallo et al., 2008). Although some of the assumptions vary between models (i.e., the choice of reference pathogen or selection of dose-response variables), all are based on the accepted principles of QMRA--that is, hazard identification, exposure assessment, dose-response assessment and risk characterization.
A QMRA model was developed by Health Canada as part of the risk assessment process for enteric pathogens in drinking water. This probabilistic model explores the potential disease burden, with associated uncertainty, for user-defined scenarios for a drinking water system. The model includes user inputs for the protozoal quality of the raw water source and the specific treatment train (defined in terms of filtration and disinfection approaches). Cryptosporidium and Giardia are used as reference protozoans. For drinking water systems where data are lacking for the above parameters, the model includes values from published literature and from expert opinion as a starting point for the assessment. For source water quality, the model provides users with the choice of four categories. These source water quality estimates were developed only to be used within the context of the QMRA model for evaluating the impacts of variations in source water quality on the overall microbiological risks. It should be noted that although a source may be a particular category for enteric protozoans, it may have a different source water quality category for bacterial or viral pathogens. For treatment processes, the model uses a range of literature values to more accurately represent the range of effectiveness of treatment methodologies.
The QMRA model uses this exposure information, along with the dose-response model and the DALY calculations for Cryptosporidium and Giardia, to estimate the potential disease burden (in DALYs/person per year) for the site-specific scenario information. The quality of the outputs from the QMRA model are dependent on the quality of the information that is input into the model. Measurements, as opposed to estimates, for exposure levels will result in a higher-quality risk assessment output. Even with high-quality exposure data, the QMRA model requires numerous assumptions that introduce uncertainty into the assessment: