March 2011
Help on accessing alternative formats, such as Portable Document Format (PDF), Microsoft Word and PowerPoint (PPT) files, can be obtained in the alternate format help section.
The guideline for enteric viruses is a health-based treatment goal of a minimum 4-log reduction (i.e., removal and/or inactivation) of enteric viruses. Depending on the source water quality, a greater log reduction may be required. Methods currently available for the detection of enteric viruses are not applicable for routine monitoring. Treatment technologies and watershed or wellhead protection measures known to reduce the risk of waterborne illness should be implemented and maintained if source water is subject to faecal contamination or if enteric viruses have been responsible for past waterborne outbreaks.
Viruses are extremely small microorganisms that are incapable of replicating outside a host cell. In general, viruses are host specific, which means that viruses that infect animals or plants do not usually infect humans, although a small number of enteric viruses have been detected in both humans and animals. Most viruses also infect only certain types of cells within a host; consequently, the health effects associated with a viral infection vary widely. Viruses that can multiply in the gastrointestinal tract of humans or animals are known as "enteric viruses." There are more than 140 enteric viruses known to infect humans.
Although there are methods capable of detecting and measuring viruses in drinking water, they are not practical for routine monitoring in drinking water because of methodological and interpretation limitations. The microbiological quality of drinking water continues to be verified by relying on the monitoring of indicators, such as Escherichia coli. The presence of E. coli indicates faecal contamination and the potential presence of enteric viruses. However, the absence of E. coli does not necessarily mean that enteric viruses are also absent. Therefore, E. coli monitoring needs to be used as part of a "source-to-tap" multi-barrier approach to protect the quality of drinking water.
Health Canada recently completed its review of the health risks associated with enteric viruses in drinking water. This Guideline Technical Document reviews and assesses identified health risks associated with enteric viruses in drinking water. It evaluates new studies and approaches and takes into consideration the methodological and interpretation limitations in available methods for the detection of viruses in drinking water. From this review, the guideline for enteric viruses in drinking water is established as a treatment goal of a minimum 4-log reduction of enteric viruses.
The human illnesses associated with enteric viruses are diverse. The main health effect associated with enteric viruses is gastrointestinal illness. The incubation time and severity of health effects are dependent on the specific virus responsible for the infection. In addition to gastroenteritis, enteric viruses can cause serious acute illnesses, such as meningitis, poliomyelitus and non-specific febrile illnesses. They have also been implicated in chronic diseases, such as diabetes mellitus and chronic fatigue syndrome.
The seriousness of the health effects from a viral infection will depend on the specific virus, as well as the characteristics of the individual affected (e.g., age, health status). In theory, a single infectious virus particle can cause infection; however, it usually takes more than a single particle. For many enteric viruses, the number of infectious virus particles needed to cause an infection is presumed to be low.
Enteric viruses cannot multiply in the environment, but they can survive longer in water than most intestinal bacteria and are more infectious than most other microorganisms. Enteric viruses are excreted in the faeces of infected individuals, and some enteric viruses can also be excreted in urine.
Enteric viruses have been detected in surface water and groundwater sources. Recent scientific data have also shown the presence of enteric viruses in groundwater that had been considered less vulnerable to faecal contamination.
The multi-barrier approach is the best approach to reduce enteric viruses and other waterborne pathogens in drinking water. For these types of contaminants, the focus should be on characterizing source water risks and ensuring that effective treatment barriers are in place to achieve safe drinking water. Generally, minimum treatment of supplies derived from surface water sources or groundwater under the direct influence of surface waters should include adequate filtration (or technologies providing an equivalent log reduction credit) and disinfection. Recent published information has shown the presence of enteric viruses in some groundwater sources that were considered to be less vulnerable to faecal contamination (i.e., those not under the direct influence of surface waters). As a result, it is recommended to ensure adequate treatment of all groundwaters to remove/inactivate enteric viruses, unless exempted by the responsible authority.
Once the source water quality has been characterized, pathogen removal/inactivation targets and effective treatment barriers can be established in order to achieve safe levels in the finished drinking water. The removal of enteric viruses from raw water is complicated by their small size and relative ease of passage through filtration barriers. However, viruses are effectively inactivated through the application of various disinfection technologies, individually or in combination, at relatively low dosages. In drinking water supplies with a distribution system, a disinfectant residual should be maintained at all times.
Quantitative microbial risk assessment (QMRA) can be used as part of a multi-barrier approach to help provide a better understanding of risk related to a water system. QMRA uses source water quality data, treatment barrier information and pathogen-specific characteristics to estimate the burden of disease associated with exposure to pathogenic microorganisms in a drinking water source. Through this assessment, variations in source water quality and treatment performance can be evaluated for their contribution to the overall risk. Such analysis can be used to assess the adequacy of existing control measures or the requirement for additional treatment barriers or optimization and help establish limits for critical control points.
Specific enteric viruses whose characteristics make them a good representative of all similar pathogenic viruses are considered in QMRA; from these, a reference virus is selected. Ideally, a reference virus will represent a worst-case combination of high occurrence, high concentration and long survival time in source water, low removal and/or inactivation during treatment, and a high pathogenicity for all age groups. If the reference virus is controlled, it is assumed that all other similar viruses of concern are also controlled. Numerous enteric viruses have been considered. As no single virus has all of the characteristics of an ideal reference virus, this risk assessment incorporates key characteristics of rotavirus together with other enteric viruses to better represent all enteric viruses of concern.
Note: Specific guidance related to the implementation of the drinking water guidelines should be obtained from the appropriate drinking water authority in the affected jurisdiction.
Exposure to viruses should be limited by implementing a "source-to-tap" approach to protect the quality of drinking water. This approach includes assessing the entire drinking water system, from the source water through the treatment and distribution systems to the consumer.
Source water assessments should be part of routine vulnerability assessments and/or sanitary surveys. They should include identifying potential sources of faecal contamination in the watershed/aquifer that may impact the quality of the water. Groundwater sources should also be assessed for their vulnerability to contamination. Risk factors may include the type of overlying soil, the land uses surrounding the well and the condition/construction of the well. Sources of human faecal matter, such as sewage treatment plant effluents, sewage lagoon discharges and improperly maintained septic systems, have the potential to be significant sources of human enteric viruses. Faecal matter from wildlife and other animals are not considered a significant source of enteric viruses capable of causing illness in humans, since viruses are generally host specific.
Assessments of water quality need to consider the "worst-case" scenario for that source water. For example, there may be a short period of poor source water quality following a storm. This short-term degradation in water quality may in fact embody most of the risk in a drinking water system. Although routine monitoring of drinking water for enteric viruses is not practical, collecting and analysing source water samples for enteric viruses can provide useful information to help determine the level of treatment that should be in place to reduce the risk of illness from enteric viruses. Source water samples are generally collected at a location that is representative of the quality of the water supplying the drinking water system, such as at the intake of the water treatment plant or, in the case of groundwater, close to the well. In many places, source water sampling for enteric viruses may not be feasible. The potential risk of enteric viruses can be estimated using information from the source water assessment along with the results of other water quality parameters, such as indicator organisms, to provide information on the risk and/or level of faecal contamination in the source water. Because all water quality assessments will have a level of uncertainty associated with them, additional factors of safety during engineering and design of the treatment plant or a greater log reduction than calculated using a QMRA approach should be applied in order to ensure production of drinking water of an acceptable microbiological quality.
The information obtained from source water assessments is a key component to carrying out site-specific risk assessments. This information should be used along with treatment and distribution system information to help assess risks from source to tap. This document suggests the use of QMRA as a tool that can help provide a better understanding of the water system by evaluating the impacts of variations in source water quality and treatment process performance on the overall risk, including the potential impact of hazardous events, such as storms, contamination events or the failure of a treatment barrier. The resulting analysis can be used to assess the adequacy of existing control measures, to determine the need for additional treatment barriers or for optimization and to help establish limits for critical control points.
A minimum 4-log reduction of enteric viruses is recommended for all water sources, including groundwater sources. Recent published information has shown the presence of enteric viruses in some groundwaters sources that were considered to be less vulnerable to faecal contamination. A jurisdiction may allow a groundwater source to have less than the recommended minimum 4-log reduction if the assessment of the drinking water system has confirmed that the risk of enteric virus presence is minimal. In many source waters, particularly surface water sources, a greater than 4-log reduction is necessary.
Reductions can be achieved through physical removal processes, such as filtration, and/or by inactivation processes, such as disinfection. Generally, minimum treatment of supplies derived from surface water sources or groundwater under the direct influence of surface waters should include adequate filtration (or technologies providing an equivalent log reduction credit) and disinfection. For groundwater sources (i.e., those not under the direct influence of surface waters), it is recommended to ensure adequate treatment to remove/inactivate enteric viruses, unless exempted by the responsible authority. The appropriate type and level of treatment should take into account the potential fluctuations in water quality, including short-term water quality degradation, and variability in treatment performance. Pilot testing or other optimization processes may be useful for determining treatment variability. In systems with a distribution system, a disinfectant residual should be maintained at all times.
As part of the multi-barrier approach, indicators that can be routinely monitored, including turbidity, chlorine residual and organisms such as E. coli and total coliforms, should be used to verify that the water has been adequately treated and is therefore of an acceptable microbiological quality (see Guideline Technical Documents on E. coli, total coliforms, and turbidity). These indicators can also be used for assessing the distribution system and to verify that the microbiological quality of the water is being maintained through the distribution system to the consumer's tap.
Viruses range in size from 20 to 350 nm. They consist of a nucleic acid genome core (either ribonucleic acid [RNA] or deoxyribonucleic acid [DNA]) surrounded by a protective protein shell, called the capsid. Some viruses have a lipoprotein envelope surrounding the capsid; these are referred to as enveloped viruses. Non-enveloped viruses lack this lipoprotein envelope. Viruses can replicate only within a living host cell. Although the viral genome does encode for viral structural proteins and other molecules necessary for replication, viruses must rely on the host's cell metabolism to synthesize these molecules.
Viral replication in the host cells results in the production of infective virions and numerous incomplete particles that are non-infectious (Payment and Morin, 1990). The ratio between physical virus particles and the actual number of infective virions ranges from 10:1 to over 1000:1. In the context of waterborne diseases, a "virus" is thus defined as an infectious "complete virus particle," or "virion," with its DNA or RNA core and protein coat as it exists outside the cell. This would be the simplest form in which a virus can infect a host. Infective virions released in the environment will degrade and lose their infectivity. They can still be seen by electron microscopy or detected by molecular methods, but will have lost their potential for infection.
In general, viruses are host specific. Therefore, viruses that infect humans do not usually infect non-human hosts, such as animals or plants. The reverse is also true: viruses that infect animals and viruses that infect plants do not usually infect humans, although a small number of enteric viruses have been detected in both humans and animals. Most viruses also infect only specific types of cells within a host. The types of susceptible cells are dependent on the virus, and consequently the health effects associated with a viral infection vary widely, depending on where susceptible cells are located in the body. In addition, viral infection can trigger immune responses that result in non-specific symptoms. Viruses that can multiply in the gastrointestinal tract of humans or animals are known as "enteric viruses." Enteric viruses are excreted in the faeces of infected individuals, and some enteric viruses can also be excreted in urine. These excreta can contaminate water sources. Non-enteric viruses, such as respiratory viruses, are not considered waterborne pathogens, as non-enteric viruses are not readily transmitted to water sources from infected individuals.
More than 140 different serological types of enteric viruses known to infect humans have been described (AWWA, 1999a; Taylor et al., 2001). The illnesses associated with enteric viruses are diverse. In addition to gastroenteritis, enteric viruses can cause serious acute illnesses, such as meningitis, poliomyelitis and non-specific febrile illnesses. They have also been implicated in the aetiology of some chronic diseases, such as diabetes mellitus and chronic fatigue syndrome. Further information on the enteric viruses commonly associated with human waterborne illnesses, including noroviruses, hepatitis A virus (HAV), hepatitis E virus (HEV), rotaviruses and enteroviruses, and on other viruses of potential concern is included below.
Noroviruses are non-enveloped, single-stranded RNA viruses, 27-32 nm in diameter, belonging to the family Caliciviridae. Noroviruses are currently subdivided into five genogroups (GI, GII, GIII, GIV, GV), which are composed of 22 distinct genotypes. However, new norovirus variants continue to be identified (Jiang et al., 1999). Genogroups GI and GII contain the norovirus genotypes that are usually associated with human illnesses. For example, Norwalk virus is included in the GI genogroup. Genogroup GIV has also been associated with human illness, but much less frequently than the GI and GII genogroups (Bon et al., 2005). Although most noroviruses appear to be host specific, there is some recent research showing human norovirus GII variants isolated from farm animals (Mattison et al., 2007). Other genogroups, such as GIII and GV, have been detected only in non-human hosts (Vinje et al., 2004).
Norovirus infections occur in infants, children and adults. The incubation period is 24-48 h (Kapikian et al., 1996; Chin, 2000). Health effects associated with norovirus infections are self-limiting, typically lasting 24-48 h. Symptoms include nausea, vomiting, diarrhoea, abdominal pain and fever. In healthy individuals, the symptoms are generally highly unpleasant but are not considered life threatening. In vulnerable groups, such as the elderly, illness is considered more serious. Theoretically, a single infectious virus particle is sufficient to cause infection, with or without disease symptoms. However, the median dose required to initiate infection is usually more than a single infectious particle. For noroviruses, this dose is unknown, but presumed to be low. Immunity to norovirus infection seems to be short-lived, on the order of several months. After this period, individuals appear to become susceptible to the same strain of virus again (Parrino et al., 1977). There is some research showing an inherent resistance in some individuals to infection with noroviruses. It is thought that these individuals may lack a cell surface receptor necessary for virus binding or may have a memory immune response that prevents infection (Hutson et al., 2003; Lindesmith et al., 2003; Cheetham et al., 2007).
Noroviruses are shed in both faecal matter and vomitus from infected individuals and can be transmitted through contaminated water. They are also easily spread by person-to-person contact. Many of the cases of norovirus gastroenteritis have been associated with groups of people living in a close environment, such as schools, recreational camps, institutions and cruise ships. Infections with noroviruses show seasonality, with a peak in norovirus infections most common during winter months (Mounts et al., 2000; Haramoto et al., 2005; Maunula et al., 2005; Westrell et al., 2006b).
To date, six types of hepatitis viruses have been identified (A, B, C, D, E and G), but only two types, hepatitis A (HAV) and hepatitis E (HEV), appear to be transmitted via the faecal-oral route and therefore associated with waterborne transmission. Although HAV and HEV can both result in the development of hepatitis, they are two distinct viruses.
HAV is a 27- to 32-nm non-enveloped, small, single-stranded RNA virus with an icosahedral symmetry. HAV belongs to the Picornaviridae family and was originally placed within the Enterovirus genus; however, because HAV has some unique genetic structural and replication properties, this virus has been placed into a new genus, Hepatovirus, of which it is the only member (Carter, 2005).
HAV infections, commonly known as infectious hepatitis, result in numerous symptoms, including fever, malaise (fatigue), anorexia, nausea and abdominal discomfort, followed within a few days by jaundice. HAV infection can also cause liver damage, resulting from the host's immune response to the infection of the hepatocytes by HAV. In some cases, the liver damage can result in death. The incubation period of HAV infection is between 10 and 50 days, with an average of approximately 28-30 days. The incubation period is inversely related to dose: the greater the dose, the shorter the incubation period (Hollinger and Emerson, 2007). Theoretically, a single infectious virus particle is sufficient to cause infection, with or without disease symptoms. However, the median dose required to initiate infection is generally more than a single infectious particle. The median dose for HAV is unknown, but is presumed to be low.
Infection with HAV occurs in both children and adults. Illness resulting from HAV infection is usually self-limiting; however, the severity of the illness increases with age. For example, minimal or no symptoms are seen in younger children (Yayli et al., 2002); however, in a study looking at HAV cases in persons over 50 years of age, a case fatality rate 6-fold higher than the average rate (average rate 0.3%) was observed (Fiore, 2004). The virus is excreted in the faeces of infected persons for 3-10 days before the development of hepatitis symptoms, leading to transmission via the faecal-oral route (Chin, 2000; Hollinger and Emerson, 2007). HAV is also excreted in the urine of infected individuals (Giles et al., 1964; Hollinger and Emerson, 2007). Convalescence may be prolonged (8-10 weeks), and in some HAV cases, individuals may experience relapses for 12 months or more (Carter, 2005).
HEV is a non-enveloped virus with a diameter of 30-34 nm and a single-stranded polyadenylated RNA genome. HEV was initially placed in the family Caliciviridae; however, based on its genomic organization and enzymatic capabilities, it has since been moved out of this family and currently is not assigned to a family. However, it does belong to the genus Hepevirus (Fauquet et al., 2005).
HEV infection, previously referred to as enterically transmitted non-A non-B hepatitis, is clinically indistinguishable from HAV infection. Symptoms include malaise, anorexia, abdominal pain, arthralgia, fever and jaundice. Theoretically, a single virus infectious particle is sufficient to cause infection, with or without disease symptoms. However, the median dose required to initiate infection is usually more than a single infectious particle. The median dose for HEV is unknown. The incubation period for HEV varies from 14 to 63 days. HEV infection usually resolves in 1-6 weeks after onset. Virions are shed in the faeces for a week or more after the onset of symptoms (Percival et al., 2004). The illness is most often reported in young to middle-aged adults (15-40 years old). The fatality rate is 0.5-3%, except in pregnant women, for whom the fatality rate can approach 20-25% (Matson, 2004). Illnesses associated with HEV are rare in developed countries, with most infections being linked to international travel. Although most human enteric viruses do not have non-human reservoirs, HEV has been reported to be zoonotic (transmitted from animals to humans, with non-human natural reservoirs) (AWWA, 1999a; Meng et al., 1999; Wu et al., 2000; Halbur et al., 2001; Smith et al., 2002).
Rotaviruses are non-enveloped, double-stranded RNA viruses approximately 70 nm in diameter, belonging to the family Reoviridae. These viruses have been divided into six serological groups, three of which (groups A, B and C) infect humans. Group A rotaviruses are further divided into serotypes using characteristics of their outer surface proteins, VP7 and VP4. There are 14 types of VP7 (termed G types) and approximately 20 types of VP4 (P types), generating great antigenic diversity (Carter, 2005). Although most rotaviruses appear to be host specific, there is some research indicating the potential for zoonotic transmission of rotaviruses (Cook et al., 2004; Kang et al., 2005; Gabbay et al., 2008; Steyer et al., 2008); however, it occurs infrequently.
In general, rotaviruses cause gastroenteritis, including vomiting and diarrhoea. Vomiting can occur for up to 48 h prior to the onset of diarrhoea. The severity of the gastroenteritis can range from mild, lasting for less than 24 h, to, in some instances, severe, which can be fatal. In young children, extra-intestinal manifestations, such as respiratory symptoms and seizures can occur and are due to the infection being systemic rather than localized to the jejunal mucosa (Candy, 2007). The incubation period is about 4-7 days (Carter, 2005). The illness generally lasts between 5 and 8 days. Theoretically, a single infectious virus particle is capable of causing infection, although more than one infectious virus particle is generally required. The median infectious dose for rotavirus is 5.597 (Haas et al., 1999). The virus is shed in extremely high numbers from infected individuals, possibly as high as 109/g of stool. Some rotaviruses may also produce a toxin protein that can induce diarrhoea during virus cell contact (Ball et al., 1996; Zhang et al., 2000). This is unusual, as most viruses do not have toxin-like effects.
Group A rotavirus is endemic worldwide and is the most common and widespread rotavirus group. Infections are referred to as infantile diarrhoea, winter diarrhoea, acute non-bacterial infectious gastroenteritis and acute viral gastroenteritis. Children 6 months to 2 years of age, premature infants, the elderly and the immunocompromised are particularly prone to more severe symptoms caused by infection with group A rotavirus. Group A rotavirus is the leading cause of severe diarrhoea among infants and children and accounts for about half of the cases requiring hospitalization, usually from dehydration. In the United States, approximately 3.5 million cases occur each year (Glass et al., 1996). Asymptomatic infections can occur in adults, providing another means for the virus to be spread in the community. In temperate areas, illness associated with rotavirus occurs primarily in the cooler months, whereas in the tropics, it occurs throughout the year (Moe and Shirley, 1982; Nakajima et al., 2001; Estes and Kapikian, 2007). Illness associated with group B rotavirus, also called adult diarrhoea rotavirus, has been limited mainly to China, where outbreaks of severe diarrhoea affecting thousands of persons have been reported (Ramachandran et al., 1998). Group C rotavirus has been associated with rare and sporadic cases of diarrhoea in children in many countries and regions, including North America (Jiang et al., 1995). The first reported outbreaks occurred in Japan and England (Caul et al., 1990; Hamano et al., 1999).
The enteroviruses are a large group of viruses belonging to the genus Enterovirus and the Picornaviridae family. They are 20- to 30-nm non-enveloped, single-stranded RNA viruses with an icosahedral symmetry. The members of this group that are associated with illness in humans include the polioviruses (3 serotypes), coxsackieviruses A (23 serotypes) and B (6 serotypes), echoviruses (31 serotypes) and numerous ungrouped enteroviruses (types 68-91) (Nwachuku and Gerba, 2006). Further enterovirus serotypes continue to be identified.
The incubation period and the health effects associated with enterovirus infections are varied. The incubation period for enteroviruses ranges from 2 to 35 days, with a median of 7-14 days. Many enterovirus infections are asymptomatic. However, when symptoms are present, they can range in severity from mild to life threatening. Viraemia (i.e., passage in the bloodstream) often occurs, providing transport for enteroviruses to various target organs and resulting in a range of symptoms. Mild symptoms include fever, malaise, sore throat, vomiting, rash and upper respiratory tract illnesses. Acute gastroenteritis is less common. The most serious complications include meningitis, encephalitis, poliomyelitis, myocarditis and non-specific febrile illnesses of newborns and young infants (Rotbart, 1995; Roivainen et al., 1998). Other complications include myalgia, Guillain-Barré syndrome, hepatitis and conjunctivitis. Enteroviruses have also been implicated in the aetiology of chronic diseases, such as inflammatory myositis, dilated cardiomyopathy, amyotrophic lateral sclerosis, chronic fatigue syndrome and post-poliomyelitis muscular atrophy (Pallansch and Roos, 2007; Chia and Chia, 2008). There is also some work that supports a link between enterovirus infection and the development of insulin-dependent diabetes mellitus (Nairn et al., 1999; Lönnrot et al., 2000). Although many enterovirus infections are asymptomatic, it is estimated that approximately 50% of coxsackievirus A infections and 80% of coxsackievirus B infections result in illness (Cherry, 1992). Coxsackievirus B has also been reported to be the non-polio enterovirus that has most often been associated with serious illness (Mena et al., 2003). Enterovirus infections are reported to peak in summer and early fall (Nwachuku and Gerba, 2006; Pallansch and Roos, 2007).
Enteroviruses are endemic worldwide, but few water-related outbreaks have been reported. The large number of serotypes, the usually benign nature of the infections and the fact that they are highly transmissible in a community by personal contact probably explain why so little is known of their transmission by the water route (Field et al., 1968; Lenaway et al., 1989; Ikeda et al., 1993; Kee et al., 1994; Melnick, 1996; Jaykus, 2000; Lees, 2000; Amvrosieva et al., 2001; Mena et al., 2003).
Adenoviruses are members of the Adenoviridae family. Members of this family include 70- to 100-nm non-enveloped icosahedral viruses containing double-stranded DNA. At present, there are 51 serotypes of adenoviruses; about 30% of these are pathogenic in humans, most causing upper respiratory tract infections (Carter, 2005; Wold and Horwitz, 2007). Serotypes 40 and 41 are the cause of the majority of adenovirus-related gastroenteritis. The majority of waterborne isolates are types 40 and 41; however, other serotypes have also been isolated (Van Heerden et al., 2005). Symptoms of adenovirus gastroenteritis include diarrhoea and vomiting. The incubation period lasts 3-10 days, and illness may last a week (Carter, 2005).
Adenoviruses are a common cause of acute viral gastroenteritis in children (Nwachuku and Gerba, 2006). Infections are generally confined to children under 5 years of age (FSA, 2000; Lennon et al., 2007) and are rare in adults. The viral load in faeces of infected individuals is high (~106 particles/g of faecal matter) (Jiang, 2006). This aids in transmission via the faecal-oral route, either through direct contact with contaminated objects or through recreational water and, potentially, drinking water. In the past, adenoviruses have been implicated in drinking water outbreaks, although they were not the main cause of the outbreaks (Kukkula et al., 1997; Divizia et al., 2004). The main route of exposure to adenoviruses is not through drinking water.
Astroviruses are members of the Astroviridae family. Astroviruses are divided into eight serotypes (HAst1-8) that comprise two genogroups (A and B) capable of infecting humans (Carter, 2005). Members of this family include 28- to 30-nm non-enveloped viruses containing a single-stranded RNA. Astrovirus infection typically results in diarrhoea lasting 2-3 days, with an initial incubation period of anywhere from 1 to 4 days. Infection generally results in milder diarrhoea than that caused by rotavirus and does not lead to significant dehydration. Other symptoms that have been recorded as a result of astrovirus infection include headache, malaise, nausea, vomiting and mild fever (Percival et al., 2004; Méndez and Arias, 2007). Serotypes 1 and 2 are commonly acquired during childhood (Palombo and Bishop, 1996). The other serotypes (4 and above) may not occur until adulthood (Carter, 2005). Outbreaks of astrovirus in adults are infrequent, but do occur (Gray et al., 1987; Oishi et al., 1994; Caul, 1996). Healthy individuals generally acquire good immunity to the disease, so reinfection is rare. Astrovirus infections generally peak during winter and spring (Gofti-Laroche et al., 2003).
Coronaviruses are members of the Coronaviridae family. Coronaviruses are enveloped, single-strand RNA viruses. Coronaviruses are primarily respiratory pathogens; they are a frequent cause of the common cold in both children and adults (McIntosh, 1970; Mäkelã, 1998). In the past, coronaviruses have not been a concern for waterborne transmission. However, a new coronavirus, the causative agent of severe acute respiratory syndrome (SARS), has been detected in faeces of infected patients. In one location during the SARS epidemic, sewage was suspected as the vehicle of transmission (WHO, 2003). Although the SARS coronavirus is potentially spread through the faecal-oral route, its major mode of transmission is person-to-person contact through respiratory secretions. However, further research is still needed to understand the persistence of this virus in the environment and, consequently, its potential transmission through a waterborne route.
Other viruses that have the potential to be transmitted by the faecal-oral route include parvoviruses, TT virus and JC virus. These viruses have all been detected in sewage (Vaidya et al., 2002; Bofill-Mass and Girones, 2003; AWWA, 2006). JC virus is also excreted in urine. It is important to note that new enteric viruses continue to be detected and recognized. These viruses have been associated with illnesses in immunocompromised individuals, such as gastroenteritis, respiratory illnesses and other more serious diseases, including progressive multifocal leukoencephalopathy and colon cancer (AWWA, 2006). Although these viruses have been detected in sewage, their transmission through water has not been documented.
The main source of human enteric viruses in water is human faecal matter. Enteric viruses are excreted in large numbers in the faeces of infected persons (both symptomatic and asymptomatic). They are easily disseminated in the environment through faeces and are transmissible to other individuals via the faecal-oral route. Infected individuals can excrete over 1 billion (109) viruses/g of faeces. Some enteric viruses can also be excreted in urine from infected individuals. The presence of these viruses in a human population is variable and reflects current epidemic and endemic conditions (Fields et al., 1996). Sewage plant effluents, sewage lagoon discharges, combined sewer overflows and septic tank leakage can be responsible for contamination of water sources. Enteric virus concentrations have been reported to peak in sewage samples during the autumn/winter, suggesting a possibly higher endemic rate of illness during this time of year or better survival of enteric viruses at cold temperatures. Animals can be a source of enteric viruses; however, the enteric viruses detected in animals generally do not cause illnesses in humans (Cox et al., 2005), although there are some exceptions. As mentioned above, one exception is HEV, which may have a non-human reservoir. To date, HEV has been an issue in developing countries, and therefore most of the information on HEV occurrence in water sources results from research in these countries. There is limited information on HEV presence in water and sewage in developed countries (Clemente-Casares et al., 2003; Kasorndorkbua et al., 2005). It is important to remember that person-to-person and foodborne spread are also important mechanisms for transmission of enteric viruses.
Most of the enteric viruses described above, including noroviruses, rotaviruses, HAV, HEV, enteroviruses, adenoviruses and astroviruses, have been detected in sewage, surface water sources, groundwater sources and drinking water sources around the world, including Canada (Subrahmanyan, 1977; Sattar, 1978; Sekla et al., 1980; Payment et al., 1984, 2000, 2001; Gerba et al., 1985; Raphael et al., 1985a,b; Payment, 1989, 1991, 1993; Bloch et al., 1990; Payment and Franco, 1993; Pina et al., 1998, 2001; AWWA, 1999a; Jothikumar et al., 2000; Scipioni et al., 2000; Van Heerden et al., 2005; Locas et al., 2007). These studies report varying prevalence and concentrations of enteric viruses; however, they cannot be readily compared, given the range of detection methods used (Payment and Pintar, 2006). In general, the level of infectious enteric virions in sewage ranges from 100 to 10 000 infectious units/L (Sano et al., 2004; Sedmak et al., 2005; Geldreich et al., 1990). In contaminated surface water, levels of 1-100 infectious enteric virions/L are common. In less polluted surface water, their numbers are closer to 1-10/100 L (Gerba et al., 1985; Bloch et al., 1990; AWWA, 1999a; Jothikumar et al., 2000; Scipioni et al., 2000; Pina et al., 2001; Dorner et al., 2007). Groundwater sources have been shown to have between 0 and 200 infectious enteric virions/100 L, depending on the level of contamination; however, most contaminated groundwater systems are thought to have very low levels (< 2/100 L) (U.S. EPA, 2006a). These concentrations were generally obtained through targeted studies, since water and wastewater sources are not routinely monitored for enteric viruses. It should also be noted that the concentration of enteric viruses in a source water can have significant temporal and spatial variability depending on whether the pollution source is continuous or the result of a sudden influx of faecal contamination.
Contamination of water sources can occur through various routes, including wastewater plant effluent, disposal of domestic wastewater or sludges on land, septic tank field effluents and infiltration of surface water into groundwater aquifers (Bitton, 1999; Hurst et al., 2001). Migration of enteric viruses into groundwater sources depends on the extent of virus retention in the surrounding soils and their survival rate. For example, research on retention of particles based on subsurface composition has shown that viruses tend to adsorb more strongly to clay than to silt and sand particulate (Goyal and Gerba, 1979). It is important to note that adsorption of viruses in the subsurface does not inactivate viruses, and adsorption is a reversible process, dependent on the ionic characteristics of the percolating water (Bales et al., 1993). Therefore, retained infectious viruses, if desorbed from soil, can potentially still contaminate water sources. Many factors, including rainfall, temperature, hydraulic stresses and soil-specific characteristics, such as pH and soil water content, along with virus-specific attributes, such as isoelectric point, virus size and virus load, can impact subsurface movement of viruses (Schijven and Hassanizadeh, 2000; U.S. EPA, 2003). Under certain conditions, viruses can migrate significant distances. Studies have reported viruses being detected in groundwater samples more than 100 m from known septic sources and in groundwaters from confined aquifers (Gerba and Bitton, 1984; Bales et al., 1993; Borchardt et al., 2007; Locas et al., 2007).
As mentioned previously, viruses cannot replicate outside their host's tissues and therefore cannot multiply in the environment; however, they can survive in the environment for extended periods of time. Early experiments investigating the survival of enteric viruses reported survival times ranging from 28 to 188 days (Rhodes et al., 1950; Wellings et al., 1975; Stramer and Cliver, 1984).
Survival depends on numerous factors, including, but not limited to, virus-specific characteristics, the presence of other microorganisms and the characteristics of the water, such as pH, temperature, turbidity and ultraviolet (UV) light levels. Some of these factors have been characterized. Temperature effects on survival rates have been defined for many enteric viruses (Yates et al., 1985; U.S. EPA, 2003). In general, as the temperature increases, the survival time decreases. Exposure to UV light also shortens the survival time of viruses. Other parameters, such as microbial activity, are less well characterized. It has been suggested that bacteria and protozoa can inactivate waterborne viruses, especially in surface waters. Inactivation may be the result of enzymatic activity destroying viral capsid proteins or predation (Herrmann and Cliver, 1973; Pinheiro et al., 2007). In either case, the amount of inactivation is dependent on the microbial ecology and is currently not well understood. Conversely, survival of viruses can be prolonged by factors such as the presence of sediments to which viruses can readily adsorb.
The survival rate varies between types of viruses. Enteric viruses of concern for waterborne transmission are generally non-enveloped viruses. They are generally more resistant to environmental degradation compared with enveloped viruses. Comparisons between enteric viruses also show variability, with adenoviruses potentially surviving longer in water than other enteric viruses, such as HAV and polioviruses (Enriquez et al., 1995).
Enteric virus survival rates also differ from survival rates for protozoa and bacteria. In the environment, enteric viruses have been reported to be more resistant to environmental degradation than bacteria and some protozoa (e.g., Giardia) (Johnson et al., 1997). Survival of viruses through drinking water treatment processes also differs from survival of bacteria and protozoa. For example, enteric viruses have been detected at low levels (i.e., 1-20/1000 L) in treated drinking water free of coliform bacteria in 100-mL samples (Payment, 1989; Gerba and Rose, 1990; Bitton, 1999; Payment et al., 2000; Ehlers et al., 2005). The survival of enteric viruses, and consequently presence in drinking water, can result from the absence of treatment (for many groundwater sources) or from insufficient treatment for the level of virus in the source water (Payment, 1989; Payment and Armon, 1989; Gerba and Rose, 1990; Payment et al., 1997; Bitton, 1999).
Enteric viruses are transmitted via the faecal-oral route. Vehicles for transmission can include water, food (particularly shellfish and salads), aerosols, fomites (inanimate objects, such as door handles that, when contaminated with an infectious virion, facilitate transfer of the pathogen to a host) and person-to-person contact. Poor hygiene is also a contributing factor to the spread of enteric viruses. In addition, the high incidence of rotavirus infections, particularly in young children, has suggested to some investigators that rotavirus may also be spread by the respiratory route (Kapikian and Chanock, 1996; Chin, 2000). There is also some evidence that noroviruses can be spread by contact with vomitus (Marks et al., 2003). For many of the enteric viruses discussed above, outbreaks have occurred both by person-to-person transmission and by common sources, involving contaminated foods, contaminated drinking water supplies or recreational water.
Exposure to enteric viruses through water can result in both an endemic rate of illness in the population and waterborne outbreaks. Endemic rates of enteric illness are difficult to measure or estimate. In Canada, there are roughly 1.3 episodes of enteric illness per capita per year (Majowicz et al., 2004; PHAC, 2007). This estimate includes gastrointestinal illnesses caused by all types of enteric pathogens, not just enteric viruses, and includes all sources of transmission. The numerous routes of transmission and the highly infectious nature of enteric viruses make it difficult to determine what proportion of the endemic enteric illness is specifically related to drinking water sources.
Waterborne outbreaks caused by enteric viruses have been reported in Canada, and these viruses are a common cause of outbreaks worldwide. Some of the viral agents responsible for these outbreaks have only recently been identified (Craun, 1986, 1992; Fields et al., 1996; Payment and Hunter, 2001). The true prevalence of viral-related waterborne outbreaks in Canada and worldwide is unknown. In Canada, between 1974 and 2001, there were 24 reported outbreaks and 1382 confirmed cases of waterborne illness caused by enteric viruses (Schuster et al., 2005). Ten of these outbreaks were attributed to HAV, 12 were attributed to noroviruses and 2 to rotaviruses (O'Neil et al., 1985; Health and Welfare Canada, 1990; Health Canada, 1994; INSPQ, 1994, 1998, 2001; Boettger, 1995; Health Canada, 1996; Beller et al., 1997; Todd and Chatman, 1997; Todd and Chatman, 1998; De Serres et al., 1999; Todd et al., 2001; BC Provincial Health Officer, 2001). There were also 138 outbreaks of unknown aetiology, a portion of which could be the result of enteric viruses, and a single outbreak that involved multiple viral pathogens. Of the 10 reported outbreaks attributed to waterborne HAV, 4 were due to contamination of public drinking water supplies, 2 were the result of contamination of semi-public suppliesFootnote 1 and the remaining 4 were due to contamination of private water supplies. Only 4 of the reported 12 waterborne outbreaks of norovirus infections in Canada occurred in public water supplies, and the remainder were attributed to semi-public supplies. Both rotavirus outbreaks arose from contamination of semi-public drinking water supplies.
In the United States, between 1991 and 2002, 15 outbreaks and 3487 confirmed cases of waterborne viral illness were reported. Of these, 12 outbreaks and 3361 cases were attributed to noroviruses, 1 outbreak and 70 cases were attributed to "small round-structured virus" and 2 outbreaks and 56 cases were attributable to HAV (Craun et al., 2006). During this period, 77 outbreaks resulting in 16 036 cases of unknown aetiology were also reported. It is likely that enteric viruses were responsible for a significant portion of these outbreaks (Craun et al., 2006). Prior to 1991, outbreaks associated with rotavirus contamination had also been reported (Hopkins et al., 1984).
Waterborne outbreaks of noroviruses and HAV occur worldwide (Brugha et al., 1999; De Serres et al., 1999; Brown et al., 2001; Boccia et al., 2002; Anderson et al., 2003;
Carrique-Mas et al., 2003). A study on waterborne outbreaks in Finland (1998-2003) found, for the samples analysed for viruses, that the most prominent virus was norovirus (Maunula et al., 2005). Groundwater sources are also frequently reported to be associated with outbreaks of noroviruses and HAV (Häfliger et al., 2000; Maurer and Stürchler, 2000; Parchionikar et al., 2003).
Major waterborne epidemics of HEV have occurred in developing countries (Guthmann et al., 2006), but none has been reported in Canada or the United States (Purcell, 1996; Chin, 2000). Astroviruses and adenoviruses have also been implicated in drinking water outbreaks, although they were not the main cause of the outbreaks (Kukkula et al., 1997; Divizia et al., 2004). However, astrovirus RNA in tap water was correlated with an increased risk of intestinal disease in a study in France (Gofti-Laroche et al., 2003). The development of new detection methods to determine the agent responsible in the numerous outbreaks of unidentified aetiology could potentially link these viruses to outbreaks (Martone et al., 1980; Turner et al., 1987; Hedberg and Osterholm, 1993; Gray et al., 1997; Kukkula et al., 1997, 1999; Lees, 2000).
Monitoring for enteric viruses still suffers from methodological and interpretation limitations inherent to pathogen detection (Medema et al., 2003; Payment and Pintar, 2006). These limitations include the necessity to concentrate large volumes of water, the need for specialized laboratory equipment and highly trained personnel and the cost of analysis, as well as determining which pathogens to test for, given the multitude of pathogens that may be present, which can vary over time and space. As such, routine monitoring for enteric viruses is currently not practical. Instead, indicator organisms that can be routinely monitored are used to indicate faecal contamination and the potential presence of enteric viruses. Commonly used indicators include bacteria, such as E. coli, enterococci and Clostridium perfringens spores, and viruses of bacteria (i.e., bacteriophages). Total coliforms can also be used, not as indicators of faecal pollution, but to provide general water quality information, especially in groundwater sources (Locas et al., 2007). Non-microbial indicators, such as faecal sterols, caffeine or chloride, have also been used in research studies to indicate faecal contamination (Borchardt et al., 2003; Peeler et al., 2006; Shah et al., 2007; Hussain et al., 2010), however further research on routinely using these indicators is still needed.
There are many published studies investigating the relationship between the various indicator organisms and the presence of enteric viruses in treated drinking water, surface water and groundwater. Determining a relationship between pathogens and faecal indicators has some inherent difficulties, foremost because of methodological differences related to the volumes of water analysed. For indicator organisms, usually 100-mL sample volumes are tested, whereas for pathogens, tens to hundreds of litres of water are concentrated, and then a portion of this volume is analysed. Even with these limitations, faecal indicators have been found to be useful for indicating the potential presence of enteric viruses in various water sources. The most appropriate indicator (or indicators) will depend on whether they are being used to provide information on virus presence in groundwater or surface water sources or as indicators of removal or inactivation of viruses by treatment processes and treated drinking water quality.
The indicator organisms routinely monitored in Canada as part of the multi-barrier "source-to-tap" approach for verifying drinking water quality are E. coli and total coliforms. The presence of E. coli in drinking water indicates recent faecal contamination and the potential presence of enteric pathogens, including enteric viruses. Total coliforms, however, are not faecal specific and therefore cannot be used to indicate faecal contamination (or the potential presence of enteric pathogens). Instead, total coliforms are used to indicate general water quality issues. Further information on the role of E. coli and total coliforms in drinking water quality management can be found in the Guideline Technical Documents on E. coli and total coliforms (Health Canada, 2006a,b). As mentioned previously, the survival of bacteria and viruses differs in the environment and through treatment processes. As a result, although the presence of E. coli indicates the potential presence of enteric viruses, the absence of E. coli does not necessarily indicate that all enteric viruses are also absent. However, if a multi-barrier, source-to-tap approach is in place and each barrier in the drinking water system has been controlled to ensure that it is operating adequately based on the quality of the source water, then E. coli and total coliforms can be used as part of the verification process to show that the water has been adequately treated and is therefore of an acceptable microbiological quality.
Other faecal indicators that may be used to verify the adequacy of treatment include enterococci, Clostridium perfringens spores and various bacteriophages (somatic coliphages, F-specific RNA coliphages and phages of Bacteroides). As enterococci and Clostridium perfringens spores are both bacterial indicators, they suffer from the same limitations as E. coli, in that their survival and response to treatment processes differ from those of enteric viruses. The concentration of these indicators in source waters can also be much lower than the concentrations of E. coli and total coliforms, making them less useful than E. coli and total coliforms for routinely verifying treatment processes. Bacteriophages, since they are viruses (of bacteria), have survival rates that are more similar to those of enteric viruses, and they are often used as surrogates for enteric viruses for determining treatment efficiencies. However, their concentrations in source waters are generally insufficient to make them useful for verifying treatment adequacy on a routine basis.
Several studies have investigated the relationships between indicator organisms and the presence or absence of human enteric viruses in surface water sources. Escherichia coli and Clostridium perfringens have been found to be associated with the presence of enteric viruses in surface waters that are impacted by human faecal pollution (Payment and Franco, 1993; Payment et al., 2000; Ashbolt et al., 2001; Hörman et al., 2004). Bacteriophages have also been found to be associated with the presence of enteric viruses in some studies (Skraber et al., 2004; Ballester et al., 2005; Haramoto et al., 2005), but not in others (Hörman et al., 2004; Choi and Jiang, 2005). Both correlations to coliform bacteria (Haramoto et al., 2005) and lack of correlations to coliform bacteria (Skraber et al., 2004; Ballester et al., 2005; Choi and Jiang, 2005) have also been observed. Based on these studies, it is evident that no one faecal indicator can be used to indicate enteric virus presence in all surface water sources. The most suitable indicator or indicators will depend on the surface water source and its site-specific faecal pollution inputs. Although not used for routine monitoring, targeted studies can also be carried out to determine enteric virus concentrations directly, as opposed to using indicator organisms.
Microbial indicators normally used to indicate faecal contamination of a water source, such as E. coli, do not necessarily migrate through the subsurface or have a survival rate in the environment comparable with that of enteric viruses. Several recent studies have investigated the usefulness of E. coli and total coliforms for indicating enteric virus contamination of groundwater sources.
The presence of any of the indicator organisms in a groundwater source is a good indication that the source may be at risk of faecal contamination and potentially adversely affect human health. In a study by Craun et al. (1997) on groundwater consumption, it was found that the presence of coliforms correlated very well with the presence of viral gastroenteritis. However, the absence of indicators may not necessarily indicate the absence of enteric viruses. A study of private wells in the United States found that 8% of the wells tested by PCR were positive for one or more enteric viruses, however, none of the contaminated wells contained indicators of faecal contamination (i.e., E. coli, enterococci, coliphages) and only 25% of the virus impacted wells were positive for total coliforms (Borchardt et al., 2003). Several other studies conducted in the United States have also reported that they found no link between the detection of an indicator organism and the detection of enteric viruses in a groundwater sample (Abbaszadegan et al., 1998, 2003; Borchardt et al., 2004) with approximately 15 % of samples testing positive for enteric viruses in the absence of indicators (Abbaszadegan et al., 2003). However, some studies observed that upon repeat sampling, if a site tested positive for pathogens it usually tested positive, at some point in time, for one of the microbial indicators (Lieberman et al., 2002; Abbaszadegan et al., 2003). A study on Canadian groundwater quality that monitored 23 municipal wells with a history of acceptable bacteriological quality found that wells that underwent repeat monitoring (122 samples collected from 16 wells) tested positive in a small number of samples (7/122 samples). Indicator organisms were detected in 4 of the 16 wells, while enteric viruses were detected in only 1 of the 16 wells. However, the positive site for enteric viruses was not one of the sites positive for indicator organisms (Locas et al., 2008). An additional study investigating several groundwater aquifers in various countries determined that using a combination of a bacterial indicator and a bacteriophage was more useful for assessing groundwater contamination than using only bacterial indicators (Lucena et al., 2006). Based on these studies, ongoing routine monitoring with bacterial indicators in combination with bacteriophages (in some instances), and using data collected from sanitary surveys and vulnerability assessments, can be used to provide a useful assessment of enteric virus presence in groundwater sources.
Pathogen detection still suffers from methodological and interpretation limitations (Payment and Pintar, 2006). These limitations include the necessity to concentrate large volumes of water, the need for specialized laboratory equipment and highly trained personnel and the cost of analysis, as well as determining which pathogens to test for, given the multitude of pathogens that may be present, which can vary over time and space. Therefore, routine monitoring of drinking water for enteric viruses is currently not practical.
Although not used for routine monitoring purposes, detection of enteric viruses in source water samples can be used as a tool to evaluate risks that may be associated with using a specific raw water source and to ensure that appropriate treatment is in place. As well, during outbreak investigations where epidemiological evidence indicates that drinking water could be the source of infection, testing for enteric viruses can provide invaluable data to researchers and public health authorities.
Standard methods for enteric virus recovery and detection have been published (U.S. EPA, 1996, 2001c; APHA et al., 1998; ASTM, 2004). These methods have been validated and can be used by laboratories with the capacity to monitor for enteric viruses. The following sections provide an overview of these methodologies along with information on recent advancements in virus detection that have been used in research settings.
Enteric viruses are generally present in small numbers in faecally contaminated water; as such, 10-1000 L of water may need to be filtered to concentrate the pathogens to a detectable level.
Two methods of filtration have traditionally been used for initial virus concentration: filtration by adsorption and filtration by size exclusion (ultrafiltration). Adsorption filtration can employ electropositive filters, such as those prescribed by the U.S. Environmental Protection Agency's (EPA) Information Collection Rule for the recovery of viruses from water (U.S. EPA, 1996), negatively charged filters (Beuret, 2003; Fuhrman et al., 2005; Villar et al., 2006) or nitrocellulose membranes (Hsu et al., 2006). At ambient pH, most enteric viruses are negatively charged; therefore, they are captured by electropositive filter media. To adsorb viruses using negatively charged filter media, a cation such as magnesium chloride needs to be added to the sample, and the pH of the sample may need to be adjusted to an acidic pH. Since the viruses adsorb to the filter media, they must subsequently be eluted from the filter using an alkaline solution that alters the surface charge of the viral particles so that they will elute back into solution. Eluents commonly incorporate beef extract, glycine, tryptose phosphate buffer and/or sodium hydroxide into the solutions (Katayama et al., 2002; Hörman et al., 2004; Brassard et al., 2005; Villar et al., 2006). Size exclusion methods, such as ultrafiltration, are independent of pH and have the advantage of not requiring an elution step (Olszewski et al., 2005). Ultrafiltration does have some disadvantages. Because of the small size of viruses, the filter pore size must be extremely small and can become clogged. Typically, only approximately 20 L of water can be filtered at one time (Griffin et al., 2003), although volumes up to 100 L are being used in some laboratories (Linquist et al., 2007). Ultrafiltration is also less cost- and time-effective than adsorption filtration (Fong and Lipp, 2005). There is some work ongoing investigating the use of ultrafiltration for the simultaneous recovery of protozoa, bacteria and viruses, which could be advantageous from an economical, and potentially time-saving, perspective (Morales-Morales et al., 2003; Hill et al., 2005).
The initial concentration of the water sample is usually followed by a secondary concentration step, reducing the sample volume to 1-2 mL, to produce a concentrate sufficient for detection of viruses. Secondary concentration methods include organic flocculation, polyethylene glycol precipitation and ultracentrifugation.
Following concentration of the sample, detection methods for the enteric viruses are used. In general, virus detection methods have recovery efficiencies around 50% (Payment et al., 2000). The most commonly used detection methods include cell culture methods and polymerase chain reaction (PCR) methods, or a combination of both methods.
Historically, cell culture was the most widely used technique for the detection of viruses, and it is still the best method for determining the occurrence of infectious viruses in water. The ability to detect infectious viruses in environmental samples is important for predicting health risks to the public. However, not all enteric viruses can be readily detected by cell culture. Some enteric viruses do not produce a clear cytopathogenic effect, which is necessary for visual detection. This can underestimate the concentration of viruses in a sample. For other enteric viruses, such as some noroviruses, successful use of cell culture has only recently been accomplished using new three-dimensional cell culture techniques (Straub et al., 2007). While some viruses grow rapidly in a few days, most cell culture assays require several weeks to confirm negative results and to detect slow-growing viruses. In addition, plaque assays may underestimate virus concentration since, as mentioned previously, not all viruses produce a clear cytopathogenic effect. Other reasons for underestimation include aggregation of viruses in a sample, resulting in an individual plaque being infected with more than one virus (Teunis et al., 2005); the inability to maintain the cell monolayer for sufficiently long periods for some slow-growing viruses to produce a visible plaque; and the presence of fast-growing enteric viruses, which can lead to an underestimate of the concentration of slow-growing viruses (Irving and Smith, 1981; Fong and Lipp, 2005).
PCR-based detection methods have been developed for most of the key enteric viruses of concern for waterborne transmission. Recent improvements in technology have also made what is now known as real-time or quantitative PCR (q-PCR) the PCR method most often used for detection and quantitation of enteric viruses. It should be noted, however, that quantitation using q-PCR is not yet very precise and is reliant on materials that are currently not routinely available. Also, direct comparison of q-PCR results with cell culture results is not possible. PCR detection methods have some advantages over cell culture methods: they are rapid (results within 24 h), highly sensitive and, if properly designed, very specific, in comparison with cell culture. The main disadvantages of PCR-based methods are that they are unable to determine if the viruses are infectious and they are subject to inhibition by common environmental compounds, such as humic and fulvic acids, heavy metals and phenolic compounds (Fong and Lipp, 2005). Inhibitors can be removed from the samples, but this requires additional processing and results in loss of sensitivity. Knowing if a virus is infectious is important from a public health perspective to determine if there is a public health concern. For example, a recent study looking at adenoviruses in a water source was unable to find infectious virus using cell culture, but approximately 16% of the samples tested positive for the presence of adenoviruses using q-PCR (Choi and Jiang, 2005). These limitations need to be considered when interpreting PCR results.
Methods integrating cell culture and PCR make it possible to shorten the processing time (compared with cell culture alone) and to detect infectious viruses. Cell culture methods can also be combined with immunological methods to improve virus detection. Integrated methods have been reported to be both sensitive and specific, including for those viruses that are difficult to assay using conventional cell culture, such as adenoviruses and rotaviruses (Payment and Trudel, 1993; Jothikumar et al., 2000; Hurst et al., 2001; Payment, 2001, 2007; Reynolds et al., 2001; Greening et al., 2002; Ko et al., 2003). An additional advantage of combining cell culture with immunological or molecular methods is improvement in the sensitivity of the assay, as the infected cells amplify the quantity of virus, providing more target material for detection.
Methods for the detection of viruses in water are not practical for routine monitoring, and therefore various surrogate parameters (i.e., indicators) have been proposed to evaluate water treatment efficiency or to indicate the presence of enteric viruses in water (Deere et al., 2001; WHO, 2004). The indicators proposed to date include E. coli, total coliforms, enterococci, Clostridium perfringens spores and bacteriophages.
Escherichia coli is the microbial indicator that is used most often for determining faecal contamination of water sources. Further information on detection methods for E. coli is provided in the Guideline Technical Document on E. coli (Health Canada, 2006a).
Total coliforms, although not an indicator of faecal contamination, are useful as an indicator of overall water quality. Further information on detection methods for total coliforms is provided in the Guideline Technical Document on total coliforms (Health Canada, 2006b).
Enterococci can be used to indicate faecal contamination and indirectly indicate the presence of viruses (U.S. EPA, 2000; Ashbolt et al., 2001). Standardized methods for the detection of enterococci in water have been published (APHA et al., 1998; U.S. EPA, 2002a,b). Commercial kits for the detection of these indicators are also available.
Clostridium perfringens spores are indicators of both recent and past faecal contamination, but they are not as numerous as coliforms in faeces or contaminated water. Clostridium perfringens spores are also used as indicators of treatment efficiency. Standardized detection methods for C. perfringens have been published (ASTM, 2002; HPA, 2004).
Three types of bacteriophages are generally used as indicators: the somatic coliphages, male-specific F-RNA bacteriophages (also referred to as F-specific coliphage) and Bacteroides phages (i.e., phages infecting Bacteroides fragilis, B. thetaiotaomicron and Bacteroides strain GB-124). In the United States, standardized methods for the detection of somatic and male-specific coliphages have been developed (U.S. EPA, 2001a,b). The International Organization for Standardization (ISO) has also published standardized methods (ISO 10705 series) for the detection of bacteriophages (Mooijman et al., 2001, 2005).
The multi-barrier approach is the best approach to reduce enteric viruses and other waterborne pathogens in drinking water. Since available analytical methods make it impractical to routinely monitor for microbial pathogens in treated drinking water, the focus should be on characterizing source water risks and ensuring that effective treatment barriers are in place to achieve safe drinking water. Source water protection measures to minimize faecal contamination, especially control of domestic and sanitary sewage, should be implemented where feasible.
Source water quality should be characterized in terms of the concentrations and variability of waterborne pathogens and faecal indicators. Some means of achieving this are routine analysis for microbial pathogens and/or faecal indicators in source water; and sanitary surveys and/or source tracking, to identify potential sources of human and animal faecal contamination. In order to understand the full range of source water quality, data should be collected during normal conditions as well as during extreme weather or spill/upset events (e.g., spring runoff, storms). For example, the flooding of sewage collection and treatment systems during heavy rainfall events can lead to sudden increases in enteric viruses and other microbial pathogens in the source water.
Generally, minimum treatment of supplies derived from surface water sources or groundwater under the direct influence of surface waters should include adequate filtration (or technologies providing an equivalent log reduction credit) and disinfection. Recent published information has shown the presence of enteric viruses in some groundwaters that were considered to be less vulnerable to faecal contamination (i.e., those not under the direct influence of surface waters) (Abbaszadegan et al., 1998, 1999; Borchardt et al., 2003, 2004; Locas et al., 2007). As a result, it is recommended to ensure adequate treatment of all groundwaters to remove/inactivate enteric viruses, unless exempted by the responsible authority. In the case of small systems, technologies classified as residential scale may be used to achieve a 4-log reduction of enteric viruses, depending on the capacity requirements of the system. Although they may be classified as residential scale, many of them have a rated capacity for volumes greater than that of a single residence. Specific guidance on technologies that can be used in small systems should be obtained from the appropriate drinking water authority in the relevant jurisdiction.
Once the source water quality has been characterized, pathogen removal/inactivation targets and effective treatment barriers can be established in order to achieve safe finished drinking water. To optimize performance for removal of microbial pathogens, the relative importance of each barrier must be understood. Some water systems have multiple redundant barriers such that failure of a given barrier still provides adequate treatment. In other cases, all barriers must be working well to provide the required level of treatment. For these systems, failure of a single treatment barrier could lead to a waterborne disease outbreak.
The removal of enteric viruses from raw water is complicated by their small size and relative ease of passage through filtration barriers. However, viruses are effectively inactivated through the application of various disinfection technologies individually or in combination, at relatively low dosages. In most cases, a well-operated conventional treatment plant should be able to produce water with a negligible risk of disease transmission. Options for treatment and control of viruses are discussed briefly in this document; however, more detailed information is available in other references (U.S. EPA, 1991; Health and Welfare Canada, 1993; AWWA, 1999b; Deere et al., 2001; Hijnen et al., 2004, 2006; LeChevallier and Au, 2004; Medema et al., 2006).
In general, all drinking water supplies should be disinfected, and a disinfectant residual should be maintained throughout the distribution system at all times. In addition to primary disinfection, treatment of surface water or groundwater under the direct influence of surface waters should include physical removal methods, such as chemically assisted filtration (coagulation, flocculation, clarification and filtration) or technologies providing an equivalent log reduction credit. It is essential to achieve the physical removal and disinfection targets prior to the first consumer in the distribution system. Adequate process control measures and operator training are also required to ensure the effective operation of treatment barriers at all times (U.S. EPA, 1991; Health and Welfare Canada, 1993; AWWA, 1999b).
Treatment technologies should be in place to achieve a minimum 4-log (99.99%) removal and/or inactivation of enteric viruses. With this level of treatment, a source water concentration of 1 virus/100 L can be reduced to 1 × 10−4 virus/100 L, which meets the population health target of 10−6 disability adjusted life year (DALY)/person per year (see Section 8.0 for a detailed discussion of the DALY). However, raw water could have a much higher virus concentration and therefore require additional treatment for removal/inactivation in order to produce safe drinking water.
The level of treatment needed is based on the known or estimated concentration of pathogens in the source water. For example, a source water concentration of 1 virus/100 L would be reduced to 1 × 10−4 virus/100 L using a 4-log (99.99%) removal/inactivation process. A source water with higher virus concentrations will require greater removal/inactivation in order to meet an acceptable level of risk in the treated drinking water. Table 1 indicates the overall level of treatment required for a range of source water virus concentrations resulting in an acceptable level of risk of 1 × 10−6 DALY/person per year.
| Source water virus concentration (no./100 L) | Overall required treatment reduction for viruses (log10) |
|---|---|
| 1 | 4 |
| 10 | 5 |
| 100 | 6 |
| 1000 | 7 |
Where possible, source water virus concentrations should be characterized based on actual water sampling and analysis. Such characterization should take into account normal conditions as well as event-based monitoring, such as spring runoff, storms or spill events. Testing results should also take into account recovery efficiencies for the analytical method and pathogen viability in order to obtain the most accurate assessment of infectious pathogens present in the source water. In many places, source water sampling for enteric viruses may not be feasible and the potential risk from enteric viruses may be estimated using a combination of sanitary surveys, comparison with other source waters, indicator organisms or related research studies. Because it is difficult to analyse for viruses, regular monitoring of indicator organisms and source water characteristics can be a practical solution for assessing the need for treatment adjustments. However, given the uncertainty of such estimates, engineering safety factors or additional treatment reductions should be applied in order to ensure production of microbiologically safe drinking water.
The overall treatment requirements can be achieved through one or more treatment steps involving physical removal and/or primary disinfection. The virus log reductions for each separate treatment barrier can be combined to define the overall reduction for the treatment process.
The physical removal of viruses can be achieved by clarification and filtration processes. Clarification is followed by a filtration process. Some filtration systems, however, are used without clarification (direct filtration). Viruses in the water can be either free flowing or particle associated, and their adsorption depends on a number of factors, such as the isoelectric point and hydrophobicity of both the virus and of the particle (Templeton et al., 2008). The isoelectric point is the pH at which the virus has no net electrical charge and varies among different viral species. The association of virus with particles plays a role in both the physical removal and the disinfection/inactivation of the virus.
The addition of a chemical coagulant to the raw water produces flocs that adsorb the particle-associated viruses. These flocs are then removed from the water using gravity sedimentation, a sludge blanket or dissolved air flotation. Studies have shown virus removal ranging from 1.1 to 3.4 log for the clarification process only (coagulation, flocculation and sedimentation steps) (LeChevallier, 1999; Hijnen et al., 2004).
The granular media filter is traditionally the most common type used. Treatment systems using coagulation, flocculation, clarification and rapid sand filtration are often referred to as conventional treatment. Studies have shown virus removal of 0.1-3.8 log for the filtration only. Combining these process steps together, conventional filtration is given a credit of 2.0-log physical removal of enteric viruses (Table 2). A greater log removal of enteric viruses is possible using conventional filtration with optimization of the treatment for turbidity and particle removal (Xagoraraki et al., 2004).
| Treatment barrier | Virus removal creditTable 1 footnote 1 (log10) |
|---|---|
Table 1 footnotes
|
|
| Conventional filtration | 2.0 |
| Direct filtration | 1.0 |
| Slow sand filtration | 2.0 |
| Diatomaceous earth filtration | 1.0 |
| Microfiltration | No creditTableau 1 note de bas de page 2 |
| Ultrafiltration | Demonstration and challenge testing; verified by direct integrity testing |
| Nanofiltration and reverse osmosis | Demonstration and challenge testing; verified by integrity testing |
Membrane filtration by microfiltration does not provide an ultimate physical barrier to viruses because of the size of the pores, which range from 0.1 to 10 µm. However, several studies have demonstrated that viruses can be removed to a 4-log level when a coagulation process precedes microfiltration (Zhu et al., 2005a,b; Fiksdal and Leiknes, 2006). Ultrafiltration membranes have pore sizes ranging from 0.01 to 0.1 µm and can reject viruses. A review of several studies indicates that ultrafiltration membranes typically remove viruses to greater than a 3-log level (AWWA, 2005). Nanofiltration and reverse osmosis membranes are typically considered to be non-porous and represent a physical barrier to viruses. Because any breach in the integrity of the membranes would allow viruses to pass through the filter, direct integrity testing should be conducted during the filter operation for ultrafiltration membranes. Currently, it is not possible to conduct direct integrity testing for nanofiltration and reverse osmosis membranes without disrupting production for an extended period of time. However, indirect integrity testing is required on a continuous basis, and direct integrity testing should be performed on a regular basis.
It is important to note that many treatment processes are interdependent and rely on optimal conditions upstream in the treatment process for efficient operation of subsequent treatment steps. For example, coagulation and flocculation should be optimized for particles to be effectively removed by filtration. Filters must be carefully controlled, monitored and backwashed such that particle breakthrough does not occur (Huck et al., 2001), and filter backwash water should not be recirculated through the treatment plant without additional treatment subsequent to coagulation, flocculation and clarification (Medema et al., 2006).
Slow sand filtration can also be effective, with physical removals in the range of 0.9-3.5 log for viruses (Hijnen et al., 2004). Several factors can negatively affect the removal of viruses by slow sand filtration, such as cold water, higher hydraulic loading and decreased sand depth. A 1-year monitoring study of three full-scale riverbank filtration facilities reported an average male-specific and somatic bacteriophage reduction of 2.1 log and 3.2 log, respectively (Weiss et al., 2005).
Drinking water treatment plants that meet the turbidity limits established in the Guidelines for Canadian Drinking Water Quality (Health Canada, 2003b) can apply the estimated physical removal credits for enteric viruses given in Table 2. These log removal credits are based on the mean or median removals established by the U.S. EPA (1999) as part of the Disinfection Profiling and Benchmarking Guidance Manual and the Long Term 2 Enhanced Surface Water Treatment Rule (LT2ESWTR) (U.S. EPA, 2006b). Alternatively, log removal rates can be established on the basis of demonstrated performance or pilot studies. The physical log removal credits can be combined with the disinfection credits to meet overall treatment goals. For example, if an overall 4-log (99.99%) virus removal is required in a given water supply and conventional filtration provides 2-log removal, then the remaining 2-log reduction must be achieved through another barrier, such as disinfection.
Chemical disinfectants commonly used for treating drinking water include chlorine, chloramine, chlorine dioxide and ozone. Disinfection is typically applied after treatment processes that remove particles and organic matter. This strategy helps to ensure efficient inactivation of pathogens and minimizes the formation of disinfection by-products (DBPs). It is important to note that when describing microbial disinfection of drinking water, the term "inactivation" is used to indicate that the pathogen is no longer able to multiply within its host and is therefore non-infectious, although it may still be present.
Physical characteristics of the water, such as temperature, pH and turbidity, can have a major impact on inactivation and removal of pathogens. For example, inactivation rates increase 2- to 3-fold for every 10°C rise in temperature. When temperatures are near 0°C, as is often the case in winter in Canada, the efficacy of disinfection is reduced, and an increased disinfectant concentration or contact time, or a combination of both, is required to achieve the same level of inactivation.
The effectiveness of some disinfectants is also dependent on pH. When using free chlorine, increasing the pH from 6 to 10 reduces the level of virus inactivation by a factor of 8-10 times (see CT tables in Appendix B). However, a recent study by Thurston-Enriquez et al.
(2005a) reported that chlorine dioxide was 1.9 and 19.3 times more effective at pH 8 than at pH 6 for adenovirus type 40 and feline calicivirus (used as a surrogate for norovirus), respectively. Similar findings have been reported for other enteric viruses using chlorine dioxide (Alvarez and O'Brien, 1982; Moss and Olivieri, 1985). pH has been shown to have little effect on virus inactivation efficiency of ozone, although a higher pH will impact ozone stability and therefore increase ozone demand.
Reducing turbidity is an important step in the inactivation of viruses and other microorganisms. Chemical disinfection may be inhibited because protection of viruses and other microorganisms can occur within the associated particles. Negative impacts of particle-associated viruses on disinfection processes have been demonstrated in several studies (Templeton et al., 2008). The effect of turbidity on treatment efficiency is further discussed in the Guideline Technical Document on turbidity (Health Canada, 2003b).
The efficacy of chemical disinfectants can be predicted based on knowledge of the residual concentration of disinfectant, temperature, pH and contact time (AWWA, 1999b). This relationship is commonly referred to as the CT concept, where CT is the product of "C" (the residual concentration of disinfectant, measured in mg/L) and "T" (the disinfectant contact time, measured in minutes). Generally, CT objectives are determined in controlled laboratory studies. In treatment facilities, the residual concentration is usually determined at the exit of the contact chamber rather than using the applied dose or initial concentration, to account for disinfectant decay. To account for the mixing hydraulic of the contact chamber, the contact time "T" is typically calculated using a T10 value, such that 90% of the water meets or exceeds the required contact time. The T10 values can be estimated based on the geometry and flow conditions of the disinfection chamber or basin. Hydraulic tracer tests, however, are the most accurate method to determine the contact time under actual plant flow conditions.
Complete CT tables for 2-log, 3-log and 4-log inactivation of viruses can be found in Appendix B. Some selected CT values are presented in Table 3 for 4-log (99.99%) inactivation of enteric viruses using chlorine, chloramine, chlorine dioxide and ozone. The CT values illustrate the fact that chloramine is a much weaker disinfectant than free chlorine, chlorine dioxide or ozone, since much higher concentrations and/or contact times are required to achieve the same degree of virus inactivation. Consequently, chloramine is not recommended as a primary disinfectant. Free chlorine is the most common chemical used for primary disinfection because it is widely available, is relatively inexpensive and provides a residual that can be used for maintaining water quality in the distribution system. For example, a moderate chlorine concentration of 0.5 mg/L with 15-min contact time can achieve greater than 4-log virus inactivation at 20°C (Table 3). Ozone is another strong disinfectant for virus inactivation, as noted by the low CT values required for 4-log inactivation. However, ozone decays rapidly after being applied during treatment and cannot be used to provide a secondary disinfectant residual. Although ozone and chlorine dioxide are effective disinfectants, they are typically more expensive and complicated to implement, particularly in small treatment systems.
| Temperature (°C) | T values for 99.99% (4-log) inactivation | |||
|---|---|---|---|---|
| Free chlorine (Cl2) |
Chloramine (NH2Cl) |
Chlorine dioxide (ClO2) |
Ozone (O3) |
|
Table 1 footnotes
|
||||
| 5 | 8 | 1988 | 33.4 | 1.2 |
| 20 | 3 | 746 | 12.5 | 0.5 |
Research studies involving several enteric viruses have shown varying levels of resistance to chemical disinfectants (Engelbrecht et al., 1980; Payment et al., 1985; Hoff, 1986; Sobsey et al., 1988; Payment and Armon, 1989; U.S. EPA, 1989; AWWA, 1999a,b; Thurston-Enriquez et al., 2003a, 2005a,b). Table 4 presents CT values from various research studies for 2-log (99%) inactivation of several viruses using various chemical disinfectants. In these studies, HAV was found to be more resistant to chemical inactivation using chlorine dioxide and ozone than other types of viruses. For free chlorine disinfection, HAV was shown to be consistently more resistant than rotavirus and adenovirus 40; however, the susceptibility of coxsackievirus B5 and poliovirus 1 varied significantly between studies. Further research on the inactivation of these viruses is needed. As a result, virus disinfection targets and guidance tables of CT values have been based on HAV (U.S. EPA, 1991).
| Virus | CT values for 99% (2-log) inactivation | |||
|---|---|---|---|---|
| Free chlorine (Cl2) pH 6-7 |
Chloramine (NH2Cl) pH 8-9 |
Chlorine dioxide (ClO2) pH 6-7 |
Ozone (O3) pH 6-7 |
|
Table 1 footnotes
|
||||
| Poliovirus 1Table 1 footnote a | 1.1-6 | 768-3740 | 0.2-6.7 | 0.1-0.2 |
| RotavirusTable 1 footnote b | 0.01-0.05 | 3806-6476 | 0.2-2.1 | 0.006-0.06 |
| Hepatitis A virusTable 1 footnote c | 0.7-1.18 | 428-857 | <0.17 - 2.8 | 0.5 |
| Coxsackievirus B5Table 1 footnote d,Table 1 footnote e | 1.7-12 | 550 | n.a. | n.a. |
| Adenovirus 40Table 1 footnote e,Table 1 footnote f | 0.02-2.4 | 360 | 0.25 | 0.027 |
In addition to microbial inactivation, chemical disinfection can result in the formation of DBPs, some of which may pose a human health risk. The most commonly used disinfectant, chlorine, reacts with naturally occurring organic matter to form trihalomethanes and haloacetic acids, along with many other halogenated organic compounds (Krasner et al., 2006). The use of ozone and chlorine dioxide can also result in the formation of inorganic DBPs, such as bromate and chlorite/chlorate, respectively. When selecting a chemical disinfectant, the potential impact of DBPs should be considered. Where possible, efforts should be made to minimize the formation of these DBPs without compromising the effectiveness of disinfection. The issues of DBPs are further discussed in the Guideline Technical Documents on trihalomethanes (Health Canada, 2006c), haloacetic acids (Health Canada, 2008), bromate (Health Canada, 1998) and chlorite and chlorate (Health Canada, 2008).
UV light disinfection is considered an alternative method of disinfection. UV dose, usually called fluence, is expressed in millijoules per square centimetre (mJ/cm2), which is equivalent to milliwatt seconds per square centimetre (mW·s/cm2). UV light is usually applied after particle removal barriers such as filtration in order to prevent shielding by suspended particles and allow better light penetration through to the target pathogens. Several recent studies have examined the effect of particles on UV disinfection efficacy, and most have concluded that the UV dose-response of microorganisms is not affected by variations in turbidity up to 10 nephelometric turbidity units (Christensen and Linden, 2002; Oppenheimer et al., 2002; Mamane-Gravetz and Linden, 2004; Passantino et al., 2004). However, the presence of humic acid particles and coagulants has been show to significantly affect UV disinfection efficacy, with lower inactivation levels being achieved (Templeton et al., 2005). Further research is needed to better understand the effect of particles and coagulants on microbial inactivation by UV light.
Several studies have investigated the inactivation of enteric viruses using UV light
(Chang et al., 1985; Arnold and Rainbow, 1996; Meng and Gerba, 1996; AWWA, 1999b; U.S. EPA, 2000; Cotton et al., 2001). Studies have shown that adenoviruses are much more resistant to UV disinfection compared with other enteric viruses (Cotton et al., 2001; Thurston-Enriquez et al., 2003b; Nwachuku et al., 2005). A relatively high UV dose of 152 mJ/cm2 was required for a 4-log (99.99%) inactivation of adenovirus 40 in buffered demand-free water. In contrast, a recent study (Linden et al., 2007) obtained 3-log inactivation of adenovirus 40 by use of a polychromatic source at fluence of approximately 30 mJ/cm2 and wavelength of 220 nm and 228 nm. A 4-log inactivation of rotavirus was achieved using a mean UV dose of 40 mJ/cm2. Table 5 lists typical UV dosages required to achieve a 4-log (99.99%) inactivation for various types of enteric viruses. A more detailed table of UV doses for multiple log reductions of various viruses is presented in a study by Chevrefils et al. (2006).
| Virus | 1-log | 2-log | 3-log | 4-log |
|---|---|---|---|---|
Table 1 footnotes
|
||||
| Hepatitis A virus | 4.1-5.5 | 8.2-13.7 | 12.3-22 | 16.4-29.6 |
| Coxsackievirus B5 | 6.9-9.5 | 13.7-18 | 20.6-27 | 36 |
| Poliovirus type 1 | 4.0-8 | 8.7-15.5 | 14.2-23 | 20.6-31 |
| Rotavirus SA-11Table 1 footnote b | 7.1-10 | 14.8-26 | 23-44 | 36-61 |
| Adenovirus | 58 | 100 | 143 | 186 |
It appears that double-stranded DNA viruses, such as adenoviruses, are more resistant to UV radiation than single-stranded RNA viruses, such as HAV (Meng and Gerba, 1996). The mechanisms for this higher resistance are not totally understood. The resistance may occur as the result of the physical or chemical properties of the viruses or repair of the UV-induced damage either by the virus or with the help of host cell enzymes (Shin et al., 2005). Because of their high level of resistance to UV treatment and because adenoviruses cause illness in children and immunocompromised adults, adenoviruses have been used by the U.S. EPA as the basis for establishing UV light inactivation requirements for enteric viruses in the LT2ESWTR. Accordingly, the LT2ESWTR requires a UV dose of 186 mJ/cm2 to receive a 4.0-log credit for viral inactivation (U.S. EPA, 2006b).
For water supply systems in Canada, a UV dose of 40 mJ/cm2 is commonly applied, often in combination with chlorine disinfection or other physical removal barriers (MOE, 2006). This dose is considered to be protective of human health because most enteric viruses are inactivated at a UV dose of 40 mJ/cm2. However, a UV dose of 40 mJ/cm2 would provide only a 0.5-log inactivation of adenovirus but the addition of free chlorine can provide additional log removal credit. In a laboratory study, Baxter et al. (2007) found that a concentration of 0.22 mg/L of chlorine with 1 minute of contact time in a demand-free water, provided a 4-log inactivation of adenovirus.
For drinking water sources considered to be less vulnerable to human faecal contamination, the responsible authority may choose an enteric virus such as rotavirus as the target organism (i.e., as found in Table 5) to determine the required UV dose. Where a system relies solely on UV disinfection for pathogen control and the source water is known or suspected to be contaminated with human sewage, either a higher UV dose such as that stated in the LT2ESWTR or a multi-disinfectant strategy should be considered.
A multiple disinfectant strategy involves using two or more primary disinfection steps to meet treatment objectives. For example, UV light and free chlorine are complementary disinfection processes, which can inactivate protozoa, viruses and bacteria. UV light is highly effective for inactivating protozoa and bacteria (but less effective for some viruses), whereas chlorine is highly effective for inactivating bacteria and many viruses (but less effective for protozoa). In some treatment plants, ozone may be applied for taste and odour control, followed by chlorine disinfection. In such cases, both the ozone and chlorine disinfection could potentially be credited towards meeting the minimum of 4-log reduction for enteric viruses while meeting taste and odour treatment objectives.
Site-specific assessments of drinking water supplies should be carried out in order to determine the most appropriate treatment strategy based on the source water quality, including the organisms of concern. For example, utilities using surface water or groundwater under the direct influence of surface water will need to treat source waters for all three types of organisms (protozoa, viruses and bacteria) and therefore may need to consider the use of a multi-disinfectant strategy. Groundwater from sources less vulnerable to faecal contamination, on the other hand, may need to be treated only for the presence of enteric viruses, and therefore a multi-disinfectant strategy would not be necessary. When determining whether a multiple disinfectant strategy is required to meet overall treatment objectives, the contribution from any physical removal treatment process also needs to be considered. Specific guidance on disinfection requirements should be obtained from the appropriate drinking water authority in the relevant jurisdiction.
Various options are available for treating source waters to provide high-quality pathogen-free drinking water. These include filtration and disinfection with chlorine-based compounds or alternative technologies, such as UV light. These technologies are similar to the municipal treatment barriers, but on a smaller scale. In addition, there are other treatment processes, such as distillation, that can be practically applied only to small or individual water supplies. Most of these technologies have been incorporated into point-of-entry devices, which treat all water entering the system, or point-of-use devices, which treat water at only a single location--for example, at the kitchen tap.
The use of UV light has increased owing to its availability and relative ease of operation. However, scaling or fouling of the UV lamp surface is a common problem when applying UV light to raw water with moderate or high levels of hardness, such as groundwater. UV light systems are often preceded by a pretreatment filter to reduce scaling or fouling. A pretreatment filter may also be needed to achieve the water quality that is required for the UV system to operate as specified by the manufacturer. In addition, the regular cleaning and replacement of the lamp, according to manufacturer's instructions, are critical in ensuring the proper functioning of the unit. Alternatively, special UV lamp-cleaning mechanisms or water softeners can be used to overcome this scaling problem.
Health Canada does not recommend specific brands of drinking water treatment devices, but it strongly recommends that consumers look for a mark or label indicating that the device has been certified by an accredited certification body as meeting the appropriate NSF International (NSF)/American National Standards Institute (ANSI) standard. These standards have been designed to safeguard drinking water by helping to ensure the material safety and performance of products that come into contact with drinking water.
For example, NSF/ANSI Standard 55 (Ultraviolet Disinfection Systems) provides performance criteria for two categories of certified systems, Class A and Class B. UV systems certified to NSF/ANSI Standard 55 Class A are designed to deliver a UV dose at least equivalent to 40 mJ/cm2 in order to inactivate microorganisms, including bacteria, viruses, Cryptosporidium oocysts and Giardia cysts, from contaminated water. As such, UV systems certified to NSF/ANSI Standard 55 Class A can provide 4-log reduction for most viruses (Table 5) and are suitable for this use. However, it must be noted that they are not designed to treat wastewater or water contaminated with raw sewage and should be installed in visually clear water. It is important to note that systems certified to NSF Standard 55 Class B are designed to deliver a UV dose at least equivalent to 16 mJ/cm2 and cannot provide 4-log reduction for most viruses (Table 5). Class B systems are intended for a drinking water supply that is already disinfected, tested and deemed acceptable for human consumption.
Reverse osmosis membranes have a pore size smaller than the viruses and could provide a physical barrier to them. Currently, the NSF/ANSI standard for reverse osmosis systems does not include a claim for virus reduction; as a result, reverse osmosis units cannot be certified to this standard for virus reduction.
Certification organizations provide assurance that a product or service conforms to applicable standards. In Canada, the following organizations have been accredited by the
Standards Council of Canada to certify drinking water devices and materials as meeting the appropriate NSF/ANSI standards:
The adoption of a risk-based approach, such as a multi-barrier approach, is essential to the effective management of drinking water systems (CCME, 2004). This approach should include assessment of the entire drinking water system, from the watershed/aquifer and intake through the treatment and distribution chain to the consumer, to assess potential impacts on drinking water quality and public health.
Current drinking water guidelines encourage the adoption of a multi-barrier approach to produce clean, safe and reliable drinking water. Numerous indicators, such as indicator microorganisms, turbidity and disinfectant residuals, are used as part of the multi-barrier approach to determine the quality of the treated drinking water. For example, E. coli and total coliforms are bacteriological indicators that are routinely used to verify the microbiological quality of drinking water. Although indicators are an important aspect of a multi-barrier approach, they do not provide any quantitative information on pathogens or the potential disease burden associated with drinking water of a given quality. It is important to note that even water of an acceptable quality carries some risk of illness, although it is extremely low.
QMRA is gaining acceptance as part of a multi-barrier approach. QMRA is a process that uses source water quality data, treatment barrier information and pathogen-specific characteristics to estimate the burden of disease associated with exposure to pathogenic microorganisms in a drinking water source. The benefit of using a QMRA approach is that assessments can be carried out by each water system to provide site-specific information:
Site-specific variations should include the potential impact of hazardous events, such as storms, contamination events or the failure of a treatment barrier. When interpreting the results from a QMRA, the following should be considered:
Because of these limitations, QMRA should not be used to try to estimate levels of illness in a population resulting from a particular water system. Rather, the disease burden estimates produced from a QMRA are useful for site-specific system evaluations as part of a multi-barrier approach to safe drinking water.
Health-based targets are the "goal-posts" or "benchmarks" that have to be met to ensure the safety of drinking water. In Canada, microbiological hazards are commonly addressed by two forms of health-based targets: water quality targets and treatment goals. An example of a water quality target is the bacteriological guideline for E. coli, which sets a maximum acceptable concentration for this organism in drinking water. Treatment goals describe the reduction in risk to be provided by measures such as treatment processes aimed at reducing the viability or presence of pathogens. Treatment goals assist in the selection of treatment barriers and should be defined in relation to source water quality. They need to take into account not only normal operating conditions, but also the potential for variations in water quality and/or treatment performance. For example, short periods of poor source water quality following a storm or a decrease in treatment effectiveness due to a process failure may in fact embody most of the risk in a drinking water system (Gale, 2002; Medema et al., 2006). The wide array of microbiological pathogens makes it impractical to measure for all of the potential hazards; thus, treatment goals are generally framed in terms of categories of organisms (e.g., bacteria, viruses and protozoa) rather than individual pathogens. The health-based treatment goal for enteric viruses is a minimum 4-log reduction and/or inactivation of viruses. Many source waters will require a greater log reduction and/or inactivation to maintain an acceptable level of risk.
The burden of disease estimates calculated during a risk assessment should be compared with a reference level of risk--that is, a level of risk that is deemed tolerable or acceptable. This comparison is needed to understand the public health implications of the disease burden estimate and to set health-based treatment goals.
Risk levels have been expressed in several ways. The World Health Organization's (WHO) Guidelines for Drinking-water Quality (WHO, 2004) use DALYs as a unit of measure
for risk. The basic principle of the DALY is to calculate a value that considers both the probability of experiencing an illness or injury and the impact of the associated health effects (Murray and Lopez, 1996a; Havelaar and Melse, 2003). The WHO (2004) guidelines adopt 10−6 DALY/person per year as a health target. The Australian National Guidelines for Water Recycling (NRMMC-EPHC, 2006) also cite this target. In contrast, other agencies set acceptable microbial risk levels based on the risk of infection and do not consider the probability or severity of associated health outcomes. For example, the U.S. EPA has used a health-based target of an annual risk of infection of less than 1/10 000 persons (10−4) (Regli et al., 1991).
For comparison, the reference level of 10−6 DALY/person per year is approximately equivalent to an annual risk of illness for an individual of 1/1000 (10−3) for a diarrhoea-causing pathogen with a low fatality rate. For an illness with a more severe health outcome, such as cancer, 10−6 DALY/person per year is approximately equivalent to a lifetime additional risk of cancer over background of 10−5 (i.e., 1 excess case of cancer over background levels per 100 000 people ingesting 1.5 L of drinking-water per day containing the substance at the guideline value over a 70-year life span). QMRA is a useful tool in estimating whether a drinking water system can meet this health target, as current disease surveillance systems in developed nations such as Canada are not able to detect endemic illness at such a low level.
The risk assessment in this Guideline Technical Document estimates the disease burden in DALYs. There are several advantages to using this metric. DALYs take into account both the number of years lost due to mortality and the number of years lived with a disability (compared with the average healthy individual for the region) to determine the health impact associated with a single type of pathogenic organism. The use of DALYs also allows for comparison of health impacts between different pathogens and potentially between microbiological and some chemical hazards. Although no common health metric has been accepted internationally, DALYs have been used by numerous groups, and published, peer-reviewed information is available. The WHO (2004) reference level of 10−6 DALY/person per year is used in this risk assessment as an acceptable level of risk.
QMRA is an approach that uses mathematical modelling and relevant information from selected pathogens to derive disease burden estimates. QMRA follows a common approach in risk assessment, which includes four components: hazard identification, exposure assessment, dose-response assessment and risk characterization.
The first step of QMRA is hazard identification, a qualitative process of identifying hazards to the drinking water system or to human health, such as microorganisms, toxins or chemicals. The enteric viruses of most concern as human health hazards in Canadian drinking water sources include noroviruses, rotaviruses, hepatitis viruses, enteroviruses and adenoviruses.
The presence and types of enteric viruses in a given drinking water source are variable. Therefore, it is important to identify all potential sources and events, regardless of whether they are under the control of the drinking water supplier, that could lead to enteric viruses being present at concentrations exceeding baseline levels, on a site-specific basis. Human faeces and, in some instances, urine are the main sources of enteric viruses, which may originate from point sources of pollution, such as municipal sewage discharges, or from non-point sources, such as septic systems. In addition to the potential sources of contamination, it is necessary to consider whether the presence of enteric viruses is continuous, is intermittent or has seasonal pollution patterns and how rare events, such as droughts or floods, will impact the enteric virus concentrations in the source water.
Although all enteric viruses of concern need to be identified, risk assessments do not usually consider each individual enteric virus. Instead, the risk assessment includes only specific enteric viruses (reference pathogens or, in this case, reference viruses) whose characteristics make them a good representative of all similar pathogenic viruses. It is assumed that if the reference virus is controlled, this would ensure control of all other similar viruses of concern. Ideally, a reference virus will represent a worst-case combination of high occurrence, high concentration and long survival time in source water, low removal and/or inactivation during treatment and a high pathogenicity for all age groups. Numerous enteric viruses have been considered as reference viruses, including adenoviruses, noroviruses, and rotaviruses. None of these viruses meet all of the characteristics of an ideal reference virus. Adenoviruses represent a worse-case for inactivation during treatment when using UV, however, they are less prevalent in the population than noroviruses or rotaviruses. Noroviruses, although they are a significant cause of viral gastroenteritis in all age groups, have no published dose-response model currently available. Rotaviruses are a common cause of infection in children, have the possibility of severe outcomes and a dose-response model is available, however, rotaviruses are more susceptible to treatment than some other enteric viruses. As no single virus has all the characteristics of an ideal reference virus, this risk assessment incorporates the key characteristics of rotavirus, with the CT values based on HAV and poliovirus (U.S. EPA, 1999) as the best currently available disinfection information for enteric viruses commonly found in surface water and groundwater sources.
Exposure assessments provide an estimate (with associated uncertainty) of the occurrence and level of a contaminant in a specified volume of water at the time of the exposure event (ingestion, inhalation and/or dermal absorption). The principal route of exposure considered in this risk assessment is consumption of drinking water. To determine exposure, the concentration of enteric viruses and the volume of water ingested need to be known or estimated. Exposure can be determined as a single dose of pathogens ingested by a consumer at one time or the total amount over several exposures (e.g., over a year).
Drinking water is not usually monitored for enteric viruses. Therefore, to determine exposure, the concentration of the reference virus in the source water needs to be measured or estimated. Measurements, as opposed to estimates, will result in the highest-quality risk assessment. Short-term peaks in virus concentration may increase disease risks considerably and even trigger outbreaks of waterborne disease; thus, seasonal variation and peak events such as storms should be included in the measurements or estimates. Once the source water concentrations are determined, treatment reductions are calculated to determine the concentration in the finished drinking water. This risk assessment assumes that any viruses that were not removed or inactivated during treatment are still capable of causing infection and illness.
For the volume of water ingested, it is important to consider only the unboiled amount of tap water consumed, as boiling the water inactivates pathogens and will overestimate exposure (Gale, 1996; Payment et al., 1997; WHO, 2004). In Canada, approximately 1.5 L of tap water are consumed per person per day. However, approximately 35% is consumed in the form of coffee or tea (Health and Welfare Canada, 1981). The elevated temperatures (boiling, or near boiling) used for making coffee and tea would inactivate any enteric pathogens present. Therefore, for estimating risk from pathogenic organisms, this risk assessment uses an average consumption of 1 L of water per person per day for determining exposure. This estimate is similar to consumption patterns in other developed nations (Westrell et al., 2006a; Mons et al., 2007). The WHO Guidelines for Drinking-water Quality also suggest using an estimate of 1 L for consumption of unboiled tap water (WHO, 2004).
The dose-response assessment uses dose-response models to estimate the probability of infection and the risk of illness after exposure to the reference virus. The probability of infection (Pinfection) for this risk assessment is calculated using a dose-response model for rotavirus. The rotavirus dose-response data are well approximated by the beta-Poisson model (equation 1) (Haas et al., 1999):
Equation 1
![]()
where:
The beta-Poisson model describes mathematically the distribution of the individual probabilities of any one organism to survive and start infection, where α and β are parameters describing the probability of infection. The approximation of the beta-Poisson model becomes poor, and therefore should not be used, at small values of β or when the estimated dose d is large. The α and β parameters are derived from dose-response studies of healthy volunteers and may not adequately represent effects on sensitive subgroups, such as immunocompromised persons, young children or the elderly (Ward et al., 1986). An individual's daily dose of organisms is estimated using the information from the exposure assessment. An individual's yearly probability of infection is estimated using equation 2. For this risk assessment, it is assumed that there is no secondary spread of infection.
Equation 2
![]()
Not all infected individuals will develop a clinical illness. The risk of illness per year for an individual is estimated using equation 3:
Equation 3
![]()
where:
The fraction of the population that is susceptible to infection and illness varies with the type of enteric virus being considered. For rotavirus, the population susceptible to infection is generally confined to young children. Based on Canadian data, this represents approximately 6% of the population (Ministry of Finance, 2003a,b). However, as this risk assessment uses rotavirus as a representative of all enteric viruses that may be present in drinking water, including those to which greater proportions or most of the population may be susceptible (e.g., norovirus), 100% of the population is assumed to be susceptible to infection. Not all infections result in symptomatic illness. Based on U.S. data, 88% of individuals will develop symptomatic illness after infection with rotavirus (Havelaar and Melse, 2003).
To translate the risk of illness per year for an individual to a disease burden per person, the DALY is used as a common"metric" that can to take into account diverse health outcomes. The key advantage of the DALY as a measure of public health is cited as its aggregate nature, combining life years lost (LYL) with years lived with disability (YLD) to calculate the disease burden. DALYs can be calculated as follows:
Equation 4
![]()
where:
For rotavirus, the health effects vary in severity from mild diarrhoea to more severe diarrhoea and potentially death. The disease burden of gastroenteritis resulting from infection with rotavirus in drinking water is 8.46 DALYs/1000 cases (8.46 × 10−3 DALY/case) (Table 6).
| Health outcome | Outcome fractionTable 1 footnote a | Duration of illnessTable 1 footnote b | Severity weightTable 1 footnote c | DALYs/case | |
|---|---|---|---|---|---|
Table 1 footnotes
|
|||||
| Morbidity (YLD) |
Mild diarrhea |
0.50 |
0.01918 (7 days) |
0.067 | 6.43 × 10−4 |
| Bloody diarrhea | 0.49 | 0.01918 (7 days) |
0.39 | 3.67 × 10−3 | |
| Mortality (LYL) |
Death | 0.0001 | Life expectancyTable 1 footnote d; age at deathTable 1 footnote e |
1 | 4.15 × 10−3 |
| Disease burden | 8.46 × 10−3 | ||||
Using this disease burden (DALYs/case) and the risk of illness per year in an individual, the disease burden in DALYs/person per year can be estimated:
Equation 5
![]()
where:
Risk characterization brings together the data collected or estimated on pathogen occurrence in source water, pathogen removal or inactivation through treatment barriers, consumption patterns to estimate exposure and pathogen dose-response relationships to estimate the burden of disease. Using this information, the potential disease burden associated with the specified drinking water system can be calculated. An example of a disease burden calculation is provided in Figure 1. This calculation has been presented as a point estimate; however, the calculation was done using a mathematical model that included probability functions with associated uncertainties (Appendix C). The calculated disease burden can then be compared with the acceptable risk level to determine if the drinking water being produced is of an acceptable quality. If the disease burden estimate associated with the drinking water does not meet the acceptable risk level, QMRA can then be used to calculate the level of treatment that would be required to meet the acceptable health risk target (10−6 DALY/person per year).
For example, as shown in Figure 1, when source waters have a mean concentration of approximately 1 rotavirus/100 L and the treatment plant consistently achieves at least a 4-log reduction in virus concentration, the burden of disease in the population would meet the reference level of 10−6 DALY/person per year (1 case/1000 people per year). A source water concentration of 1 rotavirus/100 L of water is relatively low. It generally represents groundwater sources and relatively pristine surface water sources. Many surface water sources will have virus concentrations on the range of 1-100 viruses/L of water (100-10 000/100 L). These levels require a much greater log reduction to meet the acceptable disease burden.
Figure 2 shows that a source water with a mean concentration of 10 rotaviruses/L of water (1000 rotaviruses/100 L of water) would require the treatment plant to consistently achieve at least a 7-log reduction in virus concentration to meet the acceptable reference level of risk. Consequently, the health-based treatment goal of a 4-log reduction of enteric viruses is a minimum requirement. A site-specific assessment should be done to determine what level of enteric virus reduction is needed for any given source water. Monitoring, as opposed to estimating, enteric virus concentrations in source water will result in the highest-quality risk assessment. However, if measurements are not possible, estimated concentrations may be based on perceived source water quality. Information obtained from sanitary surveys, vulnerability assessments, and information on other water quality parameters, such as indicator organisms, can be used to help estimate the risk and/or level of faecal contamination in the source water. It is important to consider, as part of the site-specific assessment, events that can significantly change source water quality, such as hazardous spills or storms. These will have an important impact on the treatment required, and including variations in source water quality will provide the best estimate of the risk in a system. Understanding and planning for the variations that occur in source water quality create a more robust system that can include safety margins. It is also important to take into consideration the level of uncertainty that is inherent in carrying out a QMRA to ensure that the treatment in place is producing water of an acceptable quality. A sensitivity analysis using a QMRA model such as the one described in Appendix C can also help identify critical control points and their limits.

Figure 1: Example of a risk assessment under specified conditions. The calculation is presented as a point estimate; however, the calculation was done using a mathematical model that included probability functions with associated uncertainties.

Figure 2: Treatment requirements to meet an acceptable level of risk of 10-6 DALY/person per year based on 1L consumption
QMRA is increasingly being applied by international agencies and governments at all levels as the foundation for informed decision-making surrounding the health risks from pathogens in drinking water. WHO, the European Commission, the Netherlands, Australia and the United States have all made important advances in QMRA validation and methodology (Staatscourant, 2001; WHO, 2004; NRMMC-EPHC, 2006; Medema et al., 2006; U.S. EPA, 2006a,b). With the exception of the U.S. EPA, these agencies and governments have adopted an approach that takes full advantage of the potential of QMRA to inform the development of health targets (i.e., acceptable levels of risk or disease) and site-specific risk management (e.g., Water Safety Plans, as described in WHO, 2004). Building on the WHO work, the European Commission's Microrisk project has published an extensive guidance document that establishes methods and a strong science basis for QMRA of drinking water (Medema et al., 2006).
The Netherlands and the U.S. EPA provide two examples of QMRA-based regulatory approaches. In the Netherlands, consistent with the WHO approach, water suppliers must conduct a site-specific QMRA on all surface water supplies to determine if the system can meet a specified level of risk. Dutch authorities can also require a QMRA of vulnerable groundwater supplies. In contrast, recent regulatory activity in the United States has seen the U.S. EPA assess the health risks from waterborne pathogens through QMRA and apply this information to set nationwide obligatory treatment performance requirements (U.S. EPA, 2006a,b). In general, drinking water systems must achieve a 4-log removal or inactivation of enteric viruses to address risk from enteric viruses (U.S. EPA, 2006a). To address risk specifically from Cryptosporidium, drinking water systems must monitor their source water, calculate an average Cryptosporidium concentration and use those results to determine if their source is vulnerable to contamination and requires additional treatment. Water systems are classified into categories ("bins") based on whether they are filtered or unfiltered systems; these bins specify additional removal or inactivation requirements for Cryptosporidium spp. (U.S. EPA, 2006b).
Health Canada and the Federal-Provincial-Territorial Committee on Drinking Water have chosen the same approach as WHO (2004), providing QMRA-based performance targets as minimum requirements, but recommend the use of a site-specific QMRA as part of a multi-barrier source-to-tap approach. The QMRA approach offers a number of advantages, including 1) the ability to compare the risk from representative groups of pathogens (e.g., viruses, protozoa, bacteria) in an overall assessment; 2) the transparency of assumptions; 3) the potential to account for variability and uncertainty in estimates; 4) the removal of hidden safety factors (these can be applied as a conscious choice by regulatory authorities at the end of the process, if desired); 5) the site-specific identification of critical control points and limits through sensitivity analysis; and 6) the clear implications of system management on a public health outcome.
There are more than 140 types of enteric viruses known to infect humans that are excreted in the faeces and sometimes urine of infected persons and animals and are potentially found in both surface water and groundwater sources. Enteric viruses are responsible mainly for acute illnesses, although some links to chronic illnesses have been made. Many of these viruses cannot be cultured, and their occurrence in water varies in time and space. At any given time, one virus type may be more prevalent than another in the sewage from a community and affect the source water quality in communities downstream. The best means of safeguarding against the presence of hazardous levels of enteric viruses in drinking water are based on the application of the multi-barrier approach, including source water protection and adequate treatment, as demonstrated using appropriate physicochemical parameters followed by the verification of the absence of faecal indicator organisms in the finished water.
The large number of enteric viruses, the temporal and spatial variations in enteric virus occurrence and methodological limitations make it impractical to routinely monitor for these organisms, and therefore a water quality target, such as a maximum acceptable concentration, for enteric viruses in drinking water is not established. Instead, the protection of public health is accomplished by setting health-based treatment goals.
To set health-based treatment goals, the level of risk deemed tolerable or acceptable needs to be determined. The Federal-Provincial-Territorial Committee on Drinking Water has chosen this acceptable level of risk as 10−6 DALY/person per year, which is consistent with the reference level adopted by WHO (2004). This is a risk management decision that balances the estimated disease burden from enteric viruses with the lack of information on the prevalence of these pathogens in source waters, limitations in disease surveillance and the variations in performance within different types of water treatment technologies.
Although all enteric viruses of concern need to be identified, risk assessments do not usually consider each individual enteric virus. Instead, the risk assessment includes only specific enteric viruses (reference pathogens or, in this case, reference viruses) whose characteristics make them a good representative of all similar pathogenic viruses. It is assumed that if the reference virus is controlled, this would ensure control of all other similar viruses of concern. Rotavirus has been selected as the reference virus for this risk assessment because of the prevalence of infection in children, the possibility of severe outcomes and the availability of a dose-response model.
A source water concentration of 1 rotavirus/100 L of water generally represents groundwater sources and relatively pristine surface water sources. In Canada, many surface water sources will have virus concentrations on the range of 1-100 viruses/L of water (100-10 000/100 L). The QMRA approach used in this guideline demonstrates that if a source water has a mean concentration of approximately 1 rotavirus/100 L, a water treatment plant would need to consistently achieve at least a 4-log reduction in virus concentration in order to meet the reference level of 10−6 DALY/person per year. Thus, a minimum 4-log reduction and/or inactivation of viruses has been established as a health-based treatment goal. A jurisdiction may allow a groundwater source considered less vulnerable to faecal contamiantion to have less than the recommended minimum 4-log reduction if the assessment of the drinking water system has confirmed that the risk of enteric virus presence is minimal. Many source waters in Canada may require more than the minimum treatment goal to meet the acceptable level of risk.
QMRA can be used on a site-specific basis to evaluate how variations in source water quality may contribute to microbiological risk and to assess the adequacy of existing control measures or the requirement for additional treatment barriers or optimization. In most cases, a well-operated treatment plant employing effective coagulation, flocculation, clarification, filtration and disinfection achieving a sufficient CT value should produce water with a negligible risk of infection from enteric viruses. Where possible, watersheds or aquifers that are used as sources of drinking water should be protected from faecal waste.
| Temperature (°C) | Log inactivation | |||||
|---|---|---|---|---|---|---|
| 2 | 3 | 4 | ||||
| pH 6-9 | pH 10 | pH 6-9 | pH 10 | pH 6-9 | pH 10 | |
| 0.5 | 6 | 45 | 9 | 66 | 12 | 90 |
| 5 | 4 | 30 | 6 | 44 | 8 | 60 |
| 10 | 3 | 22 | 4 | 33 | 6 | 45 |
| 15 | 2 | 15 | 3 | 22 | 4 | 30 |
| 20 | 1 | 11 | 2 | 16 | 3 | 22 |
| 25 | 1 | 7 | 1 | 11 | 2 | 15 |
| Temperature (°C) | Log inactivation | ||
|---|---|---|---|
| 2 | 3 | 4 | |
| ≤1 | 1243 | 2063 | 2883 |
| 5 | 857 | 1423 | 1988 |
| 10 | 643 | 1067 | 1491 |
| 15 | 428 | 712 | 994 |
| 20 | 321 | 534 | 746 |
| 25 | 214 | 356 | 497 |
| Temperature (°C) | Log inactivation | ||
|---|---|---|---|
| 2 | 3 | 4 | |
| ≤1 | 8.4 | 25.6 | 50.1 |
| 5 | 5.6 | 17.1 | 33.4 |
| 10 | 4.2 | 12.8 | 25.1 |
| 15 | 2.8 | 8.6 | 16.7 |
| 20 | 2.1 | 6.4 | 12.5 |
| 25 | 1.4 | 4.3 | 8.4 |
| Temperature (°C) | Log inactivation | ||
|---|---|---|---|
| 2 | 3 | 4 | |
| ≤1 | 0.9 | 1.4 | 1.8 |
| 5 | 0.6 | 0.9 | 1.2 |
| 10 | 0.5 | 0.8 | 1 |
| 15 | 0.3 | 0.5 | 0.6 |
| 20 | 0.25 | 0.4 | 0.5 |
| 25 | 0.15 | 0.25 | 0.3 |
Mathematical models have been developed as a means to quantitatively assess the potential microbiological risks associated with a drinking water system, including the potential risks associated with bacterial, protozoan, and viral pathogens. These models have been developed by international organizations (Smeets et al., 2008; Teunis et al., 2009), as well as by groups within Canada (Jaidi et al., 2009). QMRA models have also been used to estimate the potential health risks through other routes of exposure (Mara et al., 2007; Armstrong and Haas, 2008; Diallo et al., 2008). Although some of the assumptions vary between models (i.e., the choice of reference pathogen, or selection of dose-response variables), all are based on the accepted principles of QMRA, that is: hazard identification; exposure assessment; dose-response assessment; and, risk characterization.
A QMRA model was developed by Health Canada as part of the risk assessment process for enteric pathogens in drinking water. This probabilistic model explores the potential disease burden, with associated uncertainty, for user-defined scenarios for a drinking water system. The model includes user inputs for the virological quality of the raw water source and the specific treatment train (defined in terms of filtration, and disinfection approaches). For drinking water systems where data is lacking for the above parameters, the model includes values from published literature and from expert opinion as a starting point for the assessment. For source water quality, the model provides users with the choice of four categories: category A (0.1 rotaviruses/100 L), category B (1 rotavirus/100 L), category C (10 rotaviruses/100 L), and category D (100 rotaviruses/100 L). Water sources that are category A or B for enteric viruses are assumed to have very little human faecal contamination. These may be groundwater sources or protected surface water sources were there is very limited human activity on the watershed. As the level of human impacts increase in a watershed, the source water quality may fall into category C or D. The enteric virus concentrations are expressed in number of rotaviruses per volume of water because the QMRA model uses the key characteristics of rotavirus, with the CT values based on HAV and poliovirus, for the risk calculations. These source water quality estimates were developed only be used within the context of the QMRA model for evaluating the impacts of variations in source water quality on the overall microbiological risks. It should be noted that although a source may be category A for enteric viruses, it may have a different source water quality category for bacterial or protozoan pathogens. For treatment processes, the model uses a range of literature values to more accurately represent the range of effectiveness of treatment methodologies.
The QMRA model uses this exposure information, along with the dose-response model and the DALY calculations for rotavirus, to estimate the potential disease burden (in DALYs/person per year) for the site-specific scenario information. The quality of the outputs from the QMRA model are dependent on the quality of the information that is input into the model. Measurements, as opposed to estimates for exposure levels, will result in a higher quality risk assessment output. Even with high quality exposure data, the QMRA model requires numerous assumptions that introduce uncertainty into the assessment:
For the purposes of this document, a semi-public water supply system is defined as a system with a minimal or no distribution system that provides water to the public from a facility not connected to a public supply. Examples of such facilities include schools, personal care homes, hotels and restaurants. The definition of a semi-public supply may vary between jurisdictions.