This file is also available for viewing and printing as a PDF file by clicking here.
Table of Contents
- Executive Summary
- Chlorination and Public Health
- Chlorine: The Disinfectant of Choice
- The Risks of Waterborne Disease
- The Challenge of Disinfection Byproducts
- Drinking Water and Security
- Comparing Alternative Disinfection Methods
- The Future of Chlorine Disinfection
- Glossary
- References
Executive Summary
The treatment and distribution of water for safe use is one of the greatest achievements of the twentieth century. Before cities began routinely treating drinking water with chlorine (starting with Chicago and Jersey City in 1908), cholera, typhoid fever, dysentery and hepatitis A killed thousands of U.S. residents annually. Drinking water chlorination and filtration have helped to virtually eliminate these diseases in the U.S. and other developed countries.
Meeting the goal of clean, safe drinking water requires a multi-barrier approach that includes: protecting source water from contamination, appropriately treating raw water, and ensuring safe distribution of treated water to consumers’ taps.
During the treatment process, chlorine is added to drinking water as elemental chlorine (chlorine gas), sodium hypochlorite solution or dry calcium hypochlorite. When applied to water, each of these forms “free chlorine,” which destroys pathogenic (disease-causing) organisms.
Almost all U.S. systems that disinfect their water use some type of chlorine-based process, either alone or in combination with other disinfectants. In addition to controlling disease-causing organisms, chlorination offers a number of benefits including:
- Reduces many disagreeable tastes and odors;
- Eliminates slime bacteria, molds and algae that commonly grow in water supply reservoirs, on the walls of water mains and in storage tanks;
- Removes chemical compounds that have unpleasant tastes and hinder disinfection; and
- Helps remove iron and manganese from raw water.
As importantly, only chlorine-based chemicals provide “residual disinfectant” levels that prevent microbial re-growth and help protect treated water throughout the distribution system.
The Risks of Waterborne Disease
Where adequate water treatment is not readily available, the impact on public health can be devastating. Worldwide, about 1.2 billion people lack access to safe drinking water, and twice that many lack adequate sanitation. As a result, the World Health Organization estimates that 3.4 million people, mostly children, die every year from water-related diseases.
Even where water treatment is widely practiced, constant vigilance is required to guard against waterborne disease outbreaks. Well-known pathogens such as E. coli are easily controlled with chlorination, but can cause deadly outbreaks given conditions of inadequate or no disinfection. A striking example occurred in May 2000 in the Canadian town of Walkerton, Ontario. Seven people died and more than 2,300 became ill after E. coli and other bacteria infected the town’s water supply. A report published by the Ontario Ministry of the Attorney General concludes that, even after the well was contaminated, the Walkerton disaster could have been prevented if the required chlorine residuals had been maintained.
Some emerging pathogens such as Cryptosporidium are resistant to chlorination and can appear even in high quality water supplies. Cryptosporidium was the cause of the largest reported drinking water outbreak in U.S. history, affecting over 400,000 people in Milwaukee in April 1993. More than 100 deaths are attributed to this outbreak. New regulations from the U.S. Environmental Protection Agency (EPA) will require water systems to monitor Cryptosporidium and adopt a range of treatment options based on source water Cryptosporidiumconcentrations. Most water systems are expected to meet EPA requirements while continuing to use chlorination.
The Challenge of Disinfection Byproducts
While protecting against microbial contamination is the top priority, water systems must also control disinfection byproducts (DBPs), chemical compounds formed unintentionally when chlorine and other disinfectants react with natural organic matter in water. In the early 1970s, EPA scientists first determined that drinking water chlorination could form a group of byproducts known as trihalomethanes (THMs), including chloroform. EPA set the first regulatory limits for THMs in 1979. While the available evidence does not prove that DBPs in drinking water cause adverse health effects in humans, high levels of these chemicals are certainly undesirable. Cost-effective methods to reduce DBP formation are available and should be adopted where possible. However, a report by the International Programme on Chemical Safety (IPCS 2000) strongly cautions:
The health risks from these byproducts at the levels at which they occur in drinking water are extremely small in comparison with the risks associated with inadequate disinfection. Thus, it is important that disinfection not be compromised in attempting to control such byproducts.
Recent EPA regulations have further limited THMs and other DBPs in drinking water. Most water systems are meeting these new standards by controlling the amount of natural organic material prior to disinfection.
Chlorine and Water System Security
The prospect of a terrorist attack has forced all water systems, large and small, to re-evaluate and upgrade existing security measures. Since September 11th, 2001, water system managers have taken unprecedented steps to protect against possible attacks such as chemical or biological contamination of the water supply, disruption of water treatment or distribution, and intentional release of treatment chemicals.
With passage of the Public Health Security and Bioterrorism Response Act of 2002, Congress required community water systems to assess their vulnerability to a terrorist attack and other intentional acts. As part of these vulnerability assessments, systems assess the transportation, storage and use of treatment chemicals. These chemicals are both critical assets (necessary for delivering safe water) and potential vulnerabilities (may pose significant hazards, if released). Water systems using elemental chlorine, in particular, must determine whether existing protection systems are adequate. If not, they must consider additional measures to reduce the likelihood of an attack or to mitigate the potential consequences.
Disinfection is crucial to water system security, providing the “front line” of defense against biological contamination. However, conventional treatment barriers in no way guarantee safety from biological attacks. Additional research and funding are needed to improve prevention, detection and responses to potential threats.
The Future of Chlorine Disinfection
Despite a range of new challenges, drinking water chlorination will remain a cornerstone of waterborne disease prevention. Chlorine’s wide array of benefits cannot be provided by any other single disinfectant. While alternative disinfectants (including chlorine dioxide, ozone, and ultraviolet radiation) are available, all disinfection methods have unique benefits, limitations, and costs. Water system managers must consider these factors, and design a disinfection approach to match each system’s characteristics and source water quality.
In addition, world leaders increasingly recognize safe drinking water as a critical building block of sustainable development. Chlorination can provide cost-effective disinfection for remote rural villages and large cities alike, helping to bring safe water to those in need.
Chlorination and Public health

Of all the advancements made possible through science and technology, the treatment and distribution of water for safe use is truly one of the greatest. Abundant, clean water is essential for good public health. Humans cannot survive without water; in fact, our bodies are 67% water! Both the U.S. Centers for Disease Control and Prevention and the National Academy of Engineering cite water treatment as one of the most significant advancements of the last century.
Disinfection, a chemical process whose objective is to control disease-causing microorganisms by killing or inactivating them, is unquestionably the most important step in drinking water treatment. By far, the most common method of disinfection in North America is chlorination.
Prior to 1908, no U.S. municipal water systems chemically disinfected water. Consequently, waterborne diseases exacted a heavy toll in illness and death. Without chlorination or other disinfection processes, consumers are at great risk of contracting waterborne diseases. Figure 1-1 shows the decline in the death rate due to typhoid fever following the introduction of chlorine to U.S. municipal drinking water systems in 1908. As more cities adopted water chlorination, U.S. death rates due to cholera and hepatitis A also declined dramatically. Worldwide, significant strides in public health and the quality of life are directly linked to the adoption of drinking water chlorination. Recognizing this success, Life magazine (1997) declared, “The filtration of drinking water plus the use of chlorine is probably the most significant public health advancement of the millennium.”
The timeline at the bottom of these pages highlights important developments in the history of drinking water chlorination.
Providing Safe Drinking Water: A Multi-Barrier Approach
Meeting the goal of clean, safe drinking water requires a multibarrier approach that includes protecting raw source water from contamination, appropriately treating raw water, and ensuring safe distribution of treated water to consumers’ taps.
Chlorination Milestones 1870 – 2000
1870–1880’s | 1890’s | 1908 | 1915 | 1917 |
Scientists demonstrate that microorganisms can cause disease. | First application of chlorine disinfectants to water facilities in England. | First application of chlorine disinfectants to U.S. municipal water facilities in Jersey City and Chicago. | First U.S. drinking water bacterial standard. | Chloramination first used in the U.S. and Canada. |
1918 | 1925 | Early 1960’s | 1970’s | 1972 |
Over 1,000 U.S. cities employ chlorine disinfection. | U.S. drinking water bacterial standard becomes more stringent. | More than 19,000 municipal water systems operate throughout the U.S. | Chlorine dioxide begins to gain acceptance as a drinking water disinfectant. | Passage of the U.S. Clean Water Act for restoring and maintaining surface water quality. |
1974 | 1996 | 2000… | ||
Passage of the U.S. Safe Drinking Water Act; the US Environmental Protection Agency is given authority to set water quality standards which states must enforce. | Amendments to the U.S. Safe Drinking Water Act extend existing law to recognize: source water protection, operator training, funding for water system improvements, and public information. |
Source Water Protection
Source water includes any surface water (rivers and lakes) or groundwater used as a raw water supply. Every drop of rain and melted flake of snow that does not re-enter the atmosphere after falling to the ground wends its way, by the constant pull of gravity, into the vast interconnected system of Earth’s ground- and surface waters. Precipitation ultimately collects into geographic regions known as watersheds or catchment basins, the shapes of which are determined by an area’s topography.
Increasingly, communities are implementing watershed management plans to protect source water from contamination and ecological disruption. For example, stream buffers may be established as natural boundaries between streams and existing areas of development. In addition, land use planning may be employed to minimize the total area of impervious surfaces such as roads and walkways, which prevent water from soaking into the ground. Reservoirs may be protected from contamination by disinfecting wastewater effluents, prohibiting septic system discharges and even controlling beaver activity (Beaver feces are potential sources of the harmful protozoan parasites Giardia lamblia and Cryptosporidium parvum.) Similarly, the Safe Drinking Water Act requires well head protection programs of water systems using groundwater sources. In such programs, the surface region above an aquifer is protected from contaminants that may infiltrate groundwater. Because source water quality affects the kind of treatment needed, watershed management planning is a sustainable, cost-effective step in providing safe drinking water.

Water Treatment
Every day, approximately 170,000 (U.S. EPA, 2002) public water systems treat and convey billions of gallons of water through approximately 880,000 miles (Kirmeyer, 1994) of distribution system piping to U.S. homes, farms and businesses. Broadly speaking, water is treated to render it suitable for human use and consumption. While the primary goal is to produce a biologically (disinfected) and chemically safe product, other objectives also must be met, including: no objectionable taste or odor; low levels of color and turbidity (cloudiness); and chemical stability (non-corrosive and non-scaling). Individual facilities customize treatment to address the particular natural and manmade contamination characteristic of their raw water. Surface water usually presents a greater treatment challenge than groundwater, which is naturally filtered as it percolates through sediments. Surface water is laden with organic and mineral particulate matter, and may harbor protozoan parasites such as Cryptosporidium parvum and Giardia lamblia. The graphic on the following page illustrates and describes the four main steps in a water treatment plant employing chlorine disinfection.
Water Distribution
In storage and distribution, drinking water must be kept safe from microbial contamination. Frequently, slippery films of bacteria, known as biofilms, develop on the inside walls of pipes and storage containers. Among disinfection techniques, chlorination is unique in that a pre-determined chlorine concentration may be designed to remain in treated water as a measure of protection against harmful microbes encountered after leaving the treatment facility.
In the event of a significant intrusion of pathogens resulting, for example, from a broken water main, the level of the average “chlorine residual” will be insufficient to disinfect contaminated water. In such cases, it is the monitoring of the sudden drop in the chlorine residual that provides the critical indication to water system operators that there is a source of contamination in the system.

1. Coagulation
Alum (an aluminum sulfate) or other metal salts are added to raw water to aggregate particles into masses that settle more readily than individual particles.
2. Sedimentation
Coagulated particles fall, by gravity, through water in a settling tank and accumulate at the bottom of the tank, clearing the water of much of the solid debris.
3. Filtration
Water from the sedimentation tank is forced through sand, gravel, coal, or activated charcoal to remove solid particles not previously removed by sedimentation.
4. Disinfection
Chlorine is added to filtered water to destroy harmful microorganisms. An additional amount, known as a “chlorine residual” is applied to protect treated water from re-contamination as it travels throughout the distribution system.
Source: Illustration by Bremmer and Goris Communications.
Chlorine: The Disinfectant of Choice
When applied to water, each of these forms “free chlorine” (see Sidebar: How Chlorine Kills Pathogens). One pound of elemental chlorine provides approximately as much free available chlorine as one gallon of sodium hypochlorite (12.5% solution) or approximately 1.5 pounds of calcium hypochlorite (65% strength). While any of these forms of chlorine can effectively disinfect drinking water, each has distinct advantages and limitations for particular applications.
Almost all water systems that disinfect their water use some type of chlorine-based process, either alone or in combination with other disinfectants. Table 2-1 shows the percentage of drinking water systems using each of these methods.
The Benefits of Chlorine
Potent Germicide
Chlorine disinfectants can reduce the level of many disease-causing microorganisms in drinking water to almost immeasurable levels.
Taste and Odor Control
Chlorine disinfectants reduce many disagreeable tastes and odors. Chlorine oxidizes many naturally occurring substances such as foul-smelling algae secretions, sulfides and odors from decaying vegetation.
Biological Growth Control
Chlorine disinfectants eliminate slime bacteria, molds and algae that commonly grow in water supply reservoirs, on the walls of water mains and in storage tanks.
Chemical Control
Chlorine disinfectants destroy hydrogen sulfide (which has a rotten egg odor) and remove ammonia and other nitrogenous compounds that have unpleasant tastes and hinder disinfection. They also help to remove iron and manganese from raw water.
Disinfectant |
Large Systems
(>10,000 persons) |
Small Systems Using Groundwater
(<10,000 persons) |
Small Systems Using Surface Water
(<10,000 persons) |
Elemental Chlorine |
84%
|
61%
|
82%
|
Sodium Hypochlorite |
20%
|
34%
|
17%
|
Calcium Hypochlorite |
<1%
|
–
|
9%
|
Chloramines |
29%
|
–
|
2%
|
Ozone |
6%
|
–
|
–
|
UV |
–
|
–
|
–
|
Chlorine Dioxide |
8%
|
–
|
6%
|
Source: American Water Works Association 2000. Note: The totals may be greater than 100 percent because some systems use more than one type of disinfectant. |
Residual Disinfection — Protecting All the Way to the Tap
The EPA requires a residual level of disinfection of water in pipelines to prevent microbial re-growth and help protect treated water throughout the distribution system. EPA’s maximum residual disinfection levels (MRDLs) are 4 mg/l for chlorine, 4 mg/l for chloramines and 0.8 mg/l for chlorine dioxide. Although chlorine levels are usually significantly lower in tap water, EPA believes that levels as high as the MRDLs pose no risk of adverse health effects, allowing for an adequate margin of safety (U.S. EPA, 1998a).
How Chlorine Kills Pathogens |
How does chlorine carry out its well-known role of making water safe? Upon adding chlorine to water, two chemical species, known together as “free chlorine,” are formed. These species, hypochlorous acid (HOCl, electrically neutral) and hypochlorite ion (OCl-, electrically negative), behave very differently. Hypochlorous acid is not only more reactive than the hypochlorite ion, but is also a stronger disinfectant and oxidant.
The ratio of hypochlorous acid to hypochlorite ion in water is determined by the pH. At low pH (higher acidity), hypochlorous acid dominates while at high pH hypochlorite ion dominates. Thus, the speed and efficacy of chlorine disinfection against pathogens may be affected by the pH of the water being treated. Fortunately, bacteria and viruses are relatively easy targets of chlorination over a wide range of pH. However, treatment operators of surface water systems treating raw water contaminated by the parasitic protozoan Giardia may take advantage of the pH-hypochlorous acid relationship and adjust the pH to be effective against Giardia, which is much more resistant to chlorination than either viruses or bacteria.
A typical bacterium has a negatively charged slime coating on its exterior cell wall, which is effectively penetrated by electrically neutral hypochlorous acid, favored by lower pH’s. (Reprinted from The Chlorination/Chloramination Handbook by permission. Copyright © 1996, American Water Works Association.) Source: Connell, 1996. |
Factors in Chlorine Disinfection: Concentration and Contact Time
In an attempt to establish more structured operating criteria for water treatment disinfection, the CXT concept came into use in 1980. Based on the work of several researchers, CXT values [ final free chlorine concentration (mg/L) multiplied by minimum contact time (minutes)], offer water operators guidance in computing an effective combination of chlorine concentration and chlorine contact time required to achieve disinfection of water at a given temperature. The CXT formula demonstrates that if an operator chooses to decrease the chlorine concentration, the required contact time must be lengthened. Similarly, as higher strength chlorine solutions are used, contact times may be reduced (Connell, 1996).
The Risks of Waterborne Disease
It is easy to take for granted the safety of modern municipal drinking water, but prior to widespread filtration and chlorination, contaminated drinking water presented a significant public health risk. The microscopic waterborne agents of cholera, typhoid fever, dysentery and hepatitis A killed thousands of U.S. residents annually before disinfection methods were employed routinely, starting about a century ago. Although these pathogens are defeated regularly now by technologies such as chlorination, they should be thought of as ever-ready to “stage a come-back” given conditions of inadequate or no disinfection.
Cryptosporidium

© A.B. Dowsett/SPL/Photo Researchers, Inc.
Illnesses Associated with Waterborne Pathogens
Worldwide, about 1.2 billion people lack access to safe drinking water, and twice that many lack adequate sanitation. As a result, the World Health Organization estimates that 3.4 million people, mostly children, die every year from water-related diseases (WHO, 2002a). In the U.S., outbreaks are commonly associated with contaminated groundwater which has not been properly disinfected. In addition, contamination of the distribution system can occur with water main breaks or other emergency situations (CDC, 2002).
Drinking water pathogens may be divided into three general categories: bacteria, viruses and parasitic protozoa. Bacteria and viruses contaminate both surface and groundwater, whereas parasitic protozoa appear predominantly in surface water. The purpose of disinfection is to kill or inactivate microorganisms so that they cannot reproduce and infect human hosts. Bacteria and viruses are well-controlled by normal chlorination, in contrast to parasitic protozoa, which demand more sophisticated control measures. For that reason, parasitic protozoan infections may be more common than bacterial or viral infections in areas where some degree of disinfection is achieved.
Bacteria
Bacteria are microorganisms often composed of single cells shaped like rods, spheres or spiral structures. Prior to widespread chlorination of drinking water, bacteria like Vibrio cholerae, Salmonella typhii and several species of Shigella routinely inflicted serious diseases such as cholera, typhoid fever and bacillary dysentery, respectively. As recently as 2000, a drinking water outbreak of E. coli in Walkerton, Ontario sickened 2,300 residents and killed seven when operators failed to properly disinfect the municipal water supply. While developed nations have largely conquered water-borne bacterial pathogens through the use of chlorine and other disinfectants, the developing world still grapples with these public health enemies.
Viruses
Viruses are infectious agents that can reproduce only within living host cells. Shaped like rods, spheres or filaments, viruses are so small that they pass through filters that retain bacteria. Enteric viruses, such as hepatitis A, Norwalk virus and rotavirus are excreted in the feces of infected individuals and may contaminate water intended for drinking. Enteric viruses infect the gastrointestinal or respiratory tracts, and are capable of causing a wide range of illness, including diarrhea, fever, hepatitis, paralysis, meningitis and heart disease (American Water Works Association, 1999).
Protozoan Parasites
Protozoan parasites are single-celled microorganisms that feed on bacteria found in multicellular organisms, such as animals and humans. Several species of protozoan parasites are transmitted through water in dormant, resistant forms, known as cysts and oocysts. According to the World Health Organization, Cryptosporidium parvum oocysts and Giardia lamblia cysts are introduced to waters all over the world by fecal pollution. The same durable form that permits them to persist in surface waters makes these microorganisms resistant to normal drinking water chlorination (WHO, 2002b). Water systems that filter raw water may successfully remove protozoan parasites.
Emerging Pathogens
An emerging pathogen is one that gains attention because it is one of the following:
- a newly recognized disease-causing organism
- a known organism that starts to cause disease
- an organism whose transmission has increased
(Source: Guerrant, 1997).
Cryptosporidium is an emerging parasitic protozoan pathogen because its transmission has increased dramatically over the past two decades. Evidence suggests it is newly spread in increasingly popular day-care centers and possibly in widely distributed water supplies, public pools and institutions such as hospitals and extended-care facilities for the elderly. Recognized in humans largely since 1982 and the start of the AIDS epidemic, Cryptosporidium is able to cause potentially life-threatening disease in the growing number of immunocompromised patients. Cryptosporidium was the cause of the largest reported drinking water outbreak in U.S. history, affecting over 400,000 people in Milwaukee in April, 1993. More than 100 deaths are attributed to this outbreak. Cryptosporidium remains a major threat to the U.S. water supply (Ibid.).
Giardia Lambia

The EPA is developing new drinking water regulations to reduce Cryptosporidium and other resistant parasitic pathogens. Key provisions of the Long Term 2 Enhanced Surface Water Treatment Rule include source water monitoring for Cryptosporidium; inactivation by all unfiltered systems; and additional treatment for filtered systems based on source water Cryptosporidium concentrations. EPA will provide a range of treatment options to achieve the inactivation requirements. Systems with high concentrations of Cryptosporidium in their source water may adopt alternative disinfection methods (e.g., ozone, UV, or chlorine dioxide). However, most water systems are expected to meet EPA requirements while continuing to use chlorination. Regardless of the primary disinfection method used, water systems must continue to maintain residual levels of chlorine-based disinfectants in their distribution systems.
Giardia Lambia
Giardia lamblia, discovered approximately 20 years ago, is another emerging waterborne pathogen. This parasitic microorganism can be transmitted to humans through drinking water that might otherwise be considered pristine. In the past, remote water sources that were not affected by human activity were thought to be pure, warranting minimal treatment. However, it is known now that all warm-blooded animals may carry Giardia and that beaver are prime vectors for its transmission to water supplies.
There is a distinct pattern to the emergence of new pathogens. First, there is a general recognition of the effects of the pathogen in highly susceptible populations such as children, cancer patients and the immuno-compromised. Next, practitioners begin to recognize the disease and its causative agent in their own patients, with varied accuracy. At this point, some may doubt the proposed agent is the causative agent, or insist that the disease is restricted to certain types of patients. Finally, a single or series of large outbreaks result in improved attention to preventive efforts. From the 1960’s to the 1980’s this sequence of events culminated in the recognition of Giardia lamblia as a cause of gastroenteritis (Lindquist, 1999).
Waterborne Disease Trends
Detection and investigation of waterborne disease outbreaks is the primary responsibility of local, state and territorial public health departments, with voluntary reporting to the CDC. The CDC and the U.S. Environmental Protection Agency (EPA) collaborate to track waterborne disease outbreaks of both microbial and chemical origins. Data on drinking water and recreational water outbreaks and contamination events have been collected and summarized since 1971.
While useful, statistics derived from surveillance systems do not reflect the true incidence of waterborne disease outbreaks because many people who fall ill from such diseases do not consult medical professionals. For those who do seek medical attention, attending physicians and laboratory and hospital personnel are required to report diagnosed cases of waterborne illness to state health departments. Further reporting of these illness cases by state health departments to the CDC is voluntary, and statistically more likely to occur for large outbreaks than small ones.
Despite these limitations, surveillance data may be used to evaluate the relative degrees of risk associated with different types of source water and systems, problems in current technologies and operating conditions, and the adequacy of current regulations. (Craun, Nwachuku, Calderon, and Craun, 2002).
From 1991 to 2000, there were 155 outbreaks and 431,846 cases of illness in public and individual water systems in the U.S. Table 3-1 lists reported outbreaks, their causes, the numbers of cases of associated illness reported, and the types of water systems affected. By far, the largest outbreak of this period occurred in 1993 with the emerging pathogen Cryptosporidium in Milwaukee.
Etiological Agent |
Community Water Systems 2
|
Noncommunity Water Systems 3
|
Individual Water Systems 4
|
All Systems
|
||||
Outbreaks
|
Cases
|
Outbreaks
|
Cases
|
Outbreaks
|
Cases
|
Outbreaks
|
Cases
|
|
Giardia |
11
|
2,073
|
5
|
167
|
6
|
16
|
22
|
2,256
|
Cryptosporidium* |
7
|
407,642
|
2
|
578
|
2
|
39
|
11
|
408,259
|
Campylobacter jejuni |
1
|
172
|
3
|
66
|
1
|
102
|
5
|
340
|
Salmonellae,nontyphoid |
2
|
749
|
0
|
0
|
1
|
84
|
3
|
833
|
E. coli |
3
|
208
|
3
|
39
|
3
|
12
|
9
|
259
|
E. coli O157:H7/C. jeuni |
0
|
0
|
1
|
781
|
0
|
0
|
1
|
781
|
Shigella |
1
|
83
|
5
|
484
|
2
|
38
|
8
|
605
|
Plesiomonas shigelloides |
0
|
0
|
1
|
60
|
0
|
0
|
1
|
60
|
Non-01 V. c h o l e r a e |
1
|
11
|
0
|
0
|
0
|
0
|
1
|
11
|
Hepatitis A virus |
0
|
0
|
1
|
46
|
1
|
10
|
2
|
56
|
Norwalk-like viruses |
1
|
594
|
4
|
1,806
|
0
|
0
|
3
|
2,400
|
Small, round-structured virus |
1
|
148
|
1
|
70
|
0
|
0
|
2
|
218
|
Chemical |
18
|
522
|
0
|
0
|
7
|
9
|
25
|
531
|
Undetermined |
11
|
10,162
|
38
|
4,837
|
11
|
238
|
60
|
15,237
|
Total |
57
|
422,364
|
64
|
8,934
|
34
|
548
|
155
|
431,846
|
1 Data in Table 1-1 are compiled from CDC Morbidity and Mortality Weekly Report Surveillance Summaries for 1991-1992, 1993-1994, 1995-1996, 1997-1998 and 1999-2000. Figures include adjustments to numbers of outbreaks and illness cases originally reported, based on more recent CDC data. 2 Community water systems are those that serve communities of an average of at least 25 year-round residents and have at least 15 service connections. 3 Non-community water systems are those that serve an average of at least 25 residents and have at least 15 service connections and are used at least 60 days per year. 4 Individual water systems are those serving less than 25 residents and have less than 15 service connections. * There were 403,000 cases of illness reported in Milwaukee in 1993. |
Figure 3-1
The pie chart in figure 3-1 illustrates the relative percentages of agents responsible for drinking water disease outbreaks in the 1991-2000 period. Protozoan parasites caused approximately 21% of reported drinking water outbreaks in this period, bacteria were responsible for about 18% and viruses caused approximately 6% of outbreaks. Chemical agents, such as copper, lead and nitrite, were responsible for about 16% of reported drinking water disease outbreaks.
Figure 3-2

From 1971 to 1998 statistics showed a gradual increase in the percentage of reported drinking water outbreaks for which causation is known. This trend was reversed in the 1999-2000 time period (see Figure 3-2). Untimely investigation, a lack of specimen collection, a lack of testing, or incomplete testing are all obstacles to a more complete understanding of the causes of waterborne outbreaks (Craun et al., 2002).
The number of reported drinking water outbreaks rose in 1999-2000, reversing a previously declining trend (see Figure 3-3). The number of reported illness cases due to these outbreaks, however, remained relatively static (see Figure 3-4).
Figure 3-3
Figure 3-4
Internet Reference on Drinking Water Pathogens |
U.S. Geological Survey, Water Quality Information Pages http://water.usgs.gov/owq/ |
Outbreak in Walkerton, Canada
Insufficient drinking water chlorination sowed the seeds of tragedy in the small southern Ontario town of Walkerton in the Spring of 2000. According to a report published by the Ontario Ministry of the Attorney General (2002), for years the town’s public utility commission operators failed to follow established Canadian Ministry of the Environment (MOE) guidelines on chlorine dosing, monitoring and recording chlorine residuals, and documenting periodic microbiological sampling locations. The report states that the operators knew their practices were “unacceptable and contrary to MOE guidelines and directives” (p.4). To make matters worse, the town’s public utility commissioners failed to properly respond to a 1998 MOE inspection report that set out significant concerns about water quality and several operating deficiencies in Walkerton (Ibid.).
Following a period of unusually heavy rainfall in early May of 2000, manure, applied as fertilizer to farm soil in the vicinity of one of the town’s municipal wells, leaked into that well. Bacteria in the manure contaminated the well water as the chlorinator for that well was not operating due to inadequate maintenance. As the contaminated water from that well blended into the general water supply, the existing chlorine levels were overwhelmed by the sudden influx of organic matter and bacteria. Before long, schools emptied and emergency rooms filled with children and elderly patients suffering from diarrhea and gastrointestinal upset. By the time the cause of the symptoms was traced to contamination of the town’s municipal water supply, many of the town’s residents were very sick. DNA typing studies carried out later would reveal E. coli 0157:H7 and C. jejuni bacterial strains in the manure matched those that were prevalent in the human outbreak. The episode left seven people dead and 2,300 ill.
A thorough government investigation of the Walkerton outbreak culminated in an exhaustive report published by the Ontario Ministry of the Attorney General in 2002. The report concludes that the Walkerton disaster could have been prevented “by the use of continuous chlorine residual and turbidity monitors…” (p. 3). Without the margin of safety provided by a carefully maintained chlorine residual, harmful bacteria remained in the water that coursed through Walkerton taps. By failing to properly monitor chlorine residual levels, the water operators permitted the town water’s chlorine concentration to plummet, setting the stage for a serious outbreak of waterborne disease.
Walkerton Culprits
Samples taken from the Walkerton water system showed contamination with E. coli and C. jejuni bacteria.
Escherichia coli
|
Campylobacter jejuni
|
The Challenge of Disinfection Byproducts
Drinking water chlorination has contributed to a dramatic decline in waterborne disease rates and increased life expectancy in the United States. Largely because of this success, many Americans take it for granted that their tap water will be free of disease-causing organisms. In recent years, regulators and the general public have focused greater attention on potential health risks from chemical contaminants in drinking water. One such concern relates to disinfection byproducts (DBPs), chemical compounds formed unintentionally when chlorine and other disinfectants react with certain organic matter in water.
In the early 1970s, EPA scientists first determined that drinking water chlorination could form a group of byproducts known as trihalomethanes (THMs), including chloroform. Concerned that these chemicals may be carcinogenic to humans, EPA set the first regulatory limits for THMs in 1979. Since that time, a wealth of research has improved our understanding of how DBPs are formed, their potential health risks, and how they can be controlled. It is now recognized that all chemical disinfectants form some potentially harmful byproducts. The byproducts of chlorine disinfection are by far the most thoroughly studied.
While the available evidence does not prove that DBPs in drinking water cause adverse health effects in humans, high levels of these chemicals are certainly undesirable. Cost-effective methods to reduce DBP formation are available and should be adopted where possible. However, the International Programme on Chemical Safety (IPCS), a joint venture of the United Nations Environment Programme, the International Labor Organization, and the World Health Organization (IPCS 2000, p. 13) strongly cautions:
The health risks from these byproducts at the levels at which they occur in drinking water are extremely small in comparison with the risks associated with inadequate disinfection. Thus, it is important that disinfection not be compromised in attempting to control such byproducts.
Recent EPA regulations have further limited THMs and other DBPs in drinking water. Most water systems are meeting these new standards by controlling the amount of natural organic matter prior to disinfection, while ensuring that microbial protection remains the top priority.
DBP Science
DBPs and Human Cancer Risk Toxicology studies have reported that high doses of some DBPs, including THMs and haloacetic acids (HAAs), can cause cancer in laboratory animals. Based largely on these animal data, EPA considers individual THMs and HAAs to be either possible or probable human carcinogens, although any risk from the low levels found in drinking water would be slight. After reviewing the full body of toxicology studies, the IPCS concluded, “None of the chlorination byproducts studied to date is a potent carcinogen at concentrations normally found in drinking water” (IPCS 2000, p. 376).
Some epidemiology studies have reported an association between human exposure to DBPs and elevated cancer risks, while other studies have found no association. EPA evaluated the existing cancer epidemiology studies and found that only for bladder cancer were associations with chlorinated water somewhat consistent. Even in these studies, cancer risks were not strongly correlated to measured THM levels, indicating that other factors cannot be ruled out (Craun et al., 2001). EPA has concluded, “The present epidemiologic data do not support a causal relationship between exposure to chlorinated drinking water and development of cancer at this time” (EPA 1998). The IPCS reached a similar conclusion in 2000, noting that a causal relationship between DBPs and increased cancer “remains an open question” (IPCS 2000).
Chloroform: No Cancer Risk at Low Exposures |
Chloroform, typically the most prevalent THM measured in chlorinated water, is probably the most thoroughly studied disinfection byproduct. Toxicological studies have shown that high levels of chloroform can cause cancer in laboratory animals. Extensive research conducted since the early 1990s provides a clearer picture of what this means for humans exposed to far lower levels through drinking water.
One study (Larson et al. 1994a) conducted by the Centers for Health Research (CIIT) observed that a very large dose of chloroform, when given to mice once per day into the stomach (a procedure known as gavage), produced liver damage and eventually cancer. In a second CIIT cancer study (Larson et al., 1994b), mice were given the same daily dose of chloroform through the animals’ drinking water. This time, no cancer was produced. Follow-up research showed that the daily gavage doses overwhelmed the capability of the liver to detoxify the chloroform, causing liver damage, cell death and regenerative cell growth, thereby increasing risks for cell mutation and cancer in exposed organs. When chloroform was given through drinking water, however, the liver could continually detoxify the chloroform as the mice sipped the water throughout the day. Without the initial liver toxicity, there was no cancer in the liver, kidney or other exposed organs (Butterworth et al., 1998). In its most recent risk assessment, EPA considered the wealth of available information on chloroform, including the important work done at CIIT. EPA concludes that exposure to chloroform below the “threshold” level that causes cell damage is unlikely to increase the risk of cancer. “While chloroform is likely to be carcinogenic at a high enough dose, exposures below a certain dose range are unlikely to pose any cancer risk to humans” (US EPA, 2002a). For drinking water meeting EPA standards, chloroform is unlikely to be a health concern. |
Developmental and Reproductive Effects
Several epidemiology studies have reported a possible association between disinfection byproducts and adverse reproductive outcomes, including spontaneous abortion (miscarriage). One study of women in several California communities (Waller et al. 1998) found a stronger association with bromodichloromethane (BDCM) than with other byproducts. Because the available studies have significant limitations, EPA and the American Water Works Association Research Foundation are sponsoring a new epidemiology study to replicate the 1998 Waller study. This study, conducted by researchers at the University of North Carolina, will be completed in 2005.
When the Waller study was published, the available toxicology data on reproductive and developmental effects of some DBPs was quite limited. It was recognized that BDCM, in particular, should be thoroughly studied for a potential causal relationship to reproductive and developmental toxicity. The Research Foundation for Health and Environmental Effects ® , a tax-exempt foundation established by the Chlorine Chemistry Division of the American Chemistry Council, sponsored a set of animal studies (Christian et al. 2001, 2002) — including two developmental toxicity studies on BDCM, a reproductive toxicity study on BDCM, and a reproductive toxicity study on dibromoacetic acid (DBA). The studies, published in the International Journal of Toxicology, found no adverse effects from BDCM and DBA at dose levels thousands of times higher than what humans are exposed to through drinking water. The studies were designed to comply with stringent EPA guidelines, and each study was independently monitored and peer reviewed.
Updating the Safe Drinking Water Act Regulations
EPA has regulated DBPs in drinking water since 1979. The first DBP standards limited THM levels to 100 parts per billion (ppb) for systems serving more the 10,000 people. In the 1996 Safe Drinking Water Act (SDWA) reauthorization, Congress called for EPA to revise its standards for disinfectants and DBPs in two stages. The revised regulations are designed to reduce potential DBP risks, while ensuring that drinking water is protected from microbial contamination.
Stage 1 DBP Rule
In December 1998 USEPA issued the Stage 1 Disinfectants and Disinfection Byproducts (Stage 1 DBP) rule. The regulations are based on an agreement between members of a Federal Advisory Committee that included representatives from water utilities, the Chlorine Chemistry Division of the American Chemistry Council, public health officials, environmentalists and other stakeholder groups. This diverse group of experts developed a consensus set of recommendations to cost-effectively reduce DBP levels, without compromising protection from microbial contaminants.
The Stage 1 DBP rule mandates a process called enhanced coagulation to remove natural organic matter, reducing the potential for DBPs to form. The rule also sets enforceable Maximum Contaminant Levels (MCLs) for total trihalomethanes at 80 ppb and the sum of five Haloacetic Acids (HAAs) at 60 ppb. These MCLs are based on system-wide running annual averages, meaning that concentrations may be higher at certain times and at certain points in the system, as long as the system-wide average for the year is below the MCL.
In developing the Stage 1 DBP rule, EPA was very cautious about encouraging the use of alternative disinfectants. The Agency recognized that alternative disinfectants might reduce THMs and HAAs, but produce other, less understood, byproducts. The Agency also avoided making recommendations that would encourage utilities to reduce the level of disinfection currently being practiced.
Large water systems (those serving more than 10,000 persons) were required to comply with the Stage 1 DBP rule by December 2001. Systems serving fewer than 10,000 persons must comply by December 2003.
Stage 2 DBP Rule
As the Stage 1 rule is coming into full force, EPA is completing work on its Stage 2 DBP rule. The Stage 2 rule is being developed simultaneously with the Long Term 2 Enhanced Surface Water Treatment Rule (LT2) in order to address the risk trade-offs between pathogen control and exposure to DBPs. The LT2 rule deals primarily with controlling Cryptosporidium and other resistant pathogens discussed in Chapter 3. Again, the EPA sought recommendations from an advisory group, the Stage 2 Microbial and Disinfection Byproducts Federal Advisory Committee.
As outlined in the advisory committee’s September 2000 Agreement in Principle, the MCLs for THMs and five HAAs will remain 80 ppb and 60 ppb respectively, based on each utility’s system-wide running annual averages. However, the Stage 2 rule will also limit DPB levels at specific locations within distribution systems. When fully implemented, these locational running annual average limits will mean that no part of the distribution system will be allowed to exceed the MCLs for these substances.
EPA expects to finalize the Stage 2 rule in 2004, with compliance phased-in over the next eight years.
Peru Cholera Epidemic |
A stark example of the continuing public health threat from waterborne disease outbreaks occurred in Peru in 1991, where a major causative factor was inadequate drinking water disinfection. The result: a five-year epidemic of cholera, the disease’s first appearance in the Americas in the 20th century. The epidemic spread to 19 Latin American countries, causing more than one million illnesses and 12,000 deaths. After the outbreak, U.S. and international health officials criticized Peruvian water officials for not chlorinating the entire water supply.
An official with the Pan American Health Organization (PAHO) blames the inadequate chlorination, at least in part, on concern over disinfection byproducts. In a 1997 article in the Journal of the American Water Works Association, Horst Otterstetter states, “Rather than being abated by increased use of chlorination, the waterborne transmission of cholera was actually aided because of worries about chlorination byproducts” (Otterstetter and Craun, 1997). Water officials in Peru and other Latin American countries clearly misinterpreted the risks posed by disinfection byproducts. In May 1991, in the midst of the outbreak, PAHO Director Carlyle Guerra de Macedo wrote to EPA Administrator William Reilly stating: Widespread publicity and the large number of scientific articles regarding the potential health significance of THMs in drinking water has caused many municipalities and communities of Latin America to abandon chlorination. This situation presents a serious problem at a time when the acute health risk due to enteric disease agents is four or five orders of magnitude greater than the chronic exposure risk from THMs. To avoid further misunderstanding, Macedo asked EPA for a letter clarifying that chlorination to control waterborne diseases should be afforded top priority. EPA’s response stated: Weighing the known benefits of disinfection as evidenced by decreased waterborne disease outbreaks, with a theoretical excess cancer risk, EPA strongly endorses disinfection of drinking water to control microorganisms. The epidemic in Peru underscores the critical, global need for adequate drinking water disinfection. Disinfection byproducts should be reduced where feasible, as they are in the U.S., but never at the cost of compromised microbial protection. |
Balancing DBP and Microbial Risks
Continuing evidence of waterborne disease occurrence suggests that microbial risks should receive a much higher level of attention than disinfection byproducts. For this reason, The American Academy of Microbiology (Ford and Colwell, 1996) has recommended, “the health risks posed by microbial pathogens should be placed as the highest priority in water treatment to protect public health.”
A report published by the International Society of Regulatory Toxicology and Pharmacology (Coulston and Kolbye, 1994) stated “The reduction in mortality due to waterborne infectious diseases, attributed largely to chlorination of potable water supplies, appears to outweigh any theoretical cancer risks (which may be as low as zero) posed by the minute quantities of chlorinated organic chemicals reported in drinking waters disinfected with chlorine.”
The IPCS (IPCS 2000, p. 375) reached similar conclusions:
Disinfection is unquestionably the most important step in the treatment of water for drinking water supplies. The microbial quality of drinking water should not be compromised because of concern over the potential long-term effects of disinfectants and DBPs. The risk of illness and death resulting from exposure to pathogens in drinking water is very much greater than the risks from disinfectants and DBPs.
Controlling Disinfection Byproducts
Treatment techniques are available that provide water suppliers the opportunity to maximize potable water safety and quality while minimizing the risk of DBP risks. Generally, the best approach to reduce DBP formation is to remove natural organic matter precursors prior to disinfection. EPA has published a guidance document for water system operators entitled, Controlling Disinfection byproducts and Microbial Contaminants in Drinking Water (EPA, 2001).
The EPA guidance discusses three processes to effectively remove natural organic matter prior to disinfection:
1. Coagulation and Clarification
Most treatment plants optimize their coagulation process for turbidity (particle) removal. However, coagulation processes can also be optimized for natural organic matter removal with higher doses of inorganic coagulants (such as alum or iron salts), and optimization of pH.
2. Absorption
Activated carbon can be used to absorb soluble organics that react with disinfectants to form byproducts.
3. Membrane Technology
Membranes, used historically to desalinate brackish waters, have also demonstrated excellent removal of natural organic matter. Membrane processes use hydraulic pressure to force water through a semi-permeable membrane that rejects most contaminants. Variations of this technology include reverse osmosis (RO), nanofilitration (low pressure RO), and microfiltration (comparable to conventional sand filtration).
Other conventional methods of reducing DBP formation include changing the point of chlorination and using chloramines for residual disinfection. EPA predicts that most water systems will be able to achieve compliance with new DBP regulations through the use of one or more of these relatively low cost methods (EPA, 1998).
Water system managers may also consider switching from chlorine to alternative disinfectants to reduce formation of THMs and HAAs. However, all chemical disinfectants form some DBPs. Much less is known about the byproducts of these alternatives than is known about chlorination byproducts. Furthermore, each disinfection method has other distinct advantages and disadvantages. Chapter 6 discusses some of the key issues for water system managers to consider when choosing between methods.
Drinking Water and Security: Threats to Public Water Systems
Water treatment and distribution systems provide one of the most basic elements of life, a reliable supply of safe drinking water. Protecting these critical systems from intentional wrongdoing has always been a concern. For many systems, security measures were primarily designed to protect facilities and equipment from pranks and vandalism. Recently, though, the prospect of a terrorist attack on a water system has forced all water systems, large and small, to re-evaluate and upgrade existing security measures.
Even before the September 11th terrorist attacks on the World Trade Center and the Pentagon, officials recognized water systems as potential terrorist targets. For example, on January 24, 2001, the Federal Bureau of Investigation warned U.S. water utilities that the Bureau had received “a signed threat from a very credible, well-funded, North Africa-based terrorist group indicating that they intend to disrupt water operations in 28 U.S. cities.”
Since September 11th, 2001, water system managers have taken unprecedented steps to improve security at their facilities. With support from federal, state and local governments, water utilities are working to secure their reservoirs, treatment plants, and distribution systems from a terrorist attack and to minimize the potential impact if an attack were to occur.
Water Systems Move to Improve Security
Drinking water systems have numerous resources available to assist them in addressing security issues. The EPA, through its Water Protection Task Force and Regional Offices, is working to:
- Provide direct grant assistance to support counter-terrorism activities;
- Support development of tools, training and technical assistance; and
- Promote information sharing and research to improve treatment and detection methods.
In addition, water industry associations, including the American Water Works Association and the Association of Metropolitan Water Agencies, serve as clearinghouses for sharing critical information with the thousands of water systems in the U.S.
With passage of the Public Health Security and Bioterrorism Response Act of 2002, Congress required each community water system serving more than 3,300 persons to assess its vulnerability to a terrorist and other intentional act. Vulnerability assessments provide a comprehensive analysis of potential threats to a drinking water system, including: chemical or biological contamination of the water supply; disruption of water treatment or distribution; and intentional release of treatment chemicals to harm employees and the public. Vulnerability assessments also provide prioritized plans for security upgrades, operational modifications, and/or policy changes to mitigate risks identified in the assessment.
Strengthening the security of both treatment plants and distribution systems are top priorities. For example, based on needs identified by its vulnerability assessment, the Metropolitan Water District of Southern California, which provides drinking water to nearly 17 million people, authorized $5.5 million dollars for new security measures. Among the improvements, these funds will be used to enhance water-quality monitoring and to strengthen physical security for the District’s chemical storage and treatment processes.
Disinfection and Bioterrorism
Disinfection is crucial to water system security, providing the “front line” of defense against biological contamination. Normal filtration and disinfection processes would dampen or remove the threats posed by a number of potential bioterrorism agents. In addition, water systems should maintain an ability to increase disinfection doses in response to a particular threat.
However, conventional treatment barriers in no way guarantee safety from biological attacks. For many potential bioterrorism agents, there is little scientific information about what levels of reduction can be achieved with chlorine or other disinfectants. In addition, contamination of water after it is treated could overwhelm the residual disinfectant levels in distribution systems. Furthermore, typical water quality monitoring does not provide real-time data to warn of potential problems (Rose 2002).
Additional research and funding are needed to improve prevention, detection, and responses to potential threats.
Protecting Chlorine and Other Treatment Chemicals
As part of its vulnerability assessment, each water system must consider its transportation, storage and use of treatment chemicals. These chemicals are both critical assets (necessary for delivering safe water) and potential vulnerabilities (may pose significant hazards, if released). For example, a release of chlorine gas would pose an immediate threat to system operators, and a large release may pose a danger to the surrounding community. As part of its vulnerability assessment, a water system using chlorine must determine if existing layers of protection are adequate. If not, a system should consider additional measures to reduce the likelihood of an attack or to mitigate the potential consequences.
Possible measures to address chlorine security include: enhanced physical barriers (e.g., constructing secure chemical storage facilities), policy changes (e.g., tightening procedures for receiving chemical shipments), reducing quantities stored on site, or adopting alternative disinfection methods. These options must be weighed and prioritized, considering the unique characteristics and resources of each system.
Water system officials must evaluate the risk-tradeoffs associated with each option. For example, reducing the chemical quantities on-site may reduce a system’s ability to cope with an interruption of chemical supplies. Furthermore, changing disinfection technologies will not necessarily improve overall safety and security. As discussed in Chapter 6, each disinfectant has unique strengths and limitations that must be considered.
Comparing Alternative Disinfection Methods
Up until the late 1970s, chlorine was virtually the only disinfectant used to treat drinking water. Chlorine was considered an almost ideal disinfectant, based on its proven characteristics:
- Effective against most known pathogens
- Provides a residual to prevent microbial re-growth and protect treated water throughout the distribution system
- Suitable for a broad range of water quality conditions
- Easily monitored and controlled
- Reasonable cost
More recently, drinking water providers have faced an array of new challenges, including:
- Treating resistant pathogens such as Giardia and Cryptosporidium
- Minimizing disinfection byproducts
- New environmental and safety regulations
- Strengthening security at treatment facilities
To meet these new challenges, water system managers must design unique disinfection approaches to match each system’s characteristics and source water quality. While chlorination remains the most commonly used disinfection method by far, water systems may use alternative disinfectants, including chloramines, chlorine dioxide, ozone, and ultraviolet radiation. No single disinfection method is right for all circumstances, and in fact, water systems may use a variety of methods to meet overall disinfection goals at the treatment plant, and to provide residual protection throughout the distribution system.
The sections below describe various disinfection technologies, and discuss the major advantages and limitations associated with each.
CHLORINATION
Chlorine is applied to water in one of three forms: elemental chlorine (chlorine gas), hypochlorite solution (bleach), or dry calcium hypochlorite. All three forms produce free chlorine in water.
Advantages
- Highly effective against most pathogens
- Provides a residual to protect against recontamination and to reduce bio-film growth in the distribution system
- Easily applied, controlled, and monitored
- Strong oxidant meeting most preoxidation objectives
- Operationally the most reliable
- The most cost-effective disinfectant
Limitations
- Byproduct formation (THMs, HAAs)
- Will oxidize bromide to bromine, forming brominated organic byproducts
- Not effective against Cryptosporidium
- Requires transport and storage of chemicals
Elemental Chlorine
Elemental chlorine is the most commonly used form of chlorine. It is transported and stored as a liquefied gas under pressure. Water treatment facilities typically use chlorine in 100 and 150-lb cylinders or one-ton containers. Some large systems use railroad tank cars or tanker trucks.
Advantages
- Lowest cost of chlorine forms
- Unlimited shelf-life
Limitations
- Hazardous gas requires special handling and operator training
- Additional regulatory requirements, including EPA’s Risk Management Program and the Occupational Safety and Health Administration’s Process Safety Management program
Sodium Hypochlorite
Sodium Hypochlorite, or bleach, is produced by adding elemental chlorine to sodium hydroxide. Typically, hypochlorite solutions contain from 5 to 15% chlorine, and are shipped by truck in one- to 5,000- gallon containers.
Advantages
- Solution is less hazardous and easier to handle than elemental chlorine
- Fewer training requirements and regulations than elemental chlorine
Limitations
- Limited shelf-life
- Potential to add inorganic byproducts (chlorate, chlorite and bromate) to water
- Corrosive to some materials and more difficult to store than most solution chemicals
- Higher chemical costs than elemental chlorine
Calcium Hypochlorite
Calcium hypochlorite is another chlorinating chemical used primarily in smaller applications. It is a white, dry solid containing approximately 65% chlorine, and is commercially available in granular and tablet forms.
Advantages
- More stable than sodium hypochlorite, allowing longer storage
- Fewer training requirements and regulations than elemental chlorine
Limitations
- Dry chemical requires more handling than sodium hypochlorite
- Precipitated solids formed in solution complicate chemical feeding
- Higher chemical costs than elemental chlorine
- Fire or explosive hazard if handled improperly
- Potential to add inorganic byproducts (chlorate, chlorite and bromate) to water
Onsite Hypochlorite Generation
In recent years some municipalities have installed on-site hypochlorite generators that produce weak hypochlorite solutions (~0.8%) using an electrolytic cell and a solution of salt water.
Advantages
- Minimal chemical storage and transport
Limitations
- More complex and requires a higher level of maintenance and technical expertise
- High capital cost
- Operating costs are often higher than for commercial hypochlorite
- Requires careful control of salt quality
- Weak solution requires high volume chemical feed and control
- Byproducts in generated hypochlorite may be difficult to monitor and control
- System backup may be more difficult and costly
CHLORINE-BASED ALTERNATIVE DISINFECTANTS
Chloramines
Chloramines are chemical compounds formed by combining a specific ratio of chlorine and ammonia in water. Because chloramines are relatively weak as a disinfectant, they are almost never used as a primary disinfectant. Chloramines provide a durable residual, and are often used as a secondary disinfectant for long distribution lines and where free chlorine demand is high. Chloramines may also be used instead of chlorine in order to reduce chlorinated byproduct formation and to remove some taste and odor problems.
Advantages
- Reduced formation of THMs, HAAs
- Will not oxidize bromide to bromine forming brominated byproducts •
- More stable residual than free chlorine
- Excellent secondary disinfectant, has been found to be better than free chlorine at controlling coliform bacteria and biofilm growth
- Lower taste and odor than free chlorine
Limitations
- Weak disinfectant and oxidant
- Requires shipment and handling of ammonia or ammonia compounds as well as chlorinating chemicals
- Ammonia is toxic to fish, and may pose problems for aquarium owners
- Will cause problems for kidney dialysis if not removed from water
Chlorine Dioxide
Chlorine dioxide (ClO 2 ) is generated on-site at water treatment facilities. In most generators sodium chlorite and elemental chlorine are mixed in solution, which almost instantaneously forms chlorine dioxide. Chlorine dioxide characteristics are quite different from chlorine. In solution it is a dissolved gas, which makes it largely unaffected by pH but volatile and relatively easily stripped from solution. Chlorine dioxide is also a strong disinfectant and a selective oxidant. While chlorine dioxide does produce a residual it is only rarely used for this purpose.
Advantages
- Effective against Cryptosporidium
- Up to five times faster than chlorine at inactivating Giardia
- Disinfection is only moderately affected by pH
- Will not form chlorinated byproducts (THMs, HAAs)
- Does not oxidize bromide to bromine (can form bromate in sunlight)
- More effective than chlorine in treating some taste and odor problems
- Selective oxidant used for manganese oxidation and targeting some chlorine resistant organics
Limitations
- Inorganic byproduct formation (chlorite, chlorate)
- Highly volatile residuals
- Requires on-site generation equipment and handling of chemicals (chlorine and sodium chlorite)
- Requires a high level of technical competence to operate and monitoring equipment, product and residuals
- Occasionally poses unique odor and taste problems
- High operating cost (chlorite chemical cost is high)
NON-CHLORINE ALTERNATIVE DISINFECTANTS
Ozone
Ozone (O 3 ) is generated on-site at water treatment facilities by passing dry oxygen or air through a system of high voltage electrodes. Ozone is one of the strongest oxidants and disinfectants available. Its high reactivity and low solubility, however, make it difficult to apply and control. Contact chambers are fully contained and non-absorbed ozone must be destroyed prior to release to avoid corrosive and toxic conditions. Ozone is more often applied for oxidation rather than disinfection purposes.
Advantages
- Strongest oxidant/disinfectant available
- Produces no chlorinated THMs, HAAs
- Effective against Cryptosporidium at higher concentrations
- Used with Advanced Oxidation processes to oxidize refractory organic compounds
Limitations
- Process operation and maintenance requires a high level of technical competence
- Provides no protective residual
- Forms brominated byproducts (bromate, brominated organics)
- Forms nonhalogenated byproducts (ketenes, organic acids, aldehydes)
- Breaks down more complex organic matter; smaller compounds can enhance microbial re-growth in distribution systems and increase DBP formation during secondary disinfection processes.
- Higher operating and capital costs than chlorination
- Difficult to control and monitor particularly under variable load conditions
Ultraviolet Radiation
Ultraviolet (UV) radiation, generated by mercury arc lamps, is a non-chemical disinfectant. When UV radiation penetrates the cell wall of an organism, it damages genetic material, and prevents the cell from reproducing. Although it has a limited track record in drinking water applications, UV has been shown to effectively inactivate many pathogens while forming limited disinfection byproducts.
Advantages
- Effective at inactivating most viruses, spores and cysts
- No chemical generation, storage, or handling
- Effective against Cryptosporidium
- No known byproducts at levels of concern
Limitations
- No residual protection
- Low inactivation of some viruses (reoviruses and rotaviruses)
- Difficult to monitor efficacy
- Irradiated organisms can sometimes repair and reverse the destructive effects of UV through a process known as photo-reactivation
- May require additional treatment steps to maintain high-clarity water
- Does not provide oxidation, or taste and odor control
- High cost of adding backup/emergency capacity
- Mercury lamps may pose a potable water and environmental toxicity risk
The Future of Chlorine Disinfection
The previous chapters discuss a number of challenges facing drinking water providers. In response to new regulations, emerging science on microbial contaminants, as well as safety and security concerns related to treatment chemicals, water system managers will continue to evaluate chlorine and other disinfection methods. Despite these challenges, a number of factors indicate that drinking water chlorination will remain a corner-stone of waterborne disease prevention.
- Disinfection is unquestionably the most important step in drinking water treatment, and chlorine’s wide range of benefits cannot be provided by any other single disinfectant.
- It is uncertain that alternative disinfectants reduce potential DBP risks significantly (IPCS 2000). All chemical disinfectants produce byproducts. Generally, the best approach to control disinfection byproducts is to remove natural organic precursors prior to disinfection (EPA 2001).
- To comply with the forthcoming Long Term 2 Enhanced Surface Water Treatment Rule, some systems with high levels of Cryptosporidium in their source water may choose to adopt alternative disinfection methods (e.g., chlorine dioxide, ozone, or UV). However, most water systems are expected to meet disinfection requirements without changing treatment technologies.
- The U.S. EPA’s forthcoming Groundwater Rule, as well as efforts to strengthen Canadian drinking water standards following the E coli. outbreak in Walkerton, ON will likely increase the use of chlorination for ground water systems.
- Only chlorine-based disinfectants provide residual protection, an important part of the multi-barrier approach to preventing waterborne disease.
- World leaders increasingly recognize safe drinking water as a critical building block of sustainable development (see Sidebar). Chlorination can provide cost-effective disinfection for remote rural villages and large cities alike, helping to bring safe water to those in need.
Safe Water: A Building Block for Sustainable Development |
An adequate supply of clean water, sanitation and hygiene are the most important preconditions for sustaining human life, for maintaining ecological systems that support all life and for achieving sustainable development. — African Ministerial Declaration at the International Conference on Freshwater, December 2001.Safe water is essential for life itself. Sadly, 1.2 billion people around the world lack access to safe drinking water, and twice that many lack adequate sanitation. As a result, the World Health Organization estimates that 3.4 million people, mostly children, die every year from water-related diseases (WHO 2002). Diarrheal disease, a result of lack of adequate water and sanitation services, in the past 10 years have killed more children than all the people lost to armed conflict since World War II (United Nations 2002). Many of these diseases can be prevented with appropriate water treatment and proper sanitation and hygiene practices.Increasing access to safe water can improve more than public health. In Africa, women and girls spend as much as 3 hours a day fetching water, an expenditure of calories greater than one-third their daily food intake (United Nations 2002). The task of keeping the home supplied with drinking water is often so laborious and time consuming that it can constitute the most significant single obstacle standing in the way of a child’s education. In addition, a reliable supply of water is necessary for almost all economic development.The United Nations has recognized the critical link between safe water and sustainable development. At the 2002 World Summit on Sustainable Development (WSSD) in Johannesburg, South Africa, the UN reaffirmed its goal to reduce by one-half the proportion of people without access to safe water by 2015. The WSSD also adopted a comparable goal for improving access to basic sanitation. Meeting these goals will require sustained, coordinated action and billions of dollars worth of investment each year. |
Glossary
Adsorption: Attachment of a substance to the surface of a solid.
Aquifer: A natural underground layer, often of sand or gravel, which contains water.
Bacteria: Microorganisms often composed of single cells shaped like rods, spheres or spiral structures.
Bioterrorism: Terrorism using biological agents.
Chlorination: The process of adding a form of chlorine to water or wastewater.
Clarification: Removal of bulk water from a dilute suspension of solids by gravity sedimentation, aided by chemical flocculating agents.
Coagulation: Irreversible combination or aggregation of particles to form a larger mass.
Contact Time: The period of disinfection in water treatment.
Disinfection: Destruction of harmful microorganisms, usually by the use of bactericidal chemical compounds.
Disinfection Byproducts: Compounds created by the reaction of a disinfectant with organic compounds in water.
Distribution System: A network of pipes leading from a treatment plant to customers’ plumbing systems.
Emerging Pathogen: A pathogen that gains attention because it is either a newly recognized disease-causing organism, a known organism that starts to cause disease, or an organism whose transmission has increased.
Epidemiology: The study of the occurrence and causes of health effects in human populations. An epidemiological study often compares two groups of people who are alike except for one factor, such as exposure to a chemical or the presence of a health effect. The investigators try to determine if any factor is associated with the health effect.
Filtration: The operation of separating suspended solids from a liquid (or gas) by forcing the mixture through a porous barrier.
Free Chlorine: The sum of hypochlorous acid and hypochlorite ions expressed in terms of mg/L or ppm.
Groundwater: The water that systems pump and treat from aquifers (natural reservoirs below the earth’s surface).
Haloacetic Acids: A group of disinfection byproducts that includes dichloroacetic acid, trichloroacetic acid, monochloroacetic acid, bromoacetic acid, and dibromoacetic acid.
Maximum Contaminant Level (MCL): The highest level of a contaminant that EPA allows in drinking water. MCLs are set as close to Maximum Contaminant Level Goals (MCLGs) as feasible using the best available treatment technology and taking cost into consideration. MCLs are enforceable standards.
Maximum Contaminant Level Goal (MCLG): The level of a contaminant, determined by EPA, at which there would be no risk to human health. This goal is not always economically or technologically feasible, and the goal is not legally enforceable.
Microbial Contamination: Contamination of water supplies with microorganisms such as bacteria, viruses and parasitic protozoa.
Microorganisms: Tiny living organisms that can be seen only with the aid of a microscope. Some microorganisms can cause acute health problems when con-sumed in drinking water. Also known as microbes.
Organic Matter: Matter derived from organisms, such as plants and animals.
Oxidation: Any reaction in which electrons are transferred.
Parasitic Protozoa: Single-celled microorgan-isms that feed on bacteria and are found in multicellular organisms, such as animals and people.
Pathogen: A disease-causing organism.
pH: A measure of the acidity or alkalinity of an aqueous solution.
Raw Water: Water in its natural state, prior to any treatment for drinking.
Residual: The measurement of chlorine in water after treatment.
Risk Assessment: The process evaluating the likelihood of an adverse health effect, with some statistical confidence, for various levels of exposure.
Surface Water: The water that systems pump and treat from sources open to the atmosphere, such as rivers, lakes, and reservoirs.
Toxicology: The branch of medical science devoted to the study of poisons, including their modes of action, effects, detection, and countermeasures.
Trihalomethanes: A group of disinfection byproducts that includes chloroform, bromodichloromethane, bromoform, and dibromochloromethane.
Turbidity: The cloudy appearance of water caused by the presence of tiny particles. High levels of turbidity may interfere with proper water treatment and monitoring.
Ultraviolet Radiation: Radiation in the region of the electromagnetic spectrum including wavelengths from 100 to 3900 angstroms.
Viruses: Microscopic infectious agents, shaped like rods, spheres or filaments that can reproduce only within living host cells.
Waterborne Disease: Disease caused by contaminants, such as microscopic pathogens like bacteria, viruses and parasitic protozoa, in water.
Watershed: The land area from which water drains into a stream, river, or reservoir.
References
American Water Works Association, Water Quality Division Disinfection Systems Survey Committee Report (May, 2000). Journal of the American Water Works Association, 9, 24-43.
American Water Works Association (1999). Manual of Water Supply Practices: Waterborne Pathogens (1st ed.). Denver: American Water Works Association.
Anonymous (1997, Fall). The millenium: The 100 events headline: No. 46; Water purification. Life Magazine Special Double Issue.
Butterworth, B.E., Kedderis, G.L., and Conolly, R.B. (1998) The chloroform risk assessment: A mirror of scientific understanding. CIIT Activities,18 no.4.
Christian, M.S., York, R.G., Hoberman, A.M., Diener, R.M., Fisher, L.C., and Gates, G.A. (2001a). Biodisposition of dibromoacetic acid (DBA) and bromodichloromethane (BDCM) administered to rats and rabbits in drinking water during range-finding reproduction and developmental toxicity studies. International Journal of Toxicology, 20, 239-253.
Christian, M.S., York, R.G., Hoberman, A.M., Diener, R.M., and Fisher, L.C. (2001b). Oral (drinking water) developmental toxicity studies of bromodichloromethane (BDCM) in rats and rabbits. International Journal of Toxicology, 20, 225-237.
Christian, M.S., York, R.G., Hoberman, A.M., Fisher, L.C., and Brown, W.R. (2002a). Oral (drinking water) two-generation reproductive toxicity study of bromodichloromethane (BDCM) in rats. International Journal of Toxicology, 21, 115-146.
Christian, M.S., York, R.G., Hoberman, A.M., Frazee, J., Fisher, L.C., Brown, W.R., and Creasy, D.M. (2002b). Oral (drinking water) two-generation reproductive toxicity study of dibromoacetic acid (DBA) in rats. International Journal of Toxicology, 21, 1-40.
Connell, G.F. (1996). The chlorination/chloramination handbook. Denver: American Water Works Association.
Coulston, F., and Kolbye, A. (Eds.) (1994). Regulatory Toxicology and Pharmacology, vol. 20, no. 1, part 2.
Craun, G.F., Nwachuku, N., Calderon, R.L., and Craun, M.F. (2002). Outbreaks in drinking-water systems, 1991-1998. Journal of Environmental Health, 65, 16-25.
Craun, G.F., Hauchman, F.S. and Robinson D.E. (Eds.) (2001). Microbial pathogens and disinfection byproducts in drinking water: Health effects and management of risks, Conference Conclusions, (pp.533-545). Washington, D.C.: ILSI Press.
Ford, T.E. and Colwell R.R. (1996). A global decline in microbiological safety of water: A call for action, a report prepared for the American Academy of Microbiology.
Guerra de Macedo, G. (1991). Pan American Health Organization. Ref. No. HPE/PER/CWS/010/28/1.1.
Guerrant, R.L. (1997). Cryptosporidiosis: An emerging, highly infectious threat. Emerging Infectious Diseases, 3, Synopses. [On-Line.] Available: http://www.cdc.gov/ncidod/ied/vol3no1/guerrant.htm (accessed 12-5-02).
International Programme on Chemical Safety (2000). Disinfectants and disinfectant byproducts, Environmental Health Criteria 216.
Kirmeyer, G.J. (1994). An assessment of the condition of North American water distribution systems and associated research needs. American Water Works Association Research Foundation Project #706.
Larson, J.L., Wolf, D.C., and Butterworth, B.E. (1994a). Induced cytolethality and regenerative cell proliferation in the livers and kidneys of male B6C3F1 mice given chloroform by gavage. Fundamentals and Applied Toxicology, 23, 537-543.
Larson, J.L., Wolf, D.C., and Butterworth, B.E. (1994b). Induced cytotoxicity and cell proliferation in the hepatocarcinogenicity of chloroform in female B6C3F1 mice: comparison of administration by gavage in corn oil vs. ad libitum in drinking water. Fundamentals and Applied Toxicology, 22, 90-102.
Lindquist, H.D.A. (1999). Emerging pathogens of concern in drinking water. EPA Publication #EPA 600/R-99/070.
National Academy of Engineering (2000). Greatest engineering achievements of the 20th century. [On-Line]. Available: (http://www.greatachievements.org/ greatachievements/) (accessed 2-10-03).
Ontario Ministry of the Attorney General, The Honorable Dennis R. O’Connor (2002). Part one: A summary: Report of the Walkerton inquiry: The events of May 2000 and related issues.
Otterstetter, H. and Craun, C. (September, 1997). Disinfection in the Americas: A necessity. Journal of the American Water Works Association, 8-10.
Rose, J.B. (2002). Water quality security. Environmental Science and Technology, 36, 217-256.
U.S. Centers for Disease Control and Prevention, (November 22, 2002). Morbidity and Mortality Weekly Report, CDC Surveillance summaries: Surveillance for waterborne disease outbreaks—United States, 1999–2000.
U.S. Centers for Disease Control and Prevention (May 26, 2000). Morbidity and Mortality Weekly Report, CDC Surveillance summaries: Surveillance for waterborne disease outbreaks—United States, 1997–1998.
U.S. Centers for Disease Control and Prevention (December 11, 1998). Morbidity and Mortality Weekly Report, CDC Surveillance summaries: Surveillance for waterborne disease outbreaks—United States, 1995–1996.
U.S. Centers for Disease Control and Prevention (1997). Summary of notifiable diseases. U.S. Centers for Disease Control and Prevention (April 12, 1996). Morbidity and Mortality Weekly Report, CDC Surveillance summaries: Surveillance for waterborne disease outbreaks—United States, 1993–1994.
U.S. Centers for Disease Control and Prevention (November 19, 1993). Morbidity and Mortality Weekly Report, CDC Surveillance summaries: Surveillance for waterborne disease outbreaks—United States, 1991–1992.
U.S. Environmental Protection Agency (2001a). Toxicological review of chloroform in support of summary information on the Integrated Risk Information System (IRIS). EPA Number 635/R-01/001.
U.S. Environmental Protection Agency (2001b). Controlling Disinfection byproducts and Microbial Contaminants in Drinking Water. EPA Number 600/R-01/110.
U.S. Environmental Protection Agency (1998a). National Primary Drinking Water Regulations: Disinfectants and Disinfection Byproducts; Final Rule. Federal Register Vol 63, No. 157. Wednesday, Dec.16, 1998.
U.S. Environmental Protection Agency (1998b). Regulatory Impact Analysis of Final Disinfectant/ Disinfection byproducts Regulations. Washington, D.C. EPA Number 815-B-98-002-PB 99-111304
U.S. Environmental Protection Agency (1991). Letter from Wilcher, L.S. to Guerra de Macedo, G.