Guidelines for Drinking-water Quality

Guidelines for Drinking-water Quality
Guidelines for
Drinking-water
Quality
Management
for New Zealand
2013
Third edition
Citation: Ministry of Health. 2013. Guidelines for Drinking-water Quality Management
for New Zealand 2013. Third edition. Wellington: Ministry of Health.
Published in October 2013
by the Ministry of Health
PO Box 5013, Wellington 6145, New Zealand
ISBN: 978-0-478-41533-9 (online)
HP 5735
This document is available at www.health.govt.nz
Document summary
The Guidelines for Drinking-water Quality Management for New Zealand 2013 (the
Guidelines) are produced in three volumes:
•
Volume 1: Chapters 1–19
•
Volume 2: Appendices 1–3 (Appendices 1 and 3 TO COME)
•
Volume 3: The Datasheets.
Chapters 1–5 are largely introductory, covering Ministry policy, risk management, source waters
and microbiological quality.
Chapters 6–11 discuss compliance issues.
Chapters 12–18 cover operations and maintenance of the water supply.
Chapter 19 concerns the smaller water supplies and water delivered by tanker.
The Guidelines are produced only in electronic form.
Guidelines for Drinking-water Quality Management for New Zealand 2013
iii
Contents
Document summary
iii
Chapter 1: Introduction
1
1.1
Introduction
1.1.1
Purpose of the guidelines
1.1.2
Background
1.1.3
Waterborne diseases in New Zealand
1.1.4
The cost of providing safe drinking-water
1.1.5
The benefits of safe drinking-water supplies
1
1
1
3
8
9
1.2
Ministry of Health public health protection strategy for drinking-water 1993–1995
1.2.1
Strategy development
1.2.2
Outline of the strategy
1.2.3
Planned milestones
1.2.4
Desired outcomes
1.2.5
Promotion of awareness of public health issues related to drinking-water
1.2.6
Outputs achieved by 1995
11
11
12
12
13
13
13
1.3
Strategy development 1995–2000
1.3.1
Consultation
1.3.2
Inclusion of protozoa in the DWSNZ
1.3.3
Human resource development
1.3.4
DWSNZ 2000
14
14
14
15
15
1.4
Strategy development 2000–2005
1.4.1
Consultation
1.4.2
Public health risk management plans
1.4.3
Legislative development
1.4.4
Development of the DWSNZ 2005
1.4.5
Development of the drinking-water assistance programme
16
16
16
17
18
19
1.5
Operational development of the drinking-water management programme 2005–2009
1.5.1
The Drinking-water Assistance Programme, DWAP
1.5.2
Legislation: The Health (Drinking Water) Amendment Act 2007 (Part 2a of
the Health Act 1956 – the Act)
1.5.3
The 2008 revision of DWSNZ 2005
1.5.4
Tankered drinking-water supplies
1.5.5
Development of specifications for Rural Agricultural Drinking-Water
Supplies (RADWS)
1.5.6
Point-of-use and point-of-entry drinking-water treatment appliances
standards
20
20
Tools for promoting safe drinking-water supplies
1.6.1
Introduction
1.6.2
Drinking-water Standards for New Zealand
1.6.3
Guidelines for Drinking-Water Quality Management in New Zealand
1.6.4
Public health risk management plans
1.6.5
Public health grading of drinking-water supplies
23
23
23
25
26
27
1.6
Guidelines for Drinking-water Quality Management for New Zealand 2013
21
21
22
22
22
v
1.6.6
1.6.7
1.6.8
1.6.9
1.6.10
1.6.11
1.6.12
1.7
Drinking-water assessment
Monitoring
Surveillance
Identifying priority 2 determinands
Register of Community Drinking-water Supplies and Suppliers in New
Zealand
Annual Review of Drinking-water Quality in New Zealand
Register of recognised laboratories
Other drinking-water requirements
1.7.1
Drinking-water quality at airports
1.7.2
Drinking-water quality in shipping
27
28
28
28
29
30
30
30
30
31
References
32
Chapter 2: Management of community supplies
36
2.1
Introduction
2.1.1
Components of a drinking-water supply
2.1.2
Overview of management systems
36
37
39
2.2
Risk management
2.2.1
General
2.2.2
Public Health Risk Management Plans (PHRMPs)
2.2.3
Contingency planning
2.2.4
Response to incidents
2.2.5
Debriefings
2.2.6
Sanitary surveys
2.2.7
Staff and contractors
40
40
43
58
59
60
61
62
2.3
Quality assurance
2.3.1
Key features of quality assurance
2.3.2
Application to drinking-water supplies
64
64
64
2.4
Quality control
66
References
67
Chapter 3: Water sources
70
3.1
Introduction
70
3.2
Groundwater
3.2.1
Description of a groundwater system
3.2.2
The quality of groundwater
3.2.3
Factors affecting groundwater quality
3.2.4
Establishing the security of a bore water supply
71
71
73
77
82
3.3
Surface water
3.3.1
Rivers and streams
3.3.2
Lakes and reservoirs
3.3.3
Springs
105
106
109
112
3.4
Legislation
3.4.1
General
3.4.2
National Environmental Standards (NES)
112
112
115
vi
Guidelines for Drinking-water Quality Management for New Zealand 2013
3.5
Mitigation of pollutants and catchment protection
3.5.1
Rural activities
3.5.2
Urban and transportation pollutants
117
117
122
References
123
Chapter 4: Selection of water source and treatment
132
4.1
Introduction
132
4.2
Identifying potential sources
4.2.1
Quantity, reliability, access
133
133
4.3
Barriers to the transmission of micro-organisms
4.3.1
Protection of water catchments
4.3.2
Storage and pretreatment
4.3.3
Coagulation and filtration
4.3.4
Disinfection and inactivation
136
137
138
138
139
4.4
Evaluating the sources
4.4.1
Where to sample
4.4.2
When to sample and how often
4.4.3
What to sample for
4.4.4
Effect of recycling washwater
140
140
142
143
150
4.5
Selecting appropriate treatment processes
4.5.1
Intakes
4.5.2
Treatment selection
151
151
152
References
156
Chapter 5: General microbiological quality
160
5.1
Introduction
160
5.2
Micro-organisms in drinking-water
5.2.1
Introduction
5.2.2
Controlling waterborne infection – historical overview
5.2.3
Maximum acceptable value (MAV)
162
162
163
165
5.3
Microbial indicators
5.3.1
Introduction
5.3.2
Bacterial indicators
5.3.3
Pathogenic protozoal indicators
5.3.4
Pathogenic viral indicators
5.3.5
Secondary bacterial indicators
5.3.6
Indicators of general quality
5.3.7
Indicators of effectiveness of treatment
166
166
167
169
170
170
172
173
5.4
Waterborne pathogens
5.4.1
Testing for specific pathogens
5.4.2
Bacterial pathogens from faecal contamination
5.4.3
Bacterial pathogens growing in the water supply
5.4.4
Viruses
5.4.5
Pathogenic protozoa
5.4.6
Helminths
174
174
175
176
176
177
180
Guidelines for Drinking-water Quality Management for New Zealand 2013
vii
5.4.7
5.4.8
5.5
Cyanobacteria (blue-green algae)
Disease from waterborne pathogens
Organisms causing problems other than disease
5.5.1
General
5.5.2
Organisms causing taste and odour
5.5.3
Micro-organisms causing colour
5.5.4
Iron and manganese deposits due to bacteria
5.5.5
Corrosion resulting from iron and sulphur bacteria activity
5.5.6
Large numbers of micro-organisms
5.5.7
Invertebrate inhabitants of water systems
181
181
182
182
183
183
183
184
184
185
References
186
Chapter 6: Bacteriological compliance
190
6.1
Introduction
190
6.2
Monitoring for E. coli
6.2.1
General principles
6.2.2
Statistical considerations
191
191
193
6.3
Microbiological compliance
6.3.1
Introduction
6.3.2
Methods for detecting and enumerating E. coli
6.3.3
Effective monitoring programmes
6.3.4
Monitoring drinking-water leaving a treatment plant
6.3.5
Monitoring drinking-water from groundwater
6.3.6
Monitoring drinking-water in the distribution system
6.3.7
Chlorine testing as a substitute for E. coli
196
196
196
197
198
199
201
202
6.4
Sampling and testing
6.4.1
Sample handling
6.4.2
Test methods and sources
6.4.3
Laboratory competency
204
204
206
208
6.5
Transgressions
6.5.1
Response
6.5.2
Record keeping
209
209
213
Appendix: Boil water notices
213
References
214
Chapter 7: Virological compliance
216
7.1
Introduction
216
7.2
Health significance of human viruses in drinking-water
217
7.3
Occurrence of human viruses in source waters
219
7.4
Risk management
7.4.1
International approaches
7.4.2
Virus removal by current water treatment processes
219
220
220
7.5
Sampling, testing and data interpretation
223
7.6
C.t values
224
References
viii
Guidelines for Drinking-water Quality Management for New Zealand 2013
227
Chapter 8: Protozoal compliance
231
8.1
Introduction
231
8.2
Source water
8.2.1
Introduction
8.2.2
Approach to categorisation
8.2.3
Catchment risk category approach
8.2.4
Measurement of Cryptosporidium oocysts approach
8.2.5
Comparison of protozoal risk assessment and oocyst monitoring data
8.2.6
Catchment categorisation review
232
232
233
235
235
237
238
8.3
The cumulative log credit approach
8.3.1
Calculation of log credits
8.3.2
Which treatment processes are additive for protozoal compliance?
239
239
240
8.4
Compliance
8.4.1
Pretreatment processes
8.4.2
Coagulation processes
8.4.3
Filtration processes
8.4.4
Disinfection processes
8.4.5
Other processes and other log credit determinations
244
245
248
255
267
281
8.5
Challenge testing
8.5.1
Using Cryptosporidium oocysts
8.5.2
Using microspheres
8.5.3
Using naturally occurring bacteria
8.5.4
Bag and cartridge filter challenge
8.5.5
Membrane filter challenge
8.5.6
UV appliance challenge
8.5.7
General
284
284
284
284
285
285
286
287
8.6
Sampling and testing for protozoa and substitute compliance tests
8.6.1
Giardia and Cryptosporidium testing
8.6.2
Alternatives to Giardia and Cryptosporidium testing
288
288
289
8.7
Transgressions
8.7.1
Response
8.7.2
Reporting
303
303
304
References
305
Appendix: Guidance Notes: Interpretation of Catchment Protozoal Risk Category
Part 1: Surface waters
Part 2: Bore water supplies
Part 3: Five-yearly reassessment of protozoa risk category
310
310
312
315
Chapter 9: Cyanobacterial compliance
316
9.1
Introduction
9.1.1
Algal bloom development
9.1.2
Health significance of cyanotoxins
9.1.3
Taste and odour caused by cyanobacteria
9.1.4
Occurrence of toxic cyanobacteria internationally and in New Zealand
316
317
319
321
321
9.2
Risk management
9.2.1
Assessment of risk
324
324
Guidelines for Drinking-water Quality Management for New Zealand 2013
ix
9.3
Monitoring
326
9.4
Compliance
327
9.5
Sampling and testing
9.5.1
Sample testing
9.5.2
Sample collection
330
330
330
9.6
Transgressions
337
9.7
Risk reduction
9.7.1
Alert levels
9.7.2
Preventive and remedial measures
338
338
341
References
349
Chapter 10: Chemical compliance
354
10.1 Introduction
10.1.1 Maximum Acceptable Value (MAV)
354
354
10.2 Chemical determinands of health significance
10.2.1 Background
10.2.2 Inorganic chemicals of health significance
10.2.3 Organic determinands of health significance and pesticides
10.2.4 Health risk from toxic chemicals
10.2.5 Derivation of MAVs for chemicals of health significance
10.2.6 Plumbosolvent water
355
355
357
359
363
364
375
10.3 Monitoring programme design
10.3.1 Drinking-Water Standards for New Zealand
10.3.2 Priority 2a and 2b chemical determinands
10.3.3 Plumbosolvent water
10.3.4 Discretionary monitoring
377
377
378
384
387
10.4 Sampling procedures and techniques
10.4.1 Chemical determinands
10.4.2 Plumbosolvent water
393
393
396
10.5 Analytical details
10.5.1 Chemical determinands
10.5.2 Laboratory competency
10.5.3 Interpretation of analytical data
397
397
401
401
10.6 Records and assessment of compliance
10.6.1 Example: Records and assessment of chemical compliance for Bogus
community drinking-water supply
402
402
10.7 Response to transgressions
405
References
407
Chapter 11: Radiological compliance
410
11.1
410
Introduction
11.2 Radiological determinands
11.2.1 Overview and occurrence of radiochemicals
11.2.2 Routes of exposure
11.2.3 Derivation of radiological MAVS
x
Guidelines for Drinking-water Quality Management for New Zealand 2013
410
410
413
415
11.3 Monitoring programme design
416
11.4 Sampling procedures and techniques
417
11.5 Analytical details
11.5.1
Total alpha and beta radioactivity measurement
11.5.2 Radiochemical analysis
11.5.3 Radon determination
417
417
417
418
11.6 Records and assessment of compliance
418
11.7 Response to transgressions
418
11.8 Radon removal
418
References
419
Chapter 12: Treatment processes, pretreatment
421
12.1 Introduction
421
12.2 Groundwater
12.2.1 Aeration
12.2.2 Oxidation processes
12.2.3 pH adjustment
422
422
425
426
12.3 Surface water
12.3.1 Bank filtration and infiltration galleries
12.3.2 Off-river storage
12.3.3 Presedimentation
12.3.4 Prefiltration
12.3.5 Microstrainers
12.3.6 Roughing filters
427
427
431
434
435
436
437
References
438
Chapter 13: Treatment processes, coagulation
440
13.1 Introduction
440
13.2 Coagulation process
442
13.3 Coagulants and flocculants
13.3.1 Definitions
13.3.2 Coagulants
13.3.3 Flocculants
13.3.4 Health effects
444
444
444
445
445
13.4 Coagulation and flocculation
13.4.1 Overview
13.4.2 Jar testing
13.4.3 Performance and control
446
446
447
448
13.5 Clarification and sedimentation
13.5.1 Overview
13.5.2 Clarifier types
13.5.3 Optimisation and performance issues
449
449
450
452
13.6 Lime softening and ion exchange
13.6.1 Lime softening
13.6.2 Ion exchange
454
454
455
Guidelines for Drinking-water Quality Management for New Zealand 2013
xi
13.7 Rapid granular media filtration
13.7.1 Overview
13.7.2 Turbidity monitoring
13.7.3 Filter operation
13.7.4 Optimisation of the filtration process
456
456
458
458
458
13.8 Second stage filtration
461
References
462
Chapter 14: Treatment processes, filtration and adsorption
465
14.1 Introduction
465
14.2 Diatomaceous earth filtration
14.2.1 Vacuum or standard DE filtration
14.2.2 Pressure or modified DE filtration
14.2.3 Some operating issues with DE filtration
14.2.4 Monitoring
466
467
468
469
469
14.3 Slow sand filtration
14.3.1 Cleaning
14.3.2 Monitoring
14.3.3 Aeration
14.3.4 Some operating issues with slow sand filtration
469
471
471
472
472
14.4 Membrane filtration
14.4.1 Introduction
14.4.2 Current experience in New Zealand and overseas
14.4.3 Fundamentals of membrane filtration
14.4.4 Membrane selection
14.4.5 Membrane plant operations
473
473
473
474
477
478
14.5 Cartridge filtration
485
14.6 Bag filtration
491
14.7 Adsorption processes
14.7.1 Activated alumina
14.7.2 Solid block activated carbon (SBAC) filters
14.7.3 Granular activated carbon (GAC) filters
14.7.4 Biologically active filters (BAC)
493
493
494
494
495
14.8 Desalination
495
References
497
Chapter 15: Treatment processes, disinfection
500
15.1 Introduction
500
15.2 Disinfection effectiveness
15.2.1 C.t values
15.2.2 Disinfectant concentration
15.2.3 Nature of the disinfectant
15.2.4 Type of micro-organisms present
15.2.5 pH
15.2.6 Temperature
15.2.7 Water quality
502
503
504
504
505
505
506
506
xii
Guidelines for Drinking-water Quality Management for New Zealand 2013
15.2.8
15.2.9
Regrowth
Disinfectant mixing and retention time
15.3 Choice of disinfectant
15.4 Disinfection by-product formation
507
507
509
511
15.5 Disinfection processes
15.5.1 Chlorine
15.5.2 Chloramines
15.5.3 Chlorine dioxide
15.5.4 Ozone
15.5.5 Ultraviolet disinfection
15.5.6 Other disinfectants
519
519
525
530
535
542
552
References
557
Chapter 16: The distribution system
563
16.1 Introduction
16.1.1 Critical points in a distribution system
563
563
16.2 Components of a distribution system
16.2.1 Service reservoirs
16.2.2 Distribution network
16.2.3 Pump stations
16.2.4 System monitoring and control
16.2.5 Design issues affecting water quality
16.2.6 Permeation and leaching
565
566
567
567
568
569
570
16.3 Operations and maintenance
16.3.1 Service reservoir operation
16.3.2 Reticulation operation
16.3.3 Maintenance of disinfection residual
16.3.4 Barriers against recontamination
574
574
576
583
584
16.4 Aesthetic considerations
16.4.1 Wholesomeness
16.4.2 Consumer complaints
587
588
591
References
592
Chapter 17: Monitoring, water treatment and drinking-water
597
17.1 Introduction
17.1.1
Test methods
597
597
17.2 Sampling
598
17.3 Monitoring for process control
17.3.1 Planning a monitoring programme
17.3.2 Installation
17.3.3 Standardisation
17.3.4 Process control
601
602
605
606
608
17.4 Continuous monitoring for compliance
17.4.1 Priority 1 determinands
17.4.2 Priority 2 determinands and indirect indicators
613
613
615
Guidelines for Drinking-water Quality Management for New Zealand 2013
xiii
17.4.3
17.4.4
17.5 Testing
17.5.1
17.5.2
17.5.3
17.5.4
17.5.5
17.5.6
17.5.7
17.5.8
17.5.9
Control limits
Recording and storing results
615
616
Introduction
Appropriate testing
Online monitoring
Quality assurance, quality control and testing proficiency
Accuracy, precision, uncertainty of measurement
Referee methods, standards and traceability
Calibrating a method against the referee method
Reporting the results
Records
618
618
618
619
620
622
624
626
627
628
17.6 Comparing test results against a MAV
17.6.1 Uncertainties of measurement
17.6.2 Comparison of a measurement with a fixed value
17.6.3 Approaches considered in developing the method used in the DWSNZ
17.6.4 Approach adopted in the DWSNZ
17.6.5 Detection
630
630
632
633
634
635
References
637
Chapter 18: Aesthetic considerations
641
18.1 Introduction
641
18.2 Aesthetic determinands
18.2.1 Overview of aesthetic determinands
18.2.2 Rationale for the aesthetic guideline values
642
642
647
18.3 Water treatment for the removal of aesthetic determinands
648
18.4 Monitoring programme design
655
18.5 Aesthetic guidelines criteria
655
18.6 Analytical details
658
References
661
Chapter 19: Small, individual and roof water supplies
663
19.1 Introduction
663
19.2 Small water supplies
19.2.1 Key requirements of the DWSNZ
19.2.2 Preparing a public health risk management plan
19.2.3 Sanitary inspection
19.2.4 Water quality monitoring
19.2.5 Supplies not required to demonstrate compliance with DWSNZ
19.2.6 water supplies operated under the Building Act
664
664
665
667
672
674
676
19.3 Individual household drinking-water supplies
19.3.1 Water sources other than rainwater
19.3.2 Sanitary inspection
19.3.3 Water quality and monitoring
19.3.4 Water treatment
19.3.5 Plumbing considerations
676
677
678
678
680
687
xiv
Guidelines for Drinking-water Quality Management for New Zealand 2013
19.4 Roof-collected rainwater supplies
19.4.1 Introduction
19.4.2 Microbiological problems
19.4.3 Chemical problems
19.4.4 Maintenance problems
19.4.5 Design and operation
19.4.6 Information resources
688
688
688
691
693
694
698
19.5 Tankered drinking-water supplies
699
19.6 Rural agricultural drinking-water supplies
700
References
700
Appendix 1 (To Come)
Appendix 2: Statistical issues that relate to the Drinking-water Standards of
New Zealand
706
1
Compliance rules for percentile standards
706
2
Handling non-detects
710
References
711
Appendix 3 (To Come)
List of Tables
Table 1.1:
Table 1.2:
Table 1.3:
Table 1.4:
Table 1.5:
Table 2.1:
Table 3.1:
Table 3.2:
Table 3.3:
Table 3.4:
Table 3.5:
Table 3.6:
Table 3.7:
Table 3.8:
Table 4.1:
Table 4.2:
Table 4.3:
Table 4.4:
Waterborne enteric disease – cases associated with outbreaks
Reported rates of potentially waterborne notifiable diseases, 1999–2007
Waterborne outbreaks in New Zealand, 1984–2006
Economic benefits arising from water and sanitation improvements
Examples of priority allocation in the DWSNZ
Supply elements contributing to the four main barriers to bacterial
contaminants
Concentrations of enteric pathogens and indicators in different types of
source water
Statistics showing the variability in conductivity, chloride and nitrate-N for
groundwaters
Example of calculations of coefficient of variation and standardised
variance
Log reductions of viruses in Canterbury’s alluvial aquifers
Human activities and associated inputs into freshwater ecosystems with
human health risks
Some chemical constituents in untreated surface water used for drinkingwater supply that present a potential problem
Faecal contamination in a range of New Zealand streams and rivers
Specific yields (kg/ha/y) for different land uses in New Zealand
Source water quality
Performance that can be achieved by effective barriers to contamination
Effect of impoundment on mean concentrations of some determinands
during fairly dry summer/autumn periods
Summary of sources that may provide significant chemical contaminants of
concern (COCs) to freshwater environments in New Zealand
Guidelines for Drinking-water Quality Management for New Zealand 2013
3
4
5
9
29
49
73
92
93
98
106
107
109
110
133
137
141
144
xv
Table 4.5:
Table 4.6:
Table 4.7:
Table 4.8:
Table 4.9:
Table 5.1:
Table 6.1:
Table 7.1:
Table 7.2:
Table 7.3:
Table 7.4:
Table 7.5:
Table 8.1:
Table 8.2:
Table 8.3:
Table 8.4:
Table 8.5:
Table 8.6:
Table 8.7:
Table 8.8:
Table A1:
Table A2:
Table 9.1:
Table 9.2:
Table 9.3:
Table 9.4:
Table 9.5:
Table 10.1:
Table 11.1:
Table 11.2:
Table 12.1:
Table 12.2:
Table 14.1:
Table 14.2:
Table 14.3:
Table 14.4:
Table 15.1:
Table 15.2:
xvi
Prioritising chemical monitoring in drinking-water using limited
information
Treatment options for typical low colour source waters
Treatment options for source waters with colour that also needs to be
removed
Treatment options for other types of source waters
Options for waters that only require disinfection
Waterborne pathogens and their significance in water supplies
Example spreadsheet for converting FAC to FACE
UV dose requirements for virus inactivation credit
C.t values for inactivation of viruses by free chlorine, pH 6–9
C.t values for inactivation of viruses by chloramine
C.t values for inactivation of viruses by chlorine dioxide, pH 6–9
C.t values for inactivation of viruses by ozone
Mean spore removal for full-scale sedimentation basins
Relationship between mean turbidity reduction during sedimentation and
the percent of months when mean spore removal was at least 0.5 log
Studies of Cryptosporidium removal at different filtrate turbidity levels
Results from studies of Cryptosporidium (or surrogate) removal by bag
filters
Results from studies of Cryptosporidium (or surrogate) removal by
cartridge filters
Methods and terminology for calculating the log inactivation credit when
using ozone
UV dose requirements for Cryptosporidium inactivation credits
Summary of some methods for standardisation and verification of
turbidimeters
Log removal requirements for bore waters <10 m deep based on land use
activities and soil / sub-surface material permeability
Sub-surface media and soil categories
Toxic cyanobacteria species and their geographical distribution
Cyanobacteria genera known to occur in New Zealand fresh waters and the
toxins they are known to produce
Information that may help in situation assessment and management
Summary of performance of water treatment processes capable of removing
cell-bound microcystins by removing whole cells
Efficiency of dissolved toxin removal by oxidants/disinfectants and
activated carbons
Data for calculating the Langelier Index
The natural radionuclides that may be found in drinking-water
Radioactivity in New Zealand waters
Reduction times for selected micro-organisms in surface water
Studies of protozoa removal from off-river raw water storage
Properties of typical membrane materials
Typical design/operating criteria for MF/UF systems (guidance only)
Data log and check sheet
Chemical cleaning log sheet – provided by membrane supplier
C.t value ranges for 99 percent (2 log) inactivation of various microorganisms by disinfectants at 5oC
Baffle factors for use in measuring detention time
Guidelines for Drinking-water Quality Management for New Zealand 2013
146
155
155
156
156
174
203
225
225
226
226
226
250
250
254
259
263
269
280
293
314
314
322
323
324
346
347
388
412
413
432
433
478
483
484
485
504
508
Table 15.3:
Table 15.4:
Table 15.5:
Table 15.6:
Table 15.7:
Table 15.8:
Table 15.9:
Table 15.10:
Table 17.1:
Table 17.2:
Table 17.3:
Table 17.4:
Table 17.5:
Table18.1:
Table 19.1:
Table 19.2:
Table 19.3:
Table A1:
Characteristics of different disinfectants
Disinfection by-products often present in disinfected waters
Chlorine C.t values for 99 percent inactivation (2 logs)
Variation of hypochlorous acid and hypochlorite ion with pH
Monochloramine C.t values for 99 percent inactivation (2 logs)
Chlorine dioxide C.t values for 99 percent inactivation (2 logs)
Ozone C.t values for 99 percent inactivation (2 logs)
UV irradiation doses required for 99 percent inactivation (2 logs)
Process control monitoring by treatment stage in a conventional process
Instruments and examples of their application
Drinking-water Standards for New Zealand: requirements for continuous
online monitoring
Types of precision associated with test results
Suggested report form
Sampling frequencies for aesthetic guidelines (ex grading)
Common contaminants, related problems, and their likely sources
Contaminants and treatment methods
Point-of-use and point-of-entry devices and an indication of their
effectiveness against various contaminants
Numbers of samples and allowable transgressions needed to keep
maximum risks below 5 percent when assessing compliance with a 95th
percentile standard
511
512
520
521
525
531
535
543
603
604
617
623
628
657
679
681
685
707
List of Figures
Figure 1.1:
Figure 2.1:
Figure 2.2:
Figure 2.3:
Figure 2.4:
Figure 2.5:
Figure 2.6:
Figure 2.7:
Figure 3.1:
Figure 3.2:
Figure 3.3:
Figure 4.1:
Figure 4.2:
Figure 6.1:
Figure 6.2:
Figure 9.1:
Figure 9.2:
Figure 9.3:
Figure 9.4:
Epidemic versus endemic/sporadic disease
Schematic diagram of a drinking-water supply system
Integrated management of water supply systems
The risk management process
Suggested approach for the development of PHRMPs
Process for the implementation of PHRMPs
Organisation of incident management teams
Basic structure for a quality management system
Atmospheric CFC and tritium concentrations in rain water
Sanitary protection of a typical bore
Catchment protection
Confidence limits on a 95th percentile estimate
Micro-organism size and treatability
Confidence of compliance for a 95 percentile, over smaller and larger
datasets
Confidence of compliance for a 98 percentile
Rapid assessment of the level of risk posed by toxic cyanobacteria in a
drinking water source
Procedure for use of the integrated hose-pipe sampler for planktonic
cyanobacteria and cyanotoxins
Benthic cyanobacteria monitoring and sampling schematic of layout of
transects and survey areas
Benthic cyanobacteria monitoring and sampling schematic of transect
cross-section showing arrangement of sampling points
Guidelines for Drinking-water Quality Management for New Zealand 2013
7
37
40
41
47
54
60
65
90
98
117
142
154
195
195
325
333
335
336
xvii
Figure 9.5:
Alert levels framework for the management of cyanobacteria in drinkingwater supplies
Figure 10.1: Standard fitting for testing plumbosolvency
Figure 10.2: Influence of water composition on meringue dezincification
Figure 12.1: Tray aerator
Figure 13.1: Conventional coagulation, sedimentation and filtration
Figure 13.2: Direct filtration
Figure 13.3: Lamella plates
Figure 13.4: Adsorption clarifier
Figure 13.5: Dissolved air flotation (DAF)
Figure 13.6: Actiflo process
Figure 13.7: Rapid granular media filter
Figure 14.1: Diatomaceous earth pressure plant at Mokau, Waitomo District
Figure 14.2: Typical submerged membrane system
Figure 14.3: Cutaway showing cartridge seal
Figure 14.4: Suggested arrangement for reading pressure differential across a cartridge
filter
Figure 14.5: Typical pressure drop across a cartridge during a filter run
Figure 15.1: Baffle characteristics of a pipe and tank
Figure 15.2: The effect of chlorine to nitrogen ratios in producing chloramines
compounds
Figure 15.3: Distribution diagram for chloramine species with pH
Figure 16.1: Typical reticulation system
Figure 16.2: Reservoir short-circuiting: severe (left) vs moderate short-circuiting
Figure 16.3: Fire hydrant standpipe being used to flush a water main
Figure 16.4: Commissioning a backflow prevention device
Figure 16.5: Example of corroded water pipe
Figure 16.6: Structure of a biofilm
Figure 17.1: Typical standard curve applicable to most test parameters
Figure 17.2: Closed loop control – a flow paced lime pump with pH correction
Figure 17.3: An example of a distributed control arrangement
Figure 17.4: An example of a SCADA arrangement
Figure 17.5: A telemetry system arrangement
Figure 17.6: Hypochlorite ion vs hypochlorous acid at various pH values
Figure 17.7: Example of use of control limits
Figure 19.1: Reducing the contamination of open reservoirs
Figure 19.2: Prevention of contamination of storage tanks
Figure 19.3a and b: Backflow prevention device installations
Figure 19.4: Typical point-of-use installation
Figure 19.5: Minimising contamination of roof water
Figure A1:
Bayesian confidence of compliance curves for a 95th percentile standard,
using Jeffreys’ uninformative
Figure A2:
Fitting a lognormal distribution to >L data (where L = 5), and extrapolating
back to obtain values of <L data
xviii Guidelines for Drinking-water Quality Management for New Zealand 2013
339
386
391
424
442
443
450
451
451
452
456
468
476
488
489
490
509
527
528
564
566
577
579
589
591
606
608
610
611
612
615
616
668
669
670
682
695
708
711
Chapter 1: Introduction
1.1
Introduction
1.1.1
Purpose of the guidelines
A wide range of gastrointestinal diseases can be caused by ingestion of food or drink that is
contaminated by pathogenic micro-organisms or by toxic chemicals. Control of these is an
important feature of public health. This is done by regulating food safety, administered by the
Ministry of Primary Industries and regulating safety of community drinking-water supplies,
administered by the Ministry of Health.
The purpose of these Guidelines for Drinking-water Quality Management for New Zealand is
to provide information on the tools used by the Ministry of Health to promote the provision, by
suppliers, of drinking-water that is safe to drink. The development of these tools commenced in
1993. This introduction to the Guidelines puts the Ministry’s tools in their historical context.
1.1.2
Background
In 1992 the public health oversight of drinking-water quality management in New Zealand was
in disarray after five years of central and local government restructuring and retrenchment. The
(then) Department of Health was receiving little information about the quality of public
drinking-water supplies (Taylor 1993a). However, an independent survey (Ogilvie 1989) had
shown that at least 45 to 50 percent of water suppliers did not monitor their chlorine dosage
satisfactorily, 28 percent never tested the bacteriological quality of water after it entered the
distribution system, and another 30 percent tested only four times per year. Thus, for most of
the year, the microbiological quality of the water was unknown. In 1991 bacterial quality was
reported for just 462 of the water supplies in NZ. As the drinking-water management tools came
into operation the number of known supplies increased until, as at 22 April 2013, 1421
community water supplies were listed on the Register of Drinking-water Suppliers for New
Zealand.
After the Department of Health was restructured into the Ministry of Health in 1993, an initial
appraisal of the public health safety management of the drinking-water industry was carried out
(Taylor 1993a, 1993b). The opportunity was taken to review and restructure the process of
public health management of drinking-water. There were also a number of major governance
and structural issues surrounding the management of the water resources and the water
industry that might have benefited from review, but responsibility for these lay outside the
health portfolio. Therefore the Ministry of Health concentrated on the public health
infrastructure, although it has contributed where possible to various governance and structural
reviews on related topics carried out by other agencies.
It soon became evident that there were a large number of small supplies about which little or
nothing was known, even though the larger municipal supplies were generally well-managed
and safe, there were some whose standards were not as good as could be desired.
Guidelines for Drinking-water Quality Management for New Zealand 2013
1
The Ministry of Health is responsible for the regulation of public health under the Health Act
1956 and subsequent amendments. This includes overview of drinking-water supplies to ensure
that the water from these supplies can be drunk without causing illness. A safe drinking-water
supply is a fundamental pre-requisite of public health.
In 1993 the World Health Organization (WHO) published the second edition of its Guidelines
for Drinking-water Quality which updated the information on drinking-water quality
requirements from the first (1983) edition. The Ministry of Health used the information in the
WHO Guidelines, and the knowledge of deficiencies in the public health management of
drinking-water that it had gained from its own review in NZ, to publish revised Drinking-water
Standards for New Zealand in 1995 and to develop a strategic plan and tools to improve the
public health safety of New Zealand drinking-water. The standards were further updated in
2000, 2005, and again in 2008.
The Drinking-water Standards for New Zealand 2000 updated the analytical methods for
drinking-water quality and made some minor changes to improve the interpretation and
robustness.
Additional new material was included in the Drinking-water Standards for New Zealand 2005
(DWSNZ) to accommodate the advances that had occurred in the previous five years. This
included protocols for the use of ultraviolet light disinfection to inactivate bacteria and protozoa,
radically restructuring sections relating to protozoal compliance, and sections on cyanotoxins,
small supplies, and tankered drinking-water. New information from the WHO Guidelines for
Drinking-water Quality (3rd edition, 2004) was included.
The current Standards are Drinking-water Standards for New Zealand 2005 (revised 2008).
The DWSNZ have two main components:
•
public health standards for drinking-water quality which list the maximum concentrations of
chemical, radiological and microbiological contaminants that can be present in drinkingwater without presenting a public health risk
•
compliance criteria for community water supplies which specify the sampling frequencies
and testing procedures needed to demonstrate with 95 percent confidence that the water
complies with the DWSNZ for at least 95 percent of the time, and provide for lesser
confidence levels for smaller supplies.
Though the DWSNZ provide performance criteria for drinking-water quality management they
do not specify how the quality of water supplies should be managed. That is discussed in this
publication, the Guidelines for Drinking-water Quality Management in New Zealand, which
forms the companion volume to the DWSNZ.
The water properties addressed in the DWSNZ relate to health significance, not to aesthetic
qualities. The Guidelines for Drinking-water Quality Management in New Zealand (the
Guidelines) explain the principles underlying the DWSNZ, how the Maximum Acceptable
Values (MAVs) for determinands in drinking-water were derived, and the part that aesthetic
quality plays in producing a safe, wholesome and acceptable community drinking-water. The
Ministry’s drinking-water quality management tools and the scope of the Guidelines are
discussed in more detail in section 1.3.
2
Guidelines for Drinking-water Quality Management for New Zealand 2013
1.1.3
Waterborne diseases in New Zealand
Sufficient waterborne disease outbreaks in New Zealand have been reported to indicate that
there is a significant risk of contracting gastro intestinal disease from drinking-water that is
untreated or inadequately treated (Ball 2006).
Table 1.1 shows the reported incidence of potentially waterborne notifiable diseases in New
Zealand during the 2001–2007 period. This shows that about three quarters of all such diseases
are potentially waterborne. Table 1.2 shows that potential waterborne disease represents 65 to
85 percent of all reported notifiable disease, and that campylobacteriosis is consistently
prominent in these statistics. Salmonellosis, cryptosporidiosis and giardiasis are also common.
Another common waterborne disease-causing organism is the norovirus, but norovirus illness is
not notifiable.
Table 1.1: Waterborne enteric disease – cases associated with outbreaks 1
Year
Total enteric
outbreaks
Waterborne
outbreaks
Enteric outbreak
cases
Waterborne outbreak
cases
2001
369
22
2095
370
2002
317
6
2783
18
2003
332
7
2603
36
2004
314
22
3974
116
2005
338
27
2264
184
2006
481
18
6162
284
2007
472
15
7821
205
Note that the actual rate of potentially waterborne illnesses is rather higher than shown in
Table 1.1 because:
•
not all people who become ill are accounted for in the notifiable disease statistics; in fact
studies, eg, Wheeler et al 1999, suggest that only a minority get reported. For the illness to be
recorded, the ill person must go to a doctor who must authorise appropriate specimens to be
competently examined and, if positive, the results have been reported to the medical officer
of health and been entered into official statistics
•
many other potentially waterborne illnesses are not notifiable, eg, norovirus illness, sporadic
general gastrointestinal illness.
The term ‘potentially waterborne’ does not necessarily mean that such an illness actually is
waterborne; the majority of gastrointestinal infections appear to be associated with food, and
person-to-person contact (Reilly et al 2004), and of those that are waterborne, only a proportion
could be associated with drinking-water (recreational water and shellfish consumption are the
other water routes). But this is not cause for complacency for at least four reasons:
1
Tables 1.1 and 1.2: Ball (2006), material from ESR reports, see www.surv.esr.cri.nz, select surveillance reports.
Guidelines for Drinking-water Quality Management for New Zealand 2013
3
Table 1.2: Reported rates of potentially waterborne notifiable diseases, 1999–2007
Notifiable disease
Rate (cases per 100,000 people per annum)
a
a
b
b
1999
2000
2001
2002
225.6
232.5
271.5
334.2
Cryptosporidiosis
27.0
21.4
32.3
26.1
Giardiasis
49.6
46.6
42.9
41.4
Legionellosis
1.9
1.9
1.2
1.4
Leptospirosis
1.6
2.8
2.8
3.8
Salmonellosis
57.4
49.9
64.7
50.0
0.2
0.6
0.7
0.6
1.8
1.9
2.0
2.0
13.9
10.9
11.5
12.7
Total (potentially waterborne)
379.0
368.5
429.6
472.2
Total (all sources)
501.7
560.7
545.0
577.9
75.5
65.7
78.8
81.7
Campylobacteriosis
Typhoid
c
VTEC/STEC
Versiniosis
% Potentially waterborne
Notifiable disease
Rate (cases per 100,000 people per annum)
2003
2004
2005
2006
2007
395.7
326.8
370.3
379.3
302.2
Cryptosporidiosis
21.9
16.4
23.8
17.6
21.9
Giardiasis
42.0
40.5
32.9
29.0
33.1
Legionellosis
2.1
1.7
2.3
1.2
1.6
Leptospirosis
3.0
2.8
2.3
2.1
1.6
Salmonellosis
37.5
28.9
37.0
31.9
30.1
0.5
0.8
0.8
1.0
1.1
2.8
2.4
2.5
2.1
2.4
11.7
11.2
10.9
11.6
12.5
Total (potentially waterborne)
517.2
429.5
482.8
475.8
406.5
Total (all sources)
609.5
598.8
605.8
555.5
459.1
84.9
71.7
79.7
85.7
88.5
Campylobacteriosis
Typhoid
c
VTEC/STEC
Yersiniosis
% Potentially waterborne
a
ESR 2001
b
Sneyd and Baker 2003
c
Verocytotoxin (Shiga toxin)-producing E. coli
Table 1.3 lists the waterborne outbreaks that have come to the attention of the Ministry of
Health since the mid-1980s.
4
Guidelines for Drinking-water Quality Management for New Zealand 2013
Table 1.3: Waterborne outbreaks in New Zealand, 1984–20062
Incident
Causal agent
Cases
Reference
Queenstown, 1984
Unknown, (sewage)
(3500)
Thorstensen 1985
Ashburton, 1986
Campylobacter
19
Brieseman 1987
Canterbury, 1990
Campylobacter
42
Stehr-Green et al 1991
Havelock North, 1991
Campylobacter
12
M Hart, Health Care Hawkes Bay, pers comm
Northland, 1992
Hepatitus A virus
30
Calder and Collison 1992
Lonsdale Park, Northland,
1992
Campylobacter
14
Jarman and Hennevald 1993
Waimate, 1992
Campylobacter
?
R Parr, Crown Public Health, Timaru, pers comm
Dunedin
Giardia
50
Fraser et al 1991
Hawkes Bay, 1992
Campylobacter
97
CDNZ 92(1):11–12
Auckland, 1993
Giardia
34
Thornton et al 1993
Raurimu, 1994
Campylobacter
16
D Vince, Ruapehu District Council, pers comm
Fairlie, 1994
Campylobacter
6
R Parr, Crown Public Health, Timaru, pers comm
Holiday camp, 1995
Gastroenteritis
ca 100
A Bichan, Hutt Valley Health, pers comm
Ashburton, 1996
Campylobacter
19 (33)
Holmes 1996; Lees 1996; R Parr, Crown Public
Health, Timaru, pers comm
Mt Hutt, 1996
Norovirus
59
Brieseman et al 2000
Auckland, 1996
Salmonella typhimurium
2
Simmons and Smith 1997
Mt Arthur, 1996
Suspected viral
gastroenteritis
6
M Molloy, Nelson–Marlborough Health,
pers comm
Denniston, 1996
Giardia
4
C Bergin, Crown Public Health, pers comm
Wainui, 1997
Campylobacter
6 (67)
Bohmer 1997
Waikato district, 1997
Cryptosporidium
9 (170)
D Sinclair, MOH, Health Waikato, pers comm
Tauranga district, 1997
Cryptosporidium
?
TM Fowles, East Bay Health, pers comm
Te Aute College, 2001
Campylobacter
137
Inkson 2002
Banks Peninsula, 2004
Shigella
5 (18)
Morrison and Smith 2005
Camp near Nelson, 2004
Campylobacter
3 (13)
Todd 2005
Cardrona skifield, 2006
Norovirus
218
D Bell, MOH, Public Health South, pers comm
First, the majority of the disease burden occurs in sporadic or endemic cases, not in outbreaks 2
(Figure 1.1 demonstrates the distinction between these terms). Therefore, given the publicity
that outbreaks attract, the sporadic disease prevalence may be underestimated. To elaborate,
some illnesses may often occur in outbreaks (eg, cryptosporidiosis) and so can receive a lot of
publicity (eg, Baker et al 1998 and associated news items). Indeed there is a whole book devoted
to analysis of outbreaks attributed to poor drinking-water supplies (Hrudey and Hrudey 2004),
including a New Zealand water supply example. 3 However, other illnesses, such as
campylobacteriosis, are usually less associated with outbreaks (although these can happen,
2
For reporting purposes the outbreak case definition is “two or more cases linked to a common source” (ESR 2002
Disease Outbreak Manual p2 (download www.surv.esr.cri.nz). The sensitivity of surveillance for diseases will
often be less, particularly for common enteric diseases where only a small proportion of those infected will advise
health officials thereby reducing the chances of identifying a common source.
3
Outbreaks of cryptosporidiosis in Milwaukee (USA) and pathogenic E. coli in Walkerton (Canada) are the most
recent serious examples, where numbers of people died and others gained life-long health impairment (usually
renal failure).
Guidelines for Drinking-water Quality Management for New Zealand 2013
5
particularly when water treatment systems are not operated well). 4 Health scientists are now in
broad agreement that outbreaks form only a minor part of the total drinking-water related
illness burden. For example, Dr Jamie Bartram of WHO, in introducing a section on
Investigation of Sporadic Waterborne Disease in an authoritative text on drinking-water and
disease (Hunter et al 2003), states that “a large proportion, and probably the vast majority, of
waterborne disease burden arises outside of detected outbreaks. This statement contrasts with
the view, predominant until only a few years ago and still periodically heard, that the failure to
detect outbreaks of waterborne disease illustrates that this route of disease transmission is
largely conquered in industrialised countries” (Bartram 2003).
Second, to identify the endemic and sporadic cases, special epidemiological investigations must
be conducted to see if those cases are associated with drinking-water. Because of the cost of
resources required to carry out such a study such work is usually not done. When such studies
are performed, an association with the degree of drinking-water treatment is often identified.
This has been found both overseas (Payment 2003, Hunter et al 2003) and in New Zealand. The
New Zealand studies include giardiasis in a city in which two water supplies drawn from the
same source received different levels of treatment (Dunedin, Fraser and Cooke 1991);
campylobacteriosis in a number of rural water supplies (Eberhardt-Phillips et al 1997) and a
Hawkes Bay college; cryptosporidiosis in many communities (Duncanson et al 2000); and
microbial and chemical contamination of roof-collected rainwater supplies in Auckland, and
associated illnesses (Simmons et al 2001).
Third, there is a substantial level of faecal contamination of New Zealand freshwaters, including
Campylobacter, enteroviruses and adenoviruses, even at recreational and water supply
abstraction sites (McBride et al 2002). Human and livestock wastes may contain large numbers
of pathogens that can present a major threat to public health if released into the environment
and result in substantial health costs. Numerous waterborne outbreaks of infectious enteric
diseases worldwide have been associated with discharge of effluents and agricultural runoff
resulting in human exposure to faecal contaminated water. Luckily, in general, the larger New
Zealand drinking-water supplies are well-managed. This has minimised the potential disease
level that could have been expected from the level of microbiological contamination of the
source waters. Cysts and oocysts of protozoan parasites such as Giardia and Cryptosporidium
are frequently found in environmental waters especially in areas of intensive livestock farming
(Ionas et al 1999). In the UK, significant costs ranging from £15 to 30 million per annum have
been estimated (Pretty et al 2000) to result from the agricultural contamination of drinkingwater with zoonoses (diseases transmitted from animals to humans) such as Cryptosporidium.
Finally, it should be noted that attaining a high standard of water treatment and reticulation is
health-protective. Careful and extensive examination of New Zealand drinking waters has failed
to find any trace of Campylobacter in well-treated and disinfected drinking-water supplies
although it is present in almost all riverine source water. 5
4
New Zealand examples include: Queenstown (Thorstensen 1985), Canterbury (Briesman 1987, Stehr-Green et al
1991), Hawkes Bay (McElnay & Inkson 2002). The 1984 Queenstown outbreak affected an estimated 3500 people,
and at least one person affected has since required continual kidney dialysis. For further details see Taylor & Ball
2004: www.nzwwa.org.nz/projects/moh/pubconsultation04/Why do we need safe water 2a.pdf
5
A preliminary study occasionally found small concentrations of Campylobacter in finished well-treated New
Zealand water supplies (Savill et al 2001), but a subsequent full-scale study, using altered laboratory procedures,
has failed to repeat that finding (Nokes et al 2004).
6
Guidelines for Drinking-water Quality Management for New Zealand 2013
Relevant terminology
Outbreak is a term used in epidemiology to describe an occurrence of disease greater than
would otherwise be expected in a particular time and place. It may affect a small and localised
group or impact upon thousands of people across an entire continent. Two linked events of a
rare infectious disease may be sufficient to constitute an outbreak. The outbreak detection level
is determined by epidemiologists on the basis of their knowledge of the disease under
consideration. Outbreaks may also refer to epidemics, which affect a region in a country or a
group of countries, or pandemics, which describe global disease outbreaks.
The disease rate is often reported as the number of cases per 100,000 people per annum.
Sometimes disease burden is used; this incorporates the duration of the illness and gives a
better idea of how serious the outbreak may be. For example, a norovirus infection may last one
or two days, but cryptosporidiosis may persist for more than two weeks.
1
Sporadic disease: A sporadic disease is one that occurs only occasionally in a
population (ie, prevalence is zero).
2
Endemic disease: An endemic disease is one that is always present in a population (ie,
never zero prevalence).
3
Epidemic disease: An epidemic disease is a disease that many people acquire over a
short period (ie, increasing incidence).
Figure 1.1, amended from Frost et al 2003, Craun et al 2004, illustrates the difference between
outbreaks of disease and endemic or sporadic disease occurrences.
Figure 1.1: Epidemic versus endemic/sporadic disease
Guidelines for Drinking-water Quality Management for New Zealand 2013
7
Following recent outbreaks in Canada (eg, Walkerton), Schuster et al (2005) analysed
288 outbreaks in Canada between 1974 and 2001. They found 99 outbreaks (34 percent)
occurred in areas served by public supplies, 138 (48 percent) in semi-public supplies (private
supplies but providing drinking-water to the public), and 51 (18 percent) in private supplies. The
causative organisms, in descending frequency of occurrence, were: Giardia, Campylobacter,
Cryptosporidium, Norwalk-type viruses, Salmonella, and hepatitis A virus.
1.1.4
The cost of providing safe drinking-water
Water that complies with the DWSNZ should be safe to drink. Since 1995 a number of attempts
have been made to determine how much it will cost to upgrade all NZ drinking-water supplies to
enable them to comply with the DWSNZ.
Two classes of expenditure must be considered:
•
CAPEX – the capital expenditure required to provide treatment facilities that are capable of
delivering compliant water
•
OPEX – the cost of operating the water supply system and monitoring its performance. Note
that this should (but doesn’t always!) include maintenance.
Prior to 2000, Local Government NZ (LGNZ) made attempts to estimate these costs for local
authority operated supplies, but found it difficult to obtain adequate data.
Prior to the passage of the Health (Drinking Water) Amendment Act in 2007 the Ministry of
Health sponsored two major studies on the costs of providing treatment plants capable of
providing safe drinking-water:
•
in 2001 Beca Steven estimated the costs as $269.50 to $290.40 million (Beca 2001)
•
in 2004 Roseveare and Yeabsley of OMS (OMS 2004) estimated the:
– Capex as $329.8 million
– Opex as $4.3 million per year.
The authors noted the wide error band of the estimates. The uncertainties arise from:
•
the unknown number of small water supplies that are not operated by local authorities and
are not on the Register. Some 1500 of these were registered, but many more were thought to
exist
•
the variable standard of the existing facilities. Although many were well-maintained and
serviceable, a significant number were inadequate or very poorly maintained and would be
expensive to bring up to a serviceable standard.
It is interesting to note that the US 1996 SDWA Amendments mandated that information about
treatment technology performance and affordability be developed for small systems (<10,000
population). Affordability criteria (for the annual cost of drinking-water) are based on a
threshold of 2.5 percent of the median household income (quoted from USEPA 2003).
The OECD (2011) notes that the full magnitude of the benefits of water services is seldom
considered for a number of reasons. Non-economic benefits that are difficult to quantify but that
are of high value to the concerned individuals and society, ie, non-use values, dignity, social
status, cleanliness and overall wellbeing, are frequently under-estimated.
8
Guidelines for Drinking-water Quality Management for New Zealand 2013
1.1.5
The benefits of safe drinking-water supplies
OMS (2004) estimated the direct annual benefit of illness avoidance by controlling waterborne
disease in New Zealand at $13 million to $37 million a year on the basis of the notified
waterborne enteric disease data of 18,000 cases in 1999 (the last full year of data available at the
time) (assessed as costs foregone).
The data from 1986 to 2004 were later systematically reviewed by Ball of ESR in 2006. Ball
concluded that the number of cases per year was at least 34,000, which would give a much
higher benefit than that calculated by OMS.
OMS noted that an uncertain level of unknown additional benefits arose from:
•
protecting the sanctity of the public drinking-water infrastructure (analogous to the sanctity
of the blood bank)
•
equality of access to a basic human right/need
•
maintaining the ‘New Zealand brand’ in terms of our clean, green and secure environment in
the eyes of the overseas community for food exports, and as a destination for immigration
and tourism
•
the benefits of the interventions included time-savings associated with better access to water
and sanitation facilities, the gain in productive time due to less time spent being ill, health
sector and patients’ costs saved due to less treatment of diarrhoeal diseases, and the value of
prevented deaths.
Table 1.4: Economic benefits arising from water and sanitation improvements
Beneficiary
Direct economic benefits of
avoiding diarrhoeal disease
Indirect economic benefits
Non-health benefits related to
water and sanitation
related to health improvement
improvement
Health sector
Less expenditure on treatment
of diarrhoeal disease
Value of fewer health workers
falling sick with diarrhoea
More efficiently managed water
resources and effects on vector
bionomics
Patients
Less expenditure on treatment
of diarrhoeal disease and fewer
related costs
Less expenditure on transport in
seeking treatment
Less time lost in seeking
treatment
Value of avoided days lost at
work or at school
Value of avoided time lost of
parent or carer of sick children
Value of loss of death avoided
More efficiently managed water
resources and effects on vector
bionomics
Consumers
Agricultural
and industrial
sectors
Time savings related to water
collection or accessing sanitary
facilities
Labour saving devices in the
household
Switch away from more
expensive water sources
Property value rise
Leisure activities and non-use
value
Less expenditure on treatment
of employees with diarrhoeal
disease
Less impact on productivity of
ill-health of workers
Benefits to agriculture and
industry of improved water
supply, more efficient
management of water
resources – time saving or
income generating technologies
and land use changes
Guidelines for Drinking-water Quality Management for New Zealand 2013
9
Hutton and Haller (2004) summarised for WHO the benefits of safe drinking-water supplies,
see Table 1.4, but noted that the intangible and unforeseen benefits often outweigh the direct
benefits of disease reduction.
In another New Zealand study the Wellington Regional Council has estimated the cost of
waterborne illness per affected household to be $5000 (WRC 1998), based on a household size
of three persons, and period of illness of 2.5 weeks. Using Hamilton as an example (160,000
people), the cost of illness of a cryptosporidiosis outbreak in drinking-water could be estimated
at $109 million comprising $80 million in cost of illness affecting 30 percent of the city and
$28.8 million due to a 0.025 percent mortality rate amongst those infected (12 people).
Applying US figures, the cost of averting behaviour (hauling safe water, boiling water and/or
purchasing bottled water) as a result of an outbreak of waterborne disease, again relating to
Hamilton, is estimated at $14.8 to $46.8 million per month (Harrington, Krupnick and Spofford
1985) who surveyed a community in Pennsylvania, USA which experienced a giardiasis outbreak
in 1983). Averting behaviour expenditures were estimated for each household at US$153 to
$483 per month (converted to NZ$278 to $878 at NZ$/US$ 0.55, for 53,300 households in
Hamilton).
A further saving arises because consumers do not need to install any supplementary treatment
device such as point-of-use (POU) equipment in any New Zealand water supply that complies
with the DWSNZ and has a good Grading. Consumers that choose to install such equipment
need to be careful that they do not introduce health concerns through improper use or
maintenance, such as allowing bacteria to grow in the equipment that removes chlorine. See
Chapter 19: Small, Individual and Roof Water Supplies, section 19.3.4 for further information.
Further savings arise because purchase of bottled water is unnecessary. It should also be noted
that New Zealanders spent $26,000,000 on bottled water in 2004, including coolers (G Hall,
Corporate Water Brands, pers comm). At least some of this can be taken as the cost of a lack of
confidence in public water supplies.
LECG (2010) concluded a net economic benefit to New Zealand of $134 million would be
achieved by requiring large water suppliers to comply with both the bacteriological and
protozoal determinands; a net economic benefit is also expected if medium supplies comply
with both. The economic benefits to New Zealand for minor, small and neighbourhood supplies
complying with both would need to be considered on a case by case basis. The cost of
compliance was estimated by CH2M Beca (2010).
Dupont and Jahan (2012) examined factors that explained consumer spending on tap water
substitutes using information from a national survey undertaken with a representative set of
Canadian respondents. They developed a model to predict the percentage of households that
undertook such spending for the purpose of reducing perceived health risks from tap water
consumption. Using results from the model they estimated the magnitude of defensive
expenditures to be over half a billion dollars (2010 US$) per year for Canada, as a whole. This is
equivalent to approximately $48 per household per year or about $19 per person per year.
Residents of Ontario, the province in which an Escherichia coli incident took place in 2000, had
the highest willingness-to-pay of approximately $60 per household per year.
10
Guidelines for Drinking-water Quality Management for New Zealand 2013
1.2
Ministry of Health public health
protection strategy for drinking-water
1993–1995
1.2.1
Strategy development
The purpose of the Ministry of Health (MoH) strategy for drinking-water, formally adopted by
the Director-General of Health in 1995, was to develop and apply the necessary tools for the
implementation of the policy for drinking-water management.
From 1993 to 1995 the MoH developed an initial strategy to protect the public health safety of
drinking-water. The goal of the MoH drinking-water policy was to achieve a high standard of
drinking-water quality and management in New Zealand by promoting the understanding and
application of the principles of public health safety by drinking-water suppliers and the general
public. The development and implementation of this strategy are outlined below.
To develop its public health protection strategy for drinking-water, the MoH developed:
•
goals and objectives for public health protection of drinking-water
•
assessed whether the desired goals and objectives were realistic and achievable
•
assessed the tools available for implementing the strategy:
– which tools are currently in use?
– what new tools are required?
– would they work?
– what were their strengths and weaknesses?
– whether their performance could be enhanced by designing them to work synergistically
– whether there was statutory authorisation for their use
•
planned an integrated strategy by:
– deciding which tools were needed to provide adequate protection for drinking-water
quality
– ascertaining how to design the tools to ensure that they reinforced one another in use
– preparing a schedule of objectives for developing and implementing the tools.
The programme was designed to get the already-existing tools into operation as soon as possible
and concurrently to:
•
develop an overall strategy and a notional timetable for implementation
•
redesign the already-existing tools to work better together to achieve the desired goals
•
develop new tools as required.
Guidelines for Drinking-water Quality Management for New Zealand 2013
11
1.2.2
Outline of the strategy
The Ministry of Health’s drinking-water strategy from 1993 to 1995 involved:
1
development of performance standards on all aspects of management of drinking-water
quality
2
development of standards of competence for health officers and Medical Officers of Health
working with drinking-water in order to achieve consistent national standards
3
development and application of an integrated set of tools for promoting the provision of
safe, wholesome drinking-water supplies
4
provision of information to the public on public health issues concerning drinking-water
and the quality of community drinking-water supplies
5
promotion of public health issues concerning drinking-water to the public
6
promotion of self-management by the water supply industry
7
promotion of the use of quality assurance management techniques by water supply
authorities, including adequate documentation of management procedures, monitoring
procedures and contingency plans
8
provision of the electronic database Water Information, New Zealand (WINZ) to provide
for the recording of all aspects of drinking-water supply performance and enable the
assessment and reporting of improvement in performance
9
preparing for consultation on legislation to strengthen the implementation of the strategy
10
implementation of the overall strategy by a “ratchet” process that improves performance
in ‘digestible’ steps in one area and to facilitate advance in another area where progress
had previously been difficult.
1.2.3
Planned milestones
•
Obtain a clear understanding, by the end of 1996, of who will take responsibility for each of
the various categories of community drinking-water supplies.
•
Achieve implementation, by the end of 1997, of a programme of self-monitoring by the water
suppliers, audited by health agencies, in 95 percent of all community drinking-water
supplies.
•
Achieve informed community discussion and decision-making on public health safety issues
on drinking-water by the end of 1997.
•
Achieve safe drinking-water in:
– 95 percent of all large community drinking-water supplies (large = over 500 population
supplied) by the end of 1996
– 90 percent of all small community drinking-water supplies (small = 25 to 500 population
supplied) by the end of 1998.
These targets are now for population rather than supplies.
12
Guidelines for Drinking-water Quality Management for New Zealand 2013
1.2.4
Desired outcomes
•
Adequate and effective monitoring of the quality of drinking-water supplies by suppliers.
•
Sufficient knowledge and awareness of public health issues for the public to enable their
effective participation in decision-making about public health issues relating to community
drinking-water supplies.
•
Competent health officers assessing the quality of water supplies to a consistent standard
throughout the country.
•
Improved public health safety standards for community and private drinking-water supplies.
•
An effective statutory basis for the public health protection of drinking-water supplies.
1.2.5
Promotion of awareness of public health issues related to
drinking-water
This was achieved by publication of reports on the public health grading of community drinkingwater supplies and on the presence of determinands of public health concern found in the
supplies (Priority 1 and 2 determinands) commenced in the Register of Community Drinkingwater Supplies in New Zealand in 1993 and immediately attracted media attention.
Copies of the Drinking-water Standards for New Zealand 1995, the Guidelines for Drinkingwater Quality Management in New Zealand 1995 and the Register of Community Drinkingwater Supplies in New Zealand (which was originally published approximately twice a year),
were placed in every public library in the country to ensure that authoritative information on
drinking-water quality management was freely available.
1.2.6
Outputs achieved by 1995
A number of the planned outputs were achieved or were well advanced in the first three years.
•
Publication of the public health grading, together with drinking-water sources, of treatment
plants and distribution zones commenced at the end of 1993.
•
The 1984 Drinking-water Standards for NZ were revised by the end of 1994 and published
in 1995.
•
Most large community drinking-water supply treatment plants had been graded by the end of
1994.
•
Standards for drinking-water data and data transfer were in place by mid-1994.
•
New Regulations for drinking-water quality management had not been achieved by 1995, but
a discussion paper on the need for drinking-water legislation was in preparation.
Guidelines for Drinking-water Quality Management for New Zealand 2013
13
1.3
Strategy development 1995–2000
1.3.1
Consultation
In 1995 a public discussion paper 6 on the introduction of the Ministry’s 1994 policy was published
and public meetings on it were held in the four main centres in association with the NZWWA.
Based on the feedback from the 1995 paper a further discussion paper 7 was produced in 1998
which reviewed all New Zealand legislation relating to drinking-water and outlined options for
consolidating the legislation. This underwent consultation similar to that held on the 1995
paper, with similar results.
1.3.2
Inclusion of protozoa in the DWSNZ
Prior to 1990, the focus of waterborne disease prevention was on bacterial pathogens and much
work was in progress to determine whether faecal coliforms (thermotolerant coliforms,
enterococci or Clostridium spp were the best indicators of the presence of faecal contamination.
Work was also in progress to distinguish between organisms of human and animal origin
because it was thought that animal organisms would not be pathogenic to humans. In the early
1990s it became evident that Giardia was a significant emergent waterborne disease in New
Zealand. Consequently public health management of Giardia and Cryptosporidium was
addressed in the 1995 and 2000 DWSNZ. Because of the difficulty of monitoring these protozoa
directly, control in the 1995 and 2000 standards was by specifying turbidity (as a surrogate for
particle counts) and filter pore size.
6
7
14
Drinking-water public health issues – a public discussion paper. MoH, Wellington, March 1995. 67 submissions
were received representing the views of 101 agencies and groups. There was unanimous agreement that:
1.01
safe drinking-water is a key requirement for public health
1.02
all persons should have the right to expect that any water which they draw from a tap was safe to drink
unless they were specifically informed to the contrary. This was considered especially important in
premises handling food and to be particularly important for tourism, and travellers. It was considered that
there was a need for signs in hotels, camps, farmstays etc where the water does not meet the Standards
1.03
the legislation relating to drinking-water is outdated, fragmented, inadequate and in need of revision,
integration and cross-referencing. Over 36 Acts and Regulations are involved. In these, reference is made
to ‘acceptable’, ‘pure’, ‘wholesome’, ‘potable’, or ‘safe’, water, etc; what these terms mean is rarely defined.
All submissions on this subject recommended that the definitions of these terms should be standardised
throughout the legislation
1.04
any legislative revision should remove gaps, produce consistency and remove conflict between the various
Acts, Regulations and Bylaws which relate or refer to drinking-water quality, especially the Local
Government Act 1974, the Rating Powers Act 1988, the Health Act 1956, the Resource Management Act
1991, the Water Supplies Protection Regulations 1961, the Building Act 1991 and the model Bylaws
1.05
compliance mechanisms and penalties should be effective and of equivalent severity to those in the
Resource Management Act 1991 and the Health and Safety in Employment Act 1992
1.06
the Ministry of Health should be the lead organisation in the national management of drinking-water
quality, with territorial authorities having a key role in the enhancement of drinking-water quality control
1.07
the Ministry of Health should be statutorily empowered to promulgate drinking-water standards
1.08
the Ministry of Health should be statutorily empowered to carry out the public health grading of drinkingwater supplies
1.09
the respective roles, relationships and responsibilities of the various agencies and statutory officers
involved in drinking-water quality management (principally the designated officers of the Ministry of
Health and the TA’s officers concerned with each of the supply, regulation and planning functions) need to
be more clearly defined in the legislation. This includes defining the responsibility for each of water supply
provision, monitoring, surveillance, audit etc
1.10
the public has a right to know about the quality and safety of drinking-water supplies and all information
about these should be publicly available.
Review of the Water Supplies Protection Regulations 1961 (Review of Regulations made under the Health Act
1956) – a discussion document. MoH (Wellington), February 1998.
Guidelines for Drinking-water Quality Management for New Zealand 2013
1.3.3
Human resource development
Assessment of the results of the first round of drinking-water supply grading carried out after
public health grading was introduced in 1993 showed that the level of competence of the DHB
health protection officers in assessing the performance of drinking-water supplies was very
uneven. For effective administration of drinking-water quality management legislation it would
be essential that the assessment process is carried out to a consistently high level of competence.
To prepare for the needs of the proposed legislation, the Ministry sponsored the establishment
of a NZQA diploma in drinking-water assessment to provide appropriate training for HPOs in
drinking-water supply management and operation, complemented by training in the legal
requirements relating to their water supply duties. In addition, arrangements were made for the
water HPOs to obtain IANZ accreditation to ISO/IEC 17020 specifications for inspection and
assessment. Drinking-water assessment units (DWAUs) were set up in the participating DHBs.
These were accredited by IANZ as inspection bodies with the officers who were to be appointed
as DWAs after the legislation was promulgated, being designated as approved signatories for the
DWAUs and authorised to use the IANZ logo on reports on the water supply assessments that
they had been accredited to perform.
1.3.4
DWSNZ 2000
The 1995 DWSNZ were revised and finally re-issued in 2000 with the assistance of the Expert
Committee on Drinking-water Quality. The main changes involved:
•
replacing faecal coliforms as the indicator of faecal contamination to E. coli
•
including Cryptosporidium in the protozoal compliance section. In the decade after 1995 the
understanding of the public health importance of protozoa in drinking-water increased
rapidly and the importance of Cryptosporidium as a major new waterborne pathogen that
was resistant to conventional disinfection procedures or practices rapidly overtook that of
Giardia. The scientific understanding of Cryptosporidium management advanced with
extreme rapidity and by 2000 it became necessary to update the 1995 DWSNZ to incorporate
the new knowledge. Cryptosporidium was selected as the representative protozoan because it
is the most difficult to control in drinking-water
•
introducing monitoring requirements for ozone and chlorine dioxide disinfection to meet the
protozoal requirements, and removing the C.t tables for Giardia inactivation using chlorine
•
the use of Bayesian statistics to guide the derivation of monitoring frequencies and
acceptable transgression rates
•
updating the MAVs based in the 1998 revision of the 2nd edition of the WHO Guidelines for
Drinking-water Quality
•
including PMAVs for cyanotoxins
Guidelines for Drinking-water Quality Management for New Zealand 2013
15
1.4
Strategy development 2000–2005
1.4.1
Consultation
In 2000 consultation was held in conjunction with NZWWA on the procedures that had been
developed by ESR for the Ministry on Public Health Risk Management Plans (PHRMP) for
drinking-water supplies and on the need to update the Public Health Grading protocols. The
philosophy behind the PHRMP was generally accepted. Proposals were made to include the
PHRMP as part of the grading process, but it was decided to hold this over until there was more
experience with the use of PHRMPs.
1.4.2
Public health risk management plans
The limitations of the historical approach to drinking-water quality management by the quality
control (QC) procedure of assessing compliance with a product quality standard (the DWSNZ)
had become evident by 2000.
Although the QC approach established whether the drinking-water quality targets had been met,
this occurred only after the event. Also, because bacterial tests took two days to complete,
identification of a contamination event did not occur until two days after the event. The use of a
public health risk management plan (PHRMP) for a water supply was seen as a way of
introducing quality assurance (QA) procedures into drinking-water quality management. The
publication of the NZ guidelines on the preparation of PHRMPs (Ministry of Health 2001) was
followed by the publication of Chapter 4 on Water Safety Plans in the WHO Guidelines on
Drinking-water Quality (3rd edition, 2004). Following the WHO publication the use of
PHRMPs for drinking-water supplies (called water safety plans by WHO) has become an
internationally accepted procedure. WHO has used New Zealand DWAs to provide training in
water safety plans to the Pacific Island countries.
The stages in the development of a PHRMP are:
•
identification of what is intended to be achieved (the target, eg, compliance with the DWSNZ)
•
identification of the factors that could impede the achievement of the target (the risks). This
includes identification of the financial and technical resources required to achieve the target,
both the set-up and ongoing operational requirements
•
identification of the ways in which the risks could be overcome (managed). This includes
identification of the necessary financial and technical resources
•
identification of the relative magnitude of the risks and ranking of the priorities for dealing
with the risks, taking into account their importance and the relative ease of management
•
development of contingency plans for managing unusual but critical perturbations of normal
operation (eg, floods, droughts, power cuts, accidents to key personnel)
•
completion of a schedule for managing the risks (a three- to five-year timetable)
•
implement the PHRMP
•
monitor, and review the PHRMP implementation performance, revise if necessary.
The public health risk management process can form the basis of a complete drinking-water
quality management programme.
16
Guidelines for Drinking-water Quality Management for New Zealand 2013
The steps along the pathway of using the PHRMP as the basis of a programme for achieving the
target of a supply that delivers an adequate volume of water that is safe to drink are:
1
completion of a PHRMP for the supply
2
optimisation of the operation and management of the existing supply process
3
establishment of a programme for monitoring the performance of the water supply system
to verify progress
4
preparation of an improvement schedule for the supply, based on the information in the
PHRMP
5
preparation of a design report for upgrading the supply, if the target cannot be achieved
(an engineer’s report).
1.4.3
Legislative development
Building on the recommendations from the 1995 and 1998 public discussions on proposals for
strengthening the drinking-water quality management sections of the Health Act 1956, Cabinet
instructed the Ministry of Health in November 2000 to prepare a Health Act Amendment Bill
which would provide a statutory framework for the non-regulatory interventions that were
currently operating. The amendment was to strengthen and improve the existing legislation by:
1
providing assurance that a sector, with assets measured in the billions of dollars and
which is fundamental to economic development (including the tourism sector), would be
adequately managed
2
assisting local government to discharge their statutory duty “to improve, promote and
protect public health”
3
placing duties on drinking-water suppliers to take all practicable steps 8 to comply with the
drinking-water standards (and various other duties and powers ancillary to that)
4
providing a statutory framework for the promulgation of drinking-water standards
5
putting duties on the general public not to contaminate drinking-water supplies
6
requiring drinking-water suppliers to introduce and implement public health risk
management plans
7
providing for officers designated by the Ministry to act as assessors to verify:
•
compliance with the DWSNZ
•
the standard and implementation of public health risk management plans
•
the competence of water supply staff carrying out process and field analyses
8
requiring designated assessors to have their competence accredited by an internationally
recognised conformance accreditation agency
9
providing for appropriate record-keeping and publication of information about the
compliance of the supply with the Act.
8
All practicable steps, in relation to the achievement of any particular result, means all steps to achieve that result
that it is reasonably practicable to take in the circumstances, having regard to the:
a)
nature and severity of the harm that may be suffered if the result is not achieved, and
b)
current state of knowledge about the likelihood that harm of that nature and severity will be suffered if the
result is not achieved, and
c)
current state of knowledge about harm of that nature, and
d)
current state of knowledge about the means available to achieve that result, and about the likely efficacy of
each, and availability and cost of each of those means.
Guidelines for Drinking-water Quality Management for New Zealand 2013
17
1.4.4
Development of the DWSNZ 2005
Sub-sections 1.4.4.1 to 1.4.4.5 discuss the main changes from the 2000 DWSNZ. The DWSNZ
2005 maintained the two principal components:
•
the water quality specification (standard), which defined the Maximum Acceptable Values
(MAVs) at which the risk of disease from drinking the water is negligible. A new concept,
operating requirements, was introduced where monitoring of a MAV was impracticable
•
the compliance specifications, which define the checks (and their frequencies) that are to be
taken to demonstrate compliance with the DWSNZ.
1.4.4.1
Introduction of the Log Credit approach
Because the methods available for identifying Cryptosporidium were not suitable for routine
monitoring, and there appeared to be no suitable indicator organisms, it became necessary to
improve the surrogate methods used to manage the public health risk. This was done by
improving the performance of treatment processes in removing or inactivating
Cryptosporidium oocysts. This stimulated the introduction of quality assurance methodology
for drinking-water supply operational management, which culminated in the development of
Public Health Risk Management Plans.
The risk of infection from drinking-water contaminated by waterborne protozoa is affected by
the:
•
concentration of Cryptosporidium oocysts or other protozoal cysts in the raw water
•
extent to which (oo)cysts are inactivated or removed by the treatment processes.
To take account of the additive effect of a series of treatment processes on the removal of
protozoa, log credits are used, Cryptosporidium being used as the reference organism. The log
credit for a treatment process is the logarithm of the percentage of the protozoa the process can
remove or inactivate. The cumulative effect of successive treatment processes can be calculated
by adding the log credits of all the qualifying processes in use. The cumulative effects cannot be
added when the removal is expressed as a percentage.
1.4.4.2
UV disinfection
Much work was done internationally on improving the understanding of the use of UV
irradiation for inactivating Cryptosporidium. Originally UV was thought to be ineffective,
because the oocysts appeared visually to be unchanged by exposure to UV. Development of
means of measuring the extent of inactivation of the oocysts and measuring their infectiousness,
combined with genotyping led to a significant increases in the understanding of control methods
and demonstrated that UV was much more effective than initially thought.
UV will also control bacteria, but, like ozone, leaves no disinfectant residual.
1.4.4.3
Cyanobacteria
Prior to 2000, cyanobacteria were not considered a major problem in New Zealand surface
waters. Outbreaks were few and far between and confined mainly to standing waters such as
ponds and lakes. Since about 2000 the situation changed and cyanobacteria became much more
prevalent, including a major outbreak in the Waikato River. Also it was realised that the public
health concern was not the cyanobacteria themselves but the cyanotoxins that they produced.
18
Guidelines for Drinking-water Quality Management for New Zealand 2013
Cyanobacteria produce a range of toxins similar to those found in toxic shellfish. Thus the
control measures had to be based on the management of the toxins at least as much as the
organisms. In addition to the planktonic cyanobacteria in the water mass, pads of benthic
cyanobacteria have also become a problem and have caused a number of dog deaths. It was
considered prudent to develop management techniques including action levels for cyanobacteria
and cyanotoxins in drinking and recreational waters.
1.4.4.4
Small water supplies
Before 2005 the development of drinking-water standards was largely targeted to the
management of water supplies serving populations greater than 500. Annual reviews of
drinking-water quality have shown that the smaller supplies consistently perform less well than
larger supplies. Also, the costs of monitoring water quality are relatively small per head of
population when spread over a large community, but excessive when spread over a small
number of people. Between 2000 and 2005 major consultations and discussions were held to
try to improve the situation for small supplies, so drinking water standards based on risk
management planning rather than formal compliance with water quality standards were
developed for use in the 2005 DWSNZ.
1.4.4.5
Tankered drinking-water
In small unreticulated drinking-water supplies, especially ones in which roof water provides a
significant proportion of the available water, it is almost inevitable that some portion of the
water will be provided by tankered supplies.
It was considered necessary to provide for the management of the quality of this tankered
drinking-water because there was anecdotal evidence that some tankered water deliveries were
of dubious quality, due to use of dirty tankers or filling them from other than town supply
hydrants.
1.4.5
Development of the drinking-water assistance
programme
By 2003 it was evident from the annual review of drinking-water quality management that the
improvement in water supplies due to the implementation of the 1993 policies had reached the
point of diminishing returns and had reached a plateau. Larger supplies were substantially
complying with the DWSNZ, but a number of the smaller supplies did not comply.
To ascertain the reasons for the non-performance of the smaller supplies, the Ministry
sponsored surveys of some 120 smaller supplies to ascertain what these suppliers considered to
be the major impediments to the improvement of their performance. 9 This was supplemented
by sixteen regional public meetings from Whangarei to Invercargill. 10 The principal impediment
to compliance with the proposed legislation was seen as lack of technical training of the
operators and the availability of technical information. This was considered by the small
suppliers to be even more important than the costs that would be incurred in complying. Many
of these suppliers did not know how to effectively manage the use of the facilities that they
already had.
9
NZWWA/NZWERF. 2002. New Zealand Small Water Systems Survey (Report to the MoH) Wellington.
10
ESR. 2004. Report on regional consultation meetings for smaller and rural water supplies; FW0474. Meetings
were held in Whangarei, Tauranga, Hamilton, Taupo, Palmerston North, New Plymouth, Gisborne, Napier,
Masterton, Queenstown, Greymouth, Nelson, Invercargill, Balclutha, Timaru, Kaikoura.
Guidelines for Drinking-water Quality Management for New Zealand 2013
19
During the government’s Infrastructure Stocktake (aimed at identifying the key utilities
required to underpin the economic well-being of New Zealanders), drinking-water and waste
management were identified as key utilities.
The need to improve the performance management of small supplies had been demonstrated by
their poor performance recorded in the annual reviews of drinking water quality and by the
responses of small communities to the consultation rounds. By 2000 sufficient information was
being gathered on the management needs of small communities to enable the management needs
of the 20 percent of the population serviced by small drinking-water supplies to be addressed.
Planning to meet the needs for provision of technical information and training for operators of
small supplies together with financial assistance where this could be demonstrated to be
necessary commenced in 2001. As a result of the government’s infrastructure stocktake, funding
for assistance to underperforming drinking-water supplies was made available in the 2005
Budget.
1.5
Operational development of the
drinking-water management
programme 2005–2009
1.5.1
The Drinking-water Assistance Programme, DWAP
To meet both the technical and financial needs of the small suppliers that had been identified by
consultation and planning in 2000-2005, the Drinking-water Assistance Programme (DWAP)
was designed to have two complementary components: the Technical Assistance Programme
and the Drinking-water Subsidy Scheme. Public health units have been appointed to implement
the DWAP on behalf of the Ministry and is for water supplies serving between 25–5000 people.
a)
The technical assistance programme
The first component of the DWAP is a Technical Assistance Programme that provides advice
and technical assistance to drinking-water suppliers. The Technical Assistance Programme
assists suppliers to evaluate their supply, produce a PHRMP and optimise the performance of
their existing facilities. Should the supply be incapable of complying with the DWSNZ after its
performance has been optimised, the Technical Assistance Programme can assist with assessing
the capital works needed to upgrade the supply so that it can meet its performance targets. The
Technical Assistance Programme is available to any supply under 5000 people.
b)
The Subsidy Scheme
The Technical Assistance Programme is complemented by a Subsidy Scheme that administers
the funds available to the DWAP for capital assistance. Eligibility and priority criteria were
revised in 2010. Applications for subsidy are processed through the Sanitary Works Technical
Advisory Committee (SAWTAC).
Applicants for the Subsidy Scheme must meet the criteria set out in Applying for a Drinkingwater Subsidy: Guidelines for applicants and district health board public health units available
on the Ministry of Health’s website. For the Subsidy Scheme:
20
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
$10 million is available for allocation each year until 2015
•
the scheme will pay up to 85 percent of costs (previously it was 95 percent)
•
only those communities with a Deprivation Index of 7 and above are eligible
•
the criteria clarify that asset replacement, maintenance, land purchase and applications from
city councils are not eligible for subsidies
•
an engineering review is required for subsidy applications that exceed $1000 subsidy per
person for a water supply scheme.
Applications must be submitted to the Ministry of Health no later than 5 pm on 28 February of
each year until 2015. Further information is available from public health units and on the
Ministry of Health’s website.
1.5.2
Legislation: The Health (Drinking Water) Amendment Act
2007 (Part 2a of the Health Act 1956 – the Act)
The Health (Drinking Water) Amendment Bill that was passed in 2007 contained a number of
changes from the original proposals submitted to Cabinet in 2000, which are listed in
section 1.2.2. These changes were either authorised by Cabinet before the Bill went to
Parliament, or were recommended by the Select Committee. These included:
•
tankers, ports and airports to be classified as drinking-water suppliers
•
changes to the criteria for establishing whether the ‘all practicable steps’ criterion had been
met
•
addition of the new category of Rural Agricultural Drinking-water Supply to the list of
categories of water supplies
•
the requirement to assess whether an action required under the Act is affordable as part of
the procedure for deciding whether all practicable steps have been taken to achieve a result
required.
1.5.3
The 2008 revision of DWSNZ 2005
The amendment to the Act necessitated a number of changes to the DWSNZ 2005, including the
need to develop a section on Rural Agricultural Drinking-water Supplies. The changes were
published in the 2008 revision of the DWSNZ 2005. The introduction of the PHRMPs
necessitated a number of minor adjustments to ensure compatibility with the Act.
Although the DWSNZ 2005 had been the result of a consensus among members of the Expert
Committee on Drinking-water Quality, set up to advise the Ministry of Health, several
submissions from small water suppliers necessitated major rewrite of section 10 (small supplies).
Water suppliers were invited to comment on the 2005 DWSNZ, resulting in the clarification of
the other sections, particularly related to procedural matters in the protozoal compliance
section. The opportunity was also taken to update the maximum acceptable value (MAV) tables
based on the latest World Health Organization (WHO) information. All water suppliers that had
commented on the 2005 DWSNZ were asked to confirm that their concerns had been addressed
in the draft revision.
Guidelines for Drinking-water Quality Management for New Zealand 2013
21
1.5.4
Tankered drinking-water supplies
Standards have been developed for use with different types of source water for tankered
supplies. For operational guidance, the Tankered Drinking Water Carrier’s Association has
prepared Guidelines for the Safe Carriage and Delivery of Drinking-water. The initial draft was
produced in conjunction with the New Zealand Water and Wastes Association, as a Code of
Practice. The final version was published by the Ministry in 2008, as Guidelines.
1.5.5
Development of specifications for Rural Agricultural
Drinking-Water Supplies (RADWS)
The DWSNZ 2005 revised (2008) are prescriptive standards developed to ensure safe drinkingwater for the population’s water supplies. Many of the criteria used in these standards are
population-based and are appropriate for use in homogeneous reticulated communities such as
towns where the principal purpose of the water supply is for drinking. However they do not
meet the needs of the rural situation where the purpose of the water supply is more
heterogeneous.
In rural communities a large proportion of the water supply may not be intended for drinking.
The water may be used mainly for irrigation or stock watering. Treating all of this water to
comply with the drinking-water standards may be pointless and unnecessarily costly. For this
reason it has been proposed that a Rural Agricultural category (RADWS) be developed in which
only water intended to be drunk by humans will be required to meet the drinking-water
standards. This will require different levels for the different categories of community and it is
likely that one of the acceptable solutions will involve the use of either point-of-use (POU) or
point-of-entry (POE) appliances.
At the time of writing the necessary consultation process on a Rural Agricultural DrinkingWater Supply was about to be undertaken.
1.5.6
Point-of-use and point-of-entry drinking-water treatment
appliances standards
To facilitate the regulatory control over the performance of point-of-use (POU) and point-of-entry
(POE) drinking-water treatment appliances, the documentation of the appliance needs to specify:
a)
what contaminants the appliance will control
b)
what contaminants the appliance will NOT control
c)
performance standards for control of the contaminants of concern
d)
a clear indication of when the appliance is no longer achieving its performance standards.
There are four relevant international standards that deal with POU and POE appliances:
1
AS/NZS 4983:1998 Water supply – Domestic type water treatment appliances –
Performance requirements
2
AS/NZS 3497:1998) Amended Plumbing Requirements for POU and POE appliances
3
NSF/ANSI 53, and
4
NSF/ANSI 55.
Of these standards only AS/NZS 3497:1998 includes the documentation requirements specified
above.
22
Guidelines for Drinking-water Quality Management for New Zealand 2013
The technical performance specifications of AS/NZS 4983 need to be brought up to the standard
of the specifications of NSF/ANSI 53 and NSF/ANSI 55 in order to ensure that the appliance
will deliver water that complies with the DWSNZ.
1.6
Tools for promoting safe drinking-water
supplies
1.6.1
Introduction
From 1992 to 1996 the Ministry of Health developed an integrated set of tools for improving
drinking-water quality to protect public health. These tools were designed in such a way that
they reinforce one another in use and included the:
•
public health grading of community drinking-water supplies
•
the new 1995 Drinking-water Standards for New Zealand
•
development of national drinking-water databases, eg, WINZ
•
and publication of:
– Register of Community Drinking-Water Supplies in New Zealand
– Guidelines for Drinking-Water Quality Management in New Zealand
– Annual Reports on Quality of Drinking-Water in New Zealand.
Subsequently, the following new tools have been added:
•
the introduction of public health risk management plans
•
the introduction of drinking-water assessment and the training of personnel for this
•
publication of the Register of Recognised Laboratories
•
publication of the Register of Drinking-water Assessors.
Several of these tools have been updated since 2000, eg, the DWSNZ, the Guidelines, the
Grading, and the WINZ software.
The tools are designed to promote maximum interaction and mutual support between the
various stakeholders, the public, the media, the drinking-water supplier, and the drinking-water
assessor. Emphasis is on using risk management planning techniques to promote a quality
assurance approach. This is complemented by a monitoring programme used as a final quality
control that also acts as a feedback loop and provides a trigger for remedial action where this is
necessary.
A description of the tools and the way they interact follows.
1.6.2
Drinking-water Standards for New Zealand
The DWSNZ prescribe maximum acceptable values (MAVs) for determinands of public health
significance, providing a yardstick against which drinking-water quality can be measured. A
MAV is the concentration of a determinand below which there is no significant risk to a
consumer over a lifetime of consumption.
Guidelines for Drinking-water Quality Management for New Zealand 2013
23
Wherever possible, the MAVs have been based on the latest WHO guideline values. WHO calls
their guideline values provisional when there is a high degree of uncertainty in the toxicology
and health data, or if there are difficulties in water treatment or chemical analysis. The DWSNZ
adopt the same approach. Provisional MAVs (PMAVs) have also been applied to chemical
determinands when the Ministry of Health has derived a MAV in the absence of a WHO
guideline value. In terms of compliance with the DWSNZ, PMAVs are considered to be
equivalent to MAVs.
Chemical (and cyanotoxin) MAVs are based on average values, and while a higher daily dose
could be safe for a certain period, consumption of that dose for a lifetime is not expected to be
safe. Average values are used for framing Regulations because provision cannot be made for all
possible combinations of exposures that individuals may encounter; an exception is cyanide
where the MAV has been established to protect consumers during short-term exposure
following a significant spill of cyanide to a drinking-water source (see datasheet in the
Guidelines). There is a short-term MAV for nitrate and nitrite as well, established to protect
against methaemoglobinaemia in bottle-fed infants.
For carcinogenic chemicals, the MAVs set in the DWSNZ generally represent a risk of one
additional incidence of cancer per 100,000 people ingesting the water at the concentration of
the MAV for 70 years.
MAVs apply to water intended for human consumption, food preparation, utensil washing, oral
hygiene or personal hygiene. Approximately one third of the daily average fluid intake is thought
to be derived from food. The remaining water requirement must be met from consuming fluids.
The criteria in the DWSNZ are applicable to all drinking-water except bottled water, which must
comply with the Food Act 1981.
The DWSNZ list the maximum concentrations of chemical, radiological and microbiological
contaminants acceptable for public health in drinking-water. For community drinking-water
supplies, the DWSNZ specify the sampling frequencies and testing procedures that must be used
to demonstrate that the water complies with the DWSNZ.
The sampling frequencies are chosen to give 95 percent confidence that the medium to large
drinking-water supplies comply with the Standards for at least 95 percent of the time. The larger
supplies are required to monitor more frequently. The DWSNZ 1995 used classical statistics to
derive the necessary monitoring frequencies, but the DWSNZ 2000 took advantage of more
recent advances in the use of statistics in which monitoring frequencies are derived using the
Bayesian approach (McBride and Ellis 2000).
The DWSNZ do not describe how a water supply should be managed. This is discussed in the
Guidelines for Drinking-water Quality Management in New Zealand.
The DWSNZ specify MAVs for more than 120 determinands. To minimise the number of
determinands that have to be monitored routinely in any specific drinking-water supply but still
maintain adequate safeguards to public health, the DWSNZ have grouped the determinands of
public health concern into four priority classes, see section 1.6.9 and Table 1.5.
The potential indicators of disease-causing organisms, micro-organisms characteristic of faecal
contamination, are given the highest priority (Priority 1), in the DWSNZ because the public
health implications of disease organisms in the water supply are almost always of greater
concern than the presence of chemical contaminants, which are usually slower acting.
24
Guidelines for Drinking-water Quality Management for New Zealand 2013
It can be seen that the top priority is given to identifying potential causes of infectious disease
outbreaks. In an ideal world a screening test would be used that provides instant identification
of the presence of pathogenic organisms in drinking-water. At present no such test exists. Until
better tests have been developed, New Zealand, like the rest of the world, has to fall back on the
use of indicator organisms to identify the probability that the water has been contaminated by
excrement and, therefore, the possibility that pathogenic bacteria or viruses are present.
Because of the practical difficulties in routinely enumerating infectious protozoa in drinkingwater, surrogate methods have had to be used, based on checking that the water is from a safe
source or has received a level of treatment that has a high probability of removing protozoal
organisms.
Information on the supply-specific Priority 2 determinands, ie, those determinands in a
drinking-water supply that are of public health concern, is published in the Register of
Community Drinking-Water Supplies and Suppliers in New Zealand (see section 1.6.11),
available at every public library.
The MAVs in the DWSNZ apply to private and individual household drinking-water supplies as
well as to community supplies. Because of the wide variation in the circumstances of individual
supplies it is not possible to give explicit guidance on sampling strategies for individual supplies
in the DWSNZ. Individual household supplies are discussed in Chapter 19 of these Guidelines.
Advice on specific cases can be obtained from the drinking-water assessors.
Compliance with the DWSNZ demonstrates that a drinking-water supply is potable within the
meaning of the Act. The DWSNZ:
1
specify referee methods against which the methods used by individual laboratories have to
be calibrated
2
require that laboratories carrying out compliance testing be approved for the purpose by
the Ministry of Health
3
specify minimum remedial action needed in the event of the DWSNZ being breached.
The purpose of the referee methods is to overcome the problem of different analytical methods
giving differing results. The referee method provides a benchmark in case of disagreements.
Laboratories can use any method that has sufficient precision and sensitivity, but the
conformance assessment agency that accredits them is required to certify that the methods the
laboratory is accredited for have been calibrated against the referee method.
1.6.3
Guidelines for Drinking-Water Quality Management in
New Zealand
The Guidelines for Drinking-Water Quality Management in New Zealand (the Guidelines)
provide more detailed information on the public health management of drinking-water and the
properties of drinking-water determinands of public health concern than appears in the
DWSNZ. They provide access to information on public health aspects of drinking-water to water
supply personnel, health personnel and the general public.
Guidelines for Drinking-water Quality Management for New Zealand 2013
25
The Guidelines provide background and supporting information for the DWSNZ and will be
revised as necessary. The Guidelines contain:
•
guidance and good management principles for community drinking-water supplies
•
volume 1 includes the chapters. Chapters 1–5 are largely introductory and discuss risk
management, and source water. Chapters 6–11 discuss compliance issues. Chapters 12–18
relate to operating and maintaining the supply. Chapter 19 covers small supplies
•
volume 2 comprises the appendices, an assemblage of related technical material
•
the datasheets, in volume 3, describe how the criteria used in the DWSNZ were derived.
These datasheets provide background information about each determinand including their
sources, environmental forms and fates, typical concentrations either in New Zealand or
overseas drinking-water supplies, processes for removing the determinand from drinking-water,
analytical methods, health considerations, derivation of the MAVs for health significant
determinands and Guideline Values for aesthetic determinands, and references for further
reading. Datasheets for determinands of possible health and aesthetic significance and are
included for general information.
1.6.4
Public health risk management plans
The introduction of public health risk management plans (PHRMPs) in 2001 marked the
transition from drinking-water quality management procedures from purely quality control
(monitoring compliance against product quality standards) to a combination of QC and quality
assurance (QA). Prior to 2001 public health management of supplies relied largely on
monitoring the quality of the water produced by individual water suppliers to check that it
complied with the DWSNZ. While monitoring is always important, PHRMPs for drinking-water
supplies provide the additional benefit of introducing management procedures that reduce the
likelihood of contaminants entering supplies in the first place. Also, by the time monitoring
shows that contaminants are present, something has already gone wrong and a hazard is already
present in the water.
PHRMPs encourage the use of risk-management principles during treatment and distribution so
that monitoring is not the only water quality management technique used thereby further
reducing the risk of contamination.
To assist drinking-water suppliers to develop PHRMPs for their drinking-water the Ministry of
Health produced 39 PHRMP Guides covering the system elements (eg, filtration, disinfection,
water storage, distribution etc) that are most frequently found in drinking-water supplies. The
model PHRMP Guides are available at http://www.moh.govt.nz/water then select publications
and Public Health Risk Management Plans ~ Reference Guides. PHRMPs are discussed in detail
in Chapter 2: Management of Community Supplies.
The first item, How to prepare and develop public health risk management plans for drinkingwater supplies should be read before using any of the PHRMP Guides because it explains the
risk management process and how the different guides are intended to be used to build up a
PHRMP for a particular water supply.
Subsequently, in 2005, simplified PHRMP procedures and multi-media training material were
developed especially for use by small water supplies and published, together with related
training CDs, as an integral part of the DWAP TAP.
26
Guidelines for Drinking-water Quality Management for New Zealand 2013
All but the smallest community water supplies are required to prepare and implement a PHRMP
(HDWAA section 69Z). The timetable for compliance with this requirement is set out in
HDWAA sections 69 C to F. Water supplies that are smaller than neighbourhood supplies
(usually smaller than 25 persons) are not required to have PHRMPs unless specifically required
to do so by the Medical Officer of Health.
The preparation of an approved PHRMP by a drinking-water supplier provides one way of
demonstrating that all practicable steps have been taken to meet the requirements of the
proposed drinking-water legislation (HDWAA section 69H), because it:
•
identifies the nature and magnitude of public health risks inherent in the water supply process
•
specifies what preventive and corrective procedures should be in place to manage/mitigate
each risk
•
identifies what will be done by the supplier to mitigate the risks
•
identifies what the supplier is not able to do to mitigate the risks because of resource limitations.
1.6.5
Public health grading of drinking-water supplies
The grading of community drinking-water supplies is a voluntary system that has been in place
in various forms since 1962. The current grading system was updated in 2003 to incorporate
changes introduced by the Drinking-Water Standards for New Zealand 2000. There is no
requirement for a water supplier to participate in grading. If a water supplier chooses not to be
graded, the supplier is recorded in the Register of Drinking-Water Supplies in New Zealand as
being ungraded.
In 2008, following the amendments to the Health Act 1956 that introduced a statutory
compliance regime for drinking-water supplies, ESR Ltd surveyed water supply stakeholders to
see if there was support for a new grading framework. The survey found that grading was still
regarded by water suppliers as an important tool, and the purpose of providing a public
statement of safety was still desirable. Stakeholders agreed however that the existing framework
did not satisfactorily account for risk. It had no provision for public health risk management
plans (PHRMPs), or for the requirements of the Drinking-Water Standards for New Zealand
2005 (revised 2008).
At the time of writing the consultation on revision of the Grading Framework had closed with
submissions being analysed.
1.6.6
Drinking-water assessment
The role of the Drinking-water Assessors (DWAs) is to verify that that the requirements of the
Health Act 1956 as they relate to drinking-water have been complied with. The DWAs are
appointed by the Director-General of Health, and have the following set of tasks and their
functions are set out in section 69ZL of the Act.
DWAs are located in District Health Board public health units and are accredited as authorised
signatories. Maintenance and public access a Register of Drinking-water Assessors is a
requirement of the Act’s section 69ZX. The Register can be accessed
at: http://www.health.govt.nz/water then select legislation.
Guidelines for Drinking-water Quality Management for New Zealand 2013
27
1.6.7
Monitoring
Assesses the extent to which a drinking-water supply complies with the DWSNZ at the time of
monitoring.
Monitoring of the quality of a community drinking-water supply was made the responsibility of
the drinking-water supplier in the DWSNZ 1995. Previously, under the DWSNZ 1984,
monitoring had been carried out by the (then) Department of Health.
To demonstrate compliance with the DWSNZ, the Priority 1 and 2 determinands have to be
monitored according to the protocols set down in the DWSNZ. The DWSNZ specify the
minimum frequency of compliance monitoring. Water suppliers also conduct process control
testing and quality assurance monitoring as part of their day-to-day management. Process
control test results can be used for compliance monitoring if the procedure used complies with
the requirements of the DWSNZ.
1.6.8
Surveillance
The definition of surveillance in the DWSNZ is: the process of checking that the management of
drinking-water supplies conforms to the specifications in the Drinking-water Standards for New
Zealand (DWSNZ); usually conducted by the public health agency. An example of surveillance is
the process that results in a chemical determinand being assigned as a P2 (see next section).
The WHO Guidelines describe drinking-water supply surveillance as “the continuous and
vigilant public health assessment and review of the safety and acceptability of drinking-water
supplies”. This surveillance contributes to the protection of public health by promoting
improvement of the quality, quantity, accessibility, coverage, affordability and continuity of
water supplies (known as service indicators) and is complementary to the quality control
function of the drinking-water supplier. Drinking-water supply surveillance does not remove or
replace the responsibility of the drinking-water supplier to ensure that a drinking-water supply
is of acceptable quality and meets predetermined health-based and other performance targets.
1.6.9
Identifying priority 2 determinands
Identifies chemical determinands of potential health significance in drinking-water distribution
zones serving more than 100 people.
The DWSNZ require Priority 2 determinands to be monitored, as set out in section 8 of the
DWSNZ, so that their health significance can be evaluated. The Priority 2 Chemical
Determinands Identification Programme (P2) identifies for water suppliers those determinands
in their supply that need to be monitored.
The procedure is described in Appendix 3: Priority 2 Determinand Identification Guide.
28
Guidelines for Drinking-water Quality Management for New Zealand 2013
Table 1.5: Examples of priority allocation in the DWSNZ 11
Priority
Example of determinands
Priority 1
Escherichia coli (E. coli)
Giardia
Cryptosporidium
Applies to all community drinkingwater supplies
Priority 2
Applies to determinands where
there is good reason to believe
that the substance is present in
concentrations that present a
potential public health risk (this
priority is specific to the particular
supply)
Priority 3
Applies to determinands not likely
to be present in the supply to the
extent where they could present a
risk to public health
Priority 4
Applies to determinands not likely
to be present in New Zealand
drinking-waters
Chemical and radiological determinands that could be introduced into the
drinking-water supply by the treatment chemicals at levels potentially significant
to public health (usually greater than 50 percent MAV) eg, acrylamide monomer
where low specification polyacrylamide is used as a coagulant aid.
Chemical and radiological determinands of health significance that have been
demonstrated to be in the drinking-water supply at levels potentially significant
to public health (usually greater than 50 percent MAV) eg, arsenic and boron in
geothermal areas.
Micro-organisms of health significance which have been demonstrated to be
present in the drinking-water supply
Chemical and radiological determinands of health significance arising from
treatment processes in amounts known not to exceed 50 percent MAV.
Chemical and radiological determinands of health significance which are not
known to occur in the drinking-water supply at greater than 50 percent MAV.
Micro-organisms of health significance which could be present in the drinkingwater supply.
Determinands of aesthetic significance known to occur in the drinking-water
supply.
Chemical and radiological determinands of health significance that are known
not to be likely to occur in the drinking-water supply, eg, pesticides not
registered in, and not yet introduced into New Zealand.
Micro-organisms of health significance which are known not to be likely to be
present in the drinking-water supply.
Determinands of aesthetic significance not known to occur in the drinking-water
supply.
1.6.10 Register of Community Drinking-water Supplies and
Suppliers in New Zealand
The Register of Community Drinking-Water Supplies and Suppliers in New Zealand is a
requirement of the Health Act (s69J). It is a public document that provides easily accessible
information about community water supplies and drinking-water carriers.
For each supply, the Register records:
•
the name and address of the drinking-water supplier or carrier
•
the source(s) of the supply
•
unique codes for each component (to aid clear identification)
•
when the supply was first registered
•
category of the supply.
11
The priority classification scheme was introduced to give guidance as to the relative public health concern relating
to the many determinands of public health significance that are listed in the WHO Guidelines for Drinking-water
Quality. A detailed discussion of the priority classes is given in the DWSNZ.
Guidelines for Drinking-water Quality Management for New Zealand 2013
29
1.6.11 Annual Review of Drinking-water Quality in New Zealand
The Annual Review of Drinking-water Quality in New Zealand provides a public statement of
the extent to which a community water supply (serving over 100 people) complies with the
requirements of the Part 2A Health Act 1956.
Publication of an annual report is a requirement of the Director-General under section 69ZZZB.
Annual reviews are available at http://www.health.govt.nz/water (then select publications.
1.6.12 Register of recognised laboratories
To be accepted by the Ministry of Health for the purpose of analysing samples for compliance
with the DWSNZ, a laboratory must satisfy the Ministry that it:
•
requires suppliers who send samples from community drinking-water supplies for analysis
for the purpose of demonstrating compliance with the DWSNZ, to identify all such samples
with the appropriate unique site identification code as listed in the current Register of
Community Drinking-Water Supplies and Suppliers in New Zealand
•
has been recognised by an appropriate accreditation or certification authority as competent
to perform those analyses for which acceptance by the Ministry is sought (this would involve
accreditation to NZS/ISO/IEC 17025 [IANZ 2005] or equivalent). This includes IANZ
accredited laboratories and laboratories recognised by IANZ as complying with Ministry of
Health Level 2 Criteria (IANZ 2007)
•
is operating appropriate quality assurance procedures
•
is using methods that have been calibrated against the referee methods specified in the
DWSNZ
•
is actively engaged in on-going inter-laboratory method-comparison programmes to compare
the results of their analyses of the determinands for which they wish to be accepted by the
Ministry with analyses on those determinands carried out by other laboratories accepted by
the Ministry.
Other requirements may be added from time to time.
The Register of Recognised Laboratories is available via http://www.health.govt.nz/water
(under Drinking-water/Legislation/Related websites).
1.7
Other drinking-water requirements
1.7.1
Drinking-water quality at airports
Annex 1 B 1(d) of the International Health Regulations (IHR) (WHO 2005) requires every
designated airport location worldwide to develop the capacity to provide potable water for the
aircraft that use their facilities. However, it is the responsibility of each aircraft operator to
ensure that these standards are being upheld, not just in terms of the quality of the water taken
on board from the source of supply on the ground. In accordance with Article 24(c) of the IHR
(WHO 2005) states shall take all practicable measures to ensure that conveyance operators keep
the water system free of sources of contamination and infection.
30
Guidelines for Drinking-water Quality Management for New Zealand 2013
Airports should comply with the core capacity requirements of Annex 1 B 1(d) and the role of the
competent authorities to ensure, as far as practicable, that the facilities are in sanitary condition
and kept free of sources of infection and contamination, as per Article 22(b), such as providing
potable water from a uncontaminated source approved by the competent authority.
For further information, see WHO (2009).
1.7.2
Drinking-water quality in shipping
Historically ships have played an important role in transmitting infectious diseases around the
world. For example, the spread of cholera pandemics in the nineteenth century was thought to
be linked to trade routes, and facilitated by merchant shipping.
The purpose of the International Health Regulations is “to provide security against the
international spread of disease while avoiding unnecessary interference with international
traffic”.
Waterborne outbreaks have been associated with loading poor quality water. Therefore, the first
waterborne disease prevention strategy should be to load ships with the safest water available at
port. To support this objective, ports should make good quality potable water available to ships.
Potable water for ships, including water-boats and water-barges, needs to be obtained only from
those water sources and water supplies that provide potable water of a quality in line with the
standards recommended in the Guidelines for Drinking-water Quality (WHO 2004), especially
in relation to bacteriological requirements and chemical and physical requirements.
Potable water would typically need to be obtained from those watering points approved by the
health administration or health authority. Facilities include piping, hydrants, hoses and any
other equipment necessary for the delivery of water from shore sources at the pier or wharf area
to the filling line for the ship’s potable water system. Plans for the construction or replacement
of facilities for loading potable water aboard vessels would typically be submitted to the port
health authority or other designated authority for review.
For further information, see WHO (2007 and 2011a).
Note: The International Health Regulations (2005), hereafter referred to as IHR (2005), are an
international WHO legal framework addressing risks of international disease spread and legally
binding on 194 states parties throughout the world, including all 193 WHO member states. The
IHR (2005) are very broad, focusing upon almost all serious public health risks that might
spread internationally, whether biological, chemical or radionuclear in origin, and whether
transmissible in goods (including food), by persons, on conveyances (aircraft, ships, vehicles),
through vectors or through the environment. The IHR (2005) contain rights and obligations for
states parties (and functions for WHO) concerning prevention, surveillance and response;
health measures applied by States to international travellers, aircraft, ships, ground vehicles and
goods; and public health at international ports, airports and ground crossings. For more
information, see http://www.who.int/csr/ihr/en/.
Guidelines for Drinking-water Quality Management for New Zealand 2013
31
References
Ball A. 2006. Estimation of the Burden of Waterborne Disease in New Zealand: Preliminary report
FW0687. Christchurch: ESR.
Baker M, Russell N, Roseveare C, et al. 1998. Outbreak of cryptosporidiosis linked to Hutt Valley
swimming pool. The New Zealand Public Health Report 5(6): 41–5.
Bartram J. 2003. Investigation of sporadic waterborne disease. Preface to section 3 (on Investigation
of Sporadic Waterborne Disease) in PR Hunter, M Waite, E Ronchi. 2003. Drinking Water and
Infectious Disease. Boca Raton, LA: CRC Press and IWA Publishing.
Beca. 2001. Drinking Water Compliance Assessment. A report prepared (in March) for the Ministry
of Health by Beca Steven in association with BERL.
Brieseman MA. 1987. Town water supply as the cause of an outbreak of Campylobacter infection.
New Zealand Medical Journal 100: 212–13.
CH2M Beca. 2010. Drinking Water Standards New Zealand Cost Benefit Analysis – Engineering
Input. 103 pp. http://www.health.govt.nz/publication/drinking-water-cost-benefit-analysis
Craun G, Till DG, McBride GB. 2004. Epidemiological studies and surveillance. Chapter 10 in:
JA Cotruvo, A Dufour, G Rees, et al. Waterborne Zoonoses: Identification, Causes and Control.
London: IWA Publishing, for the World Health Organization. Available at:
http://www.who.int/water_sanitation_health/diseases/zoonoses/en/
Duncanson M, Russell N, Weinstein P, et al. 2000. Rates of notified cryptosporidiosis and quality of
drinking water supplies in Aotearoa, New Zealand. Water Research 14(15): 3804–12.
Dupont DP, Jahan N. 2012. Defensive spending on tap water substitutes: the value of reducing
perceived health risks. J Water and Health 10(1): 56–68.
Eberhardt-Phillips J, Walker N, Garrett N, et al. 1997. Campylobacteriosis in New Zealand: results of
a case-control study. Journal of Epidemiology and Community Health 51: 686–91.
ESR. Annual Summary of Outbreaks in New Zealand. Wellington: ESR. See www.surv.esr.cri.nz
then select surveillance reports, annual surveillance summary for the year desired, and then selected
tables.
Fraser G, Cooke KR. 1991. Endemic giardiasis and municipal water supply. American Journal of
Public Health 81(6): 760–2.
Frost F, Calderon RL, Craun GF. 2003. Improving waterborne disease surveillance. Chapter 2 in
FW Pontius (ed) Drinking Water Regulation and Health. New York, NY: Wiley-Interscience.
Harrington W, Krupnick A, Spofford W. 1985. The benefits of preventing an outbreak of giardiasis
due to drinking water contamination. Resources for the Future Report. Washington:
US Environmental Protection Agency, National Center for Environmental Economics.
Hrudey SE, Hrudey E. 2004. Safe Drinking Water: Lessons from Recent Outbreaks in Affluent
Nations. London: IWA Publishing.
Hunter P, Waite M, Ronchi E. 2003. Drinking Water and Infectious Disease: Establishing the Links.
London: CRC Press Boca Raton and IWA Publishing.
Hutton G, Haller L. 2004. Evaluation of the Costs and Benefits of Water and Sanitation
Improvements at the Global Level. WHO/SDE/WSH/04.04. Geneva: WHO.
See: http://www.who.int/water_sanitation_health/wsh0404/en/
IANZ. 2000. General Requirements for the Competence of Testing and Calibration Laboratories.
(NZS/ISO/IEC 17025).
32
Guidelines for Drinking-water Quality Management for New Zealand 2013
IANZ. 2007. Supplementary Criteria for Accreditation No. 1.2/2.2 (2nd edition). Auckland:
Ministry of Health Register of Water Testing Laboratories, International Accreditation New Zealand.
See: http://www.ianz.govt.nz
Ionas G, Learmonth JJ, Keys EA, et al. 1999. Distribution of Giardia and Cryptosporidium in
natural water systems in New Zealand – a nationwide survey. Water Science and Technology
38(12): 57–60.
LECG. 2010. Cost Benefit Analysis of Raising the Quality of New Zealand Networked Drinking
Water. Report to the Ministry of Health. 274 pp. http://www.health.govt.nz/publication/drinkingwater-cost-benefit-analysis
McBride G, Till D, Ryan T, et al. 2002. Freshwater Microbiology Research Report: Pathogen
occurrence and human health risk assessment analysis. Wellington: Ministry for the Environment.
McElnay C, Inkson I. 2002. Outbreak of Campylobacter in Hawke’s Bay traced to school water
supply. In: Annual Summary of Outbreaks in New Zealand 2001: A report to the Ministry of
Health. Wellington: Institute of Environmental Science and Research Limited (ESR).
Ministry of Health. 1992 to present day. Annual Reviews of Drinking-water Quality in New
Zealand. Wellington: Ministry of Health.
Ministry of Health. 1993 to present day. Register of Community Drinking-Water Supplies and
Suppliers in New Zealand. Wellington: Ministry of Health.
Ministry of Health. 1995, 2000, 2005 and 2008. Drinking-water Standards for New Zealand.
Wellington: Ministry of Health.
Ministry of Health. 2001. How to Prepare and Develop Public Health Risk Management Plans for
Drinking-Water Supplies. Wellington: Ministry of Health.
Nokes C, Devane M, Scholes P, et al. 2004. Survey of Campylobacter in New Zealand’s treated
drinking waters. Proceedings of the New Zealand Water and Wastes Annual Conference and Expo.
Christchurch, October 6–8.
OECD. 2011. Benefits of Investing in Water and Sanitation: An OECD Perspective. OECD
Publishing. 151 pp. http://dx.doi.org/10.1787/9789264100817-en
Ogilvie DJ. 1989. Water supply questionnaire, disinfection and bacteriology. New Zealand Water
Supply and Disposal Association Newsletter (now New Zealand Water & Wastes Association
Journal). Part I, issue #55: 8–14; Part II, issue #56: 19–26; Part III, issue #57: 20–4.
OMS. 2004. Assistance Options for Safer Drinking Water Systems: Report to the Ministry of
Health. Wellington: Outcome Management Services Ltd.
Payment P, Hunter P. 2003. Intervention studies. In PR Hunter, M Waite, E Ronchi. 2003. Drinking
Water and Infectious Disease. Boca Raton, LA: CRC Press and IWA Publishing.
Percival S, Chalmers R, Embrey M, et al. 2004. Microbiology of Water-borne Diseases. Amsterdam:
Elsevier Academic Press.
Pretty JN, Brett C, Gee D, et al. 2000. An assessment of the total external costs of UK agriculture.
Agricultural Systems 65: 113–36.
Reilly WJ, Browning LM. 2004. Zoonoses in Scotland – food, water, or contact? Chapter 11 in
JA Cotruvo, A Dufour, G Rees, et al (eds.). Waterborne Zoonoses Identification, Causes and Control.
London: IWA Publishing, for the World Health Organization, pp 167–190. Available
at: http://www.who.int/water_sanitation_health/diseases/zoonoses/en/
Reimann C, Banks D. 2004. Setting action levels for drinking water: are we protecting our health or
our economy (or our backs!). Science of the Total Environment 332: 12–21.
Guidelines for Drinking-water Quality Management for New Zealand 2013
33
Savill MG, Hudson JA, Ball A, et al. 2001. Enumeration of Campylobacter in New Zealand
recreational and drinking waters. Journal of Applied Microbiology 91: 38–46.
Schuster CJ, Aramini JJ, Ellis AG, et al. 2005. Infectious disease outbreaks related to drinking water
in Canada 1974–2001. Revue Canadienne de Santé Publique 96(4): 254–8.
Shaw CP. 1993. The New Zealand Drinking-water Standards – a preview. New Zealand Water and
Wastes Association Annual Conference, September.
Simmons G, Hope V, Lewis G, et al. 2001. Contamination of potable roof-collected rainwater in
Auckland, New Zealand. Water Research 35(6): 1518–24.
Sneyd E, Baker M. 2003. Infectious diseases in New Zealand: 2002 annual surveillance summary.
ESR Client Report FW 0332, for New Zealand Ministry of Health.
Stehr-Green JK, Nicholls C, McEwan S, et al. 1991. Waterborne outbreak of Campylobacter jejuni in
Christchurch: the importance of a combined epidemiologic and microbiologic investigation. New
Zealand Medical Journal 104: 356–8.
Thorstensen AL. 1985. What happened at Queenstown – an object lesson for water supply
authorities. New Zealand Local Government, February, p 455.
Till DG, McBride GB. 2004. Potential public health risk of Campylobacter and other zoonotic
waterborne infections in New Zealand. Chapter 12 in: JA Cotruvo, A Dufour, G Rees, et al.
Waterborne Zoonoses: Identification, Causes and Control. London: IWA Publishing, for the World
Health Organization. Available
at: http://www.who.int/water_sanitation_health/diseases/zoonoses/en/
Taylor MEU. 1993a. Review of the Management of Drinking-water Quality in New Zealand.
New Zealand Water and Wastes Association Annual Conference, September.
Taylor MEU. 1993b. Drinking-water for the 21st Century. The second annual AIC Water Resources
Industry Conference, Auckland, July.
Taylor MEU. 2000. Drinking-water management in New Zealand. Water (J Australian Water and
Waste Association), November issue.
Taylor M, Ball A. 2004. Why do we need safe water? Unpublished paper presented to the New
Zealand Environment Summit, Wellington, 22 March 2004.
USEPA. 2003. Small Drinking Water Systems Handbook: A guide to ‘packaged’ filtration and
disinfection technologies with remote monitoring and control tools. EPA/600/R-03/041. Office of
Research and Development. Water Supply and Water Resources Division, United States
Environmental Protection Agency. 73
pp. http://www.epa.gov/nrmrl/pubs/600r03041/600r03041.htm now go
to: http://nepis.epa.gov/Exe/ZyNET.exe/100046K6.TXT?ZyActionD=ZyDocument&Client=EPA&In
dex=2000+Thru+2005&Docs=&Query=&Time=&EndTime=&SearchMethod=1&TocRestrict=n&To
c=&TocEntry=&QField=&QFieldYear=&QFieldMonth=&QFieldDay=&IntQFieldOp=0&ExtQFieldO
p=0&XmlQuery=&File=D%3A%5Czyfiles%5CIndex%20Data%5C00thru05%5CTxt%5C00000006%
5C100046K6.txt&User=ANONYMOUS&Password=anonymous&SortMethod=h%7C&MaximumDocuments=1&FuzzyDegree=0&ImageQuality=r75g8/r75g8/x150y150g16/i425&Display
=p%7Cf&DefSeekPage=x&SearchBack=ZyActionL&Back=ZyActionS&BackDesc=Results%20page&
MaximumPages=1&ZyEntry=1&SeekPage=x&ZyPURL
Wheeler JG, Sethi D, Cowden JM, et al. 1999. Study of infectious intestinal disease in England: rates
in the community, presenting to general practice, and reported to national surveillance. British
Medical Journal 318(7190): 1046–50.
34
Guidelines for Drinking-water Quality Management for New Zealand 2013
WHO. 2004. Guidelines for Drinking-water Quality (3rd edition). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda.
WHO. 2005. International Health Regulations (IHR). Geneva: World Health Organization. 82
pp. http://www.who.int/ihr/9789241596664/en/index.html
WHO. 2007. International Health Regulations, Guide to Ship Sanitation (3rd edition) draft,
Version 10. Geneva: World Health Organization. 178 pp. Final version due December
2010. http://www.who.int/water_sanitation_health/gdwqrevision/rrships/en/index.html
WHO. 2009. Guide to Hygiene and Sanitation in Aviation (3rd edition). Geneva: World Health
Organization. 71 pp. See for general links: http://www.who.int/ihr/ports_airports/en/ and
also: http://www.who.int/water_sanitation_health/gdwqrevision/aviation/en/index.html http://w
ww.who.int/water_sanitation_health/hygiene/ships/guide_hygiene_sanitation_aviation_3_edition
.pdf
WHO. 2011. Guidelines for Drinking-water Quality 2011 (4th edition). Geneva: World Health
Organization. Available
at: http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
WHO. 2011a. Guide to Ship Sanitation (3rd edition). Geneva: World Health Organization. 171
pp. http://whqlibdoc.who.int/publications/2011/9789241546690_eng.pdf
WRC. 1998. Wholesale Water Supply Risk Analysis Discussion Paper. Wellington Regional Council.
Guidelines for Drinking-water Quality Management for New Zealand 2013
35
Chapter 2: Management of
community supplies
2.1
Introduction
This chapter discusses good management practices for community drinking-water supplies. A
community drinking-water supply is a reticulated, publicly or privately owned, drinking-water
supply connecting at least two buildings on separate titles, and serving at least 1500 person days
a year (eg, 25 people at least 60 days per year). An integrated management system should be
designed to meet the requirements of the Drinking-water Standards for New Zealand 2005
(revised 2008) (DWSNZ), statutory requirements and the consumers’ needs, as well as
environmental and cultural considerations.
The most important constituents of drinking-water are undoubtedly those that are capable of
having a direct impact on public health. It is up to the water suppliers to demonstrate to their
consumers that the management of the water supply system is being undertaken in a
responsible and efficient manner.
The proper management of a water supply system includes:
1
awareness and understanding of the physical and operational components of the system
2
adopting risk, quality assurance and asset management procedures in the operation of the
water supply
3
maintaining a surveillance programme to confirm that all systems are operating effectively
4
establishing a preventive and remedial actions programme
5
establishing effective monitoring programmes to test compliance with the drinking-water
quality standards
6
being aware of the requirements set down by statutory and consumer needs
7
having the ability to respond to consumer and community needs
8
establishing communication lines and techniques.
This chapter covers items 2–4 in some detail, and necessarily overlaps with items 1 and 5–8,
which are covered in other chapters.
Chapters 3 and 4 cover the selection and protection of water sources; Chapters 5–11 cover
compliance issues; Chapters 12–15 discuss treatment processes (including disinfection);
Chapter 16 discusses distribution system operations and maintenance. Chapter 17 covers
monitoring. Consumer satisfaction is important, especially with respect to the aesthetic quality
of the water supply, refer Chapter 18: aesthetic considerations. Management of small supplies
appears in Chapter 19.
Some management issues are covered in ANSI/AWWA Standards G series, see list on
http://www.awwa.org/files/Resources/Standards/StandardsSpreadsheet.xls.
36
Guidelines for Drinking-water Quality Management for New Zealand 2013
2.1.1
Components of a drinking-water supply
The principal features of a drinking-water supply are shown in Figure 2.1, which is reproduced
from section 1.8 of the DWSNZ. The format for the information on drinking-water supplies
published in the Ministry’s Register of Community Drinking-water Supplies and Suppliers in
New Zealand is based on the schematic approach illustrated in Figure 2.1.
A community water supply comprises one or more of the following (see Figure 2.1):
•
the source of raw water
•
the treatment plant
•
the distribution system.
Individual components and chemicals used in the water supply need to be appropriate, ie,
should not compromise the quality of the water. New Zealand, Australian, UK, ISO and US
standards should be referred to where possible. Examples include:
•
List of Approved Products for Use in Public Water Supply in the United Kingdom, see
http://dwi.defra.gov.uk/drinking-water-products/approved-products/soslistcurrent.pdf
•
AS/NZS 4020:1999. Products for use in contact with drinking water.
Figure 2.1: Schematic diagram of a drinking-water supply system
Groundwater
source
River
source
Treatment
Plant A
Distribution
Zone X
2.1.1.1
Distribution
Zone Y
Lake
source
Treatment
Plant B
Distribution
Zone Z
Source water
A community water supply may abstract raw water from rainwater, surface water or
groundwater sources. These are discussed in Chapters 3 and 4; Chapter 8 which describes the
source water categories for Cryptosporidium; and Chapter 19 which focuses on small water
supplies, including rainwater supplies.
Guidelines for Drinking-water Quality Management for New Zealand 2013
37
Surface water is frequently contaminated by micro-organisms. Waters from shallow
groundwater sources and springs are microbiologically equivalent to surface water, along with
rivers, streams, lakes and reservoirs. Secure bore water is considered to be free from
microbiological (bacterial and protozoal) contamination, see Chapter 3.
A water supply may have more than one source of raw water. Secondary sources may be
permanent or temporary. Temporary water supplies are discussed in section 6.9 of WHO (2004,
3rd addenda, 2008). Tankered water is covered by the Guidelines for the Safe Carriage and
Delivery of Drinking-water (MoH 2008).
2.1.1.2
The treatment plant
A treatment plant is a facility that treats raw water to make it safe and palatable for drinking. To
harmonise the DWSNZ with the conventions used in the public health grading of community
drinking-water supplies, the treatment plant is considered to be that part of the system where
raw water becomes the drinking-water. This can range from a full-scale water treatment plant
comprising chemical coagulation, sedimentation, sand filtration, pH adjustment, disinfection
and fluoridation, to simply being the point in a pipeline where the water main changes from a
raw water main to a drinking-water supply main. In a simple water supply, the water may be
merely abstracted from a river, passed through a coarse screen and piped to town; thus the
water supply acts like a diverted stream. If the raw water is chlorinated, however, it will not be
considered to become drinking-water until it has been exposed to chlorine (or chlorine dioxide)
for the design contact time.
A treatment plant may receive raw water from more than one source.
Water treatment and disinfection processes are discussed in Chapters 12–15.
2.1.1.3
The distribution system
Once the water leaves the water treatment plant, it enters one or more distribution zone(s) that
serve the community. The DWSNZ and the Public Health Grading of drinking-water supplies
define a distribution zone as:
“the part of the water supply network within which all consumers receive drinking-water
of identical quality, from the same or similar sources, with the same treatment and usually
at the same pressure. It is part of the supply network that is clearly separated from other
parts of the network, generally by location, but in some cases by the layout of the pipe
network. For example, in a large city, the central city area may form one zone, with
outlying suburbs forming separate zones, or in a small town, the system may be divided
into two distinct areas. The main purpose of assigning zones is to separately grade parts of
the system with distinctly different characteristics.”
A distribution zone may receive water from more than one treatment plant. The distribution
system may comprise more than one distribution zone. See Figure 2.1.
Distribution zones are distinguished because they may:
•
be fed by a pumping station so that they are isolated from nearby zones by pressure
•
be fed from a service reservoir which can markedly increase the retention time
•
vary seasonally due to supplementary sources being used at peak draw-off times
•
the boundaries may vary due to changes in pressure or draw-off
•
vary due to the materials used in common sections of the distribution system
38
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
receive their water from another supply by tanker that pumps the water into a storage tank
•
receive their drinking-water from a water supply wholesaler via bulk mains.
The distribution zones selected for the Public Health Grading and the DWSNZ are based on
water quality considerations and will not necessarily coincide with the distribution zones which
the water suppliers identify for operational and management purposes. Many community
drinking-water supplies comprise one distribution zone only.
The distribution system is discussed in Chapter 16.
2.1.2
Overview of management systems
There are a number of concepts and techniques that are useful for the management of water
supply systems. These include:
•
risk management (as outlined in section 2.2), which involves identifying, controlling and
minimising the impact of uncertain events
•
quality assurance (as outlined in section 2.3) which is based on controlling processes to
provide consistent products that satisfy customer requirements
•
quality control measures (as outlined in section 2.4), which provides the checks to
demonstrate the product is complying with standards, and feedback to the adequacy of risk
management
•
asset management, which involves the management of assets to achieve the required levels of
service.
From the time the Drinking Water Standards for New Zealand 1995 were prepared, the
emphasis of the Ministry of Health has shifted from quality assurance to risk management.
Quality assurance techniques were first devised for the control of manufacturing processes.
They aim to ensure a consistently acceptable end product by understanding and controlling the
processes used to produce that product. Quality assurance techniques recognise that there will
be a percentage of products that do not comply with the specified requirements and
concentrates on reducing this quantity to an economically acceptable level. Whilst this approach
offers many benefits, it is not completely suitable for the management of water supplies, where
the release of even a very small amount of contaminated water can impact on public health, and
cause economic and social impacts.
Risk management on the other hand concentrates on identifying, controlling and minimising
the impact of uncertain events. Like quality assurance it recognises that sometimes things will
not go as planned and aims to identify the causes of these problems and early warnings that the
events are starting, and put into place measures to control their impact. Emphasis is placed on
developing plans that detail how to prevent events occurring and to respond to events when they
do occur.
The use of risk management principles provides a greater certainty that the water being
provided to the public is safe than is given by merely monitoring compliance with standards
(quality control). This approach to water supply management leads water suppliers to consider
what can possibly go wrong in a water supply, to pinpoint what the causes may be, and once
identified, to take actions to reduce the likelihood of the event occurring.
Guidelines for Drinking-water Quality Management for New Zealand 2013
39
Whilst quality assurance and risk management techniques have different emphases, they both
involve similar tasks and both techniques have a place, along with asset management
techniques, in the integrated management of water supply systems, as illustrated in Figure 2.2.
Risk management techniques are used to identify what can go wrong and for putting in place
measures to reduce these risks. Often the measures will involve controlling the every day work
processes through quality assurance techniques. In other cases measures may involve the
maintenance or upgrading of assets such as treatment plants and distribution pipes, using asset
management techniques to decide when and/or how to upgrade or maintain these assets. The
understanding gained from the application of quality assurance and asset management to the
operation, maintenance and upgrading of the water supply system will in turn provide a better
understanding of the risks that can affect the system. When this is fed back into the risk
assessment the whole cycle starts again.
Figure 2.2: Integrated management of water supply systems
Key stakeholder requirements
Risk management
• Identify context
• Identify risks
• Analyse risks
• Evaluate risks
Treat risks
Asset management
Maintain existing assets
Construct new assets
Incident management
Plan responses to incidents
Quality management
Control work processes
2.2 Risk management
2.2.1
General
Risk is measured in terms of:
•
likelihood, ie, what is the probability or chance that the event will occur
•
consequence, ie, what harm will be caused by the event.
Risk management, therefore, places emphasis on preventing events occurring (to reduce the
likelihood) and responding to events when they do occur (to reduce the consequences).
40
Guidelines for Drinking-water Quality Management for New Zealand 2013
In the case of a water supply, an event would be something that has the potential to compromise
the ability to supply safe drinking water. Examples could include:
•
the water supply being contaminated by faeces and people becoming sick because of drinking
contaminated water
•
a truck spills its chemical load upstream of the intake and people becoming sick because of
drinking contaminated water
•
a power failure at the pump station and people becoming sick because they do not have
access to enough safe water
•
an operator not taking appropriate action in response to a bad water test result and people
becoming sick because of drinking contaminated water. See Water UK (2010).
The concept of the process for managing risk is shown in Figure 2.3.
Figure 2.3: The risk management process
Recovery
Risk Analysis
Response
Reduction
Readiness
Risk analysis
Risk analysis involves:
•
identifying events that may introduce hazards (contaminants that can make you sick) into the
water
•
establishing the likelihood and consequence of each event – the risk
•
determining the tolerability or acceptability of the risks
•
identifying possible options to reduce the chances of the events occurring and/or the impact
of the events should they occur
•
balancing the costs and benefits of each option for achieving acceptable risk levels.
Guidelines for Drinking-water Quality Management for New Zealand 2013
41
Risk reduction
Risk reduction involves implementing the measures to reduce the likelihood or consequence of
the events determined in the risk analysis – a proactive step. Typical measures include:
•
determining the scope/tasks of the reduction measures
•
assigning responsibilities to the tasks
•
determining a timeline for implementing the measures
•
implementing the measures
•
checking that the measures have been successful.
Examples of risk reduction measures may include additional/improved treatment processes,
online monitoring, or improved training.
Readiness
Where it is not practical to completely eliminate the risk through the implementation of risk
reduction measures, organisations need to be ready to deal with the risk event when it occurs –
preparation for the reactive response step. This involves:
•
preparing contingency plans that detail what personnel will do when a risk event occurs –
measures to deal with the event itself (eg, fix the broken pipe, restore residual chlorine level)
and measures to deal with the consequences of the event (eg, issue boil water notice, alert
medical services)
•
establishing relationships and outlining the channels of responsibility and communication
with key stakeholders, peer organisations, regulatory authorities, suppliers and service
providers so that they are in a position to help during a risk event
•
the training of staff in incident management techniques and their individual roles in
managing incidents
•
conducting exercises to train staff and test contingency plans.
Response
When a risk event arises organisations need to be able to respond and implement quickly and
effectively the contingency plans that they have already developed in readiness – a reactive step.
Therefore, organisations need to conduct team exercises regularly to practise and fine tune any
response processes that they have designed and prepare robust communication systems.
Recovery
Recovery involves two stages:
•
firstly, the measures taken to return operations to normal and put to rest any customer or
community dissatisfaction
•
secondly, the analysis of the event and carrying out debriefings, to learn from the event and
put into place measures to reduce the likelihood of it recurring.
The results of the debriefing should be fed back into the risk analysis thus closing the cycle and
allowing it to start again. Over time this will help organisations gain a better understanding of
the risks that can affect their operations and they should become smarter at handling them.
For further reading on the topic of risk management, see Chapter 10 of WHO (2001), WQRA
(2009) and NHMRC (2012).
42
Guidelines for Drinking-water Quality Management for New Zealand 2013
2.2.2
Public Health Risk Management Plans (PHRMPs)
The Ministry of Health advocates the use of Public Health Risk Management Plans (PHRMPs)
for managing public health risks associated with water supplies. PHRMPs are action plans that
show how risks to public health that may arise from the drinking-water provided by the supply
will be reduced. The World Health Organization calls these water safety plans. In 2012 WHO
published Water safety planning for small community water supplies step-by-step risk
management guidance for drinking-water supplies in small communities. This includes a case
study, based on New Zealand publications.
WHO (2005) discusses managing drinking-water quality from the catchment to the consumer.
WHO (2007) was written to help users at national or local level to establish which chemicals in a
particular setting should be given priority in developing strategies for risk management and
monitoring of chemicals in drinking-water. WHO (2009) is a manual for developing WSPs.
WHO (2011) was written to “increase confidence that safe water is consistently being delivered
to consumers by ensuring that key elements in the WSP process are not overlooked and that the
WSP remains up to date and is effective”.
Section 2.2.2 provides background information about the approaches that the Ministry of
Health has taken. It also includes subsections on the model Public Health Risk Management
Plan Guides (the Guides), which provide generic information that can be of assistance in the
preparation of public health risk management plans, and a discussion of the Ministry’s
document How to prepare and develop Public Health Risk Management Plans (MoH 2001),
which provides suggestions as to how PHRMPs can be developed from the Guides.
The Ministry has developed a Small Drinking-water Supplies: Public Health Risk Management
Kit (MoH 2008), which is discussed in Chapter 19: management of small supplies.
2.2.2.1
The Ministry of Health’s model approach to public health risk
management
The key documents of the Ministry’s approach are the Public Health Risk Management Plan
Guides (the Guides). There is no requirement for water suppliers to make use of the Guides;
they may use them to whatever degree they wish.
Some terms that are used in the following sub-sections and definitions may be helpful are:
•
Supply element: a physical or operational component of a water supply. Supply elements act
together to determine the quantity and quality of the water received by the consumer.
•
Hazard: a microbiological or chemical determinand that may cause sickness.
•
Event: an incident or situation that may introduce a hazard (or hazards) into the water.
•
Cause: the situation, action or inaction resulting in an event.
•
Preventive measure: an action taken, or process, to reduce the likelihood of an event
occurring.
2.2.2.2
PHRMP guides – development
To determine which Guides had to be prepared, water supplies were considered to consist of
three supply stages: source, treatment and the distribution system. Within each of these stages
supply elements were identified. The elements are the physical or operational components
contained in each stage. They act together to determine the quantity and quality of the water
received by the consumer.
Guidelines for Drinking-water Quality Management for New Zealand 2013
43
Some elements, such as the process of disinfection, can be further subdivided, eg, chlorination,
ozonation, etc. These were termed sub-elements. PHRMP Guides have been prepared for all
elements and sub-elements where they existed. The contents of the Guides are discussed in
section 2.2.2.3.
The most important factor influencing the form of the Guides was the need to make them
generic documents, ie, generally applicable, not designed to meet the needs of a particular
supply.
A number of principles acted as the basis for the development of the Guides:
a)
The Guides focus on what might go wrong within a supply (ie, the events), not the
microbiological or chemical contaminants (hazards) in the water or the preventive
measures. This was done to avoid overlooking:
– hazards that may not have been identified at the time the Guides were prepared
– other events, not identified in the Guide, because of too narrow a focus on the
preventive measures.
b)
The Guides identify preventive measures that might not be possible to act on at present in
some supplies. These are included because they are considered important and need to be
noted in case future developments allow them to be put in place. Examples of this are
preventive measures that cannot be implemented because water suppliers lack the
legislative authority to manage their own catchments. Future changes to legislation may
allow these preventive measures to be implemented.
c)
The Guides have been regarded as a means of improving industry practices where this
seems reasonable. As a result, some water suppliers may find that their present practices
fall short of some preventive measures and corrective actions in the Guides, and they will
need to review whether an improvement in the way they manage their supplies can be
achieved. These situations will probably arise most frequently in relation to distribution
systems.
d)
Events with various levels of risk have been included in each Guide. No attempt has been
made to omit events because they were considered to be too low a risk. Each water
supplier has to determine the importance of each event for their particular situation; the
Guides only indicate what should be considered.
e)
The Guides only provide generalised estimates of the levels of risk associated with each
supply element. To obtain a fuller assessment of the risk associated with each event, water
suppliers have to analyse the risks based on the circumstances in their supplies. The
Guides do, however, contain two features that give an indication of the typical importance
of events for public health:
– an estimate of the level of risk associated with each event (evaluated on what might be
expected for most supplies)
– a risk summary, in which the event considered to present the greatest risk to public
health for a particular supply element is identified, along with the most important
preventive measures for this event.
44
Guidelines for Drinking-water Quality Management for New Zealand 2013
2.2.2.3
PHRMP guides – content
The Guides are the building blocks from which public health risk management plans can be
prepared. They contain the following sections and information:
1
Introduction: The introduction outlines the topics covered by the Guide. It also sets out
possible events that can be associated with the supply element, the possible public health
consequences of each event, and how the particular element can influence, or be
influenced by, other supply elements. This last item is important, because it provides the
operator with guidance on how the risks associated with one element may be modified by
another.
2
Risk summary: The risk summary’s purpose is to summarise the key information
contained within the Guide. It is included for the supplier that may have limited
understanding of drinking-water quality management. Even if the full information table
later in the Guide cannot be understood, the risk summary provides, in simplified form,
the most important information.
3
Risk information table: This table contains the detailed information that can be used in
managing risks associated with the supply element. The table is divided into sections, each
of which deals with a particular event. The heading of each section states the event, the
hazard(s) that may be introduced as a result of the event, and provides a guide to the
typical level of risk associated with the event. The events contained in the tables are
potential events. They are listed to alert water supplies to events that may occur; their
appearance in the table does not mean that they are all relevant to a particular supply. The
supplier has to decide this for his/her own supply.
There are some deviations from this. There are some instances where micro-organisms are
the hazard, but they may not be pathogens of faecal origin. For example, where sediment
in part of the system is stirred up (eg, Event P2.2), faecal pathogens are not the concern.
The organisms introduced into the water may be opportunistic pathogens. These
organisms may be part of the normal microflora of the body, but under certain conditions
cause disease in compromised individuals (Geldreich 1996).
Because the actual risks presented by a particular event will depend on the situation
existing in the supply; an accurate indication of the level of risk cannot be provided in a
generic document. The levels of risk given provide some guidance for those who feel
unable to estimate more accurately the risks for their supply. Section 2.2.2.6 offers more
detail about how to estimate a qualitative level of risk for an event.
Listed within each section of the risk information tables (ie, concerned with one event)
are:
•
possible causes of the event
•
preventive measures that can be taken to reduce the likelihood of the event arising from
that particular cause
•
checks that can be made to determine whether the preventive measures are working
•
signs from the checks that show when preventive measures have failed and action
needs to be taken
•
corrective actions that need to be taken if the event occurs despite the preventive
measures in place.
Guidelines for Drinking-water Quality Management for New Zealand 2013
45
Preventive measures and corrective actions are distinguished by the way in which they
deal with the two aspects of risk. Preventive measures are intended to reduce the
likelihood of an event; corrective actions aim to reduce the consequences of the event if it
occurs. In some instances, corrective actions set up preventive measures that should have
been in place already.
The suggested checks are to determine when an event has occurred and a preventive
measure has not worked. Trouble-shooting may be assisted in some instances by checks
that are specific to certain preventive measures and the causes they are designed to
control. There are other checks, however, that are not so specific. These provide limited
help in identifying the cause of an event. For example, free available chlorine (FAC)
measurements are checks common to all causes that may result in the FAC concentration
being too low during chlorination. A low FAC result therefore indicates that an event has
occurred, and that a preventive measure has failed, but does not pinpoint what caused the
problem.
4
Contingency plans: Contingency plans have been prepared for events resulting in either
serious microbial contamination of the water, or substantial chemical contamination that
will have acute consequences. The concentrations at which chemical hazards generally
occur are low enough that their consequences are long-term. The contingency plans
contain information to assist in deciding when a contingency plan is needed, and the
actions that should be taken.
Contingency plans are distinguished from corrective actions on the basis of the level of
risk they are intended to manage. For example, the detection of low levels of a faecal
indicator in the treated water requires a corrective action, but not the implementation of a
contingency plan. The detection of high levels of faecal contamination, or evidence of
widespread sickness that is likely to be of water-borne origin signals the need to
implement a contingency plan. The need for the implementation of a contingency plan
may arise from the failure of corrective actions to reduce a hazard in the water to an
acceptable level.
5
Performance assessment: This section of the Guide lists checks that can be made to
establish how well the plan is working for the particular element in question, and how
frequently the checks need to be made. Many of the checks are the same as those noted in
the risk information table. Guidance is also provided on what needs to be done with the
results of checks, particularly with respect to their review, and the need to use this
information in the updating of the plan.
The Guides may not have identified all possible events, their causes or appropriate preventive
measures. It is therefore important that, when a PHRMP is prepared, the water supplier
remains alert to the possibility that events not listed may also occur, and does not rely solely on
the Guides.
2.2.2.4 The preparation of PHRMPs
Guidance on the preparation and implementation of PHRMPs is provided in the Ministry of
Health’s publication How to Prepare and Develop Public Health Risk Management Plans. The
publication serves a number of functions by:
•
setting out which Guides are available
•
explaining in general what they contain, and the terminology used
•
offering direction in the use of the Guides in preparing PHRMPs
•
offering direction for the use of the plans once they have been prepared.
46
Guidelines for Drinking-water Quality Management for New Zealand 2013
A suggested approach to the development of PHRMPs is set out in Figure 1 of the publication. It
outlines a series of steps that should be taken in preparing plans, provides some detail as to how
to carry out the step, and indicates what should come out of the step for addition to the supply’s
PHRMP. The steps are summarised in Figure 2.4 and outlined below.
It is preferable that water suppliers prepare their own PHRMPs, because during the process,
they will become more aware of each step involved in running the supply, and will therefore
consider the risks, the improvements and training needs associated with each step. If it is
considered necessary to use consultants, the water supplier must be closely involved in the
preparation of the PHRMP. It is recommended that the PHRMP makes frequent reference to all
relevant operations manuals.
Figure 2.4: Suggested approach for the development of PHRMPs
Produce overview of supply
Identify events that may introduce hazards
Risk analysis
Identify barriers to contamination
Identify possible causes of each event, preventive
measures and corrective actions
Decide where improvements should be made
Draw up timetable
Reduction
Decide on order of improvements
Identify links with other quality assurance systems
Develop contingency plan
Readiness
Performance assessment of plans
Development communication policy
Guidelines for Drinking-water Quality Management for New Zealand 2013
47
Risk assessment
Step 1:
Produce an overview of the supply and decide which PHRMP Guides are needed
Contribution to the PHRMP: a flow diagram of the supply
The water supplier needs to identify all the elements in their supply. Without doing this all
possible events that may lead to hazards being in the water cannot be identified. The step also
serves to help the supplier determine which Guides will be required for preparation of the plan.
The task is best accomplished by methodically working through the supply from the catchment
or recharge zone, to the consumer’s property and identifying all activities, processes, or physical
components that may influence the quantity or quality of the water.
Although the term supply is used in the Guides, and in this section of the Guidelines, the
PHRMP is being prepared to protect the consumers in a particular distribution zone. A
neighbouring distribution zone may be subject to different events, so its PHRMP will need to be
different.
Where more than one plant and/or more than one source provides water for a distribution zone,
the flow diagram prepared in this step must include all supply elements that could influence the
quality of the water reaching the distribution zone of interest.
How a water supplier is to deal with more than one supply/distribution zone in determining the
priority of resource allocations is described in Step 6.
Step 2:
Identify the barriers to contamination
Contribution to the PHRMP: checklist of barriers present
Between the source water catchment or recharge zone and the consumer’s property, various
elements of a water supply act as barriers to the entry of contaminants. Each barrier contributes
to the safety of the supply, but it is generally recognised that the greatest protection to water
quality and public health is achieved by ensuring that four fundamental barriers are in place.
These four barriers must achieve the following:
1
prevention of contaminants entering the raw water of the supply
2
removal of particles from the water
3
inactivation of micro-organisms in the water
4
maintenance of the quality of the water during distribution.
Step 2 is very important in the development of the PHRMP because the absence of a barrier may
not necessarily become evident during other steps in the preparation of the plan. A water
supplier needs to know when any of these barriers is missing in their supply, because the
maximum level of public health protection, especially with regard to pathogens, cannot
otherwise be achieved. Supply elements that may contribute to each type of barrier are listed in
Table 2.1.
48
Guidelines for Drinking-water Quality Management for New Zealand 2013
Table 2.1: Supply elements contributing to the four main barriers to bacterial
contaminants
Barriers to …
Actions or supply elements contributing to these barriers
Stop contamination of raw waters
Use of secure groundwaters
Abstraction point positioned and constructed to avoid contamination
Source protected from contamination
Actions to avoid contamination of roof catchments, and contaminants being
washed from roofs
Remove particles from the water
Coagulation/flocculation/clarification
Dissolved air filtration
Filtration
Kill germs in the water
Disinfection (chlorine, chlorine dioxide, ozone, UV light)
Preventing recontamination after
treatment
Measures to stop contamination of storage tanks
Maintenance of a disinfecting residual
Actions taken to avoid contamination during distribution
Installation of backflow preventers where necessary
Step 3:
Use the Guides to identify events that may introduce hazards into the water
Contribution to the PHRMP: Feeds into the supply’s risk information table
This is the first of two steps that are the basis for producing the supply’s customised risk
information table. The Guides have been prepared with the aim of identifying all possible events
associated with a particular supply element. It is possible that some events have been omitted,
or that events that are irrelevant to a particular supply have been included. For these reasons, a
water supplier needs to work through the events listed in the Guide, select those that are
relevant for their customised risk information table, and add other events of concern that have
not been considered.
Risk management
Step 4:
Use the Guides to identify:
• causes
• preventive measures
• corrective actions
Contribution to the PHRMP: Feeds into the supply’s risk information table
This is the second step contributing to the preparation of the customised risk information table.
Having identified the events relevant to their supply, the water supplier now needs to go
through the same process of identifying the causes, preventive measures, checks on preventive
measures and corrective actions that are relevant. The preventive measures, checks and
corrective actions that appear in their risk information table ought to include all that should be
in place, not simply those that are actually in place.
Section 2.2.2.2 noted that some preventive measures have been included that it might not be
possible to act on at present in some supplies. Preventive measures of this type should be
included in the supply’s risk management plan with a flag that the measure cannot be
implemented, and a note made of the reason. The supply’s assessor will verify this during the
assessment of the plan. The inclusion of these measures will serve as a reminder of the actions
that need to be taken when their implementation becomes practicable.
Guidelines for Drinking-water Quality Management for New Zealand 2013
49
Step 5:
Decide where improvements should be made in the supply to better protect public
health
Contribution to the PHRMP: Feeds into the supply’s improvement schedule
This is the first of three steps that are the basis for preparing the improvement schedule. The
purpose of this schedule is to list any of the four main barriers, preventive measures, checks or
corrective actions that are missing from a supply.
From Step 2 it will be possible to identify which, if any, of the four barriers are missing from the
supply. A secure bore water has met the first three barriers with respect to microbiological
contamination. In the absence of dissolved chemicals of public health significance, prevention of
contaminants entering the water after it is abstracted from the ground is then the only concern.
The preventive measures, checks and corrective actions that should be in place will be contained
in the risk information table as the result of Step 4. These now need to be compared with what is
actually in place in the supply. Identifying which preventive measures and checks are not in
place, but need to be, should be straightforward. The situation is different for corrective actions
however. These are actions that will not need to be taken until something goes wrong with the
preventive measures. Consequently, the main concern with the corrective actions is to make
sure that they are listed in the customised risk information table. In the event of something
going wrong, the person responsible for the supply can then refer to the table for guidance on
the appropriate action.
Step 6:
Decide on the order in which improvements need to be made
Contribution to the PHRMP: Feeds into the supply’s improvement schedule
In this step priorities must be assigned to the improvements identified in Step 5. The most
important factors to be taken into account when making these decisions are: public health,
availability of resources, and the ease with which the improvement can be made.
The suggested approach is first to produce a table that ranks the preventive measures that need
to be put in place in the order of the level of public health risk of the event they are intended to
stop. Preventive measures associated with high-risk events should be given high priority for
attention.
The Guides provide some help in obtaining estimates of public health risk:
•
the risk summary in each guide indicates which events are considered to present the highest
public health risk for the particular supply element, as well as the preventive measures
considered most important in controlling these events
•
the risk information table provides estimates of the level of risk for each event. The
limitations of these typical values were discussed in section 2.2.2.2.
Where water suppliers wish to obtain an estimate of risk that is more tailored to their supply,
Appendix 2 of the Ministry’s guide to the preparation of PHRMPs (MoH 2001) describes how
the level of risk can be estimated from its two contributing factors of consequence and
likelihood. This is described more fully in section 2.2.2.6.
Once an order of importance based on public health has been determined, the water supplier
needs to consider how resources (financial and otherwise) and the ease of carrying out
improvements may modify this ranking. High priority should be given to improvements that can
be easily made at little cost. The improvement schedule from the PHRMP should contribute to
the preparation of the supplier’s asset management plan.
50
Guidelines for Drinking-water Quality Management for New Zealand 2013
The PHRMP should contain information giving the reasons for the final order assigned to the
improvements; the assessor will seek this. The documentation should include:
•
any information used to assess the likelihood of an event, if the supplier carried out their own
risk estimation
•
the basis for deciding on the priority of improvements when the qualitative estimated risk
levels were the same
•
information on costs of improvements
•
links to the supply’s asset management plan
•
a note on the ease of implementation where this influenced the ranking
•
any other factors, eg, political, that have been important in making the ranking decision.
The detail provided should be proportional to the size of the supply in question: small supplies
require a minimal amount of detail, and large supplies considerably more.
Water suppliers with more than one supply face a more complex situation. When evaluating the
importance to public health of improvements required in a single supply, the population is a
common factor and does not have to be considered. When a supplier has responsibility for more
than one supply/distribution zone, however, account needs to be taken of the population when
comparing risk to public health of events in different supplies. The population of a supply
determines the number of people who may get sick, and may also influence political
considerations.
A possible approach to determining the order in which improvements should be made, or
resources allocated, to a number of supplies is as follows:
1
Prepare a PHRMP for each supply/distribution zone. The improvements schedule should
not take account of resources, as account is taken of these later in the process. An overall
schedule for improvements for all the supplies/distribution zones will also have to be
prepared as part of the following process.
2
On the basis of the information in each plan estimate the overall level of public health risk
for each supply individually.
3
Using the level of public health risk for each supply found from the previous step (point 2)
and taking account of the population, rank the supplies/distribution zones in order of
their public health risk. Supplies with large populations and a high public health risk will
be at the top of this list, and small supplies with a low public health risk at the bottom.
Judgement will be required when determining the relative rankings where the situations
are not so extreme: eg, small supplies with high public health risk, and large supplies with
relatively low public health risk.
4
Having identified the supply with the greatest need for improvement from point 3, allocate
funding to the highest priority improvement needed for this supply from its improvements
schedule.
5
Return to point 2 and re-evaluate the overall level of public health risk assuming that the
improvement in point 4 has been made and is working properly, ie, the likelihood of a
particular event has been reduced. This process should be repeated until available
resources are exhausted, or there are no more improvements to be made. By the end of the
process, a list to provide the basis for an improvements schedule for all supplies/
distribution zones will have been produced.
Guidelines for Drinking-water Quality Management for New Zealand 2013
51
Step 7:
Draw up a timetable for making the improvements
Contribution to the PHRMP: Feeds into the supply’s improvement schedule
The final step in developing the improvement schedule is to assign a completion date and
responsibility to each improvement.
Step 8:
Identify links to other quality assurance systems
Contribution to the PHRMP: Note of other quality assurance systems in place
A PHRMP is one of a number of quality assurance systems a water supplier may have in place.
Other systems may include monitoring and maintenance programmes and ISO 9000/14000
series systems. Maintenance schedules and monitoring programmes are suggested in many of
the Guides. These, and other relevant programmes not mentioned in the Guides, should be
referenced in the plan once they are implemented.
A properly developed ISO quality assurance system should aim to achieve the same goal as the
PHRMPs, namely the protection of public health. Water suppliers with ISO systems in place
should check to ensure that the ISO system provides a degree of detail for managing public
health risk similar to that expected in the PHRMPs. If it does not, a PHRMP needs to be
developed and linked into the ISO system to cover those aspects of management not properly
dealt with by it.
A supply’s PHRMP aims to identify possible sources of hazards that may enter the supply and
the likely effectiveness of barriers to these hazards. This type of information cannot provide a
supplier with information about the actual hazards present, nor their concentrations. To
improve the assessment of actual risk to public health, monitoring, additional to that already
undertaken for compliance or process control, is of value. This additional monitoring will
identify which hazards are affecting water quality, their concentration and how variable their
concentrations are. The information will help in deciding on appropriate preventive measures.
Monitoring being undertaken for this purpose should also be referenced in the plan.
Step 9:
Prepare contingency plans
Contribution to the PHRMP: Contingency plans for each supply element
Suggested contingency plans are provided in each Guide for the supply element discussed. The
purpose of having contingency plans is to ensure that there is available a set of steps, thought
out in advance, for reacting rapidly to situations that may pose a major threat to the health of a
community through their water supply. A supply’s PHRMP, and its contingency plans in
particular, therefore need to be readily accessible to those who are likely to have to make supply
management decisions in such an emergency.
Suppliers should determine which of the contingency plans in the Guides are relevant to their
supply, and include additional ones if a potential situation of high risk is not covered by the
existing contingency plans. The contingency plans in the Guides provide a template for the
preparation of any new plans needed.
Contingency plans have been prepared to cover situations in which normal corrective actions
have failed to stop hazards entering the distribution zone. They are intended to deal with
circumstances in which high levels of pathogens have entered the distribution zone, or when
there is acute risk from chemical contaminants. Acute chemical risks may arise from such
incidents as chemical spills, volcanic eruptions, or flooding, which may deposit high
concentrations of chemical contaminants into a source water.
52
Guidelines for Drinking-water Quality Management for New Zealand 2013
Drought is normally associated with water shortage, but it can also impact on water quality.
Cyanobacteria may become more abundant (eg, as occurred in Kaitaia in the 2009/10 summer),
domestic sewage can become stronger with a possible reduction in effluent quality followed by
reduced dilution in the receiving water. Ash and subsequent runoff after forest fires overseas
have closed water treatment plants. Prolonged drought can cause groundwater levels to fall,
increasing the risk of saline intrusion; CDC (2010). DWI (2012) discusses health impacts related
to extreme event water shortages. UK Government policy is for emergency plans to go beyond
the routine operational events and prepare for events which may cut off water to a large number
of consumers for over 72 hours and may involve more than one water supply or company. It also
needs to be taken into account how the extreme event will affect logistics of distribution of
alternate supplies, the health of the population without a water supply, power, sanitation and
how these periods will differ from routine operational events. Extreme events affect health
beyond drinking water and this is to be taken into account when planning the response and
recovery.
Step 10:
Prepare instructions for performance assessment of the plan
Contribution to the PHRMP: Set of instructions for review of the performance of the
PHRMP
This step in the preparation of a supply’s PHRMP sets down a procedure for the review, and
where necessary, updating of the plan. The need to update a plan may arise because of:
•
a change in the circumstances of a water supply
•
the identification of possible new events and their causes
•
the discovery that one or more preventive measures or corrective actions are unsatisfactory
•
a contingency plan has failed when implemented.
Any one of these reasons leads to the need to modify the plan to minimise its weaknesses.
The PHRMP performance assessment section of the Guides can be used as the basis for
preparing instructions for reviewing the operation of the overall PHRMP for the supply. In
addition to the components of the review noted in the Guides, the review instructions should
include the need to:
•
note the frequency at which the plan should be reviewed
•
record any events that have occurred since the last review, and the actions taken as a result of
the event. These actions may include improvements to preventive measures, the introduction
of additional preventive measures, corrective actions, and new monitoring or maintenance
programmes
•
record changes, additions or deletions that have been made to supply elements
•
re-evaluate the improvement schedule. Changes occurring between reviews may require a
revision of the relative importance of the improvements needed, and consequently a
reordering of the schedule.
Step 11:
Decide on communication policy and needs
Contribution to the PHRMP: Set of instructions for reporting
The communication section of the plan should identify and record the people to whom reports
concerning the management of risk to the supply should be made, what information these
reports should contain and how often they should be made.
Guidelines for Drinking-water Quality Management for New Zealand 2013
53
The people who need to receive reports will depend on the management/ownership structure of
the supply. For example, a school may be required to report to its board of trustees and a
municipal water supply manager to his or her managers, the local authority councillors, and the
ratepayers.
The nature of the material reported, and the language used, need to be appropriate for the
recipient(s) of the report. Thought should also be given to the way in which recipients may
perceive risk and how this may need to influence the wording of the report. Perceptions of risk
can vary widely depending on such things as the assumptions, concepts and needs of the
stakeholders.
2.2.2.5
The implementation of public health risk management plans
Figure 2 in the Ministry of Health’s How to Prepare and Develop PHRMPs publication
describes what should be done with the PHRMP once it has been prepared. This diagram is
summarised in Figure 2.5.
Figure 2.5: Process for the implementation of PHRMPs
Refer to improvement schedule in plan
Follow the timetable of the schedule. Put in place:
• preventive measures
• checks
• corrective actions
that are needed, but not already present.
Review information gathered by monitoring and
maintenance programmes
Refer to and use contingency plans
Review how well the plan is working and make
changes where necessary
Step 1:
Refer to the improvement schedule.
Step 2:
Follow the timetable in the schedule for making improvements.
By following the improvement schedule the water supplier should be able to:
•
determine which capital works need to be undertaken and when
•
determine whether any new plant is scheduled for installation and when
54
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
put in place monitoring programmes. These should state:
– what is being monitored
– when samples are to be taken
– where samples are to be taken
– who will take the samples
– which laboratory is to be used, or whether the measurement will be carried out by works
or field staff
– what is to happen to the results
•
put in place maintenance programmes. These should state:
– what is to be checked and maintained
– how often checks are to be made
– who is to make the checks
– what is to happen to the check results
•
put in place staff training programmes. These should state:
– the purpose of the training
– which staff are to be trained
– how often refresher courses are needed.
Step 3:
Review information gathered by monitoring and maintenance programmes.
The PHRMP should record how frequently information from monitoring and maintenance
programmes should be reviewed, by whom, and to whom they should report in the event of
something of concern being spotted.
Reviews of this nature are important in helping staff become familiar with levels of
determinands, or conditions, that are normal and satisfactory, and those that are not. These
reviews and alertness to changes, or the occasional result of concern, may provide signs of
possible future problems. Identification of problems at an early stage may allow remedial
actions to be taken before a significant threat to public health develops.
All supply staff have a responsibility for ensuring that good quality water reaches the consumer.
Irrespective of the job a staff member has, if they become aware of a problem this information
must be passed to their manager as soon as possible.
Step 4:
Refer to and use the contingency plans if necessary.
Unlike the other steps in this sequence, contingency plans will not be used on a regular basis.
When a contingency plan has to be used, the actions that need to be taken depend on such
things as the type of hazard that is in the water, its likely concentration, and how far it has
travelled into the distribution system. Consultation with the Medical Officer of Health may be
necessary in assessing the seriousness of the event and what actions need to be taken.
As with other aspects of the PHRMP, it is important to discover why it became necessary to use
the contingency plan, and any shortcomings of the contingency plan itself. Both sets of
information can be used to modify and improve the plan.
Step 5:
Review the operation and performance of the Plan.
This is discussed in step 10 in section 2.2.2.4.
Guidelines for Drinking-water Quality Management for New Zealand 2013
55
Step 6:
Return to step 1.
The series of steps outlined above need to follow a regular basis. This ensures that:
•
the need for improvements to the supply are addressed regularly
•
the Improvement Schedule is updated to take account of improvements
•
the plan is modified and improved as experience shows where there are weaknesses.
As time goes on, the degree of modification required should diminish as the system becomes
more refined, although major changes to the supply may require the re-identification of events,
causes, preventive measures etc.
For the plan to be of value it must be used, and this is more likely to happen if it is kept current
and can be used by the water supply manager as a guide to the use of resources.
2.2.2.6 Risk analysis
Risk analysis is performed to separate minor risks from major risks, and to provide information
that will help in the evaluation and treatment of risks. Identification of the level of risk
associated with a particular event assists in establishing the priority that should be given to
putting in place preventive measures to reduce the likelihood of the event occurring.
Risk analysis can be undertaken at various levels of refinement: qualitative, semi-quantitative,
quantitative, or a combination of these depending on the circumstances. Which is used will
depend on the information available. Unless the information on which the analysis is based is
very reliable, a set of numbers produced by quantitative calculations may give a false sense of
reliability to the analysis. Should quantitative analysis be undertaken, it is advisable to carry out
a sensitivity analysis to determine how the results vary as the individual assumptions made in
the calculation are varied. This will show the reliability of the calculated risks.
Risk is measured in terms of consequences and likelihood (AS/NZS 2004). Thus, the level of
risk of an event that has a high probability of occurring and which may lead to severe illness and
death is very high. An event that may occur very intermittently, and with very little effect on
public health has a low level of risk associated with it.
To evaluate the level of risk associated with an event, an estimate of how frequently such an
event is likely to occur, and an appreciation of the effects on public health of the event, if it were
to occur, is needed. Where sufficient data are available it may be possible to calculate the
probability of the event occurring and the severity of its consequences. Situations where there
are sufficient data to carry out such calculations for drinking-water supplies are rare. The water
supplier therefore needs to rely on qualitative estimates of likelihood and consequence.
Assistance in evaluating consequence can be gained from understanding the factors that
contribute to it. These include the:
•
number of people that are exposed to the hazard(s); the greater the number of people
exposed, the more severe the consequences
•
nature of the hazard and its likely effect on health, which requires consideration of its
concentration in the water, eg, the effects of elevated levels of algal toxins in the water, are
much more severe than the presence of an organism that may lead to mild diarrhoea
•
duration of exposure to the hazard(s); longer exposures may increase the severity of the
health effects and increase the number of people suffering these effects.
56
Guidelines for Drinking-water Quality Management for New Zealand 2013
For most events the water supplier is unlikely to have values for most of these factors. Apart
from the population, a broad classification of the hazard, ie, whether it is to be microbiological
or chemical may be the only guide to the severity of the consequences. The likelihood factor may
therefore best assist the water supplier in estimating the level of risk for the event. Sources of
information that can be of value in doing this are:
•
past records
•
the water supplier’s own experience
•
the experience and practice of the water supply industry as a whole
•
published research
•
the opinions of specialists and other experts.
The best guidance water suppliers have for estimating the likelihood of an event is from their
own records and staff experience. The international literature may occasionally make comments
about the frequency at which certain events occur. These do provide some guidance, but they
may be an average value, or derived from a single supply, neither of which will necessarily
provide a reasonable estimate of the frequency for the supply in question.
Appendix E of AS/NZS 4360:2004 contains an example of how qualitative levels of risk can be
derived from qualitative estimates of consequence and likelihood. The tables for consequence
and likelihood used in this example can be modified to provide descriptions that are more suited
to water supplies. The following are suggested alternatives. For a given water supply, where the
population is fixed, the descriptors for consequence may be better linked to the percentage of
the population affected and the nature of the effect, eg, mild gastrointestinal upset, severe
diarrhoea etc.
Likelihood scale
Likelihood ranking
Description
Rare
May occur only in exceptional circumstances
Unlikely
Could occur
Possible
Might occur at some time
Likely
Will probably occur
Almost certain
Is expected to occur in most circumstances
Consequence scale
Consequence ranking
Description
Insignificant
Insignificant
Minor
Minor impact for small population
Moderate
Minor impact for big population
Major
Major impact for small population
Catastrophic
Major impact for big population
Guidelines for Drinking-water Quality Management for New Zealand 2013
57
This gives the following estimates of risk:
Likelihood
Consequences
Insignificant
Minor
Moderate
Major
Catastrophic
High
High
Extreme
Extreme
Extreme
Moderate
High
High
Extreme
Extreme
Possible
Low
Moderate
High
Extreme
Extreme
Unlikely
Low
Low
Moderate
High
Extreme
Rare
Low
Low
Moderate
High
High
Almost certain
Likely
Scales of likelihood and consequence, and a risk matrix, which are more related to use in water
supply than these more general tables are given in Tables 4.2 and 4.3 of the WHO Guidelines for
drinking-water quality (WHO 2004).
2.2.3
Contingency planning
Water supply authorities should identify and assess any local conditions that may threaten the
integrity of their system (refer section 2.2.2.2). It is essential that water suppliers develop
contingency plans to be invoked in the eventuality that an emergency arises. These plans should
consider:
•
potential natural disasters (such as earthquakes, volcanic eruptions, algal blooms, droughts
and floods)
•
accidents (spills in the catchment or recharge area)
•
areas with potential backflow problems (including ones with fluctuating or low pressures)
•
damage to the electrical supply
•
damage to intakes, treatment plant and distribution systems
•
human actions (strikes, vandalism, and sabotage).
Contingency planning should establish a series of steps and procedures for dealing with
emergencies. The plans should specify responsibilities clearly in the water supply authority and
with outside authorities for co-ordinating the response. This should include a communications
plan to alert and inform users of the supply, plans for providing and distributing emergency
supplies of water, and liaison with the Medical Officer of Health or other designated officer of
the Ministry of Health. These plans should be developed in liaison with civil defence personnel.
Contact with civil defence should be maintained and the plans updated.
The contingency plans should also cover:
•
assignation of responsibilities
•
priorities for dealing with multiple problems
•
investigation of all probable causes of the emergency
•
an assessment of the public health risk arising from the emergency
•
an epidemiological investigation if deemed appropriate by the Medical Officer of Health (if a
causal relationship between the water supply and illness is suspected but not obvious)
•
action required to mitigate any public health risks which may have been revealed by the
emergency (this may include initiating legal proceedings if negligence can be proved)
•
advising and liaising with the Medical Officer of Health.
58
Guidelines for Drinking-water Quality Management for New Zealand 2013
Occasional emergency exercises will help the public develop confidence in their water supply at
the same time as the water supplier and cooperating parties learn how to cope.
Chapter 7 of WHO (2003) addresses emergency planning and response issues in water supplies.
WHO (2011a) provides a range of technical notes covering emergencies.
2.2.4 Response to incidents
One of the key measures for success for a risk management system is how well an organisation
responds to a risk event. Even with the best risk management system in place things will go
wrong and organisations need to respond. In fact responding to a number of small events can be
positive as it can help organisations identify areas that need to be improved in order to prevent
larger, more serious events from occurring. Risk events have a number of characteristics
including:
•
they can get worse
•
they can have wide ranging impacts
•
their effects can be ongoing.
For example, heavy rain may initially cause flooding and resources could be focused on
protecting habitable floors and maintaining road networks. However, if the heavy rain and
flooding starts to contaminate the water being received at the treatment plant there may be a
risk to public health from drinking water and boiled water notices may need to be issued. The
incident is now far worse than initially thought, the communication needs are a lot greater, the
impacts may now affect a whole community, it may take days to rectify the situation and the
ongoing investigations as to why the event occurred may take several months.
Organisations therefore need systems, communication and responsibility networks and trained
staff to be able to recognise and respond to the changing circumstances that occur in an event
and implement the contingency plans that should have already been developed. Risk
communication is discussed thoughtfully in Chapter 14 of WHO (2001). Water UK (2010a) has
prepared a Technical Guidance Note to help water suppliers prepare. Risk communication is any
purposeful exchange of information about risks between interested parties. More specifically,
risk communication is the act of conveying or transmitting information between parties about a
range of areas including:
•
levels of health or environmental risks
•
the significance or meaning of health or environmental risks
•
decisions, actions or policies aimed at managing or controlling health or environmental risks.
Interested parties include government agencies, corporations and industry groups, unions, the
media, scientists, professional organisations, interested groups, and individual citizens.
2.2.4.1
Incident levels
As risk events can very quickly escalate, become far more complicated and their effects felt more
widespread, organisations often develop a system of incident levels. At the lowest level, eg,
minor flooding, the situation may be handled by normal work crews, but with management and
call centres informed to the extent of flooding. As the situation worsens a higher level may be
triggered with more senior staff and specialist staff called in to undertake tasks such as
investigating the water catchment. An even higher level may be triggered if it is likely that the
water supply may be contaminated and outside agencies such as Ministry of Health and Civil
Defence informed. On a larger scale, the USEPA has begun to address issues related to
terrorism, see http://cfpub.epa.gov/safewater/watersecurity/index.cfm
Guidelines for Drinking-water Quality Management for New Zealand 2013
59
2.2.4.2 Organisation
Initially when an event occurs, the focus is often on putting the immediate situation right.
However, as the event develops and escalates there needs to be increased focus on planning,
communication and logistics. People need to be thinking about what is going to happen in one
hour, four hours, the next day and the next week, and they need to start to put in place plans for
dealing with these situations. Key people need to be communicating with the public, press and
health authorities. Yet others will need to be addressing logistical issues such as organising
emergency staff for the next shift, and the materials and equipment required to rectify the
situation. If these areas are not addressed then the immediate situation may be fixed but the risk
to public health may remain. For example in the situation discussed at the beginning of this
section, if the contaminated supply to the treatment plant is isolated, but boiled water notices
are not communicated to the whole community, then the public will be unaware of the potential
risks and may continue to drink contaminated water.
Some organisations are therefore using the Coordinated Incident Management Response
System (CIMS) as shown in Figure 2.6 to structure their teams that respond to incidents. Under
the CIMS structure an Incident Controller is assigned who has overall responsibility for
managing the incident. Reporting to the Incident Controller are personnel who are responsible
for operations, planning, communications and logistics. Advantages of CIMS are:
•
it provides clearly defined roles and responsibilities
•
it compartmentalises thinking, so personnel only have to think about their particular tasks
rather than trying to tackle the whole incident and miss critical issues in the process
•
it provides common management structure and terminology with emergency organisations
such as the Fire Service and Civil Defence.
Figure 2.6: Organisation of incident management teams
Incident Controller
Planning
2.2.5
Operations
Communications
Logistics
Debriefings
Debriefings should be conducted after all incidents and exercises. Their purpose is to use the
experiences and lessons learned during the incident or exercise to make improvements, so that
further incidents can either be prevented from recurring or managed more effectively.
Debriefings should not be seen as a blame laying exercise; rather they should be seen as a
positive step for improving the organisation’s risk management.
Debriefings should involve all participants in the incident or exercise including contractors,
service providers, affected parties and regulatory agencies.
The debriefing process involves:
•
description of the events: involves describing the incident in detail, listing the names of the
people involved, the sequence of events, the impacts of the incident and any relevant
information. At this stage it is important that only the facts are recorded and assumptions are
not made
60
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
corrective action: the immediate actions taken to fix the problem are described and the
people and organisations informed of the incident are noted
•
immediate post incident debriefing: the views of the participants immediately after the
incident are recorded. This information may be subjective and may be just one person’s view,
but it gives a basis for further investigation during the structured debriefing
•
structured debriefing: a root cause analysis is conducted. This involves looking at each event
and asking why it happened? The question continues to be asked until the team has drilled
down to the root causes of the incident. The analysis considers:
– the physical causes of the incident
– resources, eg, equipment
– available information, both before and during the incident
– human resources, eg, availability of resources and training
– communication, both before and during the incident
– planning and procedures, both to avoid the incident and to respond to the incident
– processes, eg, were the plans and procedures that are in place followed
– leadership
•
preventive actions: following the debriefing, the actions required to prevent a recurrence or
to improve the effectiveness of the responses are identified. Staff are allocated responsibility
for actioning these items and a timeline is set.
2.2.6 Sanitary surveys
The expression ‘sanitary survey’ is used internationally to cover a wide range of activities. The
following terminology has been used in the DWSNZ:
a)
catchment assessment: assessing what may affect the raw water quality
b)
sanitary inspection: inspecting the whole drinking-water supply
c)
bore head protection: inspecting bore heads to ensure they provide adequate sanitary
protection
d)
protozoal risk categorisation: considering the catchment in terms of its protozoal risk
(Appendix 3 in the DWSNZ).
Sanitary surveys of the catchment, abstraction point, treatment plant, and distribution system
should be undertaken by the water supply authority as part of any programme of risk
management. They should be conducted with sufficient frequency to be useful in interpreting
trends or sudden or significant changes in water quality as revealed by routine monitoring.
The surveys identify potential risks, whilst monitoring can record process performance and
water quality trends, whether contamination is occurring, and the extent and the intensity of
that contamination.
The USEPA defines a sanitary survey (Title 40 CFR 141.2) as an onsite review of the water
source, facilities, equipment, operation, and maintenance of a public water system for the
purpose of evaluating its ability to produce and distribute safe drinking-water. The sanitary
survey should be conducted by qualified persons and identify contamination or deficiencies and
inadequacies in the catchment, treatment plant or distribution system, which could result in
failure to control contamination should it occur. The USEPA (1999) prepared a sanitary survey
guidance manual, which contains a lot of valuable information.
Guidelines for Drinking-water Quality Management for New Zealand 2013
61
A sanitary survey is indispensable for the proper interpretation of analytical results. No
microbiological or chemical survey, however carefully it is made, is a substitute for a thorough
knowledge of the conditions at the source and points of abstraction, the treatment process and
the distribution system. Sample results represent single points in time; the sanitary survey
provides information to determine whether the analytical results are likely to be typical.
Contamination may be random and intermittent, and if so, it is rarely revealed by occasional
sampling. However, a sanitary survey may identify a potential source of contamination that may
then be investigated by targeted monitoring.
A catchment assessment should review such items as land use, whether road or rail systems pass
through the catchment, disposal of human and animal wastes, storage and use of chemical
contaminants such as pesticides, the presence of existing sanitary landfills and old dumps,
release of nutrients, erosion status, levels and disease carrier status of animals (feral,
agricultural, and domestic), protection of intake structures from human and animal access,
sealing of well casings, protection of wells from flooding, and human access restrictions and
security. These factors should be assessed in relation to climatic and hydrological conditions.
The frequency with which catchment sanitary surveys should be performed will be a function of
factors such as access control, existing risks, size of population served, accessibility of catchment
and seasonal conditions, so cannot be rigidly specified. As a general rule, a thorough survey
should be performed every five years, with several less detailed inspections occurring within that
period, or when any change in land use or water quality is suspected.
2.2.7
Staff and contractors
The successful management of water supply systems depends on having staff and contractors
that have the necessary knowledge and ability to manage and operate the system, identify
potential risks and propose improvements.
The procedure for developing a programme to ensure that all staff have the necessary skills and
training will typically involve:
1
preparing job descriptions: each employee should have a detailed job description that sets
out their duties, key result areas and performance criteria
2
training needs analysis: meetings are normally held with individual staff members to
identify gaps between the duties that the staff are required to undertake and their skill
level. Training needs are identified and prioritised
3
training programme development: from the training needs analysis the most suitable type
of training is determined. Training needs may be a mix of:
– on-job training
– off-site training
– informal meetings and conferences
– encouragement to belong to professional and technical organisations
– internal advocacy of the knowledge industry through actions such as internal
distribution of technical journals, encouragement to attend local interest meeting
– recognition of current competencies
4
development and budgeting for a training programme for water staff
5
auditing of the training programme – to assess whether the programme has been initiated
and prove that the required levels of competency have been achieved.
62
Guidelines for Drinking-water Quality Management for New Zealand 2013
The MoH has prepared a PHRMP Guide on Staff Training, Ref G1.
It is recommended that water suppliers undertake a similar process when engaging contractors.
This would involve:
1
preparing task descriptions
2
identifying minimum required levels of training and experience
3
detailing required training and experience levels in the contract documents
4
auditing to ensure that the contractor’s staff have the required levels of competency.
Water Industry Training has developed a number of national qualifications in partnership with
the water industry. The qualifications have been designed to include practical training and
assessment at the workplace, complemented with theory-based training through accredited
training providers. The qualifications currently available include:
1
National Certificate in Water Treatment – Site Operator: includes drinking-water
treatment theory, and practical operation of a range of conventional treatment systems.
The qualification requires approximately two years of part-time study and on-job learning
2
National Diploma in Water Treatment – Site Technician: includes quality assurance,
safety, managing and optimising advanced water treatment processes on site. The
qualification requires approximately two years of part-time study and on-job learning
3
National Certificate in Water Reticulation: includes trenching technology, safety,
reticulation systems and disinfection. The qualification requires approximately one year of
part-time study and on-job learning
4
National Diploma in Drinking Water-Assessment: includes treatment technology,
assessing and implementation of PHRMPs and communications skills. The qualification
requires approximately two years of part-time study and on-job learning.
It is also important for water suppliers to have in place a good sanitation and housekeeping
programme to ensure that the actions of staff and contractors do not contaminate the water
supply. Items typically covered in such a programme would involve:
•
all employees and personnel in the plant must wear clean outer clothing
•
employees working in the water processing areas must wash and sanitise their hands before
returning to the work area and any time when the hands may have become soiled or
contaminated
•
eating, drinking, smoking, or engaging in any other activity around the water processing
areas which may introduce contamination of any kind is prohibited
•
an effective hair restraint is required of all employees or personnel in the water processing
areas
•
no person affected by disease in a communicable form, or while a carrier of such disease, or
while affected with boils, sores, or infected wounds, shall work in a water plant in any
capacity in which there is any remote possibility of the water supply becoming contaminated
by that person, or of a disease being transmitted by such person to other individuals working
within the water plant
•
only authorised employees and personnel are allowed in the water processing areas
•
signs should be posted outlining the above requirements.
Guidelines for Drinking-water Quality Management for New Zealand 2013
63
2.3 Quality assurance
2.3.1
Key features of quality assurance
The overriding principle of quality assurance is that if the process used to deliver the end
product and the factors that impact upon that process are well understood then it is possible to
implement measures to control the process so that a product of consistently high quality is
achieved.
Whereas risk management focuses more on the actual processes used to deliver the quality end
product, quality assurance focuses more on understanding and managing the factors that
impact on the processes. In the context of a water supply system, risk management focuses on
the processes of collecting, treating and then distributing water and the interrelationships
between these processes. But, as well as understanding the risks of the processes, it is equally
important to understand other factors that control and support the provision of safe drinkingwater – regulatory, organisational structure and processes, human and financial resources.
Examples include:
•
the Ministry of Health, with their requirements being set out for example in the DWSNZ
•
other government agencies, with their requirements being set out in legislation and policies
such as Local Government Act and Resource Management Act
•
councillors, the community and water users, with their requirements outlined in documents
such as by-laws, long term council community plans and supply contracts
•
everyone in the organisation is involved and takes responsibility for ensuring that the part of
the process for which they are involved is functioning effectively. Responsibility, decisionmaking and ownership are delegated as far down the chain of command as possible
•
there are enough personnel and they have adequate experience and training to undertake
their tasks
•
there is enough equipment and it is maintained so that it remains accurate and reliable
•
standard procedures are in place to provide direction and allocate responsibilities to staff
when they are undertaking critical tasks
•
monitoring ensures that the systems are working well and provide early warning of possible
problems
•
surveillance is directed to ensuring that the whole process is right, not merely in checking the
quality of the product at the end of the process.
2.3.2
Application to drinking-water supplies
The Public Health Grading of drinking-water sources, treatment plants and distribution systems
includes a requirement to have an approved quality management system if the highest grading
(A1 for treatment, a1 for the distribution system) is to be achieved, see Chapter 18, section 18.4
for a discussion on aesthetic guidelines methodology. The scope of such a management system
would cover all aspects, from source to consumer.
The basic structure of a quality management system that is appropriate for a community
drinking-water supply is shown in Figure 2.7.
64
Guidelines for Drinking-water Quality Management for New Zealand 2013
Figure 2.7: Basic structure for a quality management system
Quality
policy
Quality manual
Work instructions
Supporting documentation
Monitoring records
ISO 9001 2000 requires organisations to document the following:
•
Quality policy that outlines the organisation’s commitment to meet customer, legal and
regulatory requirements. The quality policy is supported by the quality objectives the
organisation strives to achieve. Quality objectives must be measurable and communicated
throughout the organisation. Quality objectives are often developed as part of the preparation
of long-term council community plans.
•
Quality manual that includes documented procedures for managing the quality system. It is
mandatory that the quality manual include documented procedures for:
– control of documents: they must be legible, identified, reviewed, authorised, distributed
and periodically updated
– control of records: they must be legible and easy to identify and retrieve
– planning and conducting internal audits: these must be undertaken regularly for each area
covered by the quality system. Audit results must be reported, recorded and follow up
actions verified
– non-conforming product (ie, product that does not meet the quality objectives): when
non-conformances occur they must be investigated and actions implemented to prevent
recurrences
– preventive actions: the same systems that must be in place for dealing with non-
conforming products are required to be in place for dealing with potential problems that
have not yet resulted in defective products.
The quality manual is also required to include a description of the interaction of the processes
that make up the quality system. This normally takes the form of flowcharts that may for
example show the various stages of the collection process and the measures taken to control it
and how it interacts with the treatment process. The Ministry of Health’s guide How to Prepare
and Develop PHRMPs for Drinking-water Supplies contains several examples of flowcharts
that can be used to describe the water supply process.
Guidelines for Drinking-water Quality Management for New Zealand 2013
65
•
Work instructions: these cover procedures for undertaking specific tasks for which the
organisation considers it is necessary to have a documented procedure in place to control the
process. Work instructions normally cover:
– the scope of the procedure, ie, what activities are covered/not covered
– who is authorised to undertake the task, eg, their required qualifications or experience
– the procedures that must be followed
– processes for checking that the work has been completed correctly
– processes for reviewing, authorising, distributing and updating the work instructions.
•
Supporting documentation that includes externally sourced documentation such as
manufacturers’ manuals, reference standards and operating manuals.
•
Records that are kept to demonstrate that the quality system is working correctly. Examples
of records normally kept include:
– training records
– machinery calibration and maintenance records
– records from suppliers of materials
– results of tests and measurements undertaken
– details of internal audits and follow up actions
– meeting minutes
– correspondence.
Documentation can either be paper-based or electronic. Increasingly, organisations are using
databases or websites to publish and store documents, as they are easier to update and provide
staff with better access than paper-based systems.
The quality assurance system should be seen as a living entity. To stay effective the system needs
to be adapted to accommodate items such as changing circumstances, changing requirements,
the identification of new hazards, or identification of improved ways of doing things.
Organisations are also tending to produce integrated management systems that cover risk
management, asset management, health and safety, environmental and financial matters, as
well as quality, all under the same system. In doing so they are recognising that when tasks are
being undertaken, employees do not consider, for example, health and safety in isolation, and
then quality, but they need to consider all of these aspects at the same time. By developing an
integrated management system organisations can simplify the amount of documentation
required and develop documentation that reflects the way that work is actually carried out.
2.4 Quality control
Quality control provides the checks to demonstrate that risk management and quality assurance
has produced a product that complies with standards, and feedback to the adequacy of risk
management. In many situations the DWSNZ have set transgression levels. A transgression may
not result in a non-compliance. Using quality control principles, a water supplier will establish
control limits, with the aim of triggering some action to prevent the value reaching a
transgression level or operational requirement. Control limits, and the actions to be followed
when reached, should be covered in PHRMPs. The Ministry of Health evaluates the compliance
of a drinking-water supply with the DWSNZ on a regular basis and uses this in determining the
Public Health Grading of community drinking-water supplies and in preparing its annual report
on drinking-water quality in New Zealand.
66
Guidelines for Drinking-water Quality Management for New Zealand 2013
This section discusses compliance in general terms. Chapters 6–11 discuss the DWSNZ
compliance criteria and requirements in more detail. Demonstrating compliance with the
DWSNZ requires more than demonstrating that the water quality is satisfactory. Other
requirements, which are described in more detail in section 3.1.1 of the DWSNZ, include
demonstrating:
a)
the prescribed number of samples have been taken from the correct places at the
prescribed frequencies
b)
the samples have been analysed according to approved methods and by a Ministry of
Health recognised laboratory
c)
compliance requirements have been met for the previous 12-month period
d)
the necessary actions have been taken in response to results
e)
up-to-date records are kept.
In addition to keeping good records, good quality control practice includes:
a)
reviewing results on a regular basis for trends or changes
b)
reporting the results to those who need to know – water supply staff, management, health
officials, community
c)
reviewing and updating PHRMPs and associated documentation such as procedures.
A national database system for drinking-water, Water Information New Zealand (WINZ), serves
a multitude of roles in ensuring water supplies are identified, their water quality assessed and
their risks managed. WINZ is the primary database for managing compliance with DWSNZ and
public health grading, and records supply-specific characteristics, monitoring results, and
responses to transgressions.
References
ANSI/AWWA G100-05. 2005. Water Treatment Plant Operation and Management (1st edition).
Denver CO: AWWA Standard, American Water Works Association.
ANSI/AWWA G410-09. 2009. Standard for Business Practices for Operation and Management.
Denver CO: AWWA Standard. American Water Works Association.
AS/NZS ISO 9001. 2000. Quality Management Systems – Requirements. Standards Australia/
Standards New Zealand.
AS/NZS. 2004. AS/NZS 4360:2004. Risk Management. Standards New Zealand.
AWWA. 1994. Overcoming natural disasters. Articles by Reid, McMullen, Murphy and Shimoda in
Journal American Water Works Association 86(1).
AWWA. Manual M5: Water Utility Management. Denver CO: American Water Works Association.
www.awwa.org
CDC. 2010. When Every Drop Counts: Protecting public health during drought conditions – a
guide for public health professionals. Centers for Disease Control and Prevention, US
Environmental Protection Agency, National Oceanic and Atmospheric Agency, and American Water
Works Association. Atlanta: US Department of Health and Human Services, 56 pp. See
http://www.cdc.gov/nceh/ehs/Publications/Drought.htm
Codex Alimentarius. 1993. Guidelines for the Application of the Hazard Analysis Critical Control
Point (HACCP) System. CAC/GL 18-1993.
Guidelines for Drinking-water Quality Management for New Zealand 2013
67
DWI. 2012. Health Impacts from Extreme Events Water Shortages. 92 pp.
http://dwi.defra.gov.uk/research/completed-research/reports/DWI70-2-263.pdf
Geldreich EE. 1996. Microbial Quality of Water Supply in Distribution Systems. Boca Raton: CRC
Lewis Publishers.
MoH. Public Health Risk Management Plan Guides. The New Zealand Ministry of Health’s Guides
for drinking-water supplies can be accessed as Word documents on the Ministry of Health website:
http://www.moh.govt.nz/water then select publications and Public Health Risk Management Plans.
First, read: How to prepare and develop PHRMPs.
MoH has prepared a PHRMP Guide on Staff Training, Ref. G1.
MoH. 2005. Drinking-water Standards for New Zealand. Wellington: Ministry of Health. This
version was revised 2008.
MoH. 2008. Guidelines for the Safe Carriage and Delivery of Drinking-water. Wellington: Ministry
of Health. Available online at http://www.moh.govt.nz/water
MoH. 2008. Small Drinking-water Supplies: Public health risk management kit. Available online at
http://www.moh.govt.nz/water.
MoH. Register of Community Drinking-water Supplies and Suppliers in New Zealand. Wellington:
Ministry of Health.
NHMRC. 2012. Australian Drinking Water Guidelines, Community Water Planner – A tool for
small communities to develop drinking water management plans, Community Water Planner,
User Manual. www.nhmrc.gov.au/guidelines/publications/eh52web or
http://www.nhmrc.gov.au/publications/synopses/eh39.htm or
http://www.communitywaterplanner.gov.au/resourceportal/35
NZS. 2000. SNZ HB 4360:2000. Risk Management for Local Government. Standards New Zealand.
USEPA. 1999. Guidance Manual for Conducting Sanitary Surveys of Public Water Systems;
Surface Water and Ground Water Under the Direct Influence (GWUDI). EPA, United States Office
of Water. EPA 815-R-99-016. Environmental Protection Agency. April. See
http://www.epa.gov/safewater/mdbp/pdf/sansurv/sansurv.pdf
Water UK. 2010. Medical Surveillance of Personnel. Technical Guidance Note No.1. 2 pp.
http://www.water.org.uk/home/policy/publications/archive/drinking-water/priciples-of-watersupply-hygiene
Water UK. 2010a. Event and Incident Management. Technical Guidance Note No.10. 5 pp.
http://www.water.org.uk/home/policy/publications/archive/drinking-water/priciples-of-watersupply-hygiene
WHO. 2001. Water Quality – Guidelines, Standards and Health: Assessment of risk and risk
management for water-related infectious disease. Published on behalf of the World Health
Organization by IWA Publishing. This book is available on the internet at:
http://www.who.int/water_sanitation_health/dwq/whoiwa/en/index.html
WHO. 2003. Environmental Health in Emergencies and Disasters: a practical guide.
ISBN 92 4 154541 0. Chapter 7 discusses water supply. Available on the internet at:
http://www.who.int/water_sanitation_health/hygiene/emergencies/emergencies2002/en/
WHO. 2004. Guidelines for Drinking-water Quality 2004 (3rd edition.). Geneva: World Health
Organization. http://www.who.int/water_sanitation_health/dwq/guidelines/en/index.html and see
also the addenda (2006).
68
Guidelines for Drinking-water Quality Management for New Zealand 2013
WHO. 2005. Water Safety Plans: Managing drinking-water quality from catchment to consumer.
WHO/SDE/WSH/05.06. 244 pp.
http://www.who.int/water_sanitation_health/dwq/wsp170805.pdf
WHO. 2007. Chemical Safety of Drinking-water: Assessing priorities for risk management.
Thompson T, et al. 160 pp. Geneva: World Health Organization. See:
http://www.who.int/water_sanitation_health/dwq/dwchem_safety/en/
WHO. 2009. Water Safety Plan Manual (WSP Manual): Step-by-step risk management for
drinking-water suppliers. How to develop and implement a Water Safety Plan – A step-by-step
approach using 11 learning modules. 108 pp.
http://www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2011. Water Safety Plan Manual Quality Assurance Tool – User Manual. 28 pp.
http://www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2011a. WHO technical notes on drinking-water, sanitation and hygiene in emergencies.
http://www.who.int/water_sanitation_health/dwq/publications/en/index.html
WHO. 2011b. Guidelines for Drinking-water Quality 2011 (4th edition). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
WHO. 2012. Water Safety Planning for Small Community Water Supplies: Step-by-step risk
management guidance for drinking-water supplies in small communities. 66 pp.
http://www.who.int/water_sanitation_health/publications/2012/water_supplies/en/index.html
WQRA. 2009. Risk Assessment for Drinking Water Sources. Research Report No 78. 65 pp.
http://www.wqra.com.au/project-details/2
Guidelines for Drinking-water Quality Management for New Zealand 2013
69
Chapter 3: Water sources
3.1
Introduction
Source water is potential raw water, ie, it is natural fresh water that could be abstracted and
processed for drinking purposes.
The chemical composition of natural fresh water is the end result of rainwater that has fallen on
to the land and interacted with the soil, the material in or on the soil, and rocks as it moves
down rivers, or into lakes, or percolates underground. Its overall quality is further modified by
run-off from various land uses (non-point or diffuse sources) and by discharges (point source).
The quality is modified further by biological activity, wind-blown material and evaporation.
The sections in this chapter are aimed at addressing what impacts on the quality of natural fresh
waters, and what can be done to identify and limit these impacts, by taking into account recent
research findings from New Zealand and abroad.
Half of the chapter discusses groundwater, including compliance issues related to
demonstrating bore water security. Bore water security impacts on both bacterial and protozoal
compliance. The concept of bore water security was originally developed for the DWSNZ as an
alternative approach to monitoring E. coli at the rate required for surface water sources.
A summary of the legislation covering natural fresh water is included in this chapter; see
Appendix 1 for a more detailed discussion of water supply legislation.
Chapter 4 discusses the steps recommended in the selection of raw water sources and
appropriate water treatment processes.
Monitoring surface source waters to determine the number of log credits required for protozoal
compliance is covered in Chapter 8: Protozoa Compliance, section 8.2.
Chapter 17: Monitoring, section 17.2 discusses some aspects of water sampling and testing.
Rainwater is discussed in Chapter 19: Small and Individual Supplies.
General source water risk management issues are discussed in the MoH Public Health Risk
Management Plan Guide PHRMP Ref. S1.1: Surface and Groundwater Sources; also see
Chapter 2: Management of Community Supplies.
Source water quality management is discussed in Chapter 4 of AWWA (1990).
WHO (2003a) is an excellent general text, some of which was used in compiling the Guidelines
for Drinking-water Quality WHO (2004). The chapter titles are shown in Chapter 4: Selection
of Water Source and Treatment, section 4.3.1.
A well-illustrated publication that describes groundwater quality protection very simply was
published by the Vermont Department of Environmental Conservation in September 2005.
70
Guidelines for Drinking-water Quality Management for New Zealand 2013
The AWWA’s third edition of their manual on groundwater appeared in 2003.
The USEPA (2008) published a guidance manual related to their groundwater rule, see
References.
WHO in 2012 published an excellent book Animal Waste, Water Quality and Human Health.
3.2 Groundwater
3.2.1
Description of a groundwater system
Unlike surface water, many of the processes that affect the quality of groundwater occur
underground, out of sight, so cannot be observed directly. Our understanding of how a
groundwater system works is largely obtained by deduction from indirect observation. The
following sections describe the general characteristics of a groundwater system and the
processes that can affect bore water quality.
Groundwater comprises about 80–90 percent of the world’s freshwater resources. It is
recharged from the surface, predominantly from rainfall, but can also receive leakage from
rivers and lakes. Water seeps down through the soil and unsaturated formation until it reaches
the water table. At this point it moves more horizontally through pores in sediments and
fractures in rock. Aquifers are large areas of formation that act as reservoirs from which
groundwater can be abstracted through a bore for supply.
In the DWSNZ, groundwater is considered to be the water contained in the aquifer; bore water
is either in the bore or is the water that has left the bore. This distinction is necessary because
previously there has been reference to secure groundwater, which led to people talking about
secure aquifers. To be called secure, water that has been abstracted from an aquifer through a
bore to become drinking-water needs to comply with bore water security criteria 1 and 2 and 3.
3.2.1.1
Confined and unconfined aquifers
If a layer of relatively saturated impermeable material (an aquitard) overlies an aquifer, the
system is known as a confined aquifer. The aquitard acts as a protective layer, often minimising
or preventing further vertical movement of contaminants into the aquifer. Aquitards can also
reduce the vertical interchange of water between aquifers at different depths. Where an aquitard
is lacking (eg, tapers out) an aquifer may be more vulnerable to contamination from the ground
surface or springs can emerge at the ground surface. Springs can be contaminated directly from
surface sources, and can act as conduits for contaminants to move down into the underlying
groundwater if they dry out during dry periods.
An unconfined aquifer is so called because of the absence of a confining aquitard layer (eg, clay).
In contrast to a confined aquifer, it is relatively vulnerable to contamination from the land
surface. For the purposes of the Drinking-water Standards for New Zealand 2005 (revised
2008) (DWSNZ), when planning a drinking-water quality monitoring programme, unconfined
groundwater systems less than 10 m deep should be regarded as being no safer than surface
sources. Bores drawing from unconfined aquifers greater than 10 m deep may be able to
demonstrate security, but require more monitoring than if drawn from a confined aquifer.
Guidelines for Drinking-water Quality Management for New Zealand 2013
71
When a bore is sunk, the drillers should collect substrate samples at different depths for
inspection; this is called the bore log. When a bore is installed it is often pump tested to
establish the volume of water that it can supply. Bore logs and pumping test information from
observation bores will often show whether an aquifer is confined, particularly adjacent to the
bore. However, it doesn’t show how extensive the confining layer is, or whether it offers
consistent protection of the aquifer over a wide area. The regional council may have additional
data on file that may help to understand the whole aquifer. The confining layer only protects the
water from what is happening above it; contamination from the surface nearer the recharge area
can still occur.
Knowledge of the water levels in a bore can also indicate whether an aquifer is confined. Note
that for DWSNZ purposes, depth is the length of casing to the shallowest screen, not the total
depth. Bores that are naturally free-flowing (artesian) are generally indicative of confined
aquifer conditions. This upward flow of groundwater that provides some natural aquifer
protection can, however, be reversed during pumping or intermittent use upslope.
USEPA (2008) describes 14 indicators of confinement and the characteristics used to identify
the presence of a confining layer.
3.2.1.2
Groundwater flow
By measuring the depths of the water in a number of bores relative to a common datum, eg,
seawater level, the depths in the various bores can be contoured to produce a map of the water
table (unconfined aquifer) or piezometric surface (confined aquifer). Groundwater generally
moves at a much slower rate than surface water. It seeps through the pores of sediments or
fractures in rock, down-gradient from areas of high elevations to areas of low elevation.
Eventually it discharges to rivers, lakes, the sea, or through springs.
Groundwater flows in the direction of greatest downhill slope or gradient (ie, perpendicular to
the equal elevation contours on the water table or piezometric map).
The slope of the water table (i), the effective porosity of the aquifer (n), and the amount of water
flowing through the pores (flow volume per unit time, Q, divided by the cross-sectional area
through which it moves, A) can be used to determine the average linear velocity of the
groundwater, v, using the D’Arcy equation:
v = Q/(nA) = K i/n
where k is the hydraulic conductivity of the aquifer.
The velocity, v, is known as the average linear velocity because it describes the gross flow rate
through the aquifer material. Aquifers are not homogeneous but may, for example, consist of
lenses of finer material (clay, silts or sands) alternating with coarser materials (gravels), such as
in Heretaunga (Hawkes Bay), the Canterbury Plains and Waimea (Nelson). These have built up
from braided rivers. Groundwater movement through these systems will be quicker through the
coarser material than through the finer material. Consequently contaminants in the
groundwater can be transported much faster through parts of the aquifer (up to 50 times) than
is indicated by the average linear velocity. In addition, localised flow through the buried
channels can deviate significantly from the presumed down-gradient flow direction.
Consequently, care must be taken in assuming the rate and direction of the groundwater
movement through non-homogeneous aquifers.
72
Guidelines for Drinking-water Quality Management for New Zealand 2013
Tracer tests may be useful in determining the localised groundwater flow rate and direction. An
easily detectable tracer can be introduced into the aquifer through an injection bore and its
progress determined directly by measuring its concentration in samples of groundwater from
down-gradient bores (or possibly indirectly, by geophysical techniques such as surface resistivity
using a salt tracer). Flow direction, velocity, dispersion and attenuation characteristics can be
estimated by measuring spatial and temporal variations of tracer concentrations. However, the
cost of drilling bores is often high, the data interpretation complex and tracer selection critical.
Tracer tests should only be carried out by an experienced hydrogeologist.
The temperature of water in very shallow aquifers (eg, less than about 10–15 m deep) may vary
seasonally but deeper groundwater temperature remains relatively constant. This is why water
from a bore may seem relatively warm in winter or cool in summer.
The effective insulation of deeper groundwater from temperature changes also occurs in respect of
contaminants. Contaminants in an aquifer are not flushed from their source in the same manner
or as quickly as surface water. Unless contaminants attenuate through die-off (microbial), decay
or adsorption, they will be retained and move through the aquifer system, potentially affecting the
use of the groundwater along its flowpath and probably for a considerable time.
Most groundwater supplies need to be pumped. Risk management issues related to pumping are
discussed in the MoH Public Health Risk Management Plan Guide PHRMP Ref. P4.2:
Treatment Processes – Pump Operation.
3.2.2
The quality of groundwater
Groundwaters are generally of better microbiological quality than surface waters because of the
range of mechanisms active under the ground that can attenuate the microbial contaminants
initially present in the water. Moreover, changes in microbiological quality that occur are not as
large or as rapid as those in surface waters. Although some aspects of the chemical quality of
groundwaters may be a concern, these characteristics of the microbiological quality of
groundwater often mean they are more preferable source waters than surface waters. However,
once a groundwater becomes contaminated by chemicals, it takes a long time before the
contamination is flushed out.
Table 3.1, copied from the WHO Guidelines for Drinking-water Quality (2004), provides a
comparison of the levels of pathogens and indicator organisms found in surface and groundwaters.
Table 3.1: Concentrations of enteric pathogens and indicators in different types of source
water
Pathogen or indicator
group, per litre
Lakes and
reservoirs
Impacted rivers
and streams
Wilderness rivers
and streams
Groundwater
20–500
90–2500
0–1100
0–10
–
3–58,000
b
(3–1000)
1– 4
–
10,000–1,000,000
30,000–1,000,000
6,000–30,000
0–1000
Viruses
1–10
30–60
0–3
0–2
Cryptosporidium
4–290
2–480
2–240
0–1
Giardia
2–30
1–470
1–2
0–1
Campylobacter
Salmonella
E. coli (generic)
a
Should be zero if bore water secure.
b
Lower range is a more recent measurement.
a
Guidelines for Drinking-water Quality Management for New Zealand 2013
73
Table 3.1 provides an indication of the microbial quality of different waters sources. The levels of
microbial contamination in a particular water source will depend, amongst other things, on the
nature of contamination sources in the catchment or recharge zone, and the barriers between
these contamination sources and the water source. New Zealand source waters tend to exhibit
much lower numbers per litre than appear in Table 3.1.
Tests undertaken over a period of time long enough to show seasonal variation are required to
establish the microbial quality of a groundwater source. It is advantageous to consult someone
familiar with the groundwater in the area for guidance about the most appropriate time to
sample. A good reference on sampling groundwaters is A Guide to Groundwater Sampling
Techniques by L Sinton, published by the National Water and Soil Conservation Authority as
Water and Soil Miscellaneous Publication No. 99. Refer also to Sundaram et al (2009).
Groundwaters are not usually in direct contact with faecal material, as surface waters may be,
but rainfall and irrigation provide means by which surface contamination can be carried into the
groundwater. In some countries groundwaters have been contaminated by the very bad practice
of pumping wastes down disused bores. The vulnerability of aquifers to microbial contamination
is increased by (Sinton 2001):
•
recharge water coming into contact with microbial contamination
•
higher porosity aquifer media, which allow greater penetration and transport of microbes,
see section 3.2.4.3
•
shallow aquifer depth
•
absence of a confining layer
•
light overlying soils and porous subsoil strata, which reduce the efficacy of processes
removing microbes in these layers.
In many areas of the world, aquifers that supply drinking-water are being used faster than they
recharge. Not only does this represent a water supply problem, it may also have serious health
implications. In coastal areas, aquifers containing potable water can become contaminated with
saline water if water is withdrawn faster than it can naturally be replaced. The increasing
salinity makes the water unfit for drinking and often also renders it unfit for irrigation. To
remedy these problems, some coastal authorities have chosen to recharge aquifers artificially
with treated wastewater, using either infiltration or injection. Aquifers may also be recharged
passively (intentionally or unintentionally) by septic tanks, wastewater applied to irrigation and
other means. Aquifer recharge with treated wastewater is likely to increase in future because it
can:
•
restore depleted groundwater levels
•
provide a barrier to saline intrusion in coastal zones
•
facilitate water storage during times of high water availability.
If aquifer recharge is haphazard or poorly planned, chemical or microbial contaminants in the
water could harm the health of consumers, particularly when reclaimed water is being used. For
a full discussion, see WHO (2003b).
The layer of unsaturated soil above the groundwater plays a major role in reducing the numbers
of micro-organisms found in groundwaters. Factors affecting the survival of organisms (ie, how
rapidly the organisms die off), and those influencing their transport (ie, how quickly they are
carried through the unsaturated strata) both affect the levels of microbes reaching the
groundwater.
74
Guidelines for Drinking-water Quality Management for New Zealand 2013
Bacterial survival in soils is improved by (Sinton 2001):
•
high soil moisture
•
greater penetration into the soil profile
•
low temperatures
•
low pH values (in the range 3–5)
•
high organic matter content
•
low numbers of antagonistic soil microflora.
The most important attenuating processes for bacteria in soils are filtration and adsorption
(Sinton 2001). The effectiveness of filtration is greatest in soils with low particle size, while some
sedimentation can occur in zones where there is virtually no flow of water. Media providing
large surface areas, such as clays, improve contaminant adsorption. The adsorption process is
enhanced by conditions that minimise electrostatic repulsion between the micro-organism and
surfaces to which they might adsorb. Increased levels of dissolved solids in the water assist in
suppressing electrostatic repulsion, consequently rainwater, which contains little dissolved
material, assists microbes in penetrating further into the ground.
Filtration and sedimentation, which are influenced by the size of the contaminant, are less
important in the removal of viruses in soils, because viruses are very much smaller than
bacteria.
Processes active during transport in groundwater further attenuate the levels of microbes that
reach the water table. These mechanisms are similar to those active in removal in soils.
However, the increased size of aquifer media, and the resulting larger pore sizes, and the higher
water velocities in aquifers than through soils, result in filtration, sedimentation and adsorption
being less effective. Transport distances are much greater in aquifers than soils.
The absence of sunlight (with its UV light) in the groundwater environment is an important
factor leading to the differences in the rates of microbial inactivation in groundwaters and
surface waters.
Organisms can be found at great depths. In karst regions, microbes and invertebrates can be
found in caves and other openings 100 metres or more beneath the surface. Bacteria can exist in
some groundwater thousands of feet below the land surface. However, invertebrates are
typically found within 1 to 10 metres of the surface in consolidated materials, in what is called
the hyporheic zone. Within this shallow groundwater zone, many macroscopic invertebrates
have been identified. Furthermore, the species richness and community structure of these
organisms has been shown to change with alterations in groundwater quality. Therefore, the
relative presence or absence of different communities or populations of organisms may reflect
the impact of changes in regional groundwater quality. As a result, the organisms living within
the shallow groundwater zone can serve as indicators of the quality of the groundwater resource.
Macroinvertebrates living in the hyporheic zone, such as oligochaetes, isopods, and ostracods,
have evolved special adaptations to survive in a food-, oxygen-, space-, and light-limited
environment (USEPA 1998). Sinton (1984) described macro-invertebrates observed in a
polluted Canterbury aquifer.
Guidelines for Drinking-water Quality Management for New Zealand 2013
75
Health-significant chemical determinands may appear in waters from natural sources, as well as
human activities. The naturally-occurring chemical determinand that appears most frequently
at potentially health significant concentrations (greater than 50 percent MAV) in drinking-water
sources in New Zealand is arsenic. Groundwaters in geothermal areas often contain arsenic and
boron at concentrations above drinking water MAVs. Arsenic is also detected in groundwaters in
other parts of the country, although generally at lower concentrations than in obviously
geothermal areas (Nokes and Ritchie 2002). Higher arsenic concentrations are often associated
with anaerobic (poorly oxygenated) groundwaters. Although arsenic may appear in association
with iron and manganese, the presence of these metals in a groundwater does not imply the
presence of arsenic. In some groundwaters, the presence of arsenic is thought to arise from the
contaminant being leached from old marine sediments.
Arsenic has been observed to vary substantially with season, particularly in shallow bores (Frost
et al 1993). Measurements should therefore be undertaken under a range of seasonal conditions.
Further, the occurrence of arsenic in groundwaters is not always predictable, and tests for
arsenic should be included in the investigation of any new groundwater source.
Boron is found in association with arsenic in geothermal areas. It can also appear at high
concentrations in the absence of arsenic in some geothermally influenced (hydrothermal)
springs, eg, near Auckland; few of these occurrences result in boron exceeding 50 percent of its
MAV.
High nitrate concentrations occur in drinking-water sources in a number of areas in New
Zealand. It has a number of possible sources, all related to human activities, such as: fertiliser
application; disposal of wastewater from dairy factory operations; high grazing densities of dairy
stock. It can also be found at high concentrations on a localised scale due to on-site waste
disposal systems (eg, septic tanks).
There is typically an increased leaching of nitrate from soils with increased rainfall or rising
water table levels. In these cases, the highest nitrate concentrations will be found when the
water table is highest, ie, usually in the winter and spring.
Fluoride is often found overseas as a groundwater contaminant of health significance, but
fluoride in excess of 50 percent of its MAV has been found in only three water supplies in New
Zealand (Ritchie 2004). Slightly elevated levels of fluoride can found in geothermal areas, and in
some geothermally influenced (hydrothermal) waters.
Pesticides have been found in a number of vulnerable New Zealand groundwaters (Close and
Flintoft 2004; MAF 2006). Pesticides in excess of 50 percent of a MAV have been found in only
two drinking-water supplies (Ritchie 2004). Dieldrin was the detected pesticide in both cases.
Refer to individual datasheets for details.
Manganese is a health-significant determinand, but it can also adversely affect the aesthetic
properties of water. It is a naturally-occurring determinand, which dissolves into groundwater
under oxygen-deficient, low pH conditions. Such conditions often arise in shallow groundwaters
as the result of the respiration of microbes in the sub-surface media through which the water
passes. During respiration organisms withdraw oxygen from the water and return carbon
dioxide as a waste product. The carbon dioxide dissolves to form carbonic acid, thereby
depressing the pH of the water and dissolving the manganese, particularly when the water is
anaerobic. Some source of organic matter, such as peat, may be associated with the appearance
of manganese, as the organic matter provides a source of carbon, which is required as a nutrient
by the microbes.
76
Guidelines for Drinking-water Quality Management for New Zealand 2013
Iron, which is of aesthetic but not health importance, is also mobilised from minerals under
conditions similar to those that will mobilise manganese, and the two are often found together
in groundwaters. The conditions that lead to the appearance of iron and manganese can be very
localised. New groundwater sources should therefore be tested for both metals, rather than
relying on results from nearby bores as an indicator of likely water quality. Iron and manganese
concentrations may also vary significantly with time, particularly in shallow unconfined
groundwater.
Complexation of iron or manganese with organic matter also present in the groundwater can
inhibit the effectiveness of treatment processes designed to remove these metals from the water.
Calcium and magnesium are two major cations that can occur at high concentrations in
groundwaters that have been in contact with calcareous rocks, such as limestone (chalk) and
marble. These cause water hardness, which can lead to problems of scale formation on hot
surfaces, and difficulty in getting soaps to lather.
Bores sited near the coast may undergo varying degrees of seawater intrusion, depending on the
level of pumping, phase of the tide, distance to the sea, and the ease with which seawater can
intrude into the aquifer. This phenomenon can lead to high concentrations of chloride and
sodium, and elevated concentrations of calcium, magnesium, all of which may adversely affect
the aesthetic properties of the water. It can be very difficult to reverse the process. High levels of
sodium and potassium can also occur if the bore draws from old marine deposits.
3.2.3
Factors affecting groundwater quality
3.2.3.1
The water source
Because contaminants that may have infectious or toxic properties can remain in the aquifer for
a considerable time and affect its use, operators of groundwater-sourced drinking-water
supplies should take every precaution to prevent contamination of the aquifer. Water managers
should assess the potential for contamination arising from possible sources located in the
immediate area of the bore (although some plumes, eg, nitrate plumes, can travel many
kilometres). All possible precautions should be taken to protect the water supply from all
potential impacts. Examples of sources include: stores of hazardous substances, underground
storage tanks such as at petrol stations, effluent discharges, septic tanks, waste ponds, offal pits,
application of pesticides or animal wastes to nearby land. As part of this assessment, advice
should be sought to determine the capture zone of supply bores. This takes into account
groundwater flows, drawdown effects and attenuation characteristics, see section 3.2.4. Note
that contaminants that are not soluble in water will float on the top or sink to the bottom of the
aquifer.
Some groundwater systems receive water from more than one source, and if these are not
consistent, water quality may vary. For example, as observed in parts of Canterbury, an aquifer
may be fed from a river system and from rainwater. The groundwater quality and composition
will vary depending on the preceding and current river flow and rainfall conditions, and may
even depend on the state of the river bed, and the land-uses in the recharge area.
Guidelines for Drinking-water Quality Management for New Zealand 2013
77
In areas without reticulated sewerage, sewage is usually discharged into septic tanks. Microorganisms in the liquid discharge from septic tanks or other systems are likely to enter shallow
unconfined aquifers. Good design and maintenance of septic tanks can reduce the threat of
microbiological contamination of aquifers from this source. Sewage discharges have been known
to find their way directly into aquifers via abandoned bores or soil-absorption or soakage
systems receiving the effluent from septic tanks. New bores should be drilled on higher ground
than where the septic tank discharges, and should be cased to sufficient depth to prevent
contaminated water mixing with the deeper groundwater. Some metal casings can have a
surprisingly short life due to the corrosive effect of the high carbon dioxide content in the
subsoil water. Most states in USA require a minimum of 50 feet separation between a bore head
and a septic tank discharge, livestock yards, silos, and 250 feet from manure stacks, see
section 3.2.4.3.
If the average groundwater travel time from the aquifer fed by a septic tank is 10 years, then it is
likely that all pathogens will be inactivated during their residence in the subsurface, and it is
likely that the groundwater is not at risk. If the average groundwater travel time is two years,
then some groundwater will take a fast path and arrive in one year or less, and other
groundwater will take a slower path and arrive in three years or more. Because pathogens
remain infectious in the subsurface for a maximum of about one year, the health risk depends
on the proportion of groundwater that arrives most rapidly at the well (USEPA 2008).
Other possible sources of microbiological contamination include seepage from sewers, landfills,
and land application of domestic and animal effluent.
One potentially major pathway for contamination often overlooked is the conduit provided by a
poorly sealed bore, particularly during a flood or after heavy rain. Contaminants can enter
directly down the bore shaft or down the junction between the casing and the soil. To protect the
groundwater against this source of contamination it is essential to design and construct the bore
head to protect against such contamination from the surface, see Figure 3.2 and section 3.2.4.3).
Bores should be secured in this manner regardless of the use made of the groundwater
abstracted from them. Groundwater contamination can persist for a long time and affect a large
area. Contamination can also enter the aquifer when deep piles are drilled through the aquitard
without due protection, somewhat like a puncture.
Groundwater in coastal areas is prone to high salt levels. This is likely to be caused by seawater
intrusion into the aquifer, salt drift, or possibly dissolution of salt deposits. The quality of
groundwater supplies subject to seawater intrusion may change in association with pumping,
other users, water level variation and tidal cycles.
Geological events such as earthquakes and volcanic eruptions have the potential to change the
chemical and microbiological quality of a groundwater. These events can dislocate aquifers so
that previously confined aquifers are no longer protected from the influence of surface events, or
the source aquifer receives groundwater from other, previously separate aquifers. Fissures,
resulting from volcanic eruptions, may also allow the introduction of contaminants into the
source aquifer. See ESR (2012) for a literature review of the impacts of earthquakes on
groundwater quality. Note that following the earthquakes in Christchurch, 20 of the 174 deep
bores needed to be redrilled and 82 needed repairs.
78
Guidelines for Drinking-water Quality Management for New Zealand 2013
Because of these possible effects, the microbiological and chemical quality of groundwater after
a substantial flood or earthquake or volcanic or geothermal eruption should be checked
immediately after the event, and for up to a year after a major event, because the effect of the
event may take some time before it reaches the abstraction point (several months or years).
Changes in water level or pressure can also indicate a change in an aquifer. Checks on water
quality should also be made after other events causing damage to bore heads, such as damage by
a vehicle. WHO (2011a) includes a Technical Note covering cleaning and disinfecting bores.
3.2.3.2
Changes in the unsaturated zone
The composition of the water will probably change during its percolation through the
unsaturated zone. Mineral or organic matter may dissolve from the substrates that the water
passes through. Groundwater in low rainfall regions is generally harder and more mineralised
than water in regions with high annual rainfall that will receive more dilution, although
hardness tends to be associated more with limestone areas.
In the unsaturated zone, particulate matter and some micro-organisms may be filtered out, and
dissolved constituents adsorbed or absorbed from the water. The effectiveness of the filtration
process in making the water biologically safe depends on the characteristics of the substrates.
Filtration can be very effective where the soil and rock consists of thick layers of fine particles.
Where soils are thin, eg, overlying some gravel aquifers of the Canterbury Plains, or where the
unsaturated zone is coarse, or fractured (some volcanic areas of Auckland), filtration can be
almost non-existent. In areas where the groundwater table is shallow, the groundwater is
unlikely to be microbiologically safe. USEPA (2008) defines karst, fractured bedrock and gravel
aquifers as ‘sensitive’; the greater the sand content, the less ‘sensitive’ the aquifer.
3.2.3.3
Changes in the aquifer
It is more difficult for contaminants to get into groundwater than into surface water. Once there,
however, it is more difficult for the contaminant to be removed. The characteristics of the
contaminants, the rate of groundwater flow through the aquifer, and the type of material the
aquifer is composed of, will all influence whether the contaminants will attenuate through dieoff, decay, adsorption or dispersion.
Once in the aquifer, the quality of the groundwater may change due to its interaction with the
matrix or by mixing with other groundwaters. Sediments like sand may act as a filter and
remove some types of contaminants from the groundwater as it flows through the aquifer.
Fractured or karst (limestone exhibiting dissolution features) aquifers offer little filtration or
adsorption of contaminants, but due to a localised higher pH, chemical reactions may occur. In
the absence of recontamination, the bacterial quality of the water in an aquifer will usually
improve during storage because of die-off due to unfavourable conditions. Passage through
aquifer media will also reduce levels of viruses and protozoa, although the rate at which they are
inactivated is much slower.
Simultaneously, the rocks may be releasing minerals into solution or exchanging ions with those
in the water.
If iron and manganese are present, the water from the tap may be clear initially while these are
in a reduced state but, once dissolved oxygen is present in the water, they can oxidise to
coloured forms (generally rusty or black), which may be insoluble and settle out.
Deeper aquifers are more likely to contain higher concentrations of minerals in solution because
the water has had more time to dissolve the minerals from the surrounding rock material.
Guidelines for Drinking-water Quality Management for New Zealand 2013
79
Groundwater may contain significant concentrations of naturally occurring radiological
determinands, notably radon. Section 9.4 of the DWSNZ requires an initial radiological test of
new bores, thereafter testing 10-yearly. See section 11.3 of Chapter 11: Radiological Compliance
for further discussion on monitoring requirements.
Groundwater may go very long periods without E. coli being detected in samples. However, for
various reasons, E. coli may be found in consecutive samples. It may be wise to disinfect the
bore. One technique is to insert a narrow diameter pipe down the borehole and then to pump in
a strong chlorine solution in an attempt to kill off any contamination. Alternatively, the strong
chlorine solution can be poured down the bore. If the contamination persists after this, a
permanent disinfection system will be needed. See WHO (2011a) and the Centers of Disease
Control and Prevention (US Government) web page for a helpful procedure.
(http://www.cdc.gov/healthywater/emergency/safe_water/wells/ was accessed in 2010).
3.2.3.4 Changes in the bore
Changes in water quality and quantity from a bore can result from changes, notably including
chemical incrustation, biofouling and corrosion.
a)
Chemical incrustation: Disturbance of chemical equilibria that control the solubility of
compounds in the groundwater can result from drawing water from the aquifer.
Substances that have been dissolved but are just on the edge of remaining dissolved, can
be precipitated.
Precipitation may occur at screen slots where water velocities are high, and disturbance of
the equilibria is greatest. This reduces bore capacity, but increases the water velocity
through the slot because of its smaller size. Fine particles entrained in the higher velocity
water then act to erode the screen. Precipitation may also occur in the aquifer material
around the screen, which cements sand grains together and reduces the flow of water into
the bore.
The substances most commonly associated with chemical incrustation are calcium
carbonate and the insoluble hydroxides of naturally-occurring iron and manganese. Warm
groundwaters can deposit silica.
Actions that will help reduce incrustation problems are:
•
use of the maximum possible slot area in the screen. This reduces flow velocity through
the slots, and therefore the degree to which chemical equilibria are disturbed
•
thorough development of the bore
•
use of minimum pumping rate (because of the influence of flow velocity on chemical
equilibria)
•
use of a number of small bores rather than one, or a few, large bores
•
frequent maintenance and cleaning of the bore; preventive maintenance is preferable to
drastic corrective actions.
Mechanical methods such as wire brushing or scraping, or controlled blasting, can be used
to remove incrustation that does develop, but the most effective method is the use of
strong acid, such as hydrochloric acid (requiring careful handling and flushing).
Ultrasonic cleaners may have a role to play too.
80
Guidelines for Drinking-water Quality Management for New Zealand 2013
b)
Biofouling: Biofouling of screens most often occurs as the result of infection of the bore
by iron bacteria. The organisms mainly responsible for biofouling catalyse the oxidation of
soluble iron and manganese in the water, and as a by-product produce slimes containing
large amounts of ferric hydroxide. As well as affecting bore yield, this phenomenon
degrades water quality with respect to taste, odour and organic matter content (which
increases the disinfectant demand of the water).
For biofouling to develop, most micro-organisms need a bore that is open to the
atmosphere (for oxygen), sufficient concentrations of iron and/or manganese in the water
(a level of iron of less than 0.1 mg/L is usually too low for iron bacteria to survive) as well
as dissolved organic matter, and bicarbonate ions or carbon dioxide. Some microorganisms can extract their oxygen requirements from chemicals such as nitrate and
sulphate. The problem seems to be worse with intermittently used bores, or bores that are
used seasonally. The end-products of this reduction process include ammonia (which will
increase the chlorine demand) and hydrogen sulphide (which makes the water smelly).
Steps to prevent problems with iron or manganese bacteria include:
•
disinfection of drill rods, bits, and tools to avoid cross-contamination from previous
drilling activities (50–200 mg/L FAC)
•
preparation of drilling fluid with chlorinated water (initially 50 mg/L FAC, but a
minimum of 10 mg/L FAC must be maintained)
•
the bore, once completed, must be sealed to prevent of the entry of airborne bacteria.
One proprietary system prevents growth of iron bacteria by starving them of the dissolved
iron they need. The system works by injecting aerated water, degassed of carbon dioxide,
into the aquifer through a field of aeration bores surrounding the production bore. The
increased oxygen level in this water assists naturally-occurring bacteria that oxidise iron
and manganese to remove these metals from the water by precipitating their oxidised
form. Plugging of the surrounding aquifer material does occur, but at a much lower rate
than would result from biofouling.
Treatment of the water with a chemical disinfectant, such as chlorine, is the best approach to
reducing biofouling once it has occurred. Details of the procedure can be found in Driscoll
(1986), AWWA (1987) and the Centers of Disease Control and Prevention (see previous
section). AWWA (2004) discusses the biology, ecology, identification and control strategies.
c)
Corrosion: Corrosion of bore materials is a consequence of the chemistry of the water
(either side of the bore) and can result in pinhole corrosion or can affect broad areas of the
material. The rate at which it occurs generally increases with increasing concentrations of
constituents such as carbon dioxide, oxygen, hydrogen sulphide, chloride, sulphate and
the total dissolved solids content of the water. Acidic pH levels (ie, lower levels) are more
corrosive to most materials than an alkaline pH. Corrosion of steel pipes has been known
to occur during storage, even before becoming part of the bore. Mild steel pipes are
generally not recommended, and plastic pipes are being used increasingly. Galvanised
steel pipes have been used for narrow diameter bores in the past, not always successfully.
Electrochemical corrosion can also affect bores (Driscoll, 1986). This type of corrosion
results from differences in electrical potential on metal surfaces. The potential difference
may arise between two different metals, or on two different areas on the same metal surface.
To minimise the likelihood of failure of the bore from electrochemical corrosion, care is
needed in selection of the materials used, contacts between different materials, and factors
that may lead to potential differences on the surface of the same metal, such as, heataffected areas near welded joints, heated areas near torch-cut slots, work-hardened areas
around machine cut slots, exposed threads, and breaks in surface coatings (eg, paint).
Guidelines for Drinking-water Quality Management for New Zealand 2013
81
Changes in the bore that can result from corrosion are:
•
the development of sand pumping as the result of the enlargement of screen slots or the
development of holes in the casing
•
structural failure of the bore screen or casing because of their reduced strength
•
reduced yield because of the blocking screen slots by corrosion products
•
ingress of poor quality water through corrosion of the casing
•
high iron levels in the water.
The ingress of poor quality water is the result of greatest concern with regard to the safety
of the water. Perforation of the casing has the potential to allow water from shallow
unconfined aquifers, carrying water of unsatisfactory microbiological quality, into the bore
water. A potentially secure groundwater source may, through this mechanism, become
unsafe without the water supplier being immediately aware of the degradation in water
quality.
The regular testing for E. coli required by the DWSNZ for secure supplies may identify a
change in water quality, but these tests are infrequent and episodes of faecal
contamination may be missed. Frequent, regular (or online) monitoring of the
conductivity of the bore water will provide an additional check for changes in water
quality. The test is rapid and inexpensive, and by charting the results changes in
conductivity will become apparent. These should be used as a signal that closer
investigation of the security of the bore is needed, and that an increase in testing of the
water’s microbiological quality should be undertaken. Regular monitoring of bore water
conductivity is a valuable check on water quality, and significant changes in any bore,
secure or not, shows that something has changed and may require attention.
Corrosion and its control are complex subjects, and expert advice is valuable in ensuring
the bore is constructed in a way that will minimise corrosion, and in assessing the steps
necessary to minimise corrosion after bore construction. Some discussion on corrosion
processes appears in Chapter 10: Chemical Compliance, section 10.3.4.
3.2.4 Establishing the security of a bore water supply
To demonstrate bacterial compliance of water leaving each treatment plant (or entering the
distribution system), a water supply serving a population of 10,000 or more needs to be
monitored for E. coli daily. Section 4.3.2.1 of the DWSNZ allows compliance to be demonstrated
without E. coli monitoring, subject to certain requirements being met, the main one being a
continuous and adequate residual of chlorine. A concession has also been made for water
supplies using groundwater sources, provided the bore water compliance criteria are satisfied.
An added advantage is that secure bore waters are also considered to satisfy the protozoal
compliance criteria.
There is a range of approaches that could be used to assess groundwater vulnerability (ANZECC
1995, and see 2010 review). These include subjective rating systems, statistical and processbased methods. An example of a subjective rating system is DRASTIC (Aller et al 1987). This is a
system developed by the US National Water Well Association and the USEPA which rates
vulnerability subjectively, based on seven hydrogeologic setting factors. Such a system, while
useful as a general indication of potential contamination from the ground surface, is not specific
in respect to microbial risk.
82
Guidelines for Drinking-water Quality Management for New Zealand 2013
An example of a statistical approach is that used for groundwater protection in the Netherlands
(Schijven and Hassanizadeh 2002). It uses the Monte Carlo method in uncertainty analysis of
factors influencing viral transport. This is used to determine the minimum size of bore head
protection zones. Considering the large variation in New Zealand aquifer parameters such
application is unlikely to be simple without being overly conservative. Information about viral
transport in New Zealand is also limited.
Other process approaches such as deterministic models tend to have large data requirements
and be situation specific. A useful summary of such methods and their hybrids is provided by
Focazio et al (2002). While this is an area of continual development, a pragmatic approach is
currently needed.
The current DWSNZ approach using water dating or water quality variation criteria, along with
consideration of bore head protection is based on a reasonable assumption that groundwater
isolated from surface contamination is unlikely to contain pathogenic micro-organisms.
Advantages of the current approach include that it is pragmatic, being easily applied with
minimal information and is empirically derived from local information. It has also been
supported through public submission.
While the term ‘secure’ bore water is used, it simply relates to a lower level risk of microbial
contamination such that less frequent monitoring is justified. It does not indicate that it is
‘secure’ from other forms of contamination or ultimately from all risk of microbial
contamination via preferential pathways. In this respect it is a misnomer for which in time an
alternative label reflecting the different levels of risk may be introduced.
Sinton (in Rosen and White 2001) collated information from regional councils about microbial
contamination of New Zealand aquifers. In general, contamination was reported from bores
extracting groundwater from less than 30 m below the ground surface. Septic tank discharges
and poor bore construction were implicated in much of the contamination.
Some bore water supplies can never be considered secure. The DWSNZ specifically state that a
secure status will not be given to bores drawing from unconfined aquifers when the intake depth is:
•
less than 10 m below the surface, or
•
10–30 m below ground surface, and there is less than five years’ monitoring data showing no
E. coli contamination exists.
Secure bore waters are often those drawn from confined aquifers. However, when the water
table in an unconfined area is at a great depth below the land surface, the water may be free
from microbiological contamination even though it is unconfined, because of the time it takes
for contaminants to move down to that depth through the aquifer materials. However, depth
alone does not necessarily lead to freedom from microbiological contamination. Without
effective filtration taking place in the soil above and through the aquifer, contaminants may still
reach the bore; deep groundwaters in volcanic areas such as Auckland and karst limestone
aquifers are examples.
WQRA (2011) gives an example:
A large norovirus outbreak occurred at a newly opened restaurant in Wisconsin, US. The
premises had a private bore located in a fractured dolomite rock aquifer. The bore was
85.3 m deep and cased to 51.8 m. It was located 188 m from the septic leach field. Tracer
tests using dyes injected into the septic system showed that effluent was travelling from both
the septic tanks and infiltration field to the well in 6 and 15 days, respectively. The private
well and septic system were newly constructed and conformed to Wisconsin State Code.
Guidelines for Drinking-water Quality Management for New Zealand 2013
83
Note that the DWSNZ do not define the procedure for demonstrating whether an aquifer is
confined. Section 4.5.2.1 of the DWSNZ includes three techniques for demonstrating that the
bore water is not directly affected by surface or climatic influences. In effect, this means much
the same thing, but see next paragraph. The water supplier must provide sufficient information
for an experienced groundwater engineer/hydrogeologist/scientist to be able to make that
decision. The sort of data required will include any information already known about the
aquifer, geological information gathered during the drilling (bore log), the depth to the screen,
full details about the screen and casing, results of pump tests (piezometric survey), etc.
Demonstration 3 offers a technique for situations when demonstration 1 and 2 are not feasible.
In the absence of some of the above, existing bores may need plumbing (depth sounding), or a
CCTV inspection. Ideally, water suppliers should also have information about the recharge area
and relevant land uses.
Despite drawing from a confined aquifer, it may still be possible for the water to fail to satisfy
the requirements of bore water security criterion 1. This may be because the
•
bore is too close to the recharge area
•
confining layer is not extensive enough (wide or deep or intact) to be effective
•
main aquifer is receiving water from a subsidiary aquifer that is not confined or contains
‘young’ water.
Also, the person stating that the bore is drawing from a confined aquifer may have been wrong,
or that conditions have subsequently changed.
To be secure, bore water must meet all three criteria. These show that:
a)
activities on the surface or climatic events have no direct influence on the quality of the
groundwater (bore water security criterion 1)
b)
the bore head provides satisfactory sanitary protection for the bore (bore water security
criterion 2)
c)
E. coli are absent from the water (bore water security criterion 3).
Once security has been demonstrated, the on-going E. coli monitoring requirements are reduced
substantially, as shown by Table 4.5 in the DWSNZ. Bore water security criteria 1 and 2 need to
be checked at least every five years.
The secure status of a bore water supply indicates that microbiological (both bacterial and
protozoal) contamination of the water is unlikely. However, it provides no indication of the
chemical quality of the water. Groundwaters isolated from surface events become better
protected microbiologically as the time since the water entered the ground increases, because of
processes such as dilution, filtration, adsorption and die-off. Some chemical determinands may
be unaffected by processes that improve the microbiological quality of the water. Processes, such
as the dissolution of minerals containing arsenic, will make the chemical quality worse with
extended residence times. Consequently, the time the water is under the ground may bear no
relationship to the chemical quality of the water.
84
Guidelines for Drinking-water Quality Management for New Zealand 2013
3.2.4.1
Proving the microbiological quality of the water
E. coli is the micro-organism used to indicate the bacterial quality of bore water drawn from a
groundwater source. Although the presence of E. coli in the water shows that faecal
contamination of the water is very likely to have occurred recently, there is no reliable
relationship between the concentrations of E. coli and protozoa in the water. Assurance about
the protozoal quality of the water is obtained by demonstrating that the water quality is not
directly influenced by events above ground, see section 3.2.4.2.
A bore water that has achieved secure status must continue to be monitored for E. coli at the
point where it enters the distribution system, at the frequency stated in Table 4.5 and note 5
(DWSNZ). Samples must be taken prior to any treatment, and E. coli must be <1 per 100 mL in
any sample (bore water security criterion 3). Note that once the bore water is in the distribution
system, it must comply with section 4.4 of the DWSNZ.
The provision of an interim secure status allows a reduced E. coli monitoring frequency during
the proving period and avoids situations in which a water supply, to comply with the DWSNZ,
has to install treatment (primarily for protozoal compliance) during the year in which data have
to be gathered to demonstrate security. However, some bore waters are less likely to be able to
achieve secure status so their monitoring needs to be more extensive. The following situations
apply:
a)
Unconfined aquifers <10 m deep: If the depth from the surface to the end of the
casing/beginning of the screen is <10 m, the bore water is considered to be equivalent to
surface water (with respect to both bacterial and protozoal compliance).
b)
Unconfined aquifers >10 m deep: It is possible for water in an unconfined aquifer
>10 m deep to satisfy bore water security criterion 1, in which case, it can also be given
‘interim secure status’. If bore water security criterion 1 is not, or cannot, be met – refer to
(d ii).
c)
Water from confined aquifers: There is no minimum depth requirement if drawing
from a confined aquifer. But in reality, it is difficult to imagine an aquifer that lies <10 m
below the surface being able to satisfy the requirements of being ‘confined’. And unless the
confining layer is extensive, it is difficult to imagine it satisfying “water younger than one
year not being detectable ... etc”. But if the bore water is drawn from a confined aquifer,
and satisfies bore water security criterion 1, then it can be given ‘interim secure status’ and
be monitored for E. coli in order to prove bore water security criterion 3.
If the bore water is drawn from a confined aquifer, but does not or cannot satisfy bore
water security criterion 1 – refer to (d ii).
To gain the ‘secure status’ also requires bore water security criterion 2 to be satisfied, and
a water supplier would be unwise to start E. coli monitoring before securing the bore head.
The E. coli monitoring requirements are defined in Table 4.5 of the DWSNZ. Bores with
‘interim secure status’ serving >10,000 people (for example) require daily testing for three
months, and if no E. coli is found, monthly testing for the next nine months. If no E. coli is
found during that 12-month period, bore water security criterion 3 has been satisfied. If
Bore water security criteria 1 and 2 are satisfied, the bore water can be called secure, and
for the next 12 months, E. coli monitoring remains monthly. If still free from E. coli after
12 months, monitoring can be reduced to quarterly. Bore waters given ‘interim secure
status’ and ‘secure status’ are assumed to satisfy protozoal compliance.
Guidelines for Drinking-water Quality Management for New Zealand 2013
85
d)
Unconfined aquifers >10 m deep, not complying with (or not tested for) bore
water security criterion 1: There are two situations:
i)
10–30 m deep: If the depth from the surface to the end of the casing/beginning of
the screen is between 10–30 m the bore water is to be monitored for (and be free
from) E. coli for five years before its secure status can be considered. During this
time it is considered equivalent to surface water for the purpose of both bacterial
and protozoal compliance. Regarding E. coli monitoring, these bore waters will
require weekly, twice weekly or daily testing (depending on the population served)
for the first three months, and if no E. coli is found, monthly testing for the next four
years nine months. If no E. coli is found during that five-year period, bore water
security criterion 3 has been satisfied. If bore water security criterion 2 is still
satisfied, the bore water can be called secure, and monthly E. coli testing continues
for the next 12 months; if still free from E. coli, monitoring can be reduced to
quarterly.
For at least five years, water from these bores will need to be disinfected. Section
5.2.1.1 and Table 5.1a of the DWSNZ explain that these bores will require three
protozoal log credits.
As an example, applying UV disinfection at 40 mJ/cm2 with a validated unit should
be able to achieve bacterial compliance (section 4.3.5) and protozoal compliance
(section 5.16, up to 3 logs). Once these bore waters are given secure status they are
assumed to satisfy protozoal compliance; that means the protozoal disinfection
system can be turned off.
ii)
>30 m deep: If the depth from the surface to the end of the casing/beginning of the
screen is >30 m, and the water from the bore does not comply with bore water
security criterion 1, it “must be drawn from a source for which hydrogeological
evidence indicates that the bore water is likely to be secure” for bore water security
criterion 1 to be satisfied, ie, to be granted interim secure status. Those words were
used to cover the situation where the bore log information has been lost (for
example), or the age results (demonstration 1) are marginal or confusing, or the
chemical composition does not quite meet the requirements of demonstration 2.
‘Hydrogeological evidence’ is not to be confused with ‘hydrogeological model’ which
is discussed in demonstration 3 of bore water security criterion 1.
Bores drawing >10 m deep from a confined aquifer, but which do not or cannot
satisfy bore water security criterion 1 may be granted interim security by following
the procedure in the preceding paragraph.
The hydrogeological evidence referred to above must be provided by suitably
qualified independent personnel, and must cover all pertinent aspects related to the
aquifer and the bore (and any nearby bores drawing from the same aquifer),
particularly pump tests. What is hydrogeological evidence?
Section 4.5.2.3 of the DWSNZ allows bores >30 m deep drawing from unconfined
aquifers to produce evidence (hydrogeological) that the bore water is likely to be
secure (ie, not directly affected by surface influences). Bores drawn from an
unconfined aquifer could still undergo the age test, ie, section 4.5.2.1. Or the
chemical consistency test. If the bore water satisfies those criteria, that will be good
quality supporting evidence. If it ‘just misses’ to satisfy those criteria, that could be
helpful supporting evidence. If it ‘misses by miles’ the information may suggest not
to bother looking for any further hydrogeological evidence.
86
Guidelines for Drinking-water Quality Management for New Zealand 2013
If the bore is in limestone (karst) or basalt country there is a risk that surface water
will reach considerable depths in quite short time. Likewise for parts of the country
like Canterbury where beneath the topsoil there are largely big stones and boulders.
Most old groundwater has lost its dissolved oxygen, so conversely, a bore water with
quite a lot of dissolved oxygen (say >4 mg/L) is probably ‘young’, regardless of the
depth or whether from a confined aquifer. Very old water should never have E. coli,
and the total (heterotrophic) plate count is invariably <5 per mL. Bores with no
E. coli, but with TPCs >100 per mL probably don’t come from an aquifer where the
bore is likely to be secure. Likewise, the temperature of a deep secure bore water will
hardly change during the year when in use, maybe no more than 0.2°C, ie, often less
than the ability of a technician to measure it. Bore water likely to be ‘directly affected
by surface influences’ will probably show a distinct and reproducible seasonal
pattern of temperature. Water suppliers might say the DWSNZ (or DWAs) don’t ask
for these tests. But the DWSNZ are minimum requirements – water suppliers should
want to know about the quality of their water. It is highly likely that a water supplier
has tested the bore for E. coli for some time before approaching a DWA; that
information will be an important part of their ‘hydrogeological evidence’ so they
should provide it, all of it.
If the hydrogeological evidence does not exist, or does not indicate that the bore is likely to be
secure, then the procedure in (i) may be followed, as above. That is, the water will be considered
equivalent to surface water and some form of treatment will be required to gain protozoal
compliance, for at least five years. Being deeper than 30 m means 2 log credits will be required
(DWSNZ Table 5.1a).
It is unlikely that a bore will be granted interim secure status if it draws from an unconfined
aquifer, especially if less than 30 m deep.
Section 3.2.4.6 discusses the procedures to be followed if E. coli is found.
The interim status lasts for 12 months only, subject to section 3.2.4.6. Sampling over a period of
at least a year has the advantage of establishing whether there are any seasonal or climatic
trends that would indicate surface influences. During this time, the water supplier must gather
data to demonstrate security in full, ie, as well as the criteria met for interim status, the
groundwater has to be shown not to be directly influenced by events above ground. If the full set
of criteria for showing security cannot be met after the 12-month interim period, the source
reverts to a non-secure status, and appropriate treatment processes must be in place and
operating satisfactorily for the supply to be able to comply with the DWSNZ.
All E. coli samples must be taken upstream of any treatment that is likely to disinfect the supply.
The purpose of these tests is to characterise the quality of the raw water, not determine the
quality that can be produced after treatment. Conversely, if a secure bore water supply receives
any form of treatment or onsite storage that is potentially able to result in microbiological
contamination of the water before entering the distribution system, it will need to achieve
bacterial compliance as for surface waters, eg, section 4.3 of the DWSNZ.
See Chapter 8: Protozoal Compliance, section 8.2.2 for a discussion on the procedure for
determining the protozoal log credit requirement for bore waters.
Guidelines for Drinking-water Quality Management for New Zealand 2013
87
3.2.4.2 Demonstrating groundwater residence time
Water significantly influenced by surface or climatic conditions is most likely to be water that
has only been underground for a short time, or from an aquifer containing a fraction of
‘younger’ water. The opportunity for the levels of disease-causing micro-organisms to be
reduced by die-off, dilution, or filtration as the water moves through the ground is much less in
new than old groundwaters. Being able to show that there are no signs of above-ground
influences therefore provides additional confidence that the acceptable microbiological
(including protozoal and viral) quality of the water already shown by the E. coli testing will
continue.
Bore water security criterion 1 provides two options by which the absence of above-ground
influences can be demonstrated:
a)
showing directly that the water is not new by isotopic (tritium) and chlorofluorocarbon
(CFC) and sulphur hexafluoride (SF 6 ) measurements
b)
showing indirectly that the water is unlikely to be new by the low variability of its
physicochemical characteristics.
A third option, based on hydrogeological modelling, is only available in the event that difficulties
arise with the first two options. These difficulties are discussed below.
Determination of the residence time
The mean residence time of groundwater is the average time the water has been underground,
from the time it leaves the surface to when it arrives at the point of abstraction. During this time
the concentrations of microbiological contaminants will be minimised due to mechanisms
including filtration, dispersion and die-off. The DWSNZ require the estimation of the residence
time to be made by measurements of tritium and CFC (chlorofluorocarbon) and SF 6 (sulphur
hexafluoride) concentrations in the water.
Age dating of water yields an average age of the water. Although this is helpful, most
groundwaters are mixtures of water with different ages because of the nature of flow in porous
materials. What one really wants to know is: what is the fraction of the water with age less than
one year? The DWSNZ specify that this fraction must be less than 0.005 percent of the water
present in the aquifer. This young fraction can be determined from a series of samplings for
tritium, CFCs and SF 6 , separated in time by several years. Single samplings of tritium, CFCs and
SF 6 can sometimes be used for less precise estimates of the young fraction, but must be
confirmed by future sampling.
Tritium is a radioactive isotope of hydrogen, which decays with a half-life of 12.4 years (the time
it takes for the number of tritium atoms present in a sample to decrease by 50 percent). It is an
ideal tracer for groundwater because it is a component of the water molecule, and the
information it provides is unaffected by chemical and microbial processes, or reactions between
the groundwater and soil, sediment or aquifer material.
88
Guidelines for Drinking-water Quality Management for New Zealand 2013
Cosmic rays passing through the atmosphere generate natural background levels of tritium, but
during the 1950s and early 1960s large amounts were produced in the atmosphere by
thermonuclear tests, see Figure 3.1. Tritium is distributed between the atmosphere and bodies
of water such as lakes, rivers and most importantly the oceans, but its concentration changes
with time and location. Concentrations of tritium in rainfall reflect the local tropospheric tritium
concentrations, and allow the input of tritium into land-based hydrological systems to be
assessed. Once rainwater infiltrates into the ground, it is separated from the atmospheric
tritium cycle, and its tritium concentration, which is no longer affected by exchange with the
atmosphere, decreases by radioactive decay. The tritium concentration in the groundwater
therefore depends on the time it has been underground.
CFCs are a family of entirely man-made contaminants of the atmosphere and hydrological
systems. They are used industrially for refrigeration, air conditioning and pressurising aerosol
cans, and once released become widely distributed in the atmosphere because of their high
chemical stability in this environment. Their use is being increasingly regulated.
Before 1940 CFCs were not present in the atmosphere, but since then their concentration has
increased with their increased use, see Figure 3.1. They are slightly soluble in water and are
therefore present in recharge waters at a concentration that depends on the temperature of the
water and the atmospheric CFC concentration at the time of recharge. Measurement of the CFC
concentration in a groundwater therefore allows the time at which the water entered the ground
to be determined.
CFCs have given reliable age results for waters in the majority of groundwaters analysed in New
Zealand. There are, however, potential sources of error with the measurements. CFC-11 (one
member of the CFC family) is more susceptible to degradation than CFC-12 in underground
environments that are oxygen-deficient; the water may therefore be calculated to be older than
it really is. CFC-12 measurements, on the other hand, are more susceptible than CFC-11
measurements to interference from local sources of contamination. This may make the water
appear younger than it is.
The properties of SF 6 make it useful in a number of industries (eg, electrical and electronics
industries, magnesium manufacturing, refrigeration, air-conditioning, foam production).
Because of its chemical stability it has a lifetime in the atmosphere of the order of 3200 years.
Atmospheric concentrations before 1970 were zero, but its concentration has increased at
approximately 8 percent per annum since then. It is slightly soluble in water and provides
another marker for establishing residence time. Its potency as a greenhouse gas is causing its
use to decrease and its atmospheric concentration will eventually decrease. Natural occurrences
of SF 6 have been noted in some deeper aquifers in volcanic areas (Van der Raaij 2003).
Guidelines for Drinking-water Quality Management for New Zealand 2013
89
Figure 3.1: Atmospheric CFC and tritium concentrations in rain water
Detection of new water
A sample of groundwater does not have a discrete age. Mixing processes underground ensure
that any sample of water consists of a mix of waters of various ages. Different mathematical
models have been developed to account for the distribution of water ages that may arise in a
particular situation. Factors such as the nature of the aquifer (confined or unconfined) and the
way in which the aquifer is recharged (rainfall or river) influence the form of the model.
The DWSNZ require that the new water fraction (water less than one year old) is less than
0.005 percent, based on a number of assumptions. This criterion has been set as a realistic
lower limit for age determinations once accepting likely uncertainty in model fitting parameters
and detection limits.
Sources with <0.005 percent of groundwater less than one year old are considered unlikely to be
contaminated by disease-causing micro-organisms primarily because of die-off and filtration
processes. This is supported by monitoring data to date from groundwater supplies in New
Zealand. Contamination from ‘young’ groundwater entering an aquifer via leakage through
preferential pathways is not accounted for in dating analysis; this makes sanitary bore head
protection a very important tool.
The mathematical models used to estimate residence time distribution require two parameters
to describe realistic groundwater situations: the mean residence time and the dispersion
parameter. The dispersion parameter reflects the degree of mixing and is a measure of the
spread of the times different components of the water have been under the ground.
Adjustment of these two parameters to give the best match to the measurements allows the
distribution of residence times to be determined. The presence of new water can be estimated
from this. Where the analytical results are clear or the hydrogeological situation wellunderstood, a single dating exercise may be sufficient to allow the mean residence time and the
fraction of new water present to be determined.
90
Guidelines for Drinking-water Quality Management for New Zealand 2013
Age calculation
Measured CFC concentrations in groundwater are corrected for excess air and used to calculate
corresponding atmospheric concentrations using Henry’s Law and an estimated recharge
temperature. The excess air correction and recharge temperature are calculated from the ratio of
dissolved nitrogen and argon concentrations, which are measured simultaneously with the CFC
concentrations (Heaton and Vogel 1981). The calculated atmospheric concentrations are then
used to calculate the CFC model age of the groundwater (Plummer and Busenberg 2000).
To calculate the mean recharge ages and young fraction, GNS uses an exponential piston flow
model (Zuber 1986). A conservative estimate or the mixing fraction of 90 percent has been used.
This approximates a mainly exponential mixing situation as may occur in an unconfined aquifer.
For a confined aquifer, the mixing fraction is likely to be somewhat lower than this estimate.
However, in this case, the use of the higher mixing fraction does not affect the calculated young
fraction. The mixing fraction estimate may be further refined by additional measurements in
two years’ time OR the ages have been calculated using a mixing fraction of 70 percent in the
aquifer (E(70 percent)PM). The error shown represents the uncertainty in estimating the mixing
fraction within the aquifer, and is estimated at ±10 percent. That is, the upper and lower limits
of the error on the mean recharge age are given by E(80 percent)PM and E(60 percent)PM
respectively.
Variability in physicochemical determinands
All groundwater is recharged from the ground surface at some time, but water that has been in
the ground only a short time is likely to be more variable in quality than older water. The
processes of mixing and dispersion that occur as the water and dissolved constituents travel
through the groundwater system mean that any variability in the quality of water that has
recently entered the ground will tend to decrease with time and distance travelled. Deeper
waters therefore tend to be older and less variable in quality than shallower waters, because of
the time taken to travel to greater depths. Waters contained in confined aquifers (those overlain
by confining layers (aquitards)) also tend to be older than shallow water in unconfined aquifers.
In this case, the water has to travel along the aquifer from the recharge zone, which generally
takes longer than permeating vertically into a shallow unconfined aquifer.
Variability in the composition of groundwaters may arise for any of six reasons:
1
Seasonal recharge: During winter and spring, high rainfall combined with lower
evaporation rates and lower plant productivity causes greater leaching of chemicals, such
as nitrate, stored in the soil. More new water enters the aquifer from the surface at this
time and may therefore show high concentrations of these chemicals.
2
River recharge: High river levels during winter and spring may increase the amount of
new water feeding into aquifers. Flood events may also increase the recharge with new
water, but this may not be related to season.
3
Intermittent discharge events: Activities that contribute to variability in water quality
in this category include septic tanks leaking or overflowing during high rainfall, fertilising
of pastures, land application of sewage, or chemical spills. Where bore heads are nonsecure, floods may also cause spikes of contamination as the result of floodwaters running
down the bore casing.
4
Groundwater abstraction: Changes in pumping regime in the supply bore or
neighbouring bores can cause changes in groundwater flow directions or leakage rates. As
a result, seawater (if near the coast), and water from nearby rivers, or overlying or
underlying aquifers may be drawn into the vicinity of the supply bore.
Guidelines for Drinking-water Quality Management for New Zealand 2013
91
5
Changes in climate and land use: Gradual, long-term changes in water quality may
arise from changes in climate and land use. Although they may not affect the security of
the bore water, because changes may be slow enough to allow removal of microorganisms, they may have important implications for the future water quality and quantity
from the aquifer.
6
Earthquakes: Earthquakes can disrupt the confining layers, rupture bores and alter flow
paths, see section 3.2.3.1.
Table 3.2 summarises statistics for the conductivity, chloride and nitrate concentration data
used to establish the variability criteria contained in Bore water security criterion 1,
demonstration 2 (section 4.5.2.1 of the DWSNZ). The criteria are maximum allowable values,
and are designed to be conservative. All three determinands must be used. Some supplies not
meeting the criteria for secure status on the basis of chemical variation may still be found to be
secure using the age determination techniques.
Table 3.2: Statistics showing the variability in conductivity, chloride and nitrate-N for
groundwaters
Determinand
Statistic
%
Conductivity
Co-efficient of variation
3.0
Chloride
Co-efficient of variation
4.0
Nitrate-N, mg/L
Standardised variance
2.5
The co-efficient of variation and standardised variance in Table 3.2 are defined as follows:
Co-efficient of variation (standard deviation/mean):
n ∑ x 2 − (∑ x) 2
n(n − 1)
2
2
Standardised variance (standard deviation2/mean): n ∑ x − (∑ x)
n(n − 1)
∑x
n
∑x
n
where x is a concentration of a determinand from the set of monitoring results.
Use of the coefficient of variation with data near the detection limit can create difficulties
because the standard deviation is controlled more by uncertainty in the measurements than true
variability in the determinand. This situation often arises with nitrate measurement. To
minimise this effect of measurement uncertainty in nitrate measurements, the standardised
variance is used as the statistic for nitrate measurements. The value of the standardised
variance, however, is dependent on the units in which it is expressed, and nitrate concentrations
must therefore be expressed as NO 3 -N in mg /L when making this calculation for assessing
security.
Example
Table 3.3 lists two sets of results from measurement of nitrate in samples from two separate
bores. The statistics calculated from these are tabulated at the bottom.
The first step in deriving the required statistics is to calculate the mean and the standard
deviation for each data set. These functions are available on scientific calculators and in
spreadsheets such as Excel. Although the coefficient of variation is not the required statistic
for nitrate, it is included here to show how it is obtained, and for comparison discussed below.
The coefficient of variation is calculated by dividing the standard deviation by the mean. The
DWSNZ require the statistics to be expressed as percentages, therefore it is multiplied by 100 to
give values of 18 percent and 50 percent, for bores 1 and 2 respectively.
92
Guidelines for Drinking-water Quality Management for New Zealand 2013
Table 3.3: Example of calculations of coefficient of variation and standardised variance
Sample number
Nitrate, mg/L as NO 3 -N
Bore 1
Bore 2
1
0.24
1.6
2
0.21
3.8
3
0.15
2.1
4
0.15
1.7
5
0.21
3.1
6
0.15
1.0
7
0.22
0.9
8
0.21
1.6
9
0.15
2.3
10
0.18
1.1
11
0.21
3.2
12
0.15
4.2
Mean
0.186
2.217
Standard deviation
0.034
1.116
Coefficient of variation
0.184
0.503
% Coefficient of variation
18%
50%
Standardised variance
0.006
0.562
1%
56%
Statistic
% Standardised variance
The statistic to use for checking on the acceptable level of variability in nitrate is the
standardised variance. This is calculated by squaring the standard deviation, and dividing the
result by the mean. Again, it is expressed as a percentage by multiplying by 100. This yields
results of 1 percent and 56 percent for bores 1 and 2 respectively.
For the variation in the nitrate concentration to be considered acceptable, the standardised
variance expressed as a percentage must be less than 2.5 percent. Bore 1 therefore meets this
criterion, but bore 2 does not. Comparison of the coefficients of variation with the standardised
variance shows a relatively small difference between these two statistics for bore 2 where the
nitrate concentrations are well above the limit of detection (ca 0.05 mg/L as NO 3 -N). However,
in bore 1, where the nitrate concentrations are less than 0.5 mg/L NO 3 -N and closer to the limit
of detection of the method, there is a very large difference between the two statistics.
Some secure bore waters can show almost identical chemical composition year after year, so
using chemical consistency can be an effective tool to demonstrate security. This applies in
about one third of bores; residence time should otherwise be used. As an example of the
complexity when using this approach, many of the groundwaters under Christchurch are
considered to be secure, but their chemical composition can vary, depending on the relative
proportions of water originating from rainfall or from the Waimakariri River. In situations
where there may be difficulties in assessing variability in any of the three determinands at low
concentrations, the water supplier should consult with the DWA, who may allow the result to be
disregarded for the determinand of concern. Supporting data that show clearly the reason for
high variability will have to be provided to the DWA for these data sets to be disregarded.
Guidelines for Drinking-water Quality Management for New Zealand 2013
93
These samples must be collected with care. Ensure that the water in the sample bottle
represents the aquifer, not stale water that has been sitting in the pipe for some time. Sample
the bore head, before any treatment, and not from a tank or the distribution system. Refer also
to Sundaram et al (2009).
A very competent laboratory is needed when using the chemical consistency technique. For
example, as an indication of the likely variation that could be encountered (using somewhat
generalised uncertainties!):
•
a good laboratory technician measuring chloride in 12 replicate samples (ie, in one batch)
should obtain a result in the order of 15 ±1 mg/L
•
the same person should achieve something in the order of 15 ±2 mg/L testing the samples in
12 separate batches (see Chapter 17, section 17.5.5)
•
but if different technicians did the test each time, the results could be 15 ±3 mg/L
•
if different laboratories were hired, the results may be something like 15 ±4 mg/L
•
and if these laboratories used different methods of analysis, results could even be 15
±5 mg/L.
If bore water security criterion 1 has been demonstrated on the basis of the variability of
chemical determinands, these determinands must thereafter be measured annually to check that
they continue to lie within the range found originally. In the event that they do not, the
appropriate statistical parameter should be recalculated with the inclusion of the new data to
determine whether the compliance criterion is still met. If the appropriate compliance criterion
is not met, the DWA should be consulted, and the cause for the increase in the variability
parameter considered. For example, the increase may result from a long-term trend of a change
in the determinand concentration. This is not an indication that the groundwater quality is
responding directly to events above ground. In this case, it may be appropriate to recalculate the
variability parameter using the 12 data collected most recently. Where no significant trend is
evident, the increase in the variability parameter may require a reassessment of the secure
status of the supply.
Approach for demonstrating that a water is not directly affected by surface or climate influences
The following is suggested as an approach for establishing whether a groundwater is directly
affected by surface or climate influences.
Approach 1: Chemical consistency
•
Check whether enough chloride, conductivity or nitrate data are available to determine the
variability of all of these determinands, carry out further sampling if necessary, and calculate
the statistics.
•
To calculate the statistical parameters for any one of these determinands, at least 12 data
points are needed. They may be collected monthly over one year, bimonthly over two years or
quarterly over three years.
•
When collecting samples it is important to ensure that the bore is fully purged of stagnant
water by pumping out at least three bore-volumes before the samples are taken.
•
Once the data sets are obtained, calculate the appropriate statistical parameters (see
Table 3.2 and the related Example above). The lack of surface or climate influences is
demonstrated if the requirements of section 4.5.2.1 of the DWSNZ are met.
94
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
All determinands must meet the criteria for chemical determinand variability if the source is
to be regarded as secure. If this requirement cannot be met, the absence of the influence of
surface effects must be shown another way (see Approach 2). As discussed in the section on
chemical determinand variability, if a dataset for one determinand cannot meet the
requirements of section 4.5.2.1 of the DWSNZ, the results for this determinand may only be
disregarded following approval by the DWA.
Approach 2: Determine the residence time by tritium and CFC and SF6 analysis
•
Samples for these measurements should ideally be taken between winter and spring when
groundwater systems are most likely to be recharged with new water. Local personnel can
collect samples for tritium measurements, but CFC or SF 6 sample collection must be
managed carefully due to potential air entrainment. Although some correction can be made
for additional air, Institute of Geological and Nuclear Sciences (IGNS) staff may be required
to take the samples. IGNS should be contacted for analytical charges and charges for
sampling personnel. The USGS Reston Chlorofluorocarbon laboratory website
(http://water.usgs.gov/lab/) has a lot of further information.
•
Water suppliers should ensure that information provided to the MoH contains the following
information:
– a full description of the procedure used to determine the residence time, which includes
the mixing model assumptions, the justification for these assumptions and an
interpretation of the data
– the percentage of water that has been under the ground for less than one year (rather than
an average residence time).
•
Further confirmational dating must be carried out if the analyst specifies it to be necessary.
– It is possible that a single measurement is insufficient to determine reliably the percentage
of new water in a groundwater. For this reason more than one analysis is recommended.
Subsequent residence time determination, eg, after two years, is likely to provide greater
confidence, particularly for tritium analysis.
Approach 3: Hydrogeological modelling
•
Should residence time determination not be considered feasible due to the presence of nonmeteoric tracers, and Bore water security criterion 1 cannot be demonstrated by chemical
variability criteria, then a hydrogeological model can be used to establish the security of the
aquifer.
•
For this approach, the water supplier is likely to require the services of a groundwater
consultant who can contract to undertake modelling that meets the requirements stated in
section 4.5.2.1 of the DWSNZ. The completed modelling must show that contamination by
pathogens is very unlikely, to the satisfaction of a person (or persons) deemed (by the
Ministry of Health) to be qualified for reviewing the modelling work.
•
The results from hydrogeological models are calculated flowpaths and concentrations of
determinands, microbiological or chemical, in the groundwater at a particular location and
depth. The accuracies of the outputs from the model depend on:
– The appropriate choice of model: A range of models exists, not all of which may be suitable
for a given situation. For example, distinctions need to be made between models designed
to deal with point source contamination, and those intended for non-point source
contamination arising over an area. The number of dimensions that the model takes into
account also needs to be considered. Where preferential flow paths exist, these should be
accounted for in the model. Knowledge of the geology of the area being modelled is needed
to assess whether such flow paths are likely.
Guidelines for Drinking-water Quality Management for New Zealand 2013
95
– The accuracy of the input parameters: Values for input parameters must be provided for
the model; the number of input parameters will depend on the model’s complexity. Some
parameters, such as reaction rates and microbial die-off rates may be universal, but
others, such as adsorption coefficients and hydraulic conductivity, will depend on the
soils, or aquifer media in the area being modelled. This latter type of parameter must be
given either a value obtained from laboratory or field measurements, or a conservative
value, ie, a value that is more likely to lead to the model producing a non-secure
conclusion if the selected value is inaccurate.
•
The modeller needs to be able to justify the selection of all input parameters, as the modelling
will be have to be verified as being acceptable by a person deemed qualified by the Ministry of
Health. It is suggested that proposed models be discussed at the conceptual stage with
persons advised by the MoH to ensure applicability.
•
Comparison of the predicted concentrations of microbial contaminants in the aquifer at the
point of abstraction, with the levels of contaminants permitted for a secure bore water (less
than 1 E. coli per 100 mL), will indicate whether the source is predicted to be considered
secure.
•
The model must be run under different scenarios designed to take account of all likely
circumstances that would challenge compliance with Bore water security criterion 1. Models
can be evaluated, see USGS (2004).
•
Model auditing guidelines prepared by PDP (2002) for MfE will be used to provide guidance
for model review. A conservative approach will be required given the uncertainties inherent
in modelling.
USEPA (2008) discusses (in the appendices) several methods for assessing groundwater time of
travel.
3.2.4.3 Establishing adequate bore head protection
A geological log and bore construction details should be available for all supplies being assessed
for bore water security. Without this information the aquifer hydrogeologic setting and the
suitability of bore design and construction cannot be assessed.
Proper bore construction including bore head protection is an essential requirement for
establishing bore water security criterion 2 (DWSNZ section 4.5.2.2). It is also important for the
protection of groundwater quality in the aquifer that the bore intercepts. As a minimum
requirement, bore construction should comply with the Environmental Standard for drilling of
soil and rock (NZS 4411, 2001), unless otherwise agreed by the MoH. Another useful source of
information is Minimum Construction Requirements for Water Bores in Australia, ARMCANZ
(2003).
Good bore head protection is required for all bores used for drinking-water, not just those
demonstrating security. ANZECC (1995) discusses wellhead protection plans. The AWWA (USA)
has produced a thorough Standard for Water Wells (see ANSI/AWWA A100-97, 1997, see full
list on http://www.awwa.org/files/Resources/Standards/StandardsSpreadsheet.xls).
96
Guidelines for Drinking-water Quality Management for New Zealand 2013
Siting and construction
Assessment of bore design and condition, as well as hydrogeologic setting, should be made by a
person deemed qualified by the Ministry of Health. Sanitary bore head protection should
include an effective casing grout seal to prevent contamination from the ground surface. Where
there is doubt about bore integrity there are a number of techniques, such as casing pressure
tests and down-hole photography, which could be used but are likely to be beyond normal
requirement. In general, above ground visual inspection and bore construction data would
provide sufficient information.
All bores have a limited life and therefore periodic reviews are required. A five-yearly
assessment of bore head conditions, with any changes reported by the supply manager, should
provide appropriate assurance. It is important to develop a protocol for assessment.
The threat of aquifer contamination may be reduced both by proper bore construction (see
Figure 3.2), and by locating bores away from sources of contamination so that the normal
movement of groundwater carries contaminants away from the capture zone of the pumped
bore. Minimisation of the possibility of contamination from potential local sources such as
septic tanks and other waste disposal systems, fertiliser and pesticide stores, underground
petrochemical tanks etc, will assist in reducing the threat of contamination.
Septic tanks and similar potential sources of faecal contamination that discharge faecal matter
at shallow depths need to be sufficiently distant from the bore head that their discharge will not
be captured by the bore. For example (USEPA 2000), 41 states have setback distances (the
minimum distance between a source of contamination and a well) that are less than or equal to
100 feet for sources of microbial contaminants. Five states appear to require setback from all
sewage sources of more than 200 feet. Some require 50 feet from a septic tank and 10 feet from
a sewer line. Some of the differences relate to the nature of the soil, the depth to the
groundwater, and pumping rates.
ECan (2007) presented results of a groundwater modelling study to investigate whether the
separation distances for waste discharges in rules in their proposed Natural Resources Regional
Plan were adequate to mitigate the risks of virus contamination of water in domestic bores, and
to provide a foundation for possible alternative separation distances. Earlier work (ECan 1999)
had found that a separation distance of 50 m would achieve a 3 log reduction of E. coli.
However, viruses can be infectious at much lower concentrations and travel much further in
groundwater than bacteria, and they are now considered to represent the highest health risk of
contaminants in sewage discharges. Note that the USEPA (2000) Ground Water Rule
recommends a 4-log concentration reduction between drinking water supply wells and
contaminant sources. The current study found that much greater distances are required to
reduce the number of viruses, see Table 3.4.
Guidelines for Drinking-water Quality Management for New Zealand 2013
97
Figure 3.2: Sanitary protection of a typical bore
Table 3.4: Log reductions of viruses in Canterbury’s alluvial aquifers
Log reduction in
concentration
Average separation
distance (metres)
Standard deviation of separation
distance (metres)
2
25
27
3
140
67
4
389
125
5
764
227
6
1186
337
7
1594
427
Factors that will influence the required set back distance will be:
•
the discharge source characteristics (eg, size of the disposal system)
•
the volume and concentration of micro-organisms in the discharge
•
the nature of the soil in the infiltration zone
•
whether there is a confining layer above the aquifer
•
the direction of the groundwater flow with respect to the septic tank and bore head
•
the conductivity of the aquifer material, which influences the groundwater velocity
•
the depth to the aquifer
•
the groundwater pumping regime.
98
Guidelines for Drinking-water Quality Management for New Zealand 2013
ESR (2010) developed the ECan study to National Guidelines. Section 7.1.1 includes:
Where regional plan rules require a separation distance from a well, these generally
require that the discharge from the on-site wastewater system must be separated from a
domestic well by between 20–50 m, depending on the particular regional plan. These
separation distances have been imposed generally without substantive scientific basis or
specific consideration of the sensitivity of groundwater to contamination at the location of
the discharge. Where separation distances have been based on some scientific evidence of
contaminant transport, they relate to bacteria.
The development of the Guidelines has shown that, in many instances, bacteria-based
separation distances will be insufficient to protect drinking water quality from viruses
discharged in domestic wastewater, and that the potential for viruses to be present in
groundwater should be recognised in many more situations than is currently the case.
Their Guidelines develop separation distances for different hydrogeological settings and
geology; these are summarised in their Table 8.2. Worksheets are included.
Backflow prevention
The design and construction of a bore water supply should effectively prevent the ingress of
contaminants from the ground surface by using a grout seal. Mixing of hydraulic or aquifer units
of different water quality should also be prevented. The bore head should thereby minimise the
possibility of contamination of the aquifer from the surface due to backsiphoning, by
contaminants passing down the outside of the bore casing due to a poor seal between the casing
and the ground, or through cracks in the bore head or casing.
The possibility of backflow of contaminated water from the treatment plant or distribution
system down into the bore should also be guarded against by the use of a backflow prevention
device. Backflow is defined as a flow that is contrary to the normal intended direction of flow.
Increasingly, sludge and/or fertilisers are being applied to pasture by pumping them into the
flow of water (which is often groundwater) being used for irrigation; this is a classic situation
where contamination by backflow can occur.
One of the requirements in providing a satisfactory bore head is that an effective backflow
prevention device is in place; see section 4.5.2.2 of the DWSNZ. The exact nature of the
mechanism is not specified in the DWSNZ, as it was appreciated that different circumstances or
local requirements may require different approaches to providing this protection. The NZWWA
(2006) Code of Practice Backflow Prevention for Drinking Water Suppliers describes backflow
prevention devices as including reduced pressure backflow devices, double check valves –
testable and non-testable, dual check valves, vacuum breakers and air gap separation. The
NZWWA Code of Practice has other requirements too, such as: The water supplier shall ensure
that those involved in the specifying, installation and monitoring of backflow devices are
appropriately trained to carry out their work. The NZWWA Code of Practice covers backflow
technician qualifications and backflow device certifications. The MoH supports the CoP. The
MoH also prepared (in 2001) a PHRMP Guide, D2.4: Backflow Prevention. NZWWA (2006) was
revised in 2013. This includes:
Guidelines for Drinking-water Quality Management for New Zealand 2013
99
The water supplier should ensure that all groundwater takes from an aquifer have
adequate backflow protection. The backflow protection programme should require that
bores are drilled, constructed and maintained in a manner that avoids any contamination
of, or cross-connection with, groundwater aquifers. This should include ensuring that well
head construction on all bores incorporates a boundary device and, where required, a flow
measuring device. Groundwater takes for irrigation or stock water with direct injection of
chemicals should require, as a minimum, a double check backflow device to protect the
aquifer.
Regional councils (see section 2.5.5.8 in NZS 4411 (2001): Environmental standard for drilling
of soil and rock) require bores to include some form of backflow protection to prevent
contamination of the groundwater system, so the regional council should be contacted first.
Once they have explained their requirements, contact an experienced backflow prevention
company because backflow prevention can be expensive, and it has to be installed so that the
device can be tested, and without damage to the pump etc.
A side effect when using a backflow prevention device is the pressure loss it can create. This loss
increases depending on whether a single-check valve, dual-check valve, double-check valve, or
reduced-pressure backflow preventer is in use. Single-check valves and dual-check valves are
not approved backflow prevention devices because they cannot be tested. Double-check valves
are available as ‘testable’ and ‘non-testable’. Some regional councils may permit the use of the
non-testable double-check valves, particularly if it can be shown the bore will be used in a nonhazardous situation.
The installation of devices that offer greater levels of public health protection, but greater
pressure losses, such as reduced-pressure backflow preventers, require regular testing and
maintenance.
Single-check valves provide a lower degree of protection. Some regional councils may allow the
risk associated with a possible loss of pressure to be reduced by installation of two single-check
valves: one at the pump and the second a little downstream of the bore head.
Maintenance
Once a secure bore head has been constructed, its security needs to be maintained. The status of
the bore head should be reported to the DWA annually. The information required includes the
adequacy of the surface seals and/or caps, integrity of materials, observations about any
containment sources located in the immediate area of the bore (eg, storage of hazardous
compounds, stormwater or effluent discharges, the application of pesticides, or animal wastes
near the bore), and anything that may be important like a change in pressure or flow.
In addition, whenever a bore water sample is taken for testing, the integrity of the bore head
should be checked and recorded. Any problems (eg, vandalism, earthquake or other damage)
should be reported immediately, followed by additional sampling to see if the water quality has
been affected. Where possible, steps should be taken to protect the bore head from accidental
damage. For example, protective barriers (for physical protection rather protection against
faecal contamination) should be placed around bore heads located where they may suffer
damage by vehicles. This step is designed primarily to ensure security of supply.
100
Guidelines for Drinking-water Quality Management for New Zealand 2013
Animal control
Where a bore head is located in the same field as stock, it should be fenced off to stop faecal
matter being deposited close to the bore head, and to remove the possibility of damage. The
MoH PHRMP guide for abstraction from bores (PHRMP P1.3 – Groundwater Abstraction –
bores and wells) recommends that the fence keep stock a minimum of 10 m from the bore head,
and certainly no closer than the 5 m stipulated in section 4.5.2.2 of DWSNZ.
3.2.4.4 Multiple bores serving a drinking-water supply
Where a drinking-water supply is sourced from a number of bores, separate monitoring of each
bore can lead to a large number of E. coli samples having to be taken. The DWSNZ
(section 4.5.3) allow water suppliers to reduce the number of monitoring samples taken, if they
can show that all bores are receiving water of the same quality, and that contamination of this
water is very unlikely. If this is done, the bore considered to be the most vulnerable must be
used to represent the bore field.
The water supplier must be able to show that:
•
the bores draw from the same aquifer under similar conditions
•
any aquitard protecting the source is continuous across the bore field
•
each bore head satisfies Bore water security criterion 2
•
the chemical character of the water from each bore is similar
•
monthly samples from the individual bores must contain <1 E. coli per 100 mL for three
consecutive months.
These requirements are designed to give confidence that water quality information gathered
from one bore in the field accurately reflects the quality of the water from other bores in the
field. Information from several sources will probably be needed to provide this confidence. A
number of these are discussed below. A preliminary step is to check that the depth the bore
draws from is truly as reported.
Stratigraphic data
Stratigraphic information can be obtained from bore logs. These reveal the nature of the various
geological strata the bore passes through at that location and the thicknesses of these strata. The
nature of the aquifer media in each stratum will allow the permeability of the layer to be
evaluated and potential aquitards identified.
For a small bore field, identification of the same aquitard at each bore location will strongly
support the assumption that it is continuous throughout the field. However, where the bore field
is extensive, and there are substantial distances between locations for which stratigraphic data
are available, the presence of the same aquitard in neighbouring bores does not guarantee that
there are no breaks in the aquitard in the intervening area.
An indication that there are breaks in the aquitard may be gained from pump tests.
Guidelines for Drinking-water Quality Management for New Zealand 2013
101
Pump tests
Values for the transmissivity (the rate at which water is transmitted through a unit width of an
aquifer under a unit hydraulic gradient) and the storage coefficient (the volume of water an
aquifer releases from or takes into storage per unit surface areas of the aquifer per unit change
in head) define the hydraulic characteristics of a water-bearing formation. Their evaluation at
locations across a bore field will help in determining whether the aquifer drawn from at one
bore is the same aquifer intercepted by other bores in the area.
Storage coefficient values will help to determine whether the pumped aquifer is confined or
unconfined, as the coefficient values in confined aquifers are orders of magnitude smaller than
those of unconfined systems, cf 10-5–10-3 (confined) and 0.01–0.3 (unconfined); the units are
dimensionless.
These values can be evaluated by a number of means, but the most reliable is to undertake a
pump test at the bore field. Constant-rate and step-drawdown tests are the two most useful
forms of pump test. Both require observation bores to be sunk at distances out from the
production bore. The reader is referred to texts (eg, Driscoll 1986; Dominco and Schwartz 1998)
for more detailed information on the subject of pump tests.
Interpretation of time-drawdown plots, obtained by measuring the levels of drawdown in
production and observation bores with time after a pump tests starts, can provide information
about additional aquifer characteristics such as potential recharge from nearby rivers or springs
and leakage though confining layer/s. Depending on the location of abstraction bores, pump
tests may also indicate intersecting hydraulic effects of drawing water from multiple bores.
Water chemistry measurements
Microbiological water quality data are of little value in helping to assess whether a number of
bores are all drawing from the same aquifer, especially if the water is of generally good
microbiological quality and the levels of indicator organisms are below the limit of detection.
However, most key chemical constituents of a bore water will be present at detectable
concentrations. Comparison of the chemical characteristics and determinand ratios would
indicate similarity.
Several different ways of displaying water quality data can be used, eg, tabulation of the data;
bar graph; pie chart; Piper diagram; Stiff diagram. The most useful diagrams are probably those
developed by Piper (1944) and by Stiff (1951), both of which require the concentrations of the
constituents, which are usually expressed in mass per volume, to be expressed in equivalents per
litre. Piper diagrams are trilinear and more complicated to interpret, but Stiff diagrams produce
a simple pictorial representation of the water quality. Comparison of the shapes of the polygons
that result from this form of analysis allow waters from similar hydrogeologic environments to
be traced over large areas. The use of cluster analysis and/or principle component analysis may
also be useful in illustrating similarities or differences in groundwater chemistry (Wilkinson
et al 1992).
Waters that are chemically the same may be from the same aquifer, but marked differences in
water chemistry show that either the aquifer is different or the aquifer is being influenced by
input from another water-bearing formation.
102
Guidelines for Drinking-water Quality Management for New Zealand 2013
Hydrogeological modelling
Modelling of the aquifer system by groundwater consultants may be undertaken to support
rationalisation of monitoring. Modelling is most applicable where the bore field is extensive. See
section 3.2.4.2, Approach 3, for some discussion on hydrogeological modelling.
3.2.4.5
Changes in security
The classification of a bore water as secure is not necessarily a permanent status. This is
reflected in the on-going checks required within each of the three bore water security criteria
specified in the DWSNZ. Signs that a supply may lose its secure status include:
•
extreme events, such as floods or droughts, which may affect groundwater quality
•
the aquifer structure being altered by a geological event, such as an earthquake
•
a breach in the aquitard from developments in the confined area of an aquifer
•
a major change in land use
•
a large new bore affecting flow patterns or water levels
•
corrosion of the bore casing, damage or deterioration of the bore head leading to surface
water, or water from a poor-quality aquifer, entering the bore.
Some groundwater supplies may be a mixture of waters from different depths. Droughts, floods,
or periods of excessive drawoff may affect the relative contributions from these sources.
When the water supplier is aware of an event that may affect the secure status of the bore water,
action should be taken, where possible, to minimise the impact of that event on the water
quality.
E. coli should not be detected in a supply classed as secure. However, if it is, the actions that
need to be taken will depend on the number of samples in which the indicator has been found.
The consequences of E. coli detection in the water are given in section 4.5.5 of the DWSNZ and
are discussed in the next section.
3.2.4.6 Response to E. coli detection
Section 4.3.9 and Figure 4.1 of the DWSNZ cover the responses that must be followed when
finding E. coli in any sample of drinking-water entering the distribution system. For bore water
supplies, there are additional requirements.
Section 4.5.5 of the DWSNZ describes the response should E. coli be found in bore water. This
involves:
•
additional E. coli testing (confirming bore water security criterion 3)
•
checking the chemical consistency (confirming bore water security criterion 1)
•
a sanitary inspection of the bore head (confirming bore water security criterion 2).
A secure bore water that only has one sample containing E. coli is reclassified as provisionally
secure for the following 12 months. That means a faulty result will not require expensive
treatment to be installed. If E. coli is obtained in another sample during this 12-month
provisional period, the water must be reclassified immediately as non-secure. If a secure bore
water is classified as provisional more than twice in five years, retention of its secure status will
be at the discretion of the DWA.
Guidelines for Drinking-water Quality Management for New Zealand 2013
103
If bore water that has been given interim secure status (section 4.5.2.3 of DWSNZ) contains
E. coli in any sample, the 12-month interim sampling regime must recommence. If E. coli is
found in a second sample during the 12-month interim period, the water must be reclassified
immediately as non-secure. Because the bore water no longer complies with the bacterial
compliance criteria, it will need to be disinfected. Non-secure bore waters also require treatment
to satisfy the protozoa compliance criteria in the DWSNZ.
Section 4.5.5 of the DWSNZ also specifies the actions to be followed if E. coli is found in the bore
representing multiple bores drawing from the same field.
3.2.4.7
Grandfathering
This section applies to bores that have been in use for some time and have a monitoring history,
but whose secure status has not previously been determined, or where the secure status may
have been given in error.
Water suppliers may present to the DWA for consideration the full history of a bore’s E. coli
monitoring and test results, along with all other relevant information. If these results indicate
that Bore water security criterion 3 is highly likely to have been satisfied, E. coli samples may be
collected quarterly for compliance monitoring, as per note 5 to Table 4.5 in the DWSNZ.
To be granted full secure status, bore water security criteria 1 and 2 must have been satisfied in
the previous five years.
Some guidance is offered regarding the number and frequency of samples required.
In the normal process, once a bore supply satisfies the requirements for being given interim
secure status, water from the bore must be monitored at least as follows:
a)
weekly/twice weekly/daily depending on population (= 13, 26 or 90 samples in three
months)
b)
if no E. coli found for three months monitoring as in a) – monitoring can be reduced to
monthly, regardless of population (= another nine samples in nine months)
c)
if no E. coli found for nine months monitoring as in b) – bore water criterion 3 is satisfied.
If criteria 1 and 2 are satisfied, the bore supply can be called secure.
That means an interim secure bore needs 22, 35 or 99 E. coli-free samples in 12 months,
depending on population, before it can be called secure.
Thereafter, the secure bore water must be monitored for E. coli for evermore. For the first
12 months after having been given secure status, the monthly monitoring continues. If no E. coli
is found during that 12-month period, monitoring can become quarterly.
That means a bore that is given interim security and then becomes secure will need at least
34, 47 or 111 E. coli-free samples before being allowed to reduce to quarterly sampling.
For these bore supplies (as per paragraph 1 of this sub-section), a DWA could consider accepting
three years of monthly, nine years of quarterly, or 18 years of six-monthly E. coli-free samples.
These three examples require 36 consecutive E. coli-free samples.
104
Guidelines for Drinking-water Quality Management for New Zealand 2013
It is more serious for those bores that would normally need a five-year proving period. Ideally
these bore supplies shouldn’t be grandfathered. However, some may have been given secure
status incorrectly, so to avoid having to disinfect for five years, an alternative is suggested.
Normally, to satisfy Bore water security criterion 3, these bores would need at least 70 E. colifree samples over a five-year period. Therefore based on the previous paragraph, it is suggested
that six years of monthly, or 18 years of quarterly, or 36 years of six-monthly E. coli-free results
may be an acceptable alternative.
3.3 Surface water
Surface freshwaters (rivers, streams, lakes and impoundments) comprise those natural waters
that are open to the atmosphere and contain only relatively small quantities of dissolved
materials; generally (in New Zealand) much less than 1000 mg/L (Harding et al 2004).
Section 3.2 discusses groundwater. The DWSNZ treat springs as surface water, so they are
discussed in this section. Rainwater (usually roof water) is discussed in Chapter 19: Small and
Individual Supplies. Water UK (2012) summarises helpful ideas for catchment protection.
The convenience of having readily available and accessible sources of water rapidly renewed by
rainfall is offset somewhat by the susceptibility of surface waters to pollution from a variety of
diffuse and point sources. Point sources are clearly identifiable, have specific locations, and are
typically pipes and drains discharging wastes (Davies-Colley and Wilcock 2004).
In most catchments used for water supply, pollution will be from diffuse sources, arising from
land-use activities (urban and rural) that are dispersed across a catchment (Novotny 2003).
Diffuse sources include surface runoff, as well as subsurface drainage, resulting from activities
on land. The main categories of diffuse pollutants are sediment, nutrients and pathogenic
micro-organisms. Other categories of diffuse pollutants are heavy metals (principally from
urban land) and pesticides (mainly from agriculture and horticulture).
A summary of human activities that impinge on the suitability of freshwaters for potable water
is given in Table 3.5. Note that birds may be a significant source of faecal pollution in surface
waters as indicated by standard faecal indicators (eg, E. coli), and shed pathogens (eg, Giardia
cysts, Salmonellae and Campylobacter) (McBride et al 2002).
Guidelines for Drinking-water Quality Management for New Zealand 2013
105
Table 3.5: Human activities and associated inputs into freshwater ecosystems with human
health risks
Activity
Contaminants
Health risks
Agriculture and
horticulture
Sediments
Nutrients
Pesticides and other toxic chemicals and metals
Faecal microbial contaminants
Immune and endocrine disruption
Industry
Nutrients
Toxic chemicals and metals
Oils
Mining
Sediments
Toxic chemicals and metals
Urbanisation,
infrastructure and
development
Sediment
Pesticides and other toxic chemicals and metals
Oils
Faecal microbial contaminants
Recreation
Oils and fuel
Toxic chemicals
Retarded physical and cognitive
development, blue baby syndrome
Foetal malformation and death
Nervous system and reproductive
dysfunction
Behavioural changes
Cancers
Waterborne disease
Modified after Slaney and Weinstein 2004.
3.3.1
Rivers and streams
About half of New Zealand’s drinking-waters are pumped from the ground, with the remainder
coming from surface sources (MoH 2003). Flowing waters (rivers and streams) are thus an
important source of supply in New Zealand and there is a need to ensure that adequate quantity
and quality is maintained in order to provide a reliable and safe source.
Water quantity
It is advantageous to have a good understanding of the flow (or hydrologic) regime of a river
selected for water supply. The regime defines the character of a river, how liable it is to floods or
to have long periods of low flow, and whether it is useful for the purpose of water supply, and
whether impoundment is necessary to provide the volumes required (Duncan and Woods
2004). The continuous time-series record of river flow (hydrograph) can be analysed to estimate
the incidence of extreme flows as well as the response of flows to rainfall events.
During low flows there may simply not be enough water for supply, and intakes may even be
above low water levels.
During flood flows, water quality is often poor due to sediment discharged in high
concentrations, along with other contaminants, notably faecal micro-organisms, and coarse
floating material (logs etc) that are potentially damaging to intakes are often present.
Routine monitoring of New Zealand river flows began between 1900 and 1930 for hydroelectric
power generation, and from 1930 for designing flood protection works (Pearson and Henderson
2004). The National Hydrometric Network (Pearson 1998) is a key source of New Zealand
stream and river flow data, and is complemented by monitoring networks operated by regional
and district councils (Pearson and Henderson 2004).
Flow extremes, such as the frequency of floods of a given return period, or the mean annual
seven-day low flow, are useful in this respect and have been summarised for many New Zealand
rivers (eg, Hutchinson 1990).
106
Guidelines for Drinking-water Quality Management for New Zealand 2013
Flow regimes are perhaps most simply linked to the flow-duration curve, a cumulative frequency
plot of flows that shows the proportion of time during which flow is equal to or greater than
given magnitudes, regardless of chronological order. The overall slope of flow-duration curves
indicates the flow variability of rivers. Clearly rivers that have fairly steady flow (eg, owing to
spring or lake sources) are preferable for supply to highly flow-variable (flashy) rivers. Flowduration curves at the extremes are often fitted to analytical distribution functions for the
purpose of analysing risk of floods and low flows. For example, annual seven-day low flows are
often well-fitted by a log-normal distribution (Pearson and Henderson 2004). Flow regimes are
affected by climatic cycles, notably the El Niño-Southern Oscillation (ENSO), with stronger
westerly winds and more rainfall in the south and west during El Niño periods and less rainfall
in the south and west and more in the northeast during La Niña periods (Scarsbrook et al 2003).
Water quality
Water that has not been treated and is used for domestic supply is referred to as raw water, in
contrast with treated water that has passed through some form of treatment (eg, filtration,
disinfection). Drinking-water standards and guidelines mostly apply to treated waters.
There are few chemical constituents of water that can cause health problems from a single
exposure, except through massive accidental contamination of a drinking-water supply (WHO
2004). Where short-term exposure to a contaminant does not lead to health impairment, it is
often most effective to focus remedial action on finding and eliminating the source of
contamination, rather than on treating the water for removal of the particular chemical
constituent (WHO 2004).
Most chemicals posing a health risk are of concern only when long-term exposure occurs at
concentrations above the MAV, and where treatment to remove the chemical is not employed
(eg, Table 3.6). At times when flows and velocities are low, dissolved oxygen in small streams
may be very low because of sediment oxygen demand and insufficient reaeration (Wilcock and
Croker 2004). At such times appreciable concentrations of soluble, reduced forms of iron and
manganese may be released from anoxic sediments or from groundwater inflows which, on
contact with air, readily convert to insoluble oxide precipitates that have to be removed during
water treatment because they impart unpleasant metallic flavour to water and deposit reddishbrown (iron) or black (manganese) stains.
Table 3.6: Some chemical constituents in untreated surface water used for drinking-water
supply that present a potential problem
Constituent of concern
Associated problem
Arsenic (As)
Cancer, skin lesions
–
Fluoride (F )
Mottling of bones and teeth, fluorosis
–
–
Nitrate and nitrite (NO 3 + NO 2 )
Methaemoglobinaemia for bottle-fed infants. Note that this has been
disputed recently, see Lundberg et al (2004) and Addiscott et al (2004)
Dissolved organic carbon
Trihalomethanes produced by chlorination may be toxic, carcinogenic
Iron (Fe), manganese (Mn)
Unpleasant taste, discoloration caused by oxide precipitates
Other water quality chemical variables are important with regard to operational requirements
when water is treated for supply. These include: pH (a measure of the aggressiveness of water
with respect to corrosion), alkalinity (capacity to buffer natural waters against pH change), total
hardness (mainly divalent ions like Ca2+ and Mg2+ that are prone to forming precipitates), humic
substances that impart undesirable colour to water and can influence corrosion of copper, and
total dissolved solids (affects palatability when greater than about 1000 mg/L).
Guidelines for Drinking-water Quality Management for New Zealand 2013
107
Typical New Zealand rivers and stream waters can be described as being dilute (low total
dissolved solids), soft (having low concentrations of Ca2+ and Mg2+), with neutral to slightlyalkaline pH and weak buffering (low-moderate alkalinity). They may be broadly described as
calcium-sodium-chloride-bicarbonate waters (Close and Davies-Colley 1990). There are some
notable exceptions to this where, for example, pH is low and bicarbonate alkalinity is high, or
total hardness exceeds 100 mg/L as CaCO 3 . These are generally well-documented through the
regular surveillance programme of surface waters used for drinking-water supplies, operated by
the Ministry of Health (MoH).
Microbial pollutants are generally of greater relevance to New Zealand surface waters than
chemical pollutants. Most microbiological agents of disease (pathogens) are derived from the
faeces of warm-blooded animals including humans. The presence of pathogens in waters is
sporadic, only occurring when waters are polluted by faecal matter from sick individuals or
carriers.
The Freshwater Microbiology Research Programme (McBride et al 2002) involved the
monitoring of 22 river and three lake sites for a suite of pathogen and indicator organisms,
fortnightly for 15 months (1998–2000). Of the 25 sites, five were source waters for treatment as
community drinking-water supplies, of which three were also recreational sites. Pathogenic
viruses and Campylobacter were detected at least once at all sites and there was little difference
between the drinking-water supply sites and the other sites with respect to the occurrence of
pathogens and concentrations of faecal indicator organisms.
The main issue for source waters was the high proportion of samples that contained
Campylobacter (60 percent) and viruses (54 percent) and the ability of drinking-water
treatment to kill (or inactivate) or remove them (McBride et al 2002).
Routine testing for pathogens is seldom conducted because a wide range of pathogens might
conceivably be present, but the tests are expensive at the required detection levels so testing for
several pathogens in each sample quickly becomes prohibitive. Instead, microbiological
indicators of faecal pollution, such as the bacterium Escherichia coli (E. coli) that is ubiquitous
in faecal matter, are used also as indicators of disease risk.
Lowland streams in New Zealand continue to receive discharges from community sewage
schemes, farm oxidation ponds and other point sources (NZWWA 1998; Wilcock et al 1999), but
diffuse sources of faecal pollution are now generally dominant.
Faecal contamination of streams can be very high during floods owing to mobilisation of
contaminated sediments and wash-in from contributing land areas of catchments. For example,
E. coli concentrations of 41,000 MPN/100 mL were measured in a flood event in an agricultural
stream, compared with a pre-flood level of about 100 MPN/100 mL (Nagels et al 2002).
Diffuse faecal microbial pollution from pastoral agriculture may come from runoff (eg, from
farm raceways), livestock accessing unfenced streams, and cattle crossings of streams (DaviesColley et al 2004). Thus, it is important that key land uses within the catchment of rivers being
used for water supply are known so that implications for water treatment are understood. Some
median concentrations reported for New Zealand streams and rivers are shown in Table 3.7.
Sediments are the main reservoir of faecal contamination with concentrations of E. coli
approximately 1000 times baseflow levels (Muirhead et al 2004). These sediment stores are
mobilised by storm-flows that may have much higher (100 times) E. coli concentrations than
base flows (Nagels et al 2002).
108
Guidelines for Drinking-water Quality Management for New Zealand 2013
Table 3.7: Faecal contamination in a range of New Zealand streams and rivers
Region
Land use
Reference
Median E. coli
1
(number per 100 mL)
2
Whanganui catchment tributaries
(steep hill country)
100% pasture
830
Davies-Colley & Stroud (1995)
Waikato hill-country
Pasture
Native forest
Pine forest
400
100
83
Donnison et al (2004)
Waikato lowland
Dairy
280–440
Davies-Colley and Nagels
(2002)
Westland lowland
Dairy
Native forest
60–1000
4
Ibid
Low-elevation rivers throughout
New Zealand
Pasture
700
Larned et al (2004)
1
Note that these data are from fixed-interval sampling and are generally taken at low-flows. Flood-flow
concentrations may be 100-fold higher.
2
Based on a median faecal coliform concentration of 920 cfu/100 mL, assuming about 90 percent E. coli.
3.3.2
Lakes and reservoirs
Lakes and reservoirs are used to store water during runoff periods for use during other times of
the year. The water in the reservoir is used to supply the needs of municipalities, industrial users
and the farming community, and can also be used to protect aquatic life by maintaining a
continual flow in the stream downstream of the reservoir.
Issues related to land-use and lake management were covered in the Lake Managers’ Handbook
which was published by the Ministry of Works in 1987 and is still widely used and highly
regarded by lake managers and others involved in water management. Aspects of this
publication were updated by MfE in 2002.
An impoundment can range in size and impact from:
•
a weir where the water supplier takes ‘run of flow’; this is usually when the required water
volume is small compared with the flow in the river,
to:
•
a multi-day retention impoundment; this is usually a stream, where winter flows are stored to
meet the summer water demand.
A weir doesn’t change the water quality very much. A multi-day retention impoundment can
modify the water quality in the impoundment and downstream, and the quality of the water in
the impoundment can vary with depth, see Chapter 4: Source and Treatment Selection,
section 4.4.1. Off-river storage is discussed in Chapter 12: Pretreatment Processes.
Excessive productivity of phytoplankton (eutrophication) is probably the main water quality
problem in New Zealand lakes and is manifested by algal scums, turbid waters, deoxygenation of
bottom waters, unpleasant tastes and odours, and excessive macrophyte (aquatic weed) growth
(Vant 1987).
Blooms of blue-green algae (cyanobacteria) may release toxins at a level that is harmful to
human health when critical concentrations (about 15,000 cells/mL for contact recreation) are
exceeded, see Chapter 9.
Guidelines for Drinking-water Quality Management for New Zealand 2013
109
Phytoplankton blooms can also impart unpleasant taste and odours to water that may require
costly forms of treatment. Faecal contaminants are less problematic in standing waters than in
rivers and streams, because of inactivation by lengthy exposure to sunlight and other
inactivation processes, predation and sedimentation (Auer and Niehaus 1993).
Algal blooms occur in lakes with high nutrient concentrations, such as those in pasture
catchments, during periods of calm, fine weather when high sunlight and stratification permit
algal cells to occur. Elevated concentrations of nutrient elements (N and P) are associated with
intensification of agricultural land. Furthermore, diffuse runoff from farms contributes inputs of
faecal matter and potentially, pathogens. Thus it is important that impounded waters being used
for drinking-water supply have in place ways of intercepting runoff, such as riparian buffer
zones that trap N, P and faecal microbes; protected wetlands that enhance N removal by
denitrification; and adequate fencing to keep stock away from waters and hence, minimise
inputs of faecal matter (Williamson and Hoare 1987). These issues are explored more fully in
section 3.5.1.
Prevention by riparian strips and control of land use etc is more effective than using algicides
such as copper sulphate. Algicides have difficulty in removing an algal bloom; they are more
effective at preventing a bloom if dosed early enough. Risk management issues relating to
algicides are discussed in the MoH Public Health Risk Management Plan Guide PHRMP Ref.
P4.1: Pretreatment Processes – Algicide Application.
Reservoir catchments that are predominantly native or plantation forest are likely to have lower
specific yields (kg/ha/y) of pollutants such as sediment and nutrients (Table 3.8).
Table 3.8: Specific yields (kg/ha/y) for different land uses in New Zealand
Land use
SS
TN
TP
Intensive dairy
142
35
1.16
Average grazed pasture
600–2000
4–14
0.3–1.7
Urban development
200–2000
2.5–11
0.4–1.6
Plantation forest – disturbed
300–2000
0.06–0.8
0.4–8
500
0.07–0.2
0.15
27–300
2–7
0.04–0.68
Plantation forest – undisturbed
Native forest
Source: Davies-Colley and Wilcock (2004). SS = suspended sediment; TN = total nitrogen; TP = total phosphorus
Rates of water movement in lakes (and reservoirs) are very slow by comparison with rivers and
this permits water composition to change substantially between inflows and outflows as well as
allowing large variations within different parts of a lake (Hamilton et al 2004). Density
stratification related to gradients in water temperature within deeper lakes results in contrasting
water chemistry in the upper (epilimnion) versus lower (hypolimnion) water layers.
The mean water residence time (in days) of a lake is given by:
τ = V/Q
where Q is the outflow (m3/day) and V is the lake volume (m3). τ varies from several hours to a
few days for reservoirs behind dams on rivers, and to several years for lakes where the lake
catchment is small relative to the lake volume (Hamilton et al 2004).
110
Guidelines for Drinking-water Quality Management for New Zealand 2013
Phytoplankton are less of a problem in lakes and reservoirs with short residence times (weeks –
months) because cells tend to be washed out faster than they can multiply (Howard-Williams
1987). For flushing to be effective as a means of controlling algal biomass, the lake inflow must
be large enough, and there must be control facilities that allow the inflow to be regulated. Rapid
flushing of lakes may prevent buoyant scums formed by blue-green algae, by creating instability
in the water column (and reducing average light exposure and bicarbonate availability) through
increased circulation. Other methods for reducing nutrient concentrations and thus lowering
algal biomass in lakes include diversion of waters with high nutrient loads, and flocculation to
strip P from the water column by converting it into a solid form that settles (eg, alum was used
to strip P from Lake Okaro, in a trial in 2004).
At times when lakes are thermally stratified, hypolimnion waters may become deficient in
dissolved oxygen (anoxic or anaerobic) causing many constituents to occur in a reduced state
(eg, inorganic N will be predominantly NH 4 +).
By comparison, the epilimnion is well-oxygenated through exchange with the atmosphere, and
constituents are nearly always in an oxidised form (eg, NO 3 - is the dominant form of
inorganic N). Thus, waters drawn from deeper waters may undergo changes associated with
oxidation, when passing through a water treatment plant.
The Hayes Creek reservoir (supplying Papakura District Council) has anoxic bottom waters
containing reduced Mn2+ that is readily oxidised to form black precipitates of MnO 2 on exposure
to air. To prevent this, an oxygen curtain is deployed upstream of the water intake to oxidise the
Mn before it gets to the treatment plant.
H 2 S from geothermal sources, or produced by anaerobic metabolism of SO 4 2- in sediments, may
produce the rotten egg smell associated with many highly reducing environments (Hamilton
et al 2004). A recent example of this was H 2 S produced during decomposition of drowned
vegetation and soils in the lake formed by the Opuha dam in South Canterbury (Hamilton et al
2004).
Lakes and reservoirs can be aerated purposefully in order to reduce stratification. A common
result of destratification is an improvement in water supply quality (the first artificial circulation
system was used in 1919 in a small water supply reservoir). By introducing oxygen to the
(previously anoxic) hypolimnion, problems caused by reduced iron and manganese, and gases
like H 2 S, are greatly reduced as well.
The purpose of aeration in lake management is to increase the dissolved oxygen content of the
water. Various systems are available to help do this, either by injecting air, mechanically mixing
or agitating the water, or even injecting pure oxygen. Aeration can increase fish and other
aquatic animal habitats, prevent fish kills, improve the quality of domestic and industrial water
supplies, and decrease treatment costs. In some cases, nuisance algal blooms can be reduced, or
a shift to less objectionable algae species can occur. Risk management issues relating to
destratification are discussed in the MoH Public Health Risk Management Plan Guide PHRMP
Ref. P4.2: Pretreatment Processes – Destratification.
See Chapter 12: Pretreatment Processes, section 12.3.2: Off-river Storage for further
information, including some information re reduction times for selected micro-organisms.
Guidelines for Drinking-water Quality Management for New Zealand 2013
111
3.3.3
Springs
Springs are sources of emergent groundwater and may have very long or very short path lengths
from their source surface waters. When used for drinking-water supply, springs should provide
a reliable (continuous) supply of water and be of suitable quality. Non-piped water supplies,
such as water collected from bores or springs, may often be contaminated with pathogens and
require special treatment to achieve safe supply (WHO 2004). Even if spring water has reached
the surface from a great depth, it is likely to contain sub-soil water too. Springs can be
contaminated at the point at which water issues from the ground, eg, if animals are permitted to
graze nearby. Waterfowl may also contribute to high levels of E. coli around springs. Springs can
also be contaminated by runoff from the catchment that they drain, or from the soils that
surface water passes through before re-emerging in spring water. For example, springs draining
the Bombay Hills market garden areas have high nitrate concentrations that sometimes exceed
the drinking-water Maximum Acceptable Value (MAV) of 50 mg/L (as NO 3 ) (Wilcock and
Nagels 2001).
Geothermal springs may contain health-significant concentrations of toxic chemicals, ie,
arsenic, mercury and sometimes, elevated levels of fluoride. Geothermal springs are not used for
drinking-water supply in New Zealand but they do influence the chemical composition of
Waikato River, which is a major water supply resource, by contributing significant levels of
arsenic and boron. Median concentrations of As and B are 0.028 mg/L and 0.26 mg/L
respectively, at the nearest upstream site to the Hamilton water treatment intake (Smith 2003).
The drinking-water MAVs for these elements are 0.01 and 1.4 mg/L, respectively (MoH 2005).
Some water supplies may be drawn from hydrothermal springs, where the water temperature is
higher than expected. These should be tested for the chemicals mentioned above.
Risk management issues relating to springs are discussed in the MoH Public Health Risk
Management Plan Guide PHRMP Ref. P1.4: Groundwater Abstraction – Springs.
3.4 Legislation
3.4.1
General
Catchment protection involves firstly defining the boundaries of the catchment and determining
who is responsible for the catchment and the protection of the water quality. A catchment is the
drainage area upstream of the raw water abstraction point, or the aquifer and recharge zone of a
groundwater system. The Resource Management Act 1991 allows (Schedule 2 Part1 clause 1),
but does not require, provision to be made in a District Plan for ensuring an adequate supply of
water with regard to the subdivision of land. Steps taken to achieve this could include:
•
leak reduction plans to be implemented before increased abstraction is allowed
•
assessment of environmental effects of capital expenditure projects, including assessment of
the costs they impose on present and future drinking-water supplies
•
domestic water saving programmes
•
industrial water use audits and water-use efficiency programmes
•
development of alternative supplies
•
restrictions on new subdivisions where the regional and district plans do not provide for an
adequate drinking-water supply.
112
Guidelines for Drinking-water Quality Management for New Zealand 2013
Under the Local Government Act 2002 (Part 7, Subpart 1, sections 124–126) a territorial
authority is obliged to assess the provision of water services and other sanitary services within
its district, and describe the means by which drinking-water is obtained by residents and
communities in the district, including the extent to which water supply is provided within the
district by the territorial authority or other persons. It must also describe whether the water is
potable (section 126(1)(i)(B)), and make an assessment of:
•
any risks to the community relating to the absence in any area of either a water supply or a
reticulated wastewater service or both (section 126(1)(b))
•
the quality and adequacy of supply of drinking-water available within the district for each
community (section 126(1)(c)
•
a statement of current and estimated future demands for water services within its district
(section 126(1)(d))
•
any issues relating to the quality and adequacy of supply of drinking-water for each
community (section 126(1)(d)(i))
•
a statement of the options available to meet the current and future demands for drinkingwater (section 126(1)(e))
•
the suitability of each option for the district and for each community within it.
The territorial authority is also to provide:
•
a statement of the territorial authority’s intended role in meeting the current and future
demands (section 126(1)(f)) identified in section 126(1)(d) and proposals for meeting the
current and future demands identified in section 126(1)(d), including proposals for any new
or replacement infrastructure.
A local government organisation that is defined under the Local Government Act 2002 to mean
“a local authority, council-controlled organisation, or a subsidiary of a council-controlled
organisation, that provides water services”, that provides water services to communities within
its district or region must continue to provide water services and maintain its capacity to meet
its obligations. It must not lose control of, sell, or otherwise dispose of, the significant
infrastructure necessary for providing water services in its region or district, unless, in doing so,
it retains its capacity to meet its obligations.
Local government organisations must not close down or transfer a water service unless there are
200 or fewer persons to whom the water service is delivered who are ordinarily resident in the
district, region, or other subdivision; the opinion of the MoH has been made public; and
75 percent or more of the public have agreed.
A local government organisation may only close down a water service under section 131(1)(a) if
it has first reviewed the likely effect of the closure on the public health of the community that
would be affected by the closure (section 134(a)(i)); on the environment in the district of that
community (section 134(a)(ii)); and assessed, in relation to each property that receives the water
service, the likely capital cost and annual operating costs of providing an appropriate alternative
service if the water service is closed down (section 134(b)); compared the quality and adequacy
of the existing water service with the likely quality and adequacy of the alternative service
referred to (in section 134(b)) identified above (section 134(c)).
A local government organisation may enter into contracts for any aspect of the operation of all
or part of a water service for a term not longer than 15 years (section 136(1)).
Guidelines for Drinking-water Quality Management for New Zealand 2013
113
A local government organisation may only transfer a water service under section 135 if it has
first:
•
developed a draft management plan under which the entity representative of the community
would maintain and operate the water service
•
assessed the likely future capital and operating costs of the entity representative of the
community to maintain and operate the water service
•
assessed the ability of the entity representative of the community to maintain and operate the
water service satisfactorily.
Knowledge of localised hydrological conditions that contribute towards water quantity and
quality are essential for the design and implementation of a catchment protection scheme.
These conditions include natural inputs such as seasonal rainfall variations or regional geology,
and man-made inputs such as agricultural chemicals, industrial and domestic wastes, erosion
and animal activity. Once all factors contributing to water quality have been identified, the
planning and design of a catchment protection strategy can commence.
The prime objective of a catchment management strategy, or planning for a drinking-water
supply, should be to protect and, if necessary (and achievable), to enhance the quality of source
waters. The rules in the plans define the activities that can take place in the catchment.
Current legislation allows for the protection of the quality and other aspects of the source
waters. The predominant legislation under which this can be achieved is the Resource
Management Act 1991 (RMA), with its key purpose as the sustainable management of natural
and physical resources, and the Health Act 1956 (HA).
Regional councils have responsibility, under the RMA (s30(1)(c)), to control land use in order to
protect the water quality within their respective catchments. Responsibilities include controls
over the use and diversion of source water (RMA s30(1)(e)), discharge of contaminants into the
water (RMA s30(1)(f)), and in relation to any bed of a water the planting of vegetation on land
for the purpose of maintaining and enhancing of water quality of that water body (RMA
s30(1)(g)). Regional plans, district plans and resource consents under the RMA are the main
tools for managing the water quality of source waters.
However, although these tools are available under the RMA, they are frequently not used
effectively. The provisions for drinking-water values in regional plans are an example of this.
Thus, only six of sixteen regional councils or unitary authorities have a comprehensive approach
to the management of drinking-water catchments. The other councils have either not addressed
the issue, or have done so in a very general way (Ministry for the Environment 2004).
There is currently (2005) no specific requirement in the Resource Management Act for
consent authorities to consider the impact proposed activities may have on source water in a
drinking-water supply catchment. Consequently there is potential for land use activities/
discharges to be consented that reduce water quality at the point of abstraction to below that
which the plant is designed to treat. This presents potential health risks to the community and
may result in significant costs to the supplier in upgrading treatment facilities.
Part 7 (section 126) of the Local Government Act (2002) requires local authorities to undertake
a specific assessment of the quality and adequacy of drinking-water supplies. However there is
no requirement to manage source water quality, which is the aim of the National Environmental
Standard (NES), see next section.
114
Guidelines for Drinking-water Quality Management for New Zealand 2013
While section 5 of the Resource Management Act refers to social, economic and cultural wellbeing for people and communities, there is no specific requirement for consent applicants to
consider the impact of their proposed activity on community drinking-water supplies. Whilst it
can be argued that the definition of environment in the Resource Management Act includes
public health, there is no specific reference to community drinking-water supplies in the Act.
The Ministry for the Environment has produced a National Environmental Standard under the
Resource Management Act to improve how drinking-water is managed at source. This standard
is intended to complement Ministry of Health legislation and standards for improving drinkingwater supply and delivery; see section 3.4.2.
The Health Act 1956 allows the Governor General to declare, by Order in Council, any water
supply source, whether publicly or privately owned and operated, to be under the control of a
territorial authority if this is necessary in the interests of public health (HA section 61(2)). The
Health Act also makes it an offence to create a nuisance or to allow a nuisance to continue (HA
section 30) including allowing a water source to be offensive, liable to contamination, or
hazardous to health (HA section 29(p)).
Catchments dedicated for water supply purposes and under the control (by ownership and/or
declaration) of a territorial authority or regional council, may be controlled simply by the use of
bylaws. The Model General Bylaws for Water Supply define appropriate management controls
for the protection of water quality in such catchments (NZS9201: Chapter 7: 1994). There are
circumstances where specific legislation has been developed that relates to water supply, for
example the Wellington Regional Water Board Act 1972. This Act is an important statute for the
regional council under which it holds large areas of land in the Wellington metropolitan area.
In other situations abstractions for water supply are often only one of many demands on the
water resource. In this case the catchment management strategy or plan will need to be
incorporated within the overall regional (or district) plan process.
The reliability of production of a continuous, adequate, supply of safe water from a large river or
active catchment will be enhanced by use of off-river storage. This offers the ability to choose
when raw water should be abstracted, thus avoiding periods when water treatment may be
difficult, or when the river may be contaminated.
3.4.2 National Environmental Standards (NES)
For the multi-barrier principle to be implemented properly in the management of drinkingwater supplies, the water supplier needs to be able to put barriers to contamination in place
from the water source through to the consumer’s property. In the past, this has been difficult for
many water suppliers in New Zealand by legislation which separated responsibilities for
catchment management from those of treatment and reticulation of water. The Resource
Management Act (1991) makes regional councils responsible for the management of source
catchments, while health legislation makes water suppliers responsible for the water supply
from the point of abstraction to the consumer.
To ensure that the supply of water for drinking-water production is taken into consideration when
decisions are made regarding activities in catchments, the Ministry for the Environment
developed a national environmental standard for raw public drinking-water, ie, source water. The
original proposal for this standard was that it be a grading standard. This approach did not require
a minimum water quality to be achieved, but it proposed the generation of a grade for the raw
water to assist communities in making decisions about the management of their water resources.
Guidelines for Drinking-water Quality Management for New Zealand 2013
115
Following public consultation, the form of the NES was revised in early 2005. It is now a
narrative standard. The National Environmental Standard for Sources of Human Drinking
Water came into effect on 20 June 2008. The standard is intended to reduce the risk of
contaminating drinking water sources. See MfE (2009): Draft Users’ Guide.
The standard requires regional councils to ensure that effects on drinking-water sources are
considered when making decisions on resource consents and regional plans. Specifically, the
standard requires that regional councils:
•
decline discharge or water permits that are likely to result in community drinking-water
becoming unsafe for consumption following existing treatment
•
be satisfied that permitted activities in regional plans will not result in community drinkingwater supplies being unsafe for consumption following existing treatment
•
place conditions on relevant resource consents requiring notification of water suppliers and
consent authorities if significant unintended events occur that may adversely affect sources of
human drinking-water.
The draft standard was refined following public consultation in 2005 and several key changes
were made based on submissions received. These included:
•
applying the consent component of the NES to water and discharge permits only
•
assigning regional councils (not territorial authorities) the primary responsibility for
implementing the majority of the standard (reflecting existing responsibilities and expertise
in water quality)
•
increasing the community water supply population threshold for application of the standard
from 25 to 500 people, to reduce implementation costs.
The Ministry for the Environment will produce guidance material to assist regional councils and
consent applicants apply the new standard.
The NES is a regulation, so it is binding and prevails over rules and resource consents. More
details of the standard are available at the following
link: http://www.mfe.govt.nz/laws/standards/drinking-water-source-standard.html
The NES covers emergency notification provisions. Under the NES, emergency notification
refers to the notification (preferably by phone) of authorities when an unintended activity
occurs (this differs from the notification of a consent application under the RMA). One key
difference between emergency notification provisions and previous parts of the regulation is
that they now apply to a smaller population threshold: activities with the potential to affect
registered drinking water supplies that provide 25 or more people with drinking water for 60 or
more days of a calendar year must be notified.
116
Guidelines for Drinking-water Quality Management for New Zealand 2013
3.5 Mitigation of pollutants and catchment
protection
Water contamination may arise from a variety of sources, including seepage from pipelines,
human and animal effluent, landfill leachate, industrial effluent disposal, use of pesticides and
fertilisers, mining, leakage from underground tanks, transportation accidents, salt water
intrusion, and poorly constructed bores or bore head protection, see Figure 3.3. Groundwater
contamination usually occurs in a far less conspicuous manner than surface waters, and is
discussed in section 3.2. This section discusses catchment protection and the mitigation of
surface water contamination. Some nutrient and sediment control practices are discussed in
MfE (2002). Refer also to Appendix 4 of MfE (2009): Activities and Contaminants that may
Contribute to Source Waters.
If a river has the potential to receive contamination that the treatment plant is not designed to
remove, consideration should be given to the use of off-river storage. Off-river storage is
discussed in more detail in Chapter 12: Pre-treatment Processes. Chapter 4: Source and
Treatment Selection also includes some discussion on catchment protection, mainly related to
micro-organisms.
Figure 3.3: Catchment protection
3.5.1
Rural activities
Whilst drinking-water catchments ideally should be devoid of inputs of human and animal
waste, in reality total absence is rare. Typically, therefore, the water treatment process can
benefit from attempts to mitigate such pollution at or near to its source. Wastes from animals
are known to contain nutrients, pathogens, heavy metals and endocrine-disrupting chemicals,
all of which can be transferred to water bodies by the deposition of urine and faecal material
directly to a stream or lake, and via surface and subsurface flow pathways.
Guidelines for Drinking-water Quality Management for New Zealand 2013
117
A number of mitigation options exist to reduce this transfer, although the research to-date
typically has excluded heavy metals and endocrine-disrupting chemicals, focusing upon
sediment, nutrients and faecal microbes. However, treatment systems that effectively remove
sediment might be expected to also remove metals (eg, cadmium from phosphatic fertilisers;
zinc used for facial eczema treatment). If the water supplier owned the catchment, they would be
able to control most land uses and activities. Some have done this, then converting from
pastoral farming to forestry.
Farming (general)
The effect of agriculture on water quality is dependent on the size of the catchment vs the flow in
the river (or volume of the lake), the type, intensity and management of farming, and climatic
effects.
Problems commonly arise from animal wastes, especially from cowsheds, holding pens, holding
paddocks and yards, and whether the animals have direct access to water. Problems can also
arise from septic tanks wastes, and the transport, storage and use of pesticides and fertilisers.
Approaches that can be considered for mitigating these effects include:
•
allowing only approved animals
•
specifying stocking rates and grass/fodder length
•
standards for fencing
•
installing riparian strips – specifying size, planting
•
adopting approved fertiliser application rates
•
using approved fertiliser applicators
•
using approved pesticides and application rates
•
using approved pesticides applicators
•
requiring bunded chemical storage areas
•
instituting waste controls and treatment, including dairy shed, offal pits, sheep dips etc
•
introducing holding paddock/yard/pen waste controls (pens include buildings for pigs,
chickens, saleyards, etc).
A study of the public health issues associated with stock accessing waterways upstream of
drinking water off takes in Australia was reported by the Victorian Department of Health (2011);
risks to public health were estimated to be 5 log above tolerable levels. This report includes the
statement that the costs of outbreaks overwhelmingly exceed the costs of their prevention. A key
finding was that the major source of risk posed by Cryptosporidium parvum in typical grazing
water supply catchments arises from pre-weaned calves and lambs. Removing calves and lambs
from the catchment or housing them in hydrologically isolated areas can reduce the risk by
approximately 3 log.
WHO (2012) stated that although there are a large number of zoonotic pathogens that affect
humans, five are known to cause illness around the world with high-frequency:
Cryptosporidium, Giardia, Campylobacter, Salmonella and E. coli O157. Efforts to control
these pathogens are likely to be effective in controlling other related zoonotic pathogens whether
known, as-yet-unrecognised or emergent. Domestic animals such as, poultry, cattle, sheep and
pigs generate 85 percent of the world’s animal faecal waste, proportionally a far greater amount
than the contribution by the human population. The faecal production rate and contribution to
the environment of these animals can be as high as 2.62 × 1013 kg/year. Limiting zoonotic
pathogen-shedding in farm or production facilities for domestic animals should be
118
Guidelines for Drinking-water Quality Management for New Zealand 2013
accomplished by preventing illness in livestock, through minimising exposure to pathogens, by
increasing immunity, by manipulation of the animal gastrointestinal tract microbial ecology and
by managing (including treating) animal waste to reduce the release of zoonotic pathogens into
the environment.
See DWI (2012) for a discussion on the effect of veterinary medicines on water, where the usage,
treatment regimes, metabolism, environmental fate and toxicity of around 450 active
ingredients in use in the UK were assessed. Twenty-six substances were identified of potential
concern and these were then evaluated using more complex modelling approaches for
estimating exposure levels in raw waters and for estimating removal in different drinking water
treatment processes. For 14 of the 26 selected priority veterinary medicines, the estimated
intakes from conventional or advanced treated water were less than 10 percent of the Acceptable
Daily Intake (ADI) for all sections of the population evaluated. It is concluded, therefore, that
these 14 veterinary medicines — albendazole, amoxicillin,* chlortetracycline, chlorsulon,*
cypermethrin, cyromazine, diazinon, enrofloxacin, eprinomectin, lasalocid, salinomycin,
tiamulin, trimethoprim and tylosin — are not a potential risk to consumer health. Very minor
exceedances of the guide value (equivalent to 10 percent of the ADI) in all populations assessed
were found for a further two compounds: halofuginone and tilmicosin. However, these were not
considered to be a potential risk to consumer health. For the remaining 10 compounds (acetyl
salicylic acid,* altrenogest, apramycin, cefapirin,* dicyclanil, florfenicol, lincomycin, luprostiol,*
monensin, sulfadiazine), the worst case predicted exposure levels, based on consumption of
either raw (environmental) water or conventionally treated water were close to or exceeded ADI
values. In some cases the predicted levels of exposure significantly exceeded ADI values. The
highest exceedances of ADI values arose from exposure to water sourced from groundwater.
There is some evidence that the groundwater model that was used in the study significantly over
estimates actual concentrations in the real environment. In the advanced water treatment
scenario, worst case predicted exposure estimates only exceeded the ADI value for four
compounds (acetylsalicylic acid, florfenicol, lincomycin and luprostiol). All of these ADI
exceedances were related to the groundwater scenario. Those marked * are not on the ACVM
Register as at 2012.
Managing pesticides – an interesting trial
At the request of the UK Government, the Crop Protection Association (CPA) was asked to
develop its thoughts on a focused approach towards minimising the environmental impacts of
pesticides as an alternative to a proposed pesticide tax. In collaboration with other farming and
crop protection organisations, the CPA prepared a five-year programme of voluntary measures.
In April 2001, after public consultation, the Government accepted this approach as an
alternative way forward, now known as The Voluntary Initiative. Early results in some
catchments show up to 60 percent reductions are possible (The Voluntary Initiative 2005). Key
measures that have been identified as needing a high level of farmer uptake include:
•
Crop Protection Management Plans: a self-assessment which helps farmers review the
potential environmental risks associated with crop protection on their farm
•
The National Sprayer Testing Scheme: ensures that the spray equipment is correctly
maintained and capable of applying the product accurately with no leaking joints or drips
•
The National Register of Sprayer Operators: recruited over 20,000 active professional spray
operators who are being encouraged through continuous professional development to obtain
extra training and information.
Guidelines for Drinking-water Quality Management for New Zealand 2013
119
Treatment of dairy farm effluent
Historically, the most common form of treatment for dairy farm effluent has been by a two-pond
system combining both an anaerobic and facultative pond (Sukias et al 2001). This method is
efficient at removing sediment and biochemical oxygen demand (BOD), but high concentrations
of nutrients and pathogens can remain (Hickey et al 1989), often discharging directly to a
waterway.
Following the introduction of the Resource Management Act in 1991, land treatment of dairy
effluent is now favoured by most regional councils. This approach, relative to the two-pond
system, generally results in a marked reduction in the loss of nutrients and pathogens to
waterways. However, excessive levels of these pollutants can still occur. For example,
Houlbrooke et al (2004a) reported that 2–20 percent of nitrogen and phosphorus applied to
land with dairy effluent is leached directly through the soil profile to enter a water body. Whilst
reducing the propensity for surface runoff, artificial subsurface drains are known to transfer
both nutrients and pathogens to water bodies (Monaghan and Smith 2004; Ross and Donnison
2003). Pollutant transfer via drainage can occur under both grazed and irrigated systems.
The success of land treatment of wastes depends strongly upon soil type. For example, Aislabie
et al (2001) showed poorly drained gley soils to be much less efficient than allophanic and
pumice soils in attenuating bacterial indicators applied in effluent. Generally, soils with a fine
structure and absence of macropores are more appropriate for receiving and treating effluent
and, faecal material deposited by grazing animals.
Improved timing of effluent application to land, ie, through avoiding irrigation of effluent
during wet weather, has been shown to reduce pollutant transfer to waterways (Monaghan and
Smith 2004). Deferred irrigation, which involves effluent storage until a suitable soil water
deficit arises, has resulted in only 1 percent of applied nutrients reaching subsurface drains
(Houlbrooke et al 2004b).
Recent studies using constructed wetlands have shown potential in the treatment of drain flows
under grazed dairy pasture, particularly with respect to nutrients (Tanner et al 2005). This
approach is also applicable to drainage flows generated by the application of effluent to land.
Advanced pond systems are an alternative to the land application of effluent. These consist of
four types of ponds arranged in series (an advanced facultative pond, a high rate pond, algal
settling ponds, and a maturation pond) that result in effluent of a considerably higher quality
than the traditional two-stage oxidation ponds (Craggs et al 2004).
Riparian buffer strips
In addition to subsurface processes, agricultural pollutants can be transferred to waterways by
surface runoff generated under rainfall (Houlbrooke et al 2004b; Collins et al 2005). Riparian
buffer strips are a potential means of attenuating pollutants carried within surface runoff, with
the dense vegetation of the buffer encouraging infiltration of the runoff and deposition of
particulates.
The efficiency of riparian buffers varies with topography, soil type and the magnitude of a rain
event (Parkyn 2004; Collins et al 2004). In addition, soluble nutrients, clay-sized particles, and
free-floating (ie, unattached to soil or faecal material) faecal microbes are less susceptible to
deposition and, therefore, are less readily attenuated than particulates.
120
Guidelines for Drinking-water Quality Management for New Zealand 2013
Riparian management guidelines are available with respect to control of nutrients and sediment
(Collier et al 1995) and faecal microbes (Collins et al 2005). Some regional councils are also
developing guidelines for their parts of the countryside, eg, Auckland Regional Council (2001),
and Environment Canterbury (based on ECan 2003). Also, government departments have
issued guidelines for managing waterways on farms (MfE/MAF 2001). A summary of recent
research in this area is now available (MAF 2004, MAF 2006a).
Vegetated buffer strips were tested to see if they were effective at removing Cryptosporidium
during rainfall rates of 15 or 40 mm/h for four hours. Buffers were set on a slope of
5–20 percent and soil textures consisted of silty clay, loam, or sandy loam. It was found that
vegetated buffer strips consisting of sandy loam or higher soil bulk densities had a 1 to 2 log
reduction/m. Buffers consisting of silty clay, loam, or lower bulk densities had a 2 to 3 log
reduction/m. Also, it was found that vegetated buffer strip made of similar soils removed at least
99.9 percent of Cryptosporidium oocysts from agricultural runoff when slopes were less than or
equal to 20 percent and had a length of at least three metres (Atwill et al 2002 – reported in
Appendix E of USEPA 2009).
Natural wetlands
Near-channel saturated areas or wetlands are found extensively in pastoral landscapes in New
Zealand. These typically develop where steep hill slopes cause the convergence of surface and
subsurface flows, or where an impervious layer exists within the soil profile. Such wetlands have
been shown to attenuate nitrate through the process of denitrification, provided that water
moves through a wetland slowly enough (Burns and Nguyen 2002; Rutherford and Nguyen
2004).
Modification of wetland drainage through cattle trampling, installation of subsurface drains or
artificial channels is, therefore, likely to diminish their pollutant attenuating properties. Cattle
are attracted to the smaller, shallower areas of the wetlands for grazing, and excluding stock
from them is likely to yield improvements in wetland bacterial water quality (Collins 2004). For
guidelines for constructed wetlands treatment systems for dairy farms, see Tanner and
Kloosterman (1997).
Preventing direct deposition to waterways
Faecal contamination of freshwaters arises, where animals have access, through the deposition
of faeces directly into waterways. Direct deposition can occur when cattle cross a stream on the
way to or from the milking shed (Davies-Colley et al 2004) and through sporadic incursions into
the water at access points along the stream bank (Bagshaw 2002). Bridges and the fencing of
stream banks are the key mitigation measures for each of these processes, although providing
alternative water sources (drinking troughs) can also reduce sporadic incursions, reducing
faecal contamination of waterways (Sheffield et al 1997).
Human activity may need to be curtailed too, particularly at impoundments. Water suppliers
will need to decide whether to allow swimming, boating and fishing in the impoundment, and
how close houses, public toilets and car parks should be to the water.
Forestry
Land clearing, planting and felling can cause large increases in silt run-off. These activities can
be controlled by adopting guidelines such as developed by Environment BOP (2000). ARC
(2007) published Forestry Operations in the Auckland Region: A guideline for erosion and
sediment control.
Guidelines for Drinking-water Quality Management for New Zealand 2013
121
If the headwaters of a catchment used for water supply is in native bush, the water supplier
should do everything in its power to ensure that the area remains forested.
3.5.2
Urban and transportation pollutants
Urban pollution of waterways is primarily caused by contaminants being washed off streets and
roofs and flushed through the stormwater drainage system to the receiving water. Contaminants
of concern include nutrients, sediment, heavy metals, hydrocarbons, toxic organics and
pathogens.
Urban pollutants are associated primarily with particulate material and this offers the potential
for contaminant entrapment within, for example, stormwater retention ponds, wetlands,
vegetated filter strips and swales, and the addition of filters or screens. Measures such as
minimising imperviousness and retaining natural drainage channels will reduce both the source
and transport of pollutants.
Sewage can be an intermittent pollutant via leaks or when pumping stations break down and
discharge raw sewage to drains (Williamson 1991). Smaller settlements that still use septic tanks
should adopt guidelines (and inspections) related to their design, construction, operation,
maintenance and cleaning.
Water suppliers with urban communities upstream of their water supply intake should ensure
that the appropriate authorities police trade waste bylaws, and consent conditions relating to
activities such as landfills. Trade waste bylaws should require bunding of stored chemicals. The
Hazardous Substances and New Organisms Act provides guidelines on storing hazardous
substances. The Act has regulations and codes of practice to determine how substances should
be transported, stored and used. The storage of hazardous substances must also comply with the
New Zealand Building Act and the Resource Management Act.
Spills of hazardous substances during transport have the potential to cause serious problems for
water suppliers. Measures that water suppliers may consider include requiring:
•
trucking companies/drivers to use approved roads, and follow appropriate standards, for
example: NZS 5433:1999 Transport of Dangerous Goods on Land, and The Liquid &
Hazardous Waste Code of Practice (NZWWA)
•
regional councils to co-ordinate with Fire Service re spills etc
•
also see: Stock Effluent from Trucks: Resource Management Guidelines for Local Authorities,
Prepared by the Planning Subgroup for The National Stock Effluent Working Group; and the
three companion documents:
– Volume I: Industry Code of Practice for the Minimisation of Stock Effluent Spillage from
Trucks on Roads. National Stock Effluent Working Group. April 1999
– Volume II: A Practical Guide to Providing Facilities for Stock Effluent Disposal from
Trucks. National Stock Effluent Working Group. Second edition, March 2003
– Volume III: Resource Management Guidelines for Local Authorities. March 2003.
122
Guidelines for Drinking-water Quality Management for New Zealand 2013
References
Addiscott TM, Benjamin N. 2004. Nitrate and human health. Soil Use and Management 20: 98–104.
Aislabie J, Smith JJ, Fraser R, et al. 2001. Leaching of bacterial indicators of faecal contamination
through four New Zealand soils. Australian Journal of Soil Research 39: 1397–406.
Aller L, Bennett T, Lehr JH, et al. 1987. DRASTIC: A standardised system for evaluation of ground
water pollution potential using hydrogeologic settings. Washington DC: US Environmental
Protection Agency, 130 pp.
ANSI/AWWA. 1987. Disinfection of Wells. ANSI/AWWA Standard C654. Denver CO: American
Water Works Association.
ANSI/AWWA. A100–97. AWWA Standard for Water Wells. 1997. Denver CO: American Water
Works Association.
ARC. 2001. Riparian Zone Management: Strategy for the Auckland region. Technical Publication
148. Auckland Regional Council.
ARC. 2007. Forestry Operations in the Auckland Region: A guideline for erosion and sediment
control. TP223.
http://www.aucklandcouncil.govt.nz/EN/planspoliciesprojects/reports/technicalpublications/Pages
/technicalpublications201-250.aspx
ARMCANZ. 2003. Minimum Construction Requirements for Water Bores in Australia. National
Minimum Bore Specifications Committee, Agriculture and Resource Management Council of
Australia and New Zealand.
ANZECC. 1995. National Water Quality Management Strategy: Guidelines for groundwater
protection in Australia. Australian and New Zealand Environment and Conservation Council,
Agriculture and Resource Management Council of Australia and New Zealand. 94 pp.
http://www.environment.gov.au/water/publications/quality/pubs/nwmqs-groundwaterguidelines.pdf. These Guidelines are discussed in Assessing the Need to Revise the Guidelines for
Groundwater Protection in Australia: A review report 2010, 107 pp.
http://www.ga.gov.au/groundwater/our-capabilities/national-water-quality-managementstrategy.html
Atwill ER, Hou L, Karle BM, et al. 2002. Transport of Cryptosporidium parvum oocysts through
vegetated buffer strips and estimated filtration efficiency. Appl Environ Microbiol 68(11): 5517–27.
Auer MT, Niehaus SL. 1993. Modelling fecal coliform bacteria – I: Field and laboratory
determination of loss kinetics. Water Research 27: 693–701.
AWWA. 1990. Water Quality and Treatment (4th edition). Published for American Water Works
Association by McGraw-Hill Inc.
AWWA. 2003. Groundwater: Manual M21 (3rd edition). Denver CO: American Water Works
Association.
AWWA. 2004. Problem Organisms in Water: Identification and treatment – manual of water
supply practices. AWWA Manual M7. Denver CO: American Water Works Association.
Bagshaw C. 2002. Factors Influencing Direct Deposition of Cattle Faecal Material in Riparian
Zones. MAF Technical Paper No. 2002/19. ISBN No: 0-478-07699-1. ISSN No: 1171-4662.
See: http:/www.maf.govt.nz/mafnet/rural-nz/sustainable-resource-use/resourcemanagement/cattle-faecal-material/index.htm
Burns DA, Nguyen ML. 2002. Nitrate movement and removal along a shallow groundwater flow path
in a riparian wetland within a sheep grazed pastoral catchment: results of a tracer study. New
Zealand Journal of Marine and Freshwater Research 36: 371–85.
Guidelines for Drinking-water Quality Management for New Zealand 2013
123
Busenberg E, Plummer LN. 2000. Dating young groundwater with sulfur hexafluoride: natural and
anthropogenic sources of sulfur hexafluoride. Water Resources Research 36(10): 3011–30.
Centers for Disease Control and Prevention, Department of Health and Human Services,
US Government. www.bt.cdc.gov/disasters/floods and select Keep food and water safe and open
Disinfecting wells after an emergency.
Close ME, Davies-Colley RJ. 1990. Baseflow water chemistry in New Zealand rivers.
1: Characterisation. New Zealand Journal of Marine and Freshwater Research 24: 319–41.
Close ME, Flintoft MJ. 2004. National survey of pesticides in groundwater in New Zealand, 2002.
New Zealand Journal of Marine and Freshwater Research 38: 289–99.
Close ME, Stewart M, Rosen M, et al. 2000. Investigation into Secure Groundwater Supplies.
ESR/IGNS Report No. FW0034, Report to the Ministry of Health, 46 pp.
Collier KJ, Cooper AB, Davies-Colley RJ. 1995. Managing Riparian Zones: A contribution to
protecting New Zealand’s rivers and streams. Volumes 1 (Concepts) and 2 (Guidelines). Wellington:
NIWA/Department of Conservation.
Collins R. 2004. Fecal contamination of pastoral wetlands. Journal of Environmental Quality
33: 1912–18.
Collins R, Donnison A, Ross C, et al. 2004. Attenuation of effluent-derived faecal microbes in grass
buffer strips. New Zealand Journal of Agricultural Research 47: 565–74.
Collins R, Elliott S, Adams R. 2005. Overland flow delivery of faecal bacteria to a headwater pastoral
stream. Journal of Applied Microbiology 99: 126–32.
Collins R, McLeod M, Donnison A, et al. 2005. Surface Runoff and Riparian Management III.
NIWA Client Report 2005-054 to the Ministry of Agriculture and Forestry, Wellington. This
document can be accessed from www.maf.govt.nz.
Craggs RJ, Sukias JP, Tanner CT, et al. 2004. Advanced pond system for dairy-farm effluent
treatment. New Zealand Journal of Agricultural Research 47: 449–60.
Davies-Colley RJ, Nagels JW. 2002. Effects of dairying on water quality of lowland streams in
Westland and Waikato. New Zealand Grasslands Association 64: 107–14.
Davies-Colley RJ, Stroud MJ. 1995. Water quality degradation by pastoral agriculture in the
Whanganui River catchment. NIWA Consultancy Report DoC050/1.
Davies-Colley RJ, Nagels JW, Smith AR, et al. 2004. Water quality impact of a dairy cow herd
crossing a stream. New Zealand Journal of Marine and Freshwater Research 38: 569–76.
Davies-Colley RJ, Wilcock RJ. 2004. Water quality and chemistry in running waters. In J Harding,
P Mosley, C Pearson, et al (eds) Freshwaters of New Zealand, pp 11.1–11.17. Christchurch: New
Zealand Hydrological Society and New Zealand Limnological Society, Caxton Press.
Domenico PA, Schwartz W. 1998. Physical and Chemical Hydrogeology (2nd edition). New York:
Wiley.
Donnison A, Ross C, Thorrold B. 2004. Impact of land use on the faecal microbial quality of hillcountry streams. New Zealand Journal of Marine and Freshwater Research 38: 845–55.
Driscoll FG. 1986. Groundwater and Wells (2nd edition). Johnson Division, St Paul, Minnesota.
Duncan M, Woods R. 2004. Flow regimes. In J Harding, P Mosley, C Pearson, et al (eds)
Freshwaters of New Zealand, pp 7.1–7.14. Christchurch: New Zealand Hydrological Society and New
Zealand Limnological Society, Caxton Press.
124
Guidelines for Drinking-water Quality Management for New Zealand 2013
DWI. 2012. Desk-based Study of Current Knowledge on Veterinary Medicines in Drinking Water
and Estimation of Potential Levels. 258 pp. http://dwi.defra.gov.uk/research/completedresearch/2000todate.htm
ECan. 1999. Preliminary Modelling Results Regarding Septic Tank Discharges. Environment
Canterbury report no. U99/83, prepared for the Canterbury Regional Council, by Pattle Delamore
Partners Ltd, October.
ECan. 2003. Riparian Management Classification for Canterbury Streams. Report No. U03/43,
Christchurch: Environment Canterbury.
ECan. 2007. Predictions of Virus Movement in Canterbury Alluvial Aquifers. Environment
Canterbury Report No. U07/72, 57 pp. Author: Catherine Moore.
Environment BOP. 2000. Erosion and Sediment Control Guidelines for Forestry Operations.
Environment BOP, Guideline No. 2000/01, 73 pp. Available at:
http://www.boprc.govt.nz/media/29546/Guideline-0001-ErosionSedimentControl.pdf
ESR. 2010. Guidelines for Separation Distances Based on Virus Transport Between On-site
Domestic Wastewater Systems and Wells. ESR Client Report No. CSC1001. 296 pp.
http://www.envirolink.govt.nz/PageFiles/31/Guidelines_for_separation_distances_based_on_viru
s_transport_.pdf
ESR. 2012. Summary of Literature Impacts of Earthquakes on Groundwater Quality. 23 pp. Client
Report FW12013. http://www.esr.cri.nz/publications/Pages/MinistryofHealth.aspx
Focazio MJ, Reilly TE, Rupert MG, et al. 2002. Assessing groundwater vulnerability to
contamination: providing scientifically defensible information for decision makers. US Geological
Survey Circular 1224, 33 pp.
Frost F, Frank D, Pierson K, et al. 1993. A seasonal study of arsenic in groundwater, Snohomish
County, Washington, USA. Environmental Geochemistry and Health 15(4): 209–14. Lovelace
Medical Foundation, Albuquerque, NM, USA.
Hamilton D, Hawes I, Davies-Colley RJ. 2004. Physical and chemical characteristics of lake water. In
J Harding, P Mosley, C Pearson, et al (eds) Freshwaters of New Zealand, pp 21.1–21.20.
Christchurch: New Zealand Hydrological Society and New Zealand Limnological Society, Caxton
Press.
Harding J, Mosley P, Pearson C, et al (eds). 2004. Freshwaters of New Zealand. Christchurch: New
Zealand Hydrological Society and New Zealand Limnological Society, Caxton Press.
Heaton THE, Vogel JC. 1981. Excess air in groundwater. Journal of Hydrology 50: 201–16.
Hickey CV, Quinn JM, Davies-Colley RJ. 1989. Effluent characteristics of dairy shed oxidation ponds
and their potential impacts on rivers. New Zealand Journal of Marine and Freshwater Research
23: 569–84.
Houlbrooke DJ, Horne DJ, Hedley MJ, et al. 2004b. Minimising surface water pollution resulting
from farm-dairy effluent application to mole-pipe drained soils. I: An evaluation of the deferred
irrigation system for sustainable land treatment in the Manawatu. New Zealand Journal of
Agricultural Research 47: 405–15.
Houlbrooke DJ, Horne DJ, Hedley MJ, et al. 2004a. A review of literature on the land treatment of
farm-dairy effluent in New Zealand and its impact on water quality. New Zealand Journal of
Agricultural Research 47: 499–511.
Howard-Williams C. 1987. In-lake control of eutrophication. In WN Vant (ed) Lake Managers
Handbook, pp 195–202. Water and Soil Miscellaneous Publication No. 103. Wellington: Ministry of
Works and Development.
Guidelines for Drinking-water Quality Management for New Zealand 2013
125
Hutchinson PD. 1990. Regression Estimation of Low Flows in New Zealand. Christchurch:
Hydrology Centre, DSIR Marine and Freshwater, Department of Scientific and Industrial Research,
Publication 22.
Larned ST, Scarsbrook MR, Snelder TH, et al. 2004. Water quality in low-elevation streams and
rivers of New Zealand: recent state and trends in contrasting land-cover classes. New Zealand
Journal of Marine and Freshwater Research 38: 347–66.
Lundberg JO, Weitzberg E, Cole JA, et al. 2004. Opinion – Nitrate, bacteria and human health.
Nature Reviews Microbiology 2: 593–602.
McBride GB, Till D, Ryan T, et al. 2002. Freshwater Microbiology Research Programme. Pathogen
occurrence and human health risk assessment analysis. Wellington: Ministry for the Environment
Technical Publication.
MAF. 2004. Review of Riparian Buffer Zone Effectiveness. MAF Technical Paper No. 2004/05
(prepared by Dr Stephanie Parkin, NIWA. Available at
http://www.crc.govt.nz/publications/Consent%20Notifications/upper-waitaki-submitter-evidencemaf-technical-paper-review-riparian-buffer-zone-effectiveness.pdf).
MAF. 2006. Agricultural Pesticides in New Zealand Groundwater. MAF Technical Paper No:
2006/09. Wellington: MAF Information Bureau.
MAF. 2006a. Pathogen Pathways – best management practices. MAF Technical Paper No:
2006/01. http://maxa.maf.govt.nz/mafnet/publications/techpapers/06-01/
MfE/MAF. 2001. Managing Waterways on Farms: A guide to sustainable water and riparian
management in rural New Zealand. Wellington: Ministry for the Environment and Ministry of
Agriculture and Forestry (published by the Ministry for the Environment). 212 pp. See
http://www.mfe.govt.nz/publications/water/managing-waterways-jul01/managing-waterwaysjul01.pdf or go to http://www.mfe.govt.nz/publications/a-to-z.html
MfE. 2002. Lake Managers Handbook; Land-water Interactions. Prepared for the Ministry for the
Environment by NIWA, 78 pp. http://www.mfe.govt.nz/publications/water/lm-land-waterjun02.pdf
Ministry for the Environment. 2005. Proposed National Environmental Standard for Human
Drinking-water Sources: MfE discussion document. September. It officially came into effect on
20 June 2008.
Ministry for the Environment. 2006. A National Protocol for State of the Environment
Groundwater Sampling in New Zealand. 56 pp.
http://www.mfe.govt.nz/publications/water/national-protocol-groundwater-dec06/nationalprotocol-groundwater-dec06-updated.pdf
Ministry for the Environment. 2009. Draft Users’ Guide: National Environmental Standard for
Sources of Human Drinking Water. Wellington: Ministry for the Environment. 96 pp.
http://www.mfe.govt.nz/publications/rma/nes-draft-sources-human-drinking-water/draft-nessources-human-drinking-water.pdf. Note: Appendix 3 is Resource Management (National
Environmental Standards for Sources of Human Drinking Water) Regulations 2007.
Note: The New Zealand Ministry of Health’s Guides for drinking-water supplies can be accessed as
Word documents on the Ministry of Health website: http://www.moh.govt.nz/water then select
Publications and search for PHRMP.
MoH Public Health Risk Management Plan Guide PHRMP Ref. S1.1. Surface and Groundwater
Sources. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref. P1.3. Groundwater Abstraction:
Bores and wells. Wellington: Ministry of Health.
126
Guidelines for Drinking-water Quality Management for New Zealand 2013
MoH Public Health Risk Management Plan Guide PHRMP Ref. P1.4. Groundwater Abstraction:
Springs. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref. P4.1. Pretreatment Processes:
Algicide application. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref. P4.2. Pretreatment Processes –
Destratification. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref. P4.2. Treatment Processes: Pump
operation. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref. D2.4. Distribution System: Backflow
prevention. Wellington: Ministry of Health.
MoH. 2003. Register of Drinking-water Supplies in New Zealand (2003 edition). Wellington:
Ministry of Health.
MoH. 2005. Drinking-water Standards for New Zealand 2005, and the 2008 Revision. Wellington:
Ministry of Health.
Monaghan RM, Smith LC. 2004. Minimising surface water pollution resulting from farm-dairy
effluent application to mole-pipe drained soils. II: The contribution of preferential flow of effluent to
whole-farm pollutant losses in subsurface drainage from a West Otago dairy farm. New Zealand
Journal of Agricultural Research 47: 417–28.
Muirhead RW, Davies-Colley RJ, Donnison AM, et al. 2004. Faecal bacterial yields in artificial flood
events: Quantifying in-stream stores. Water Research 38: 1215–24.
MWD. 1987. Lake Managers Handbook. Water and Soil Miscellaneous Publication No. 103.
Wellington: Ministry of Works and Development.
Nagels JW, Davies-Colley RJ, Donnison AM, et al. 2002. Faecal contamination over flood events in a
pastoral agricultural stream in New Zealand. Water Science and Technology 45(12): 45–52.
Nokes C, Ritchie J. 2002. Survey of Arsenic in New Zealand Coastal Groundwaters. FW0264.
Report to the Ministry of Health, Wellington.
Novotny V. 2003. Water Quality: Diffuse pollution and watershed management. New York: John
Wiley and Sons.
NZS 4411. 2001. Environmental Standard for Drilling of Soil and Rock, 20 pp. This Standard is
available free for downloading on the Standards New Zealand website
http://www.standards.co.nz/default.htm
NZWWA. 1998. Wastewater Treatment Plant Database. Wellington: New Zealand Water and
Wastes Association.
NZWWA. 2006. Backflow Prevention for Drinking Water Suppliers, Code of Practice. Wellington:
Backflow Special Interest Group, New Zealand Water and Wastes Association. Since revised, see:
NZWWA. 2013. Boundary Backflow Prevention for Drinking Water Suppliers. 41 pp. Wellington:
New Zealand Water and Wastes Association, http://www.waternz.org.nz
Parkyn S. 2004. Review of Riparian Buffer Zone Effectiveness. MAF technical paper No. 2004/05
ISBN No: 0-478-07823-4. ISSN No: 1171-4662 (http://www.maf.govt.nz/mafnet/ruralnz/sustainable-resource-use/resource-management/review-riparian-buffer-zoneeffectiveness/index.htm) or go through http://www.mpi.govt.nz/
PDP. 2002. Groundwater Model Audit Guidelines. Prepared for the Ministry for the Environment by
Pattle Delamore Partners Ltd.
Guidelines for Drinking-water Quality Management for New Zealand 2013
127
Pearson C, Henderson R. 2004. Floods and low flows. In J Harding, P Mosley, C Pearson, et al (eds)
Freshwaters of New Zealand, pp 10.1–10.16. Christchurch: New Zealand Hydrological Society and
New Zealand Limnological Society, Caxton Press.
Pearson CP. 1998. Changes to New Zealand’s national hydrometric network in the 1990s. Journal of
Hydrology (NZ) 37: pp 1–17.
Piper AM. 1944. A graphic procedure in the geochemical interpretation of water analyses.
Transactions, American Geophysical Union 25: 914–23.
Plummer LN, Busenberg E, Böhlke JK, et al. 2000. Chemical and Isotopic Composition of Water
from Springs, Wells, and Streams in Parts of Shenandoah National Park, Virginia, and Vicinity,
1995–1999. US Geological Survey Open-File Report 00-373, 70 pp.
Plummer LN, Rupert MG, Busenberg E, et al. 2000. Age of irrigation water in ground water from the
Eastern Snake River Plain aquifer, South-central Idaho. Ground Water 38(2): pp 264–83.
Plummer LN, E Busenberg. 2000. Chlorofluorocarbons. In PG Cook, AL Herczeg (eds)
Environmental Tracers in Subsurface Hydrology. Boston: Kluwer Academic.
Chapter 15, pp 441–78.
Ritchie J. 2004. Additional Recommendations for the Identification of Priority 2 Chemical
Determinands in New Zealand Drinking-water Supplies (1/7/03 – 30/6/04). Report FW0479, to
the Ministry of Health, Wellington.
Ross C, Donnison A. 2003. Campylobacter and farm dairy effluent irrigation. New Zealand Journal
of Agricultural Research 46: 255–62.
Rutherford JC, Nguyen LM. 2004. Nitrate removal in riparian wetlands: Interactions between
surface flow and soils. Journal of Environmental Quality 33: 1133–43.
Scarsbrook MR, McBride CG, McBride GB, et al. 2003. Effects of climate variability on rivers:
consequences for long term water quality datasets. Journal of the American Water Resources
Association 39(6): 1435–47, plus erratum in 40(2): 544.
Schijven JF, Hassanizadeh SM. 2002. Virus removal by soil passage at field scale and ground-water
protection of sandy aquifers. Water Science & Technology 46(3): 123–9.
Sheffield RE, Mostaghimi S, Vaughan DH, et al. 1997. Off-stream water sources for grazing cattle as a
stream bank stabilization and water quality BMP. Transactions of the American Society of
Agricultural Engineers 40: 595–604.
Sinton L. 1984. The macroinvertebrates in a sewage-polluted aquifer. Hydrobiologia 119: 161–9.
Sinton L. 1986. A Guide to Groundwater Sampling Techniques. National Water and Soil
Conservation Authority, Water and Soil Miscellaneous Publication No. 99.
Sinton LW. 2001. Microbial contamination of New Zealand’s aquifers. In MR Rosen, PA White (eds)
Groundwaters of New Zealand. Wellington: New Zealand Hydrological Society Inc.
Slaney D, Weinstein P. 2004. Water and human health. In J Harding, P Mosley, C Pearson, et al
(eds) Freshwaters of New Zealand, pp 46.1–46.14. Christchurch: New Zealand Hydrological Society
and New Zealand Limnological Society, Caxton Press.
Smith P. 2003. Waikato River Water Quality Monitoring Programme Data Report 2003.
Hamilton: Environment Waikato Technical Report 2004/10.
Stewart MK, Morgenstern U. 2001. Age and source of groundwater from isotope tracers. In
MR Rosen, PA White (eds) Groundwaters of New Zealand. Wellington: NZ Hydrological Society,
pp 161–83.
128
Guidelines for Drinking-water Quality Management for New Zealand 2013
Stiff HA. 1951. The interpretation of chemical water analyses by means of patterns. Journal
Petroleum Technology 3(10): sections 1, 2, 3.
Sukias JPS, Tanner CC, Davies-Colley RJ, et al. 2001. Algal abundance, organic matter and physicochemical characteristics of dairy farm facultative ponds: implications for treatment performance.
New Zealand Journal of Agricultural Research 44: 279–96.
Sundaram B, Feitz A, Caritat P de, et al. 2009. Groundwater Sampling and Analysis – A Field
Guide. Commonwealth of Australia, Geoscience Australia, Record 2009/27, 95 pp.
http://www.ga.gov.au/image_cache/GA15501.pdf
Tanner CC, Kloosterman VC. 1997. Guidelines for Constructed Wetland Treatment of Farm Dairy
Wastewaters in New Zealand. NIWA Science and Technology Series No. 48, NIWA, Hamilton.
Available at http://www.niwa.co.nz/sites/default/files/import/attachments/st48.pdf
Tanner CC, Nguyen ML, Sukias JPS. 2005. Nutrient removal by a constructed wetland treating
subsurface drainage from grazed dairy pasture. Agriculture, Ecosystems and Environment
105: 145–62.
The Voluntary Initiative. 2005. H 2 OK? Water Catchment Protection. UK Water Industry Research
Ltd and Crop Protection Association. 16 pp.
http://www.water.org.uk/home/policy/publications/archive/pollution/pesticides-report/pesticides2.pdf or http://www.water.org.uk/home/policy/publications/archive/environment/pesticidesreport/pesticides-2.pdf
USEPA. 1998. Biological Indicators of Ground Water-Surface Water Interaction: An update. EPA
816-R-98-018. USEPA Office of Ground Water and Drinking Water.
http://www.epa.gov/safewater/sourcewater/pubs/guide_bioind_1998.pdf
USEPA. 2000. National primary drinking water regulations: ground water rule. Federal Register
61(91). 10 May. Proposed Rules. EPA-815-Z-00-002.
http://www.google.co.nz/search?hl=en&q=National%20Primary%20Drinking%20Water%20Regula
tions:%20Groundwater%20Rule.%20Federal%20Register,%20Vol%2061,%20No.%2091.%20Wed,
%20May%2010,%202000.%20Proposed%20Rules.%20EPA-815-Z-00-002&spell=1&sa=X also see
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/index.cfm for Final Rule (2006)
USEPA. 2008. Ground Water Rule: Source assessment guidance manual. EPA 815-R-07-023.
98 pp. http://www.epa.gov/ogwdw/disinfection/gwr/pdfs/guide_gwr_sourcewaterassessments.pdf
USEPA. 2009. Long Term 2 Enhanced Surface Water Treatment Rule: Toolbox guidance manual.
Review Draft. EPA-815-R-09-016. Washington: United States Environmental Protection Agency.
375 pp.
http://www.epa.gov/safewater/disinfection/lt2/pdfs/guide_lt2_toolboxguidancemanual.pdf or go
through http://water.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm or
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/index.cfm
USGS. 2004. Guidelines for Evaluating Ground-Water Flow Models: USGS scientific investigations
report, 2004-5038. US Geological Survey. Available at: http://water.usgs.gov/pubs/sir/2004/5038/
Van der Raaij RW. 2003. Age-dating of New Zealand groundwaters using sulphur hexafluoride.
Unpublished, MSc thesis, Victoria University, Wellington.
Vant WN. 1987. Eutrophication: an overview. In WN Vant (ed) Lake Managers Handbook, pp 151–7.
Water and Soil Miscellaneous Publication No. 103. Wellington: Ministry of Works and Development.
Vermont Department of Environmental Conservation. 2005. An Ounce of Prevention: A
groundwater protection handbook for local officials. 63 pp. Water Supply Division, Vermont
Department of Environmental Conservation Agency of Natural Resources, 103 South Main Street,
Old Pantry Building, Waterbury, VT 05671-0403. Available on the internet at:
http://www.vermontdrinkingwater.org/GWPRS/VTOuncePrevention2005.pdf
Guidelines for Drinking-water Quality Management for New Zealand 2013
129
Victorian Department of Health. 2011. Public Health Issues Associated with Stock Accessing
Waterways Upstream of Drinking Water Off-takes. 97 pp. Prepared by Water Futures Pty Ltd.
http://docs.health.vic.gov.au/docs/doc/Public-health-issues-associated-with-stock-accessingwaterways-upstream-of-drinking-water-off-takes
Water UK. 2012. Catchment Protection. Technical Guidance Note No. 7. 4 pp.
http://www.water.org.uk/home/policy/publications/archive/drinking-water/priciples-of-watersupply-hygiene
WHO. 2003a. Assessing Microbial Safety of Drinking-water: Improving approaches and methods.
Published on behalf of the World Health Organization and the Organisation for Economic
Co-operation and Development, by IWA Publishing, London. Available at:
http://www.who.int/water_sanitation_health/dwq/9241546301/en/index.html
WHO. 2003b. State of the Art Report: Health risks in aquifer recharge using reclaimed water.
SDE/WSH/03.08. Protection and the Human Environment, World Health Organization Geneva and
WHO Regional Office for Europe Copenhagen, Denmark. Available at:
http://www.who.int/water_sanitation_health/wastewater/wsh0308/en/index.html
WHO. 2004. Guidelines for Drinking-water Quality 2004 (3rd edition). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda
WHO. 2006. Protecting Groundwater for Health. Geneva: World Health Organization.
ISBN 92 4 154 668 9. 697 pp.
http://www.who.int/water_sanitation_health/publications/protecting_groundwater/en/index.html
WHO. 2011. Guidelines for Drinking-water Quality 2011 (4th edition). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
WHO. 2011a. WHO technical notes on drinking-water, sanitation and hygiene in emergencies.
Technical Note 2: Cleaning and disinfecting boreholes. 4 pp.
http://www.who.int/water_sanitation_health/dwq/publications/en/index.html
WHO. 2012. Animal Waste, Water Quality and Human Health. A Dufour, J Bartram, R Bos, et al
(eds). Published on behalf of the World Health Organization by IWA Publishing, UK. 489 pp.
http://www.who.int/water_sanitation_health/publications/2012/animal_waste/en/
Wilcock RJ, Nagels JW. 2001. Effects of aquatic macrophytes on physico-chemical conditions of
three contrasting lowland streams: a consequence of diffuse pollution from agriculture? Water
Science and Technology 43(5): 163–8.
Wilcock RJ, Nagels JW, Rodda HJE, et al. 1999. Water quality of a lowland stream in a New Zealand
dairy farming catchment. New Zealand Journal of Marine and Freshwater Research 33: 683–96.
Wilcock RJ, Croker GF. 2004. Distribution of carbon between sediment and water in macrophyte
dominated lowland streams. Hydrobiologia 520: 143–52.
Wilkinson L, Hill MJP, Birkenbeuel GK. 1992. SYSTAT. SYSTAT Inc.
Williamson B. 1991. Urban Runoff Data Book: A manual for preliminary evaluation of stormwater
impacts. Water Quality Centre Publication No. 20. ISBN 0-477-02629-X. Hamilton.
Williamson RB, Hoare RA. 1987. Controlling nutrient loads and predicting resulting lake nutrient
concentrations. In WN Vant (ed) Lake Managers Handbook, pp 172–82. Water and Soil
Miscellaneous Publication No. 103. Wellington: Ministry of Works and Development.
WQRA. 2011. HealthStream – Quarterly Public Health Newsletter of Water Quality Research
Australia, June issue. www.wqra.com.au
130
Guidelines for Drinking-water Quality Management for New Zealand 2013
Zuber A. 1986. On the interpretation of tracer data in variable flow systems. Journal of Hydrology
86(1–2): 45–57.
Zuber A. 1986. Mathematical models for the interpretation of environmental radioisotopes in
groundwater systems. In P Fritz, J Ch Fontes (eds) Handbook of Environmental Isotopic
Geochemistry, Volume. 2, Part B. Amsterdam: Elsevier, 1–59.
Guidelines for Drinking-water Quality Management for New Zealand 2013
131
Chapter 4: Selection of water
source and treatment
4.1
Introduction
Chapter 3: Source Waters discusses general issues relating to the quality of natural fresh water
systems, ie, surface water and groundwater, and measures that can be taken to protect or
enhance their quality.
In a sense, this chapter converts these natural waters into prospective raw or source waters, ie,
water systems that are being considered for processing into drinking-water. It discusses some of
the information that is needed in the planning stages of developing a new water supply, and the
barriers that can be used to protect public health. The chapter finishes with a summary of
matching water treatment processes with raw water quality.
A major consideration when designing a water supply scheme is the nature of the source water
that is to be used. Questions that arise include:
•
is it in reasonable proximity to the area to be supplied?
•
is the flow sufficient, or will an impoundment be needed?
•
is there an indication of the downstream minimum flow requirement?
•
how variable is the quality, day-to-day, or seasonally?
•
what is the worst water quality the treatment plant will have to cope with?
•
will the quality of its waters pose special concerns for the efficacy of treatment? For example,
might such variations cause non-compliance with the DWSNZ, impairment of desired plant
performance? Or excessive treatment costs?
•
is the catchment or recharge area vulnerable to contamination (now or in the future) eg, from
geothermal areas, mining activities, urban and agricultural pollutants: faecal microbes,
sediment, fertilisers and pesticides?
•
what management techniques are available to mitigate contamination and how might their
efficacy vary with soil type and topography, for example?
Some of the broader aspects are covered in Chapter 3, and those more specifically related to
water supply, in this chapter. Rainwater is covered in Chapter 19: Small and Individual Supplies.
132
Guidelines for Drinking-water Quality Management for New Zealand 2013
4.2 Identifying potential sources
4.2.1
Quantity, reliability, access
A variety of sources are used for the purpose of water supply, ranging in size from those needed
by single households (see Chapter 19) to supplies needed for large cities. Each kind of supply can
be characterised according to its raw water quality (Table 4.1) and there are some rules of thumb
that can be applied with regard to the necessary levels of treatment for each source-type:
•
the widely-accepted minimum treatment for a non-secure groundwater source is disinfection
•
the widely accepted minimum treatment for a surface water source is filtration followed by
disinfection. This minimum level should also be applied to a groundwater source that is
under the direct influence of surface water, which includes all springs.
Table 4.1: Source water quality
Raw water source
Microbiological quality
Chemical quality
Aesthetic quality
Roof water
Sometimes poor
Usually good, subject
to air, roof and paint
contaminants
Soft/corrosive so could contain some
metals
Unconfined aquifer
Often poor
Can be high in nitrate
and ammonium
Variable, can be turbid, discoloured,
soft/corrosive. Can be high in iron or
manganese
Confined aquifer
Usually good
Usually good. Can be
high in carbon dioxide
and ammonium
Variable. Can be hard or
soft/corrosive and high in iron or
manganese. Usually low turbidity
River or stream.
Controlled or few
human/animal impacts
Good to poor
Usually good
Usually good but turbid and
discoloured under flood conditions
River or stream.
High human and/or
animal impacts
Poor. Higher protozoal
risk
Often poor
Good to poor. Turbid and discoloured
under flood conditions
Lake/reservoir.
Controlled or few
human/animal impacts
Usually good. May contain Usually good
algae
Usually good. May have iron and
manganese in deep water. May be
coloured water from bush catchments
Lake/reservoir.
High human and/or
animal impacts
Often poor. Higher
protozoal risk
Usually good, may not be good if
prone to algae blooms. May have
iron/manganese in deep water
Good to poor
The treatment required to produce safe drinking-water depends on the raw water source that is
used. Some natural purification occurs in surface waters as a result of dilution, storage time,
sunlight exposure, and associated physical and biological processes. With groundwater, natural
purification may occur by infiltration of rainfall through soil and percolation through underlying
porous materials such as sand, gravel and joints or fractures in bedrock. Effective treatment
should be provided to ensure safety and consistency in the quality of drinking-water
(MoE 2001).
Guidelines for Drinking-water Quality Management for New Zealand 2013
133
Rivers and streams
It is critically important when assessing a possible source of water supply to ensure that the
resource has an adequate quantity at all times, so that a reliable source of supply is assured. For
flowing waters (rivers and streams) it is important to have a good understanding of the flow
regime, see Chapter 3: Source Waters, section 3.3) and have a long enough record of stream
flows to provide useful summary statistics. This includes mean annual seven-day minimum
flow, mean discharge, and mean annual flood, and to generate a flow-duration curve expressing
the proportion of time during which the flow of a stream is equal to or greater than given
amounts, regardless of chronological order. Flow measurement is discussed in the Hydrologists’
Field Manual (DSIR 1988).
Specific discharge, or flow per unit area of catchment (L/s/km2) when multiplied by the
catchment area gives the mean annual flow. Also of interest are extreme low flows, such as the
20-year seven-day minimum flow, and flood flows. These data are provided by continuous level
recording calibrated by field gaugings of the river in question over a sufficient period. In cases
where level and flow-gauging data have not been collected it is possible to estimate flow regimes
by applying measured relationships between rainfall and runoff from gauged basins to ungauged
basins within the same region (Duncan and Woods 2004). An easier way to get mean flows and
mean annual low flows for third and higher order streams throughout New Zealand is to use the
River Ecosystem Classification database that is available from the NIWA website for the
Freshwater Fish Database (via online services). The NZFFD Assistant software is available for
Windows users to download as a compressed zip file.
The regional council should be contacted early in the assessment of a new source to check on
water flows (likely minimum flow requirements and the volume available for allocation) and
trends in land use patterns.
Rivers used for drinking-water supply are generally accessible to the public as well as having
private lands draining to them. This means that water quality is variable and not easily
controlled and that such waters are unprotected from illegal activities and major pollution
events (eg, spills from tanker accidents, discharges of urban, farm and factory wastes).
Monitoring programmes are needed to determine when river sources may be unacceptable for
treatment and to determine the quality of influent water prior to treatment. They should provide
an understanding of average water quality, changing water quality conditions and the
magnitude and frequency of extreme water quality occurrences. Monthly sampling is commonly
chosen for river monitoring networks because it provides useful information about average or
characteristic water quality, and changes in water quality. Results can be used for trend analysis
after sufficient data have been collected (at least five years, or 50–100 data sets) (Ward et al
1990). Some targeted sampling may be needed too, to cover special events such as flood and
drought.
Lakes and reservoirs
The volume of a lake (or reservoir) is the product of its surface area and its average depth.
Catchment size and rainfall determine the flow of water into a lake and thereby influence
flushing and supply of water. Storage in lakes and reservoirs is usually expressed in terms of
lake level that is measured with a permanent and well-surveyed staff gauge, often to within 1
mm (Hoare and Spigel 1987). Assuming that the lake area varies negligibly with level over the
operating range, then available lake volume is proportional to level. The level of a lake is thus
controlled by the difference between its inflows and outflows, as defined below (Hoare and
Spigel 1987):
134
Guidelines for Drinking-water Quality Management for New Zealand 2013
surface inflow rate:
groundwater inflow rate:
precipitation rate on lake surface per unit area
outflow rate from surface outlet:
outflow rate to groundwater:
evaporation rate from lake surface per unit area:
lake area:
level:
Q in
G in
P
Q
G out
E
A
L
in which case:
Q in + G in + (PA) = Q + G out + (EA) + AdL/dt
dL/dt is the rate of change in water level, with time. One of the advantages of lake storage is that
because outflow rate can only increase by means of an increase in lake level, which absorbs a
large proportion of the inflow, outflow rate in response to a storm varies much less markedly
than the inflow rate. In other words, in-lake storage has a smoothing effect on outflows in
response to storm events. The corollary of this is that when lake levels falls below the spill level,
outflow ceases.
It is possible to isolate water storage reservoirs from public access or to prohibit activities like
swimming and other forms of contact recreation so that water quality is maintained at a high
level. In cases where there is some public ownership of catchment land (eg, Hays Creek,
Auckland) control of water quality is not as tight and some additional monitoring may be
required to detect incidents that adversely affect the capacity for treatment to be effective.
Bimonthly sampling is considered an appropriate frequency that will enable trends to be
detected in lakes and reservoirs as well as yielding general water quality information (Ward et al
1990).
Springs and groundwater sources
Supply of groundwater and springs is dependent on surface waters that supply them and there is
often a considerable time lag between changes in the supply (quantity) and quality of surface
water and the emergent groundwater being used downstream. Hydraulic changes travel through
an aquifer as a pressure wave moving much faster than the groundwater and its constituents.
This is particularly so for deep aquifers and groundwater such as those used to supply
Christchurch, and for spring waters emerging in the Lake Taupo catchment. In order to have a
dependable supply it is necessary to understand these relationships between surface water
hydrology and the resulting groundwater resource.
Emergent groundwater and springs water may be affected by surface contamination that is some
distance removed from the point of supply and thus, may not be apparent. Shallow groundwater
is particularly prone to this sort of contamination, where there are intensive land-use activities
in the areas that recharge the groundwater or feed springs. Recent conversions to dairy farms in
the Waitaki River valley rely on the relatively clean river water from the Waitaki River to floodirrigate pasture for dairy farming and are causing some deterioration of shallow groundwater in
the area through drainage of polluted surface water. Irrigation of freely-draining soils is a well
known mechanism for introducing surface contaminants to shallow groundwater and is thought
to be the main mechanism for nitrate contamination in the Waikato and other parts of New
Zealand (Selvarajah et al 1994). Recent irrigation trends in Canterbury, with subsequent
intensification of agricultural activities, are increasing the risk of groundwater contamination
and uncertain effects on the quantities of some groundwater resources (PCE 2004).
Guidelines for Drinking-water Quality Management for New Zealand 2013
135
Changes in groundwater quality and quantity are much more gradual than for surface waters
and, accordingly, quarterly monitoring should be carried out (ie, at three-month intervals) to
provide useful information for water quality time-trend analysis when sufficient data has been
collected. At a rate of four samples/site/year it will be many years before sufficient information
has been collected for trend analysis, with the consequence that degradation of a groundwater
may only be detected well after contamination has occurred (Ward et al 1990). Thus, it may be
prudent to monitor surface sources of groundwater and their catchments (eg, for major changes
in land use), as well. Water quality data for major New Zealand aquifers collected for the
National Groundwater Monitoring Programme is available from the Institute of Geological and
Nuclear Sciences.
Risk management issues related to new supplies are addressed in the MoH Public Health Risk
Management Plan Guide PHRMP Ref: S2: Development of New Supplies.
4.3 Barriers to the transmission of
micro-organisms
New Zealand waters generally do not contain chemicals that pose a threat to public health when
used for drinking-water supplies. The main concern is the risk of disease by the transmission of
micro-organisms, see Chapter 5: Microbiological Quality.
Although disinfectants are available that can inactivate nearly all micro-organisms, it has long
been an accepted public health concept that the greater the number of barriers employed, the
safer the water for drinking.
WHO (2004a) stated in section 1.1.1:
Securing the microbial safety of drinking-water supplies is based on the use of multiple
barriers, from catchment to consumer, to prevent the contamination of drinking-water or
to reduce contamination to levels not injurious to health. Safety is increased if multiple
barriers are in place, including protection of water resources, proper selection and
operation of a series of treatment steps, and management of distribution systems (piped
or otherwise), to maintain and protect treated water quality. The preferred strategy is a
management approach that places the primary emphasis on preventing or reducing the
entry of pathogens into water sources and reducing reliance on treatment processes for
removal of pathogens.
Traditionally, the barriers have included (WHO 2004c):
•
protection of source water (water used for drinking-water should originate from the highest
quality source possible)
•
coagulation, flocculation and sedimentation
•
filtration
•
disinfection
•
protection of the distribution system.
136
Guidelines for Drinking-water Quality Management for New Zealand 2013
See WHO (2003) for a thorough discussion on the protection of water quality. This book
contains the following chapters:
•
Chapter 1: Safe drinking water: an ongoing challenge
•
Chapter 2: Introducing parameters for the assessment of drinking water quality
•
Chapter 3: Assessment of risk
•
Chapter 4: Catchment characterisation and source water quality
•
Chapter 5: Treatment efficiency
•
Chapter 6: Monitoring the quality of drinking-water during storage and distribution
•
Chapter 7: Surveillance and investigation of contamination incidents and waterborne
outbreaks
•
Chapter 8: Analytical methods for microbiological water quality testing.
4.3.1
Protection of water catchments
Drinking-water should not contain any micro-organisms capable of causing disease. All microorganisms can enter water supplies at any stage of the collection and distribution cycle. Any
micro-organisms that reach the water source are reduced in number by natural processes such
as storage, settlement and natural solar ultraviolet light. If it can be avoided, sources from which
drinking-waters are drawn should not receive faecal contamination, which is likely to contain
pathogenic micro-organisms.
Natural processes are insufficient to ensure sterile water for public distribution. Production of
microbiologically safe water involves an intense programme of protection, treatment and
monitoring that is based largely on the improvement of nature’s already established processes. A
reasonable combination of the following measures should be in place for all modern urban water
supplies:
•
the original water source should ideally be protected from contamination by human or animal
faeces and the catchment should be protected, see Chapter 3: Source Waters, section 3.5.1
•
the water can be stored to allow settlement and die-off of micro-organisms.
Table 4.2 indicates the percentage removal of faecal coliform bacteria as a result of the processes
indicated. Care must be taken to see percentage removal in context where the actual numbers of
bacteria may be up to 106/mL. Monitoring for microbiological quality is simply a check that barriers
are working and should not be regarded as a replacement for removal of any of the barriers.
Table 4.2: Performance that can be achieved by effective barriers to contamination
Process
Removal of faecal indicator bacteria
Protection of catchment
Variable
Medema et al (2003); Collins (2005)
Artificial impoundments
(3–4 weeks’ storage)
Variable
Medema et al (2003)
Coagulation and sedimentation
40–90 percent
Medema et al (2003)
Filtration
99–99.9 percent
Stanfield et al (2003)
a
Stanfield et al (2003)
a
Stanfield et al (1999)
a
Chemical disinfection
UV disinfection
a)
b)
c)
d)
b
>99% with sufficient C.t values
>99%, depends on dose
d
c
Reference
Stanfield et al (2003) state these removals for bacteria.
Chlorine/chloramine/chlorine dioxide/ozone.
This does not necessarily apply to protozoan cysts.
2
Doses of 400 J/m will reduce vegetative bacteria by 4 to 8 logs (Stanfield et al 1999).
Guidelines for Drinking-water Quality Management for New Zealand 2013
137
Faecal material from humans and animals is the most likely source of waterborne pathogens.
Humans and domestic animals should be excluded from water supply catchments wherever
possible, particularly if the water treatment process does not include flocculation,
sedimentation, filtration and disinfection. Section 3.3 discusses the control of surface water
quality.
Chapter 3: Source Waters discusses mitigation of pollutants and catchment protection in a
broader sense, see section 3.5.
4.3.2 Storage and pretreatment
Most pathogenic micro-organisms do not survive long in stored water and significant die-off will
occur, typically more than 90 percent removal of faecal indicator bacteria after a week of two’s
storage. See Chapter 12: Pretreatment Processes, section 12.3.2: Off-river Storage for further
information, including some information re reduction times for selected micro-organisms.
Retention of water in artificial storage systems such as lakes or dams will allow the suspended
material (inorganic and organic, including pathogens) to settle as the specific gravity is
marginally greater than that of water. In addition, pathogenic micro-organisms do not usually
grow outside the host as the optimum growth conditions do not prevail. Competition for
nutrients from the normal aquatic flora, predation by native protozoans, and most particularly
inactivation by solar ultraviolet radiation, are important pathogen-removing processes.
The removal of solids by settlement helps remove micro-organisms that are adsorbed to the
solids. This clarification of water will facilitate solar ultraviolet inactivation and subsequent
disinfection. Where it is not possible to store the bulk of water for sufficient time,
pre-disinfection can be used as an alternative to storage to reduce numbers of potential
pathogens. However, prechlorination of water at this stage requires higher levels of chlorine and
may produce hazardous by-products.
4.3.3 Coagulation and filtration
Assistance of natural processes by the addition of a chemical coagulant or flocculant to
aggregate bacteria and other particles, followed by sedimentation and filtration through graded
sand, can remove up to 80–90 percent of suspended solids. Chemicals such as alum, PAC, iron
compounds, and polyelectrolytes may be used to promote aggregation of microbes and other
suspended particles, see Chapter 13. Further, activated carbon may be used to remove (by
adsorption) some taste and odour causing compounds and other organic molecules.
It is essential that the removal of micro-organisms and other particulate matter should be as
complete as possible before disinfection such that the need for high disinfectant doses (and the
cost of disinfection) is reduced. This will also limit the production of disinfection by-products.
If the colour and turbidity are not high, effective water treatment can be achieved by using
filtration without chemical coagulation. Chapter 14 covers diatomaceous earth, slow sand,
cartridge, bag and membrane filtration processes.
138
Guidelines for Drinking-water Quality Management for New Zealand 2013
4.3.4 Disinfection and inactivation
Pathogenic micro-organisms in all water supplies need to be disinfected (inactivated) or
removed, except in groundwaters that comply with the bacterial requirements in section 4.5 of
the DWSNZ. Disinfection processes are discussed in Chapter 15.
Water suppliers must assume that all surface waters contain E. coli (which indicates the
probability of pathogenic bacteria and viruses being present) and protozoal (oo)cysts, and treat
the water accordingly. A discharge that increases the number of E. coli may increase the risk to
public health but not necessarily increase the cost of disinfection – usually the dose will be the
same whether there is 1 E. coli per 100 mL, or (say) 1000 per 100 mL. However, a discharge that
increases the number of protozoal (oo)cysts in the source water may cause the required number
of log credits to increase, and hence the cost of treatment.
Disinfection can and should inactivate all types of pathogenic, indicator and other microorganisms. However, by definition disinfection does not usually inactivate every last cell (or
(oo)cyst, spore or virion) of micro-organisms that are present. Rather disinfection reduces
concentrations to acceptable levels for which disease risk is very low (but not zero, Gerba et al
2003). Note that the term inactivate is used to recognise that disinfecting agents do not
(usually) destroy micro-organisms completely, but merely render them incapable of infection
and growth (Stanfield et al 2003). Microbes are usually still recognisable microscopically after
disinfection, despite being inactivated.
The commonest disinfectant used in water supply is still chlorine, as the gas or hypochlorite, but
other chemical disinfectants such as chloramine, chlorine dioxide, and ozone are also used
(Stanfield et al 2003). Ozone is particularly popular in Europe, apparently because toxic
organochlorine byproducts are not produced with this disinfectant, and because many supplies
sourced from rivers contain organic substances that can be destroyed by ozone. Ultraviolet
radiation (usually by exposure to lamps emitting most energy at 254 nm) is a powerful
disinfecting agent, and is becoming increasingly popular in New Zealand and elsewhere,
particularly for inactivating protozoa, again in part because organochlorine byproducts are
avoided. See Chapter 15. There is increasing interest in natural solar disinfection (SODIS) of
water, particularly in developing countries and situations (eg, disaster zones) where
infrastructure for water disinfection is unavailable or has been damaged, but sunlight is
abundant (eg, McGuigan 1998; WHO 2009, 2011a).
The quality of the water prior to disinfection is important because it can greatly influence the
efficiency (and cost) of disinfection (Sobsey 1989). Both organic matter and suspended particles
(indexed by turbidity measurement) need to be reduced to low concentrations prior to
disinfection. Organic matter will increase the consumption of chemical (oxidising) disinfectants
and therefore the cost. Organic matter also strongly absorbs UV radiation, so reducing the
effective dose to micro-organisms. Turbidity will reduce the efficiency of both chemical and
ultraviolet disinfection. The pH of water is important for the effectiveness of some forms of
disinfection, notably that with chlorine.
A number of micro-organisms such as Cryptosporidium and Giardia, and the cyanobacteria,
are resistant to typical chlorination doses and can penetrate some filtration processes.
The maintenance of a satisfactory level of disinfectant throughout the distribution system is
often important, allowing the disinfection process to continue beyond the treatment plant, and
to protect against any minor accidental contamination, and helps limit regrowth (biofilms). The
lack of a residual is often cited as an important disadvantage of using UV irradiation as the sole
disinfectant, which is therefore best suited to smaller water supplies for which reticulation is in
good condition with excellent safeguards in place to avoid accidental contamination.
Guidelines for Drinking-water Quality Management for New Zealand 2013
139
4.4 Evaluating the sources
4.4.1
Where to sample
This section discusses the evaluation of potential raw water sources, not source water
monitoring which is discussed in Chapter 8: Protozoa Compliance, section 8.2, and not
sampling techniques which is dealt with in Chapter 17.
A series of sample sites needs to be assessed when evaluating potential new sources. These
should become apparent after conducting a sanitary survey of the catchment.
River systems
Samples should be collected from potential intake sites and from tributaries that may impact on
water quality in the main stream at or near the intake. If the intake is in the lower reaches, it
may be necessary to determine the distance that saline water extends up the river, particularly
during periods of low river flow that coincide with spring tides and/or long periods of onshore
wind. Also, samples should be taken close to the point at which the intake would be located, and
not necessarily in the middle of the river (eg, from a bridge). This is because upstream tributary
inflows can hug the riverbanks for some distance downstream (Rutherford 1994), and so a
sample taken from midstream will not necessarily be representative of the quality of the water
that would be taken. Sampling from each side and the middle will indicate the degree of mixing,
which in large rivers and sluggish rivers, may be minimal.
Lakes and reservoirs
Samples should be collected at different depths for at least a year to determine whether the
water body stratifies, and if it does, to measure the quality of the lower waters.
Ideally a dam should be built a few years before the water treatment plant is designed. This
allows the composition of the impounded water to settle down and gives time to assess its
treatment needs correctly. The water quality of a stream can be quite different after
impoundment, to the extent that different types of treatment may be required or desired.
Otherwise, there may be a similar catchment nearby where the effects of impoundment can be
studied, with the hope that it will give an indication of the probable raw water quality.
The changes that can occur when impounding a stream (see Table 4.3 for some typical values)
include:
•
after heavy rain, suspended solids can rise to very high levels in stream water, and remain
high for a day or two. Suspended solids entering a reservoir are diluted, so do not reach the
same high levels, but may remain elevated for weeks. If the flood water is much colder than
the reservoir water it may plunge to an intermediate depth
•
stream phytoplankton are mainly attached to pebbles (epiphytic or benthic). In a reservoir
the phytoplankton are free-swimming or floating species (planktonic), and these can reach
much higher population densities. Nutrient levels in a reservoir can be higher too, because
most nutrients are associated with run-off, which in a stream, passes with the flood flow, but
these nutrients may be retained in an impoundment
•
natural organic matter (commonly measured as UV absorbance at 254 nm after membrane
filtration) leaches from the soil during and after rain. In a stream this mostly passes down
with (or very soon after) the fresh or flood. However, much of it is retained in an
impoundment
140
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
reservoir surface water temperature in summer can reach several degrees higher than stream
water flowing through a bush catchment, and stay warmer during early winter
•
summer stratification and deoxygenation usually occur in impoundments, giving rise to
elevated concentrations of iron, manganese, ammonia, carbon dioxide and hydrogen
sulphide, mainly in the deeper water
•
the reservoir water will usually be dirtier for the first few years due to scouring of the cleared
slopes, and due to the high deoxygenation rates of the newly flooded bottom sediments.
Table 4.3: Effect of impoundment on mean concentrations of some determinands during
fairly dry summer/autumn periods
Determinand, ‘typical values’
Flowing stream
Impounded water (surface)
Impounded water (deeper)
Temperature, °C
14
19
11
pH
7.4
7.1*
6.2
2
5
10
0.005
0.02
0.2
Iron, mg/L
0.1
0.3
10
Manganese, mg/L
0.01
0.05
2
Silica, mg/L Sio 2
18
13
15
Alkalinity, mg/L Caco 3
18
13
25
0.03
0.09
0.15
5
25
30
Turbidity, NTU
Total phosphorus, mg/L P
UV abs 254, 10 mm, filtered
Colour, Hazen units
Based on Ogilvie 1983.
*
Can reach pH 8 during the afternoon if algal content high.
Groundwater
The well should be pump-tested, screened and developed before making decisions about any
water treatment requirements. Samples need to be collected without any aeration, filling the
sample bottle carefully to the top, and allowing several bottle volumes to run through so all air is
displaced; a BOD bottle with its tapered lid is ideal for tests that are affected by aeration,
including the carbon dioxide calculation. Measure the pH as soon as possible. Faulty sampling
can cause a groundwater with a pH of 6.5 with 40 mg/L of carbon dioxide to lose all its carbon
dioxide and end up with a pH of about 7.4. This will result in the selection of a completely
inappropriate treatment process.
There are several good references to sampling techniques for groundwater. A recent book is by
Sundaram et al (2009). See Sinton (1986) for an earlier publication.
Deep confined groundwaters usually display a fairly consistent chemical composition, whereas
the composition of shallow unconfined groundwaters can vary markedly throughout a year; see
Chapter 3: Source Waters, section 3.2.
Guidelines for Drinking-water Quality Management for New Zealand 2013
141
4.4.2 When to sample and how often
An objective of sampling a prospective water source is to discover the degree of contamination
that may:
•
regularly be present
•
occasionally be present.
Sampling should cover at least a year so that seasonal effects and irregular events can be
assessed. Water treatment plants usually have to be designed to treat the worst quality raw
water. The usual (regular) water quality is discovered best by random sampling in time.
However, the sampling programme should be designed to include an assessment of the impact
of climatic events such as drought and different rain intensities. That way, any otherwise
unforeseen patterns of quality variation may be picked up. See Chapter 9 for discussion related
to cyanobacteria.
In order to pick out extremes in data, a productive approach is to seek to estimate 95 percentiles
of the water quality variable with reasonable confidence. As a broad generalisation, more than
50 samples are desirable to achieve this. This is shown in Figure 4.1; once one takes beyond
about 50 samples, the width of the confidence interval decreases very slowly.
Figure 4.1: Confidence limits on a 95th percentile estimate
Source: McBride 2005.
142
Guidelines for Drinking-water Quality Management for New Zealand 2013
4.4.3 What to sample for
There are four main reasons for monitoring the quality of the source water:
1
it can indicate whether the water will be suitable after treatment for intended uses
2
it provides the means of assessing the effectiveness of catchment management
3
it can provide an indication of trends or the impacts of events
4
it helps water treatment management operate existing plant more effectively.
Regional councils should monitor land use and water quality to fulfil their responsibilities under
the RMA 1991. This is both to monitor specific consents to discharges that have been issued, and
to assist in the protection and enhancement of the quality of natural waters. The latter is
particularly important for non-point discharges of contaminants into the general environment.
A water supplier should monitor all regional council data that are relevant to the source water,
including consent applications. It may also be appropriate to commission other water quality
monitoring which will assist in the protection of the source.
Until 1995, the Institute of Environmental Science and Research Ltd (ESR) monitored many
source waters and drinking-waters on a three- to five-year surveillance cycle for inorganic and
physical determinands, pesticides and trace organics under contract to the Ministry of Health.
This source water monitoring no longer occurs.
Protozoa monitoring is discussed in Chapter 8: Protozoa Compliance, section 8.2 Source Water.
Monitoring other micro-organisms is discussed in section 4.3: Barriers to the transmission of
micro-organisms. Generally speaking, unless the water is a secure bore water, all waters contain
micro-organisms that need to be inactivated or removed. The disinfectant dose is normally
determined by the disinfectant demand of the water (after treatment), not the number of microorganisms present. A combination of the two is required for protozoal inactivation.
Risk-based approach
The DWSNZ include MAVs for 115 chemical determinands that may be present in waters and
potentially represent a significant health concern to human consumers. This list contains a wide
range of chemicals representing natural and anthropogenic sources, ie, may be found in source
waters. A few of them are produced in disinfection processes or could enter the water from
treatment chemicals or materials used in the distribution system or plumbing; these are
discussed in Chapter 10: Chemical Compliance and Chapter 15: Treatment Processes:
Disinfection.
Many of the determinands will not be present in a given water source so procedures are required
to prioritise analytical assessments. A risk management approach provides a basis for a decision
support framework to prioritise a chemical assessment programme.
The three main criteria for identifying specific determinands of concern to public health in any
particular setting are:
•
high probability of consumer exposure from drinking-water
•
significant hazard to health
•
interference in the treatment process.
Guidelines for Drinking-water Quality Management for New Zealand 2013
143
Chemicals judged to be more likely to occur and to be highly hazardous to human health should
be given greater priority for risk management than those judged less likely to occur in the
drinking-water and to have lower health hazards. This can be addressed in the PHRMP.
The period of exposure should also be considered, because health effects caused by chemicals in
drinking-water generally result from long-term exposure. Few chemicals in NZ drinking-water
have been shown to cause acute health problems in the short term, except through intentional or
accidental contamination on a large scale. In such instances, the water frequently (but not
always) becomes undrinkable due to unacceptable taste, odour or appearance (WHO 2004a).
Risk management strategies for chemicals in drinking-water should also take into account the
broader context. For example, if drinking-water is not the main route of exposure for a chemical,
then controlling levels in water supplies may have little impact on public health (WHO 2004b).
A recent World Health Organization publication (WHO 2004b) provides guidance on a riskbased approach, and detailed background information on chemical contaminants derived from a
wide range of sources. A key component of a risk-based approach is to generate a list of
potential contaminants that might be in the source water, based on knowledge about the natural
mineralogy, anthropogenic activities and hydraulic processes (eg, rainfall, catchment size,
groundwater contribution) operating in the catchment. Integration of this information provides
a robust approach to categorising the likely importance of contaminants of concern.
New Zealand has a number of major landscape-scale activities that may contribute significantly
to downstream contaminants. Table 4.4 provides an overview of some sources and activities that
may give rise to contaminants. This provides a risk-based approach to assess potential
contaminants that may occur in a water supply. The objective of this tabulation is to provide an
indication of sources that may require greater consideration in New Zealand compared with the
generic WHO (2004b) listings. Some of these activities are conducted on a wide scale (eg, pest
control for possums, forestry spraying), others may have cumulative risks from a large number
of small or diffuse inputs (eg, agricultural ponds), while others may be regional (eg, geothermal,
mining).
Table 4.4: Summary of sources that may provide significant chemical contaminants of
concern (COCs) to freshwater environments in New Zealand1
Source
Contaminants of concern
Comment
Agriculture
Ammonia, nitrate, modern
pesticides, Zn, Cu, F,
cyanotoxins, DDT, dieldrin
Cd and F input from fertiliser; Zn from application for facial
eczema; Cu from horticultural spraying; DDT and dieldrin
from legacy use of pesticides; cyanotoxins from blue-green
algal growth in agricultural oxidation ponds
Forestry
Cu, Cr, As, fungicides, PCP
Cu from forestry Dothistroma spraying; Cu, Cr, As and PCP
from old timber treatment sites
Geothermal
Hg, B, As, F, Li
Geothermal (and some hydrothermal) region input to surface
and groundwaters
Pest control
1080, brodifacoum
Used widely for possum and rat control
Mining
Gold mining: Cu, Cr, As
Coal mining: B, Hg, Cd
Legacy mining inputs. New Zealand coal is high in B in some
areas
Aquaculture
Hg, antibiotics
Hg is derived from use of some fish meal. Antibiotics added
to many feeds
Domestic oxidation
ponds
Ammonia, nitrate, various,
cyanotoxins
Microcystin from blue-green algal growth in oxidation ponds
Mineralogy
As, Hg
Parts of New Zealand have mineralised areas with natural
leaching of As and Hg to receiving waters
1
See also Hickey 1995, 2000; Smith 1986; Lentz et al 1998.
144
Guidelines for Drinking-water Quality Management for New Zealand 2013
The risk-based procedure then involves listing the potential contaminant contributions followed
by the hazard assessment. Table 4.5 illustrates an assessment procedure for an integrated
catchment approach. The contaminants illustrated in this table are those likely to be of
relevance in the New Zealand environment. There may be a range of site-specific contaminants
in some water sources. This approach provides a decision-support basis for monitoring and
surveillance of contaminants. The listing process is designed to be relatively exhaustive in
drawing information from a range of sources to compile the database. The subsequent
procedure involves risk ranking to eliminate contaminants that would not be expected to be
present in significant quantities in the catchment. For completeness, Table 4.5 includes
determinands that may enter the water during and after treatment.
There is no national database available in New Zealand that provides information on point
source and diffuse source contaminants or natural water concentrations. Rather, data must be
gathered from a wide range of sources. Information for contaminants from a wide range of
discharges is contained in a range of publications (eg, Hickey 1995, 2000; Smith 1986; Lentz
et al 1998). Wilcock (1989), Wilcock and Close (1990) reviewed pesticide use, together with a
risk-based assessment (Wilcock 1993). The pesticide use data has more recently been updated
(Holland and Rahman 1999).
Most New Zealand water suppliers find that they do not need to monitor their source waters for
chemical determinands of health significance. Some aesthetic determinands are measured
regularly. And some determinands can impair the performance of the treatment process.
Generally, the only raw water monitoring recommended to be conducted on site are those
determinands that water treatment plant operators can do something about. These depend on
the treatment process being used (Chapters 12–15) and the likelihood of them being a nuisance
in the treatment process. Some problems that are fairly common are:
•
colour (or UV absorbance) and turbidity affect coagulation processes
•
natural organic matter and bromide may lead to disinfection by-products
•
silt and debris from floods can challenge the solids loading of the treatment plant
•
a change in raw water pH can require a pH adjustment either at the coagulation or final stage
•
low alkalinity, often during and after heavy rain, may prevent sufficient floc to form in the
coagulation process
•
free carbon dioxide in bore water can cause metallic corrosion
•
iron, and particularly manganese, can be difficult to remove during treatment
•
algae can block filters, and cause taste and odour problems
•
an increase in the ammonia concentration can increase the chlorine demand
•
an increase in the colour or UV absorbance (or decrease in UVT) can affect the UV
disinfection efficacy
•
low temperatures can affect treatment rates
•
C.t values are temperature dependant.
Guidelines for Drinking-water Quality Management for New Zealand 2013
145
Table 4.5: Prioritising chemical monitoring in drinking-water using limited information
Chemical
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Is it possible that this chemical is in the raw water source?
(only  if yes)
From naturally
occurring sources
From agricultural
sources
Theory
Theory
Sitespecific
From human
wastes
SiteTheory
specific
Sitespecific


























Organic constituents
Aromatic hydrocarbons
Benzene
Toluene
Xylenes
Ethylbenzene
Benzo[a]pyrene
146
Attenuation
Final list
Is it possible that
Does this
Consider
Should this
this chemical is
chemical have a attenuation factors. chemical be kept on
the list based on
introduced during
significant
(see Chapter 8). Is it
From human
practical and
still possible for the
‘probability of
settlements and water treatment or
distribution?
consumer to be
feasibility of control
occurrence’?
industry
(only  if yes)
considerations?
exposed to this
(only  if there are
any s in
(see Chapter 10)
chemical?
Theory
SiteTheory
SiteChapters 3 to 7)
(only  if yes)
(only  if yes)
specific
specific
Inorganic constituents
Antimony
Arsenic
Barium
Beryllium
Boron
Cadmium
Chromium
Copper
Cyanide
Fluoride
Lead
Manganese
Mercury (total)
Molybdenum
Nickel
Nitrate (as NO 3 )
Nitrite (as NO 2 )
Selenium
Summary





Guidelines for Drinking-water Quality Management for New Zealand 2013





Chemical
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Summary
Attenuation
Final list
Is it possible that this chemical is in the raw water source?
(only  if yes)
From naturally
occurring sources
From agricultural
sources
Theory
Theory
Sitespecific
From human
wastes
SiteTheory
specific
Sitespecific
Is it possible that
Does this
Consider
Should this
this chemical is
chemical have a attenuation factors. chemical be kept on
the list based on
introduced during
significant
(see Chapter 8). Is it
From human
water
treatment
or
still
possible
for
the
practical and
‘probability
of
settlements and
consumer to be
distribution?
feasibility of control
occurrence’?
industry
(only  if yes)
exposed to this
considerations?
(only  if there are
any
s
in
chemical?
(see
Chapter 10)
Theory
SiteTheory
SiteChapters 3 to 7)
(only  if yes)
(only  if yes)
specific
specific
Pesticides
Alachlor
Aldicarb
Aldrin/dieldrin
Atrazine
Carbofuran
Chlordane
DDT
1,2-dibromo
3-chloropropane
2,4-D
Heptachlor and
Heptachlor epoxide
Hexachlorobenzene
Lindane
MCPA
Pentachlorophenol
Permethrin
Propanil
Pyridate
Simazine
Triflurin
Chlorophenoxy herbicides
other than 2,4-D, MCPA
2,4-DB
Dichlorprop
Fenoprop
MCPB
Mecoprop










√








Guidelines for Drinking-water Quality Management for New Zealand 2013
147
Chemical
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Is it possible that this chemical is in the raw water source?
(only  if yes)
2,4,5-T
1080
Microcystin
From naturally
occurring sources
From agricultural
sources
Theory
Theory

Sitespecific



From human
wastes
SiteTheory
specific
Sitespecific

Disinfectants




Monochloramine
Di- and trichloramine
Chlorine
Chlorine dioxide
Disinfectant by-products
Bromate
Chlorate
Chlorite
Chlorophenols
2-chlorophenol
2,4-dichlorophenol
2,4,6-trichlorop
Formaldehyde
MX
Trihalomethanes
Bromoform
Dibromochloromethane
Bromodichloromethane
Chloroform
148
Summary
Attenuation
Final list
Is it possible that
Does this
Consider
Should this
this chemical is
chemical have a attenuation factors. chemical be kept on
the list based on
introduced during
significant
(see Chapter 8). Is it
From human
water
treatment
or
still
possible
for
the
practical and
‘probability
of
settlements and
consumer to be
distribution?
feasibility of control
occurrence’?
industry
(only  if yes)
exposed to this
considerations?
(only  if there are
any
s
in
chemical?
(see
Chapter 10)
Theory
SiteTheory
SiteChapters 3 to 7)
(only  if yes)
(only  if yes)
specific
specific






Guidelines for Drinking-water Quality Management for New Zealand 2013





Chemical
Chapter 3
Chapter 4
Chapter 5
Chapter 6
Chapter 7
Summary
Attenuation
Final list
Is it possible that this chemical is in the raw water source?
(only  if yes)
From naturally
occurring sources
From agricultural
sources
Theory
Theory
Sitespecific
From human
wastes
SiteTheory
specific
Sitespecific
Is it possible that
Does this
Consider
Should this
this chemical is
chemical have a attenuation factors. chemical be kept on
the list based on
introduced during
significant
(see Chapter 8). Is it
From human
water
treatment
or
still
possible
for
the
practical and
‘probability
of
settlements and
consumer to be
distribution?
feasibility of control
occurrence’?
industry
(only  if yes)
exposed to this
considerations?
(only  if there are
any
s
in
chemical?
(see
Chapter 10)
Theory
SiteTheory
SiteChapters 3 to 7)
(only  if yes)
(only  if yes)
specific
specific
Chemicals that may give
rise to consumer
complaints
Inorganic constituents
Aluminium
Ammonia
Chloride
Copper
Hydrogen sulphide
Iron
Manganese
Sodium
Sulphate
Zinc
Organic constituents)
Synthetic detergents
Disinfectants and
disinfectant by-products





































Chlorine
Chloramine
2-chlorophenol
2,4-dichlorophenol
2,4,6-trichlorophenol
Chemicals not of health
significance







Asbestos


Source: Derived from WHO 2004b.
Upgrade this table following the WHO protocol – see Figure 1 (WHO 2004b).
Guidelines for Drinking-water Quality Management for New Zealand 2013
149
4.4.4 Effect of recycling washwater
The effect of recycling wastewater from the sedimentation process and filter backwash can
modify the composition of the source water, especially if the return is not continuous, so care is
needed when collecting raw water samples, ie, the raw water is really source water plus recycle.
Sedimentation tank wastes are usually thickened and dewatered, with only the supernatant
being returned. These practices can save 1–4 percent of the flow.
The recycled water may contain high pathogen densities (the main concern being pathogenic
protozoa) that challenge the filter and result in breakthrough unless certain precautions are
taken. The USEPA found of the 12 waterborne cryptosporidiosis outbreaks that have occurred at
drinking water systems since 1984, three were linked to contaminated drinking-water from
water utilities where recycle practices were identified as a possible cause. This resulted in their
Filter Backwash Recycle Rule (USEPA 2002).
Section 5.2.1.3 of the DWSNZ outlines the recycling conditions that need to be followed in order
avoid an increase in the protozoal log removal requirement.
The UKWIR (2000) developed a water treatment guidance manual that addresses recycling of
spent filter backwash water. The UKWIR recognised the risk posed by concentrated suspensions
of Cryptosporidium oocysts in spent filter backwash. UKWIR developed the following
guidelines to prevent passing oocysts into finished water:
•
backwash water should be settled to achieve a treatment objective of greater than 90 percent
solids removal before recycling
•
recycle flows should be at less than 10 percent of raw water flow and continuous rather than
intermittent
•
continuous monitoring of the recycle stream with online turbidimeters should be conducted
•
jar tests should be conducted on plant influent containing both recycle streams and raw
water to properly determine coagulant demand
•
polymers should be considered to assist coagulation if high floc shear or poor settling occurs
•
the recycle of liquids from dewatering processes should be minimised, particularly when
quality is unsuitable for recycling.
Raw waters with a high natural organic matter content may cause recycling problems due to
formation of increased levels of disinfection by-products. Trihalomethane formation potential
products can increase by over 100-fold between the raw water and recycled water.
Another potential health concern, particularly if water from sludge treatment systems is
returned, is the possibly high level of monomer resulting from the use of polyelectrolyte.
Section 8.2.1.2 of the DWSNZ shows how to address this problem.
Risk management issues related to recycling are covered in the MoH Public Health Risk
Management Plan Guide PHRMP Ref: P4.4: Pre-treatment Processes – Waste Liquor
Reintroduction.
150
Guidelines for Drinking-water Quality Management for New Zealand 2013
4.5 Selecting appropriate treatment
processes
4.5.1
Intakes
The intake or point of abstraction of a drinking-water supply may be from a bore, spring,
infiltration gallery, lake, reservoir, stream, or river. Careful design and maintenance of the
abstraction process can prevent significant problems in subsequent treatment processes. Design
issues include the adequate testing and development of bores, provision for backflushing
infiltration galleries, the use of fine screens to prevent particulate materials entering the
process, and the use of presettling processes to keep silts and sands out of pumps and filters
and, where required, for avoidance of the use of high turbidity water after storms. The degree of
any pretreatment needed will depend on the subsequent treatment processes. Pretreatment
processes are discussed in Chapter 12.
Design of bore abstraction systems should consider potential yields and how to enhance these;
changes in water quality with time; the potential for easy removal of the pump, and possibly the
screen, for maintenance and cleaning and inspection of the casing.
Specifications for well drilling work should ensure that rigs and equipment are thoroughly
steam cleaned between jobs to minimise the potential for transfer of iron and manganese fixing
bacteria between different locations. If infestations of these bacteria occur, regular treatment of
the bore with acid and chlorine washes may control the problem. Specifications should also
include the logging of the drillings, and the supply of the bore logs to the water supplier. See
Chapter 3: Source Waters, section 3.2 Groundwater for further information.
River intakes must be sited so that they:
•
are above the minimum water level (a weir may need to be constructed on small rivers)
•
do not accumulate debris
•
do not block with gravel
•
are not on fish migratory paths
•
are preferably upstream of or on the opposite bank from discharges or dirty tributaries
•
and if that is not possible, they are far enough downstream for the discharge to be fully
mixed.
There must be adequate redundancy of intake pumps to guard against the event of breakdowns,
and must be sited above maximum flood level. Failure to supply water to the plant inlet can have
serious repercussions on the treatment plant and is crucial to the whole supply system.
Valve selection for reservoirs is important. The top valve must be high enough to draw the
required flow of upper (epilimnion) water while the water level is nearly full. A lower valve is
needed so oxygenated water can still be abstracted during dry or high use periods; this means
high dams may require several valves. If there is an insufficient number of valves, anaerobic
water may have to be abstracted when the water level is too low for the top valve to operate.
Anaerobic water can contain very high concentrations of iron and manganese. Anaerobic water
can also have elevated ammonia and sulphide levels, which may challenge the chlorination
equipment.
Guidelines for Drinking-water Quality Management for New Zealand 2013
151
Risk management issues related to intakes are addressed in:
•
MoH Public Health Risk Management Plan Guide PHRMP Ref: P1.1: Surface Water
Abstraction – Rivers, Streams and Infiltration Galleries, and in
•
MoH Public Health Risk Management Plan Guide PHRMP Ref: P1.2: Surface Water
Abstraction – Lakes and Reservoirs.
4.5.2
Treatment selection
General
For any particular source water, there will usually be several treatment options that can produce
drinking-water that complies with the DWSNZ. What is successful overseas may not always be
appropriate for New Zealand. The treatment process is selected after assessing the catchment
and its water quality; see previous sections of this Chapter and Chapter 3: Source Waters.
Procedures for handling quality issues not addressed by the treatment process should be
covered in the PHRMP.
New Zealand surface water sources are often influenced by the steep topography of the land,
both in terms of quantity and quality, and the short in-river travel distances. Overlaying this are
sudden weather changes with significant rainfall. Unstable catchments can result in rapid
changes in turbidity or solids loadings in the source water, often with equally rapid clearing of
these conditions. Most of New Zealand is not subject to prolonged drought, freezing or spring
snowmelt.
Community populations in New Zealand are somewhat different from those in more densely
populated countries. Our relatively small population and large per capita land area means that
our water supplies are often widely spaced, serving small or very small populations. A lot of the
overseas technical papers and studies tend to deal with the larger plants. Often these are not
relevant to most of our water suppliers. In New Zealand there are:
•
14 communities providing drinking-water to >50,000 people
•
58 communities providing drinking-water to 5000–50,000
•
213 communities providing drinking-water to 500–5000
•
1702 communities providing drinking-water to <500.
Over 96 percent of New Zealand communities have a population of less than 5000. Our large
communities are not large by overseas standards. Drinking-water for populations less than
500 is discussed further in Chapter 19: Small, Individual and Tankered Supplies.
Information regarding protection of catchments, pre-treatment, and storage is covered in more
detail in section 4.3. Other data can often be obtained from regional councils or locally in terms
of water levels or flows, and past flood events.
The information and data required by the design or process engineer to help identify and select
treatment plant components and configurations are discussed in section 4.2: Identifying
potential sources, ie, potential water quantity, reliability and continuity, and from section 4.4:
Evaluating the sources.
152
Guidelines for Drinking-water Quality Management for New Zealand 2013
Pre-selection process
The traditional approach to treatment plant design includes obtaining:
•
hydrological data
•
rainfall and other relevant climate data
•
historical raw water quality
•
information about land use that can affect water quality (sanitary survey)
•
results of monitoring water treatment in the same or similar catchments
•
assessment of potential water treatment processes.
Ideally the pre-selection and planning process will allow time for pilot plant studies to test the
preferred treatment processes, allowing cost reductions in the final plant by not having to
incorporate as many contingencies in the design. Some advantages in including a pilot plant
stage were discussed in Couper and Fullerton (1995).
Other planning issues that must be considered are:
•
expected design life
•
intake site selection
•
population projections, ie, proposed and future plant capacity
•
long term (or other) security of the catchment or water source
•
planning and resource allocation/management issues, including plant waste disposal.
Due to rapid changes or extremes in raw water quality some water supplies use more than one
source. Examples include Gisborne (upland sources and Waipaoa River), and Wellington’s
Kaitoke supply (river water or pond storage). This offers the option to switch from a source that
becomes difficult to treat, to a cleaner source, either fully or by mixing the two. Two sources may
be needed when the main source is affected by drought or when the abstraction rate is
controlled by an in-stream minimum flow, eg, Waikanae.
If the difficult treatment situations are expected to be short-lived, there may be advantages in
relying on increasing the volume of stored treated water.
Selection options
Figure 4.2 shows the size of some micro-organisms and the suitability of various treatment
processes for removing them. The size of viruses and small bacteria show why disinfection is so
important.
As well as the traditional assessment of the source water and catchment, the DWSNZ now
require some water supplies to monitor Cryptosporidium, in order to determine the source
water protozoal risk category, refer section 5.2.1 of DWSNZ, and Chapter 8: Protozoa
Compliance, section 8.2 of the Guidelines.
Chapter 8: Protozoa Compliance, section 8.3 discusses the cumulative log credit approach to the
removal or inactivation of protozoa, and discusses the log credits that different treatment
processes can be awarded. This is illustrated with some examples, showing different approaches
for achieving 3 and 4 log removals.
Guidelines for Drinking-water Quality Management for New Zealand 2013
153
As well as source water quality, costs have a major bearing on the choice of treatment process.
Processes such as diatomaceous earth, bag and cartridge filtration usually require less capital to
install than coagulation/filtration plant and membrane filtration so tend to be used more often
in the smaller water supplies. Conventional coagulation/filtration plants usually have lower
operating costs so tend to be used by the larger water suppliers. Costs are not discussed in these
Guidelines.
The treatment processes chosen for protozoal compliance must also be suitable for dealing with
other impurities, as covered in the following discussion. Generally, for most source waters, the
water treatment process is still selected on basic issues such as colour and turbidity. Whether a
source water needs 3 or 4 log removals for protozoal compliance, usually dictates the selection
of disinfectant or its dose, or the turbidity required from the filters. If a source water is required
to achieve 4 log removals, it may be necessary to include an additional treatment process, over
and above the amount of treatment that just colour and turbidity would require.
Figure 4.2: Micro-organism size and treatability
This section concentrates on the options for the treatment of determinands other than protozoa.
Chapter 8 discusses the treatment requirements needed in order to comply with protozoa
compliance. Chapters 12–15 describe operational aspects of the treatment processes in more
detail. Chapter 3 in AWWA (1990) gives a guide to the selection of water treatment processes.
154
Guidelines for Drinking-water Quality Management for New Zealand 2013
All water sources other than secure bore water need some form of disinfection. Chlorine is still
the most frequently used disinfectant in New Zealand water supplies. At reasonable doses, it is
effective against most bacteria and viruses. The selection of disinfectant will be dependant on
the approach adopted in order to satisfy protozoa compliance, and whether it has been decided
to maintain a chlorine residual in the distribution system. See Chapter 15: Disinfection,
Table 15.3 for a summary of the efficacy of different disinfectants.
Tables 4.6–4.8 only offer guidance; they are not meant to be part of a design manual. The tables
attempt to match potential treatment processes with raw water quality. In some cases the raw
water quality may be such that a combination of processes is needed.
For individual supplies, refer to Chapter 19, Table 19.2: Contaminants and treatment methods,
and Table 19.3: Point-of-use devices and their effectiveness against various contaminants.
Table 4.6: Treatment options for typical low colour source waters
Treatment options
Filtering
non-secure
bore water
Cartridge
Removing
carbon dioxide
from bore water
Removing iron
or manganese
ex bore water
Yes
Yes
Aeration
Yes (note 1)
Aeration, coagulation
and filtration
Yes (note 2)
Yes
Aeration plus oxidation
and/or pH increase
Yes
Diatomaceous earth
filtration
Filtering surface
water without much
colour (note 3)
Yes
Yes
Yes
Yes
Slow sand filter
Yes
Membrane filtration (MF)
Yes
1
Check that sodium hydroxide or hydrated lime is not needed too.
2
Check amount of iron and manganese removed by aeration alone: oxidation and/or pH adjustment may be
needed.
3
If the turbidity is low enough, disinfection may be the only treatment needed. Ozone may lower the colour.
Table 4.7: Treatment options for source waters with colour that also needs to be removed
Treatment options
Surface water
Waters with high
Surface water
or a large range of
with low
with large
turbidities
particulate matter
numbers of algae
(note 1)
Slow sand filter
Yes
Coagulation, direct filtration
Yes
Lowland rivers
below industry or
intense
agriculture
Coagulation, sedimentation,
filtration
Yes
Yes
Yes (note 2)
Coagulation, DAF, filtration
Yes
Yes
Yes (note 2)
Coagulation preceded by
microstrainer
Yes
Membrane filtration (MF)
Yes
Yes (note 3)
Coagulation or MF plus
activated carbon
Coagulation or MF plus ozone
1
2
3
Yes (note 3)
Yes (note 3)
Yes
Yes
Yes
There could be seasonal variation.
Pretreatment may be essential, eg, bankside or off-river storage.
Coagulation may be needed at times.
Guidelines for Drinking-water Quality Management for New Zealand 2013
155
Table 4.8: Treatment options for other types of source waters
Treatment options
Groundwater with Groundwater with
high ammonia
geothermal
concentrate
material
Aeration
Waters with low
colour but glacial
flour
Hard water
At high pH
Aeration plus oxidation and/or
pH increase
Yes
Possibly
Diatomaceous earth filtration
Possibly
Slow sand filter
Possibly
Membrane filtration (MF)
Yes
Coagulation and sand filtration
Possibly
Softening
Possibly
Yes
Yes
Note: Groundwaters containing geothermal water may need specific guidance.
Table 4.9: Options for waters that only require disinfection
Disinfectant
Bacterial compliance
Protozoal compliance
Residual in the distribution system
Chlorine
Yes
Yes
Chloramine
Yes
Yes
Chlorine dioxide
Yes
Yes
Ozone
Yes
Yes
UV light
Yes
Yes
Yes
Note that nanofiltration and reverse osmosis systems can also remove bacteria and protozoa.
The DWSNZ do not include any compliance criteria for viruses. All of the above disinfection
processes (except chloramine) inactivate most viruses.
Risk management issues related to design and operation are covered in the MoH Public Health
Risk Management Plan Guide PHRMP Ref: Ref: P11: Treatment Processes – Plant Construction
and Operation.
USEPA (2007) is a useful manual covering many aspects of water treatment.
References
AWWA. 1990. Water Quality and Treatment (4th edition). Published for American Water Works
Association by McGraw-Hill Inc.
Collins R, McLeod M, Donnison A, et al. 2005. Surface Runoff and Riparian Management III.
NIWA Client Report 2005-054 to the Ministry of Agriculture and Forestry, Wellington. This
document can be accessed from http://maxa.maf.govt.nz/mafnet/publications/techpapers/0602/page-04.htm
Couper SJ, Fullerton RW. 1995. Pilot Plant Trials: Seeking the most cost-effective solution. Annual
Conference, New Zealand Water and Wastes Association.
DSIR. 1988. Hydrologists’ Field Manual. Hydrology Centre Publication No. 15. Christchurch. (Now
NIWA.)
156
Guidelines for Drinking-water Quality Management for New Zealand 2013
Duncan M, Woods R. 2004. Flow regimes. In J Harding, P Mosley, C Pearson, et al (eds)
Freshwaters of New Zealand, pp 7.1–7.14. Christchurch: New Zealand Hydrological Society and New
Zealand Limnological Society, Caxton Press.
Gerba CP, Nwachuku N, Riley KR. 2003. Disinfection resistance of waterborne pathogens on the
United States EPA’s contaminant candidate list. Journal of Water Supply Research and Technology
52: 81–94.
Hickey CW. 1995. Ecotoxicity in New Zealand. Australian Journal of Ecotoxicology 1: 43–50.
Hickey CW. 2000. Ecotoxicology: laboratory and field approaches. In KC Collier, M Winterbourn
(eds) New Zealand Stream Invertebrates: Ecology and Implications for Management, pp. 313–43.
Christchurch: New Zealand Limnological Society.
Hoare RA, Spigel RH. 1987. Water balances, mechanics and thermal properties. In WN Vant (ed)
Lake Managers Handbook, pp 41–58, Water and Soil Miscellaneous Publication No. 103.
Wellington: Ministry of Works and Development.
Holland P, Rahman A. 1999. Review of Trends in Agricultural Pesticide Use in New Zealand. MAF
Policy Technical Paper 99/11. 54 pp. See http://www.maf.govt.nz/mafnet/rural-nz/sustainableresource-use/resource-management/pesticide-use-trends/PesticideTrends.PDF or
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.62.1461&rep=rep1&type=pdf.
Lentz M, Kennedy P, Jones P, et al. 1998. Review of Environmental Performance Indicators for
Toxic Contaminants in the Environment: Air, water and land. Environmental performance
indicators. Technical Paper No. 37 Toxic. Wellington: Ministry for the Environment. 105+ pp.
http://www.mfe.govt.nz/publications/air/epi-toxic-review-oct98.html
McBride GB. 2005. Using Statistical Methods for Water Quality Management: Issues, options and
solutions. New York: Wiley.
McGuigan KG, Joyce TM, Conroy RM, et al. 1998. Solar disinfection of drinking water contained in
transparent plastic bottles: Characterising the bacterial inactivation process. Journal of Applied
Microbiology 84: 1138–48.
Medema GJ, Shaw S, Waite M, et al. 2003. Chapter 4. Catchment characterisation and source water
quality, pp 111–58. In Assessing Microbial Safety of Drinking Water. OECD, World Health
Organization.
MoE. 2001. Drinking Water treatment 2001. Ontario, Canada: Ministry of the Environment.
Note: The New Zealand Ministry of Health’s Guides for drinking-water supplies can be accessed as
Word documents on the Ministry of Health website: http://www.moh.govt.nz/water then select
publications and Public Health Risk Management Plans.
MoH Public Health Risk Management Plan Guide PHRMP Ref: S2. Development of New Supplies.
Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref: P1.1. Surface Water Abstraction –
Rivers, Streams and Infiltration Galleries. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref: P1.2. Surface Water Abstraction –
Lakes and Reservoirs. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref: P4.4. Pre-treatment Processes –
Waste Liquor Reintroduction. Wellington: Ministry of Health.
MoH Public Health Risk Management Plan Guide PHRMP Ref: Ref: P11. Treatment Processes –
Plant Construction and Operation. Wellington: Ministry of Health.
Ogilvie D. 1983. Changes in Water Quality after Impoundment. Paper presented at the XVth Pacific
Science Congress: Changes in Freshwater Ecosystems.
Guidelines for Drinking-water Quality Management for New Zealand 2013
157
PCE. 2004. Growing for Food. Wellington: Parliamentary Commissioner for the Environment.
Rutherford JC. 1994. River Mixing. New York: Wiley.
Selvarajah, N, Maggs GR, Crush JR, et al. 1994. Nitrate in groundwater in the Waikato region. In
LD Currie, P Loganathan (eds) The Efficient Use of Fertilisers in a Changing Environment:
Reconciling productivity with sustainability, pp 160–185. Occasional report no. 7. Palmerston
North: Fertiliser and Lime Research Centre, Massey University.
Sinton L. 1986. A Guide to Groundwater Sampling Techniques. National Water and Soil
Conservation Authority, Water and Soil Miscellaneous Publication No. 99.
Smith DG. 1986. Heavy metals in the New Zealand aquatic environment: a review. Water and Soil
Miscellaneous Publication No. 100. Wellington: Ministry of Works and Development, 108 pp.
Sobsey MD. 1989. Inactivation of health-related micro-organisms in water by disinfection processes.
Water Science and Technology 21: 179–95.
Stanfield G, Le Chevallier M, Snozzi M. 2003. Chapter 5. Treatment efficiency, pp 159–178. In
Assessing Microbial Safety of Drinking Water. OECD: World Health Organization.
Sundaram B, Feitz A, Caritat P de, et al. 2009. Groundwater Sampling and Analysis – A Field
Guide. Commonwealth of Australia, Geoscience Australia, Record 2009/27, 95 pp.
https://www.ga.gov.au/products/servlet/controller?event=GEOCAT_DETAILS&catno=68901
UKWIR. 2000. Guidance Manual Supporting the Water Treatment Recommendations from the
Badenoch Group of Experts on Cryptosporidium (2nd edition). Report No. 00/DW/06/10. London:
UK Water Industry Research Limited.
USEPA. 2002. Filter Backwash Recycling Rule, Technical Guidance Manual. Office of Ground
Water and Drinking Water, EPA 816-R-02-014. 178 pp.
http://www.epa.gov/ogwdw/mdbp/pdf/filterbackwash/fbrr_techguidance.pdf or go to
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/index.cfm
USEPA. 2007. Simultaneous Compliance Guidance Manual for the Long Term 2 and Stage 2 DBP
Rules. EPA 815-R-07-017. 462 pp.
http://www.epa.gov/ogwdw/disinfection/stage2/pdfs/guide_st2_pws_simultaneouscompliance.pdf
Ward RC, Loftis JC, McBride GB. 1990. Design of Water Quality Monitoring Systems. New York,
NY: Van Norstrand Reinhold. 231 pp.
Wilcock RJ. 1989. Patterns of Pesticide Use in New Zealand. Part 1: North Island 1985–1988. Water
Quality Centre Publication No. 15. Hamilton: DSIR.
Wilcock RJ, Close ME. 1990. Patterns of Pesticide Use in New Zealand. Part 2: South Island
1986–1989. Water Quality Centre Publication No. 16. Hamilton: DSIR.
Wilcock RJ. 1993. Application of land-use data and screening tests for evaluating pesticide runoff
toxicity. Environmental Management 17(3): 365–71.
WHO. 2003. Assessing Microbial Safety of Drinking-water: Improving approaches and methods.
Published on behalf of the World Health Organization and the Organisation for Economic
Co-operation and Development by IWA Publishing, London. Available at:
http://www.who.int/water_sanitation_health/dwq/9241546301/en/index.html
WHO. 2004a. Guidelines for Drinking-water Quality 2004 (3rd ed). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda
158
Guidelines for Drinking-water Quality Management for New Zealand 2013
WHO. 2004b. Chemical Safety of Drinking-water: assessing priorities for risk management.
Geneva: World Health Organization. See
http://www.who.int/water_sanitation_health/dwq/cmp/en/
WHO. 2004c. Water Treatment and Pathogen Control: Process efficiency in achieving safe
drinking water. Published on behalf of WHO by IWA Publishing. Available at:
http://www.who.int/water_sanitation_health/dwq/9241562552/en/index.html
WHO. 2009. Scaling Up Household Water Treatment Among Low-Income Populations.
WHO/HSE/WSH/09.02. Geneva: World Health Organization.
http://www.who.int/household_water/research/household_water_treatment/en/index.html
WHO. 2011. Guidelines for Drinking-water Quality 2011 (4th ed). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
WHO. 2011a. Evaluating Household Water Treatment Options: Health-based targets and
microbiological performance specifications. 68 pp.
http://www.who.int/water_sanitation_health/publications/2011/evaluating_water_treatment.pdf
Guidelines for Drinking-water Quality Management for New Zealand 2013
159
Chapter 5: General
microbiological quality
5.1
Introduction
This chapter discusses the microbiological quality of drinking-water in general terms.
Microbiological compliance issues are discussed as follows:
•
Chapter 6: Bacterial Compliance
•
Chapter 7: Virological Compliance
•
Chapter 8: Protozoal Compliance
•
Chapter 9: Cyanobacterial Compliance.
Infectious, water-related diseases are a major cause of morbidity and mortality worldwide.
Newly-recognised pathogens and new strains of established pathogens are being discovered that
present important additional challenges to both the water and public health sectors. Between
1972 and 1999, 35 new agents of disease were discovered and many more have re-emerged.
Amongst these are pathogens that may be transmitted by water (WHO 2003b). The first case of
human cryptosporidiosis was reported in 1976, and by 1985 this ‘new’ pathogen was becoming
more widely recognised.
The microbiological quality and the likelihood that a pathogen (disease causing organism) will
be transmitted through drinking-water is dependant on numerous factors. Some of these reflect
the characteristics of the pathogen itself, including resistance to environmental conditions such
as ultraviolet light, desiccation, temperature etc.
Many of these pathogens are zoonotic. Zoonoses are diseases caused by micro-organisms of
animal origin that also infect humans. Zoonoses are of increasing concern for human health;
next to pathogens with human-to-human transmission, they pose the greatest challenges to
ensuring the safety of drinking-water and ambient water, now and in the future. See WHO
(2004b) – a 528-page document.
The phenomena of ‘emergence’ and ‘re-emergence’ of infectious diseases is well recognised. Up
to 75 percent of emerging pathogens may be of zoonotic origin. WHO (2012) states in Chapter 2
that a pathogen or disease-causing agent is considered ‘emerging’ when it makes its appearance
in a new host population or when there is a significant increase in its prevalence in a given
population. A significant number of emerging and re-emerging waterborne pathogens have been
recognised over recent decades; examples include E. coli O157:H7, Campylobacter, and
Cryptosporidium. Public health scientists are increasingly discovering that the recent
emergence or re-emergence of infectious diseases has an origin in environmental change. These
environmental changes encompass social processes such as urbanisation and creation of
transportation infrastructure, as well as ecologic processes such as land and water use,
biodiversity loss, and climate change.
The frequency with which emerging communicable diseases are identified seems to be
increasing. The rationale is well recognised as being the consequence of:
160
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
increasing urbanisation with the movement of humans to major population centres being
matched by the movement of vertebrate and invertebrate species into urban areas as well.
The increased socialisation of individuals provides new opportunities for pathogen spread
•
the phenomenal increase in international travel, in particular air travel, has provided
opportunities for pathogens to travel along pathways between states with relative freedom
and with increased speed and volume
•
it is noted that since the end of the 1990s epidemiologists have been challenged by a
succession of events which have featured either novel infections (SARs, H1N1, H5NI) or
legacy infections that have been transferred to naïve populations (West Nile Virus in North
America, Chikungunya in South Asia and Italy).
Along with trends in animal populations and husbandry, the presence of a given pathogen (eg,
Campylobacter) may vary considerably from time to time, and the intensity of shedding may be
influenced by factors with their own underlying trends, such as the seasonality and changes in
farm control and management practices.
Securing the microbial safety of drinking-water supplies is based on the use of multiple barriers,
from catchment to consumer, to prevent the contamination of drinking-water or to reduce
contamination to levels not injurious to health.
Faecally derived pathogens from contamination by human, animal or bird faeces are the
principal concerns in setting health-based targets for microbial safety. Microbial water quality
often varies rapidly and over a wide range and short-term peaks in pathogen concentration may
increase disease risks considerably, with greater reliance on treatment processes.
WHO (2012) stated:
•
Although there are a large number of zoonotic pathogens that affect humans, five are known
to cause illness around the world with high-frequency: Cryptosporidium, Giardia,
Campylobacter, Salmonella and E. coli O157. Efforts to control these pathogens are likely to
be effective in controlling other related zoonotic pathogens whether known, as-yetunrecognised or emergent.
•
Domestic animals such as poultry, cattle, sheep and pigs generate 85 percent of the world’s
animal faecal waste, proportionally a far greater amount than the contribution by the human
population. The faecal production rate and contribution to the environment of these animals
can be as high as 2.62 × 1013 kg/year.
•
Limiting zoonotic pathogen-shedding in farm or production facilities for domestic animals
should be accomplished by preventing illness in livestock, through minimising exposure to
pathogens, by increasing immunity, by manipulation of the animal gastrointestinal tract
microbial ecology and by managing (including treating) animal waste to reduce the release of
zoonotic pathogens into the environment.
Being a significant exporter of animal sourced protein means that the relevance of zoonotic
diseases will be potentially much greater in New Zealand than in many other countries. For a
more detailed discussion relating to waterborne diseases reported in New Zealand, see
Chapter 1, section 1.1.3.
The words cyst, oocyst and (oo)cyst appear frequently in this chapter and in Chapter 8:
Protozoal Compliance. The definitions in the Drinking-water Standards for New Zealand
(DWSNZ) are:
•
an oocyst is a thick walled structure within which Cryptosporidium zygotes develop and
which serves to transfer the organism to new hosts
Guidelines for Drinking-water Quality Management for New Zealand 2013
161
•
a cyst is the non-motile dormant form of Giardia which serves to transfer the organism to
new hosts
•
(oo)cyst is an abbreviation for cyst and oocyst.
WHO (2004d) covers many water treatment processes suitable for pathogen control. WHO
(2005a) published advice for travellers on how to make drinking-water safe. WHO (2009) is a
143-page publication devoted to just Cryptosporidium.
5.2 Micro-organisms in drinking-water
5.2.1
Introduction
WHO (2004) states:
The human health effects caused by waterborne transmission vary in severity from mild
gastroenteritis to severe and sometimes fatal diarrhoea, dysentery, hepatitis and typhoid
fever. Contaminated water can be the source of large outbreaks of disease, including
cholera, dysentery and cryptosporidiosis; for the majority of waterborne pathogens,
however, there are other important sources of infection, such as person-to-person contact
and food.
Most waterborne pathogens are introduced into drinking-water supplies in human or
animal faeces, do not grow in water, and initiate infection in the gastrointestinal tract
following ingestion. However, Legionella, atypical mycobacteria, Burkholderia
pseudomallei and Naegleria fowleri are environmental organisms that can grow in water
and soil. Besides ingestion, other routes of transmission can include inhalation, leading to
infections of the respiratory tract (eg, Legionella, atypical mycobacteria), and contact,
leading to infections at sites as diverse as the skin and brain (eg, Naegleria fowleri,
Burkholderia pseudomallei).
Of all the waterborne pathogens, the helminth Dracunculus medinensis is unique in that it
is the only pathogen that is solely transmitted through drinking-water.
New Zealand has many water supplies ranging from the fully treated large municipal supplies,
to the small untreated supplies serving a community of say less than 100. Microbiological
guidelines seek to ensure that water supplies are free from disease-causing micro-organisms.
The provision of such a supply is of the utmost importance to the health of any community.
The most common and widespread health risk associated with drinking-water is contamination,
either directly or indirectly through human, animal and occasionally bird faeces and with the
micro-organisms contained in their faeces. If the contamination is recent and among the
contributors there are carriers of communicable enteric diseases (diseases of the gut), some of
the micro-organisms that cause these diseases may be present in the water. The degree of risk is
related to the level of disease in the human or animal community at that time. Drinking this
water or using it in food preparation may cause new cases of infection. Those at greatest risk of
infection are infants and young children, people whose immune system is depressed, the sick
and the elderly. Risebro et al (2012) found that “Contaminated small water supplies pose a
substantial risk of infectious intestinal disease to young children who live in homes reliant on
these supplies. By contrast older children and adults do not appear to be at increased risk”.
The pathogenic organisms of concern in New Zealand include bacteria, viruses and protozoa.
The diseases they cause vary in severity from mild gastroenteritis, to severe and sometimes fatal
diarrhoea, dysentery, hepatitis, cholera, typhoid fever and campylobacteriosis.
162
Guidelines for Drinking-water Quality Management for New Zealand 2013
A 15-month fortnightly survey of microbial health risk indicators and pathogens was carried out
at 25 freshwater recreational and water supply sites distributed throughout New Zealand, for
E. coli, Clostridium perfringens spores, F-RNA bacteriophage, somatic coliphage, human
enteroviruses, human adenoviruses, Cryptosporidium oocysts, Giardia cysts, Salmonella and
Campylobacter (MfE 2002 – the Bad Bugs Report). Viruses and Campylobacter were detected
at all six water supply sites. There was very little difference between the drinking-water supply
sites and the remaining site types with respect to the occurrence of pathogens and the
concentrations of indicator organisms. The main issue for source waters is the high proportion
of samples which contained Campylobacter (60 percent) and viruses (54 percent) and the
ability of drinking-water treatment to inactivate or remove them. The widespread presence of
Campylobacter indicates a need for considerable care with respect to small rural supplies,
which have been implicated in campylobacteriosis previously (Eberhardt-Phillips et al 1997, Till
et al 2008).
While the classical waterborne diseases are caused by organisms originating in the gut of
humans or animals, many organisms found in water are not, or at least not regularly, associated
with the gut. Some of these may under certain circumstances cause disease in humans. They
include the protozoan Naegleria fowleri, and a number of bacteria including Aeromonas,
Klebsiella, Legionella spp, and some species of environmental mycobacteria. Refer to the
individual datasheets for further information.
Infection is the main, but not the only problem associated with micro-organisms in drinkingwater. Certain algae can produce toxins that affect humans and which may remain in the water
even when the algae responsible have been removed, see Chapter 9: Cyanobacterial Compliance.
Other ‘nuisance organisms’ can cause problems of taste, odour or colour, as well as deposits and
corrosion, and while they may not cause disease, they are aesthetically unacceptable. The
organisms concerned include iron, manganese, sulphur and nitrifying bacteria, nematodes,
midges, crustacean, rotifers and mussels; these are discussed in AWWA (2004).
The supply of safe drinking-water involves the use of multiple barriers to prevent the entry and
transmission of pathogens. The effectiveness of these multiple barriers should be monitored by
a programme based on operational characteristics and testing for microbial indicators of faecal
contamination and in some circumstances actual pathogens.
5.2.2
Controlling waterborne infection – historical overview
The value of a wholesome water supply has been recognised, at least in some quarters, for many
centuries. Hippocrates described an association between water supplies and disease
(Hippocrates, cited 1938) and Roman engineers went to great lengths to provide waters suitable
in both quantity and quality for major cities.
Over recent centuries, urbanisation and industrialisation have increased the pressure upon
water supplies and the systems of waste disposal. Thus it was that, by the middle of the
nineteenth century, Britain was affected by major epidemics of cholera and endemic typhoid.
John Snow and William Budd provided irrefutable evidence of the role of water in transmission
of these two diseases. Snow’s case rested very simply on a comparison of cholera incidence
among the customers of three London water companies (Snow 1855). One supplied filtered
water; the second moved the source of its supply to a cleaner area of the River Thames, while the
third persisted in supplying polluted River Thames water. Budd appreciated that the sewer was
merely an extension of the diseased gut (Budd 1856) and applied what are now classical
epidemiological concepts to the investigation of water as a vehicle for spreading typhoid.
Guidelines for Drinking-water Quality Management for New Zealand 2013
163
As a result, filtration of river-derived water became legally required in London in 1859, and this
practice gradually spread throughout Europe. By 1917, Sir Alexander Houston could draw
attention to the effectiveness of London’s systems of water treatment and delivery in stopping
the waterborne transmission of typhoid. In America, he pointed out it was customary to
consider as normal an annual mortality rate from typhoid of 20 or more per 100,000 of
population (the rate in Minneapolis was 58.7). In London, however, the annual mortality from
typhoid was 3.3 per 100,000 (Houston 1917).
Budd’s relatively simple precautions against faecal-oral transmission of typhoid (use of strong
disinfectants in the water-closet bucket) had been remarkably successful (Budd 1856). A century
later, Hornick’s experiments on volunteers helped to explain the success by showing the disease
to be in some instances relatively difficult to catch (Hornick et al 1966). Around 107 Salmonella
serovar Typhi bacteria caused disease in only fifty percent of his volunteer subjects. Kehr and
Butterfield (1943), however, showed that a small minority of the population (about 1.5 percent)
needed to ingest only a single typhoid organism to contract typhoid, and to protect these people
clearly more elaborate precautions are needed.
When the need to protect drinking-water from faecal material was first recognised, the
techniques available for the isolation of such organisms as Salmonella serovar Typhi and Vibrio
cholerae were quite inadequate for practical purposes. Surrogates or indicators were needed,
and the obvious candidates were common microflora from the gut, and so the use of indicator
organisms became established. Testing water for ammonia was commonly used to indicate the
presence of human wastes. An early consensus developed about the use of the coliform
organisms, and in the early decades of the 20th century the work of Alexander Houston (1917)
and Doris Bardsley (1934), among many others, helped to establish the validity of Escherichia
coli (E. coli) as an indicator of faecal contamination.
Kehr and Butterfield (1943) showed the coliform test to be a useful indicator of S. serovar Typhi
and they concluded that the presence of coliforms (as a bacterial group), even in moderate
numbers, indicated a potential danger. They cited an outbreak in Detroit, Michigan, when on
two successive days mean coliform counts in the water supply of only 3 and 10 per 100 mL were
the indicator for an outbreak of waterborne typhoid. They also noted the very much higher risk
of gastroenteritis associated with this low coliform count. For the eight cases of typhoid
recorded in this outbreak, there were 45,000 cases of gastroenteritis.
Endemic and epidemic cholera and typhoid both still occur, transmitted through contaminated
drinking-water, as demonstrated in Pristina (Yugoslav Typhoid Commission 1964), in South
Africa (Kustner et al) and the cholera outbreak in Peru (Anderson 1991). The latest number of
waterborne cholera cases from the World Health Organization is for the year 2000: 137,071
cases and almost 5000 deaths. It should be noted that this figure does not include any cases
from Bangladesh or Pakistan where cholera is endemic.
Fortunately, in New Zealand indigenous typhoid and cholera are now rare. Most cases are
visitors from overseas or travellers returning to New Zealand. Waterborne disease
however, remains a constant threat to public health. Twenty cases of typhoid were
notified in New Zealand in 2003, and one case of cholera. Typhoid carriers, and those people
contracting the illness, have the potential to distribute large numbers of pathogens throughout
the country where drinking-water protection and treatment systems are not operational. In
addition there is the environmental risk from pathogens such as Campylobacter, Salmonella,
Cryptosporidium, Hepatitis A and enterohaemorrhagic E. coli (EHEC). During 2003 nearly
15,000 cases of campylobacteriosis, 1400 of salmonellosis, over 800 of cryptosporidiosis,
70 cases of Hepatitis A and 105 EHEC infections were reported in New Zealand (NZPHSR
2003).
164
Guidelines for Drinking-water Quality Management for New Zealand 2013
5.2.3
Maximum acceptable value (MAV)
The use of a Maximum Acceptable Value (MAV) for E. coli for drinking-water requires an
understanding of the use of microbiological indicator organisms as an indicator of the potential
for the risk of pathogens being present. Whereas the MAV of a chemical determinand in a
drinking-water represents its concentration that on the basis of present knowledge is not
considered to cause any significant risk to the health of the consumer over a lifetime of
consumption of the water, the use of MAVs for microbiological determinands is somewhat
different.
The microbiological determinand E. coli is an indicator of recent faecal contamination. The
quantification of E. coli is related to the absence or non-detectability of that micro-organism in a
given volume of water. Such a value, when considered with the method of analysis and
frequency of sampling for a given population, gives a probability that there is no significant risk
of infection from micro-organisms of known health significance at the time of sampling. The
presence of E. coli provides evidence of recent faecal contamination, and detection should lead
to consideration of further action such as further sampling and investigation of inadequate
treatment or breaches in distribution system integrity.
A MAV is given in the DWSNZ for E. coli as an indicator of the potential presence of pathogenic
enteric bacteria, enteric viruses, and pathogenic protozoa. However, E. coli is not always a good
indicator for viruses or protozoa.
A maximum indicator value (MIV) is a more appropriate parameter to use for bacteria than a
MAV because E. coli is not monitored for health reasons, it is monitored as an indicator of faecal
contamination, and therefore of the potential presence of pathogenic micro-organisms.
However, for consistency with general (and historical) usage, the term MAV is used throughout
the DWSNZ.
Historically and internationally, the guideline value, maximum contaminant level or MAV etc
seems always to have been ‘less than 1 per 100 mL’, with the unit or test organism changing
from B. coli, to total or presumptive coliforms, to E. coli. Over the years, improved growth media
and incubation conditions have enhanced selectivity, and quality assurance procedures have
reduced the number of false positives and false negatives. But it’s always been ‘less than 1 per
100 mL’. The pattern was probably established with the original test methods over 100 years
ago. It can’t have had anything to do with infective doses, because only indicator organisms have
been tested for, and the ratio of indicator organisms to pathogens would vary wildly. Retaining
the ‘less than 1 per 100 mL’ for compliance testing has probably been more to do with
pragmatism than science; water with ‘less than 1 per 100 mL’ seems not to have caused many
illnesses over the years, ie, it seems to work! Water suppliers interested in more than just
compliance testing are referred to Chapter 6, section 6.3.3.
Guidelines for Drinking-water Quality Management for New Zealand 2013
165
5.3 Microbial indicators
5.3.1
Introduction
The detection of specific pathogens, including bacteria, viruses, protozoa and parasites is usually
complex, expensive, time-consuming, and currently often not practically possible. It may take
weeks to determine whether a sample actually contains a particular pathogen. Furthermore,
methods for parasitic cysts or oocysts (eg, Giardia intestinalis, 12 Cryptosporidium hominis 13
and C. parvum) have recovery efficiencies of typically less than 50 percent, and can be quite
variable.
Therefore in monitoring microbiological quality, reliance is placed on relatively quick and
simple tests for the presence of indicator organisms. At present this usually involves culturing
the organisms on or in an appropriate growth medium. Selective media are usually chosen.
These prevent or retard organisms other than the ones being targeted. There has been debate
(Sinton 2006) whether culture techniques detect all the organisms. Are those that do not
respond ‘viable but non-culturable’, or are they simply dead or injured beyond repair (ie, no
longer pathogenic, ie, infective)?
In addition to the indicator organisms specifically referred to in the DWSNZ, this section
discusses heterotrophic plate counts (colony counts) that may be used to assess the general
bacterial content of drinking water, and it considers phages. Chapter 7 discusses viruses.
Microbial indicators are micro-organisms that while not themselves pathogenic, indicate
potential issues of microbiological water quality. The drinking-water industry commonly uses
the following indicator organisms:
•
heterotrophic plate count (standard plate count, mesophilic plate count, aerobic plate count)
•
total coliforms
•
faecal coliforms (thermotolerant coliforms)
•
Escherichia coli (E. coli).
An effective indicator organism for detecting faecal contamination of water should:
•
always be present when faecal pathogens are present
•
be present in faeces in large numbers so that the organisms can still be detected after
considerable dilution
•
be relatively easy and quick to detect
•
survive in water at least as long as waterborne pathogens of faecal origin
•
be as sensitive as pathogens to disinfection.
Ideally, tests used to measure the numbers of indicator organisms in a sample must be specific
to that organism, and they should encourage a high proportion of those present in the sample to
grow. It has long been recognised that artificial culture media lead to only a very small fraction
(0.01–1 percent) of the viable bacteria present being detected. Since MacConkey’s development
of selective media for E. coli and coliforms at the beginning of the twentieth century, various
workers have shown these selective agents inhibit environmentally or oxidatively stressed
coliforms (WHO 2001, Chapter 13).
12
Also known as G. lamblia and G. duodenalis.
13
Previously called Cryptosporidium parvum (genotype 1).
166
Guidelines for Drinking-water Quality Management for New Zealand 2013
No single indicator fulfils all these considerations, nor is any suitable for all cases. All indicators
have disadvantages that must be considered when interpreting test results, and expertise is thus
mandatory in this area. Multiple indicator systems may be needed in certain circumstances.
Nevertheless, if the indicators are satisfactory and monitoring is carried out appropriately, it
should be possible to dispense with the use of complex tests for the specific pathogenic microorganisms in all but a few cases. A vast amount of experience has accumulated from the use and
interpretation of tests for indicator micro-organisms and considerable confidence can be placed
on the results of these tests.
The most important point is that the presence of indicators of faecal contamination implies an
increased risk of disease. For disease to occur, however, the indicators must be accompanied by
pathogenic micro-organisms. The chances of this occurring are determined by the prevalence of
the pathogens in the potential sources (people or animals) and in the catchment from which the
water is drawn.
The occasional failure of indicators to predict disease underlines the prime importance of risk
assessments and maintaining effective multiple barriers from catchment to tap to prevent faecal
material from entering the water supply. Tests for the microbiological quality of water can only
indicate breaches of the integrity of those barriers.
A history of the development of indicator organisms appears in Chapter 13 of WHO (2001).
5.3.2
Bacterial indicators
The DWSNZ use Escherichia coli (E. coli) as the bacterial indicator, with a maximum acceptable
value (MAV) of less than 1 per 100 mL. This is unchanged from earlier editions. Faecal coliforms
(thermotolerant or presumptive coliforms) and total coliforms can be monitored instead of
E. coli, but with the proviso that a positive result for either should be treated as a positive E. coli
result. Given that one can obtain either faecal coliforms or total coliforms in the absence of
E. coli, this option is generally more demanding.
The bacterial compliance criteria in the DWSNZ have made use of the observation that E. coli is
rarely found in drinking-water if the free available chlorine content is at least 0.2 mg/L.
E. coli comes from the family of bacteria known as Enterobacteriaceae and is the most common
bacterium of this group. It is characterised by the possession of the enzymes β-galactosidase and
β-glucuronidase. E. coli is nearly always present in the gut of humans and animals and usually
in high numbers, and it is found in fresh faecal material at densities of more than 109 organisms
per gram. It can survive for considerable periods in water, which is generally similar to some of
the waterborne faecal pathogens.
A few strains of E. coli may be pathogenic in the gut. However, this is irrelevant to the use of
E. coli as an indicator organism. Both pathogenic and non-pathogenic strains of E coli are
equally important as indicators of faecal contamination, as are animal and human sources.
The arguments for using E. coli are compelling:
•
it is a strict indicator of faecal contamination, whereas the faecal coliforms and total
coliforms are not
•
it is an organism, whereas the other two are groups of bacteria
•
it is most usually present when pathogens are present (eg, as found in the New Zealand
Freshwater Microbiological Research Programme in fortnightly sampling at five drinkingwater abstraction sites over a 15-month period, McBride et al 2002)
Guidelines for Drinking-water Quality Management for New Zealand 2013
167
•
it is routinely associated with health risk effects in water ingestion studies (eg, Dufour 1984)
•
it is now amenable to rapid and accurate enumeration, eg, using the ColilertTM MPN system
(and acceptable equivalents). Colilert detects both total coliforms and E. coli. However, not
too much importance should be placed on its total coliform results; they are essentially just a
step on the way to get the important result, ie, E. coli.
The absence of E. coli does not necessarily guarantee the absence of faecal contamination
(particularly where multiple barriers are absent as, for example, when reliance is placed on
disinfection alone). Absence of evidence does not logically denote evidence of absence. Although
their presence is a definite indication of pollution, their absence suggests that pathogenic
bacteria and viruses are probably absent also.
There are some indications that E. coli may grow in favourable environmental conditions,
especially in warm climates (Fujioka et al 1999). E. coli growth has been reported in food
(including E. coli O157:H7) (Doyle 1997), tropical water (Bermúdez and Hazen 1988),
subtropical waters and soil (Hardina and Fujioka 1991), water in animal drinking troughs
(Lejeune et al 2001) and in temperate waters and sediments in water reservoirs near Sydney
(N. Ashbolt, University of New South Wales, personal communication). However the New
Zealand Freshwater Microbiological Programme did not find evidence of such growth occurring
(McBride et al 2002).
To date, bacteria, including E. coli, have been defined by their biochemical reactions in the
laboratory, rather than being identified by something more specific like DNA. Therefore there
will always be debates about which test methods are the most appropriate, and which produce
false positives or negatives. A recent study illustrates this (DWI 2010). Debate will continue
about incubation temperatures, chlorine stress, and whether lactose fermentation or
galactosidase and glucuronidase reaction is more appropriate.
Some enteric pathogens may occur even when few, if any, E. coli are present. For example,
organisms such as Giardia cysts or oocysts of Cryptosporidium, and some viruses, are relatively
resistant to chlorine disinfection in comparison with the indicators that are generally used. They
may therefore survive a disinfection process that kills the indicator organisms. Likewise, UV
disinfection is not particularly effective against some types of virus.
Clostridium perfringens spores are highly resistant in the environment, and vegetative cells
appear not to reproduce in aquatic sediments, which can be a problem with traditional indicator
bacteria. It is one of the most resistant micro-organisms in water, with a half-life (time for a
50 percent reduction in concentration) of 60 to >300 days (WHO 2003c). Like protozoa and
some viruses, Clostridium perfringens is more resistant to some disinfection processes (WHO
2001, Chapter 13). Finding Clostridia in water leaving the treatment plant generally indicates
that there is a fault in the chemical or physical treatment that requires investigation and
appropriate remedial action. Clostridium perfringens can be used to detect faecal
contamination of groundwater after the more traditional indicator organisms such as E. coli
have died.
168
Guidelines for Drinking-water Quality Management for New Zealand 2013
5.3.3
Pathogenic protozoal indicators
Giardia and Cryptosporidium are two protozoal pathogens that have been implicated in a
number of outbreak and sporadic disease patterns in New Zealand (as elaborated in Chapter 1:
Introduction, section 1. Giardia spp. and Cryptosporidium spp. are widespread in many New
Zealand water sources; they are endemic in livestock, domestic and feral animals. Therefore
surface waters, including shallow (particularly unconfined) groundwater, must be considered to
be potentially contaminated.
For treated waters, the MAV in the DWSNZ is for infectious pathogenic protozoa. Although new
methods of assessing the infectiousness of protozoa by using human cell cultures have been
developed, they are not yet suitable for routine monitoring of drinking-water. Therefore the
MAV is effectively for total protozoa.
The analytical procedure to be used is based on method 1623 (USEPA 2003). This measures both
Giardia cysts and Cryptosporidium oocysts, without identifying species. Until another method is
developed, it is accepted that this method can be used to indicate total protozoal pathogens. There
is very limited information about the removal and/or inactivation of emerging parasitic protozoa
or opportunistically pathogenic protozoa during water treatment (see section 5.4.5); datasheets
have been prepared for some. In the absence of information, the fate of these protozoal pathogens
is considered similar to that of Giardia and Cryptosporidium during water treatment.
To control pathogenic protozoa, the DWSNZ require that water be treated to ensure their removal
or inactivation, or that secure bore water is used. The level of treatment required for surface
waters and non-secure bore water is determined from the concentration of Cryptosporidium in
the source water, see section 5.2.1 of the DWSNZ, and Chapter 8: Protozoa Compliance,
section 8.2 in the Guidelines. The premise is that Cryptosporidium is known to be very resistant
to treatment processes, and is smaller than Giardia, so is used as an indicator for all pathogenic
protozoa. Thus the level of treatment selected to remove Cryptosporidium should also provide a
level of protection from other less resistant pathogenic protozoa, including Giardia.
When sewage is the source of these pathogens, the anaerobic spore-forming bacterium
Clostridium perfringens appears to be a suitable index for enteric viruses and parasitic protozoa.
Spores of C. perfringens are largely of faecal origin, and are always present in sewage (about
104–105 cfu per 100 mL). Clostridium perfringens is fairly resistant to lower doses of chlorine, so
it has been suggested as an alternative indicator organism for protozoa; spores of Clostridium
perfringens showed the strongest correlation (r = 0.76) with Cryptosporidium in a study on the
River Meuse, a stronger correlation than thermotolerant coliforms or turbidity (WHO 2003c).
Methods for the detection of Giardia and Cryptosporidium in water have advanced
considerably in the last few years. Detecting these protozoa involves the filtration of large
volumes of water as the (oo)cysts are usually present in very low numbers. Methods have been
developed using filtration and immuno-based techniques with monoclonal antibodies for
separation (immunomagnetic separation, IMS) and detection (immunofluorescence assay, IFA)
to determine concentrations of (oo)cysts with confirmation through vital dye staining (DAPI)
and differential interference contrast (DIC) microscopy. However, the recovery success of this
process can be variable, monoclonals may vary in their avidity and specificity to (oo)cysts or
cross-react with other animal species, and the methods are costly. Routine monitoring for
Cryptosporidium and Giardia in treated water is therefore not recommended in the DWSNZ as
the methods do not reliably identify strains that are infective to humans, nor determine if those
detected are infective (Quintero-Betancourt et al 2002). Molecular based methods and tissue
cell culture assays show promise in detecting low level contamination in environmental waters,
differentiating human pathogenic species from those that are not pathogenic and assessing
infectivity but they are still being evaluated.
Guidelines for Drinking-water Quality Management for New Zealand 2013
169
Instead of routine monitoring of Giardia and Cryptosporidium in treated drinking-waters, the
DWSNZ require that water treatment performance is monitored using a variety of operational
criteria as a substitute to protozoa testing in order to demonstrate compliance with the Giardia
and Cryptosporidium standard for total pathogenic protozoa. See Chapter 8.
5.3.4
Pathogenic viral indicators
It has been suggested that a significant amount of viral disease in communities may be the result
of low-level viral contamination of water. If this is the case, then viral indicators need to be
sought. Epidemiological evidence is not clear on this point, though there is often a suggestion
that recognised outbreaks of waterborne viral disease are generally associated with the presence
of bacterial indicators in water.
Viral studies usually use polio viruses (mostly derived from live oral vaccines) as indicators
because of their continual seeding into the aquatic environment and their relative resistance to
accepted levels of disinfection. However, they may not be adequate indicators for all viral
diseases likely to be associated with contaminated water.
The understanding of the fate and behaviour of viruses in drinking-water systems is not yet
sufficiently advanced to enable an explicit standard to be made. Refer to Chapter 7 for further
information about viruses. Sections 5.3.7 and 5.4.4 also mention viruses/coliphages.
The bacteriophages (viruses that infect bacteria) of E. coli have been proposed as indicators of
the survival of viral pathogens. Phages are excreted by a certain percentage of humans and
animals all the time whereas viruses are excreted only by infected individuals for a short period
of time. The excretion of viruses depends heavily on variables such as the epidemiology of
various viruses, outbreaks of viral infections and vaccination against viral infections.
Consequently there is no direct correlation between numbers of phages and viruses excreted by
humans. Enteric viruses have been detected in water environments in the absence of coliphages
(WHO 2001, Chapter 13).
Human enteric viruses associated with waterborne diseases are excreted almost exclusively by
humans. Phages used as models/surrogates in water quality assessment are excreted by humans
and animals. In fact, the faeces of animals such as cows and pigs generally contain higher
densities of coliphages than that of humans, and the percentage of many animals that excrete
phages tends to be higher than for humans. Differences between phages and enteric viruses are
also reflected by differences in the efficiency of adsorption-elution techniques for their recovery
(from Chapter 13, WHO 2001).
International collaboration is now leading to meaningful, universally accepted guidelines for the
recovery and detection of phages in water environments (such as those produced by the
International Organisation for Standardisation).
5.3.5
Secondary bacterial indicators
Faecal streptococci are a species of gram-positive cocci belonging to two genera, Enterococcus
and Streptococcus. The relevant species are linked by common biochemical antigenic properties
and are found in the faeces of humans and other animals. Many will grow in 6.5 percent sodium
chloride solutions and at 45°C. WHO (2001) defines these (and discusses them further) in
Chapter 13 as:
170
Guidelines for Drinking-water Quality Management for New Zealand 2013
Faecal streptococci (FS) are Gram-positive, catalase-negative cocci from selective media
(eg, azide dextrose broth or m-Enterococcus agar) that grow on bile aesculin agar and at
45°C, belonging to the genera Enterococcus and Streptococcus possessing the Lancefield
group D antigen.
Enterococci include all faecal streptococci that grow at pH 9.6, 10°C and 45°C and in
6.5 percent NaCl. Nearly all are members of the genus Enterococcus, and also fulfil the
following criteria: resistance to 60°C for 30 minutes and ability to reduce 0.1 percent
methylene blue. The enterococci are a subset of faecal streptococci that grow under the
conditions outlined above. Alternatively, enterococci can be identified directly as microorganisms capable of aerobic growth at 44±0.5°C and of hydrolysing 4-methlumbelliferylβ-D-glucoside (MUD, detecting β-glucosidase activity by blue florescence at 366nm), in
the presence of thallium acetate, nalidixic acid and 2,3,5-triphenyltetrazolium chloride
(TTC, which is reduced to the red formazan) in the specified medium (ISO/FDIS 7899-1
1998).
The enterococci test is increasingly replacing faecal streptococci as an indicator, as enterococci
are clearly of faecal origin from warm-blooded animals (OECD/WHO 2003). It is often used in
place of E. coli when monitoring the quality of seawater, including in New Zealand. In Europe
small water supplies are governed by the Council Directive 98/83/EC of 3 November 1998 on
the quality of water intended for human consumption. In England that legislation was
incorporated into The Private Water Supplies Regulations 2009, which requires both
Enterococci and Escherichia coli to be absent in 100 mL.
Enterococcus and Streptococcus occur regularly in faeces but not in such numbers, or so
invariably, as E. coli. Certain species of Enterococcus can be found free-living in soil and thus
their presence in water may be from a non-faecal source, (Leclerc et al 1996; Manero and Blanch
1999). Thus while the specificity of this indicator is acceptable, it is less sensitive than E. coli. Its
persistence in water is less than that of E. coli, and it is generally a poorer indicator of the
presence of certain pathogens that die off slowly (eg, viruses).
Until bifidobacteria were suggested as faecal indicators, Clostridium perfringens was the only
obligately anaerobic, enteric micro-organism seriously considered as a possible indicator of the
sanitary quality of water. Clostridium perfringens is a spore-former and may be highly
persistent in the aquatic environment, and it can be found frequently in environmental material,
eg, in soil. So as an indicator, its application is limited to specific circumstances, and the
interpretation of its significance is often difficult.
Despite the first isolation of bifidobacteria in the late 1800s and very high numbers in human
faeces (11 percent of culturable bacteria), their oxygen sensitivity (as with most other strict
anaerobes has limited their role as useful faecal indicators in waters (WHO 2001).
An alternative, the H 2 S test, to measure E. coli has been suggested by the WHO (2002). This test
uses less expensive equipment, and requires less operator skill, so will be particularly attractive
in poorer countries. Some versions use ambient temperatures for incubation so it could be a
useful test in remote areas or during emergencies, eg, when there is no electricity. While the H 2 S
producing organisms may not all be coliforms, they are organisms typically associated with the
intestinal tracts of warm-blooded animals.
Guidelines for Drinking-water Quality Management for New Zealand 2013
171
5.3.6
Indicators of general quality
The heterotrophic plate count (HPC) method uses a standard culture technique to grow a wide
range of aerobic mesophilic bacteria on a non-selective agar medium. The bacteria that grow in
these conditions are almost always present in drinking-water and are therefore an indicator of
overall cleanliness of the water supply system.
The WHO (2003a) published Heterotrophic Plate Counts and Drinking-water Safety: The
significance of HPCs for water quality and the human health. A quote from Chapter 1 follows:
HPC testing has a long history of use in water microbiology. At the end of the 19th century,
HPC tests were employed as indicators of the proper functioning of processes (and of sand
filtration in particular) and thereby as indirect indicators of water safety. Use as a safety
indicator declined with the adoption of specific faecal indicator bacteria during the 20th
century. HPC measurements nevertheless continue to figure in water regulations or
guidelines in many countries. HPC measurements are used:
•
to indicate the effectiveness of water treatment processes, thus as an indirect indication
of pathogen removal
•
as a measure of numbers of regrowth organisms that may or may not have sanitary
significance
•
as a measure of possible interference with coliform measurements in lactose-based
culture methods. This application is of declining value, as lactose-based culture media
are being replaced by alternative methods that are lactose-free.
Elevated HPC levels occur especially in stagnant parts of piped distribution systems, in
domestic plumbing, in bottled water and in plumbed-in devices, such as softeners, carbon
filters and vending machines. The principal determinants of regrowth are temperature,
availability of nutrients and lack of residual disinfectant. Nutrients may derive from the
water body and/or materials in contact with the water.
Piped water systems of large buildings may incur greater growth than encountered
elsewhere (because of storage tanks, extensive internal distribution networks and
temperature-related growth). The principal health concerns in these networks are crossconnections and growth of Legionella bacteria, which are not detected by the HPC test
procedures.
Colony counts (heterotrophic plate counts) can be a useful indicator to monitor operational
performance. They represent bacteria that have entered the water supply or that have survived
the treatment processes and are able to grow and produce viable colonies on the growth medium
used for the tests, under specified conditions (eg, incubation time, temperature). Not all bacteria
in water will, however, grow under these test conditions. It is usually not the absolute
concentration of HPC but a change in HPC concentration that is useful to the water industry.
Colony counts are usually determined after incubation at 20–22°C or at 35–37°C. Plate counts
of bacteria able to grow at 20–22°C or at 35–37°C in a standard nutrient medium (heterotrophic
counts) may be relevant to the nutrient status of the water but not the faecal pollution. In
general, the practice in New Zealand is to use 22°C and 35°C.
The count at 22°C will favour many environmental organisms. It has little sanitary value but is
useful in assessing the efficiency of water treatment, specifically the processes of coagulation,
filtration and disinfection, each of which reduces bacterial numbers. It may be used to assess the
cleanliness and integrity of the distribution system and the suitability of water for
manufacturing food and drink where a high count may lead to spoilage.
172
Guidelines for Drinking-water Quality Management for New Zealand 2013
The count at 35°C will include some environmental organisms and also some from faeces. A
significant increase above normal in this count may be an early sign of contamination. For this
reason, in many cases, the only heterotrophic plate count performed is that at 35°C.
Colony counts should only be used as an adjunct to routine monitoring for E. coli. When a large
number of organisms is detected, some form of remedial action is recommended, such as
cleaning of storage tanks or inspection and repair or disinfection of the reticulation system. It
may be useful to identify the dominant organisms present, particularly where there is persistent
bacterial growth in a reticulation system.
These counts are a useful measure of the general quality of a water supply and to some extent of
the standard of treatment or the microbial condition of the distribution system. The numbers
should fall substantially during treatment processes. Generally, well-maintained water supplies
should have little difficulty in obtaining samples with colony counts as follows (using the pourplate technique with standard plate count agar at 35°C for 48 hours):
•
disinfected supply
< 100 per mL colony-forming units
•
undisinfected supply
< 500 per mL colony-forming units.
New Zealand experience indicates that in a well-run large municipal supply the following counts
can be readily attained:
•
disinfected supply
< 20 per mL colony-forming units
•
undisinfected supply
< 200 per mL colony-forming units.
It is not uncommon to find >1000 colony-forming units per mL in good quality drinking-water
when incubating at 22°C for seven days.
5.3.7
Indicators of effectiveness of treatment
The effectiveness of treatment of raw water can be measured by following the progressive
lowering of counts of coliforms, heterotrophic plate counts or E. coli following successive stages
of treatment throughout the plant, leading, in the final stages, to their complete removal. Any
viable bacterium detected after appropriate exposure to disinfectants provides a clear warning of
a failure of treatment and therefore of a potential hazard to consumers.
When monitoring E. coli, most water suppliers test 100 mL samples, mainly because the MAV is
expressed as <1 per 100 mL. It is tempting to think that a zero result means ‘absence of’ E. coli.
Very large volumes of clean drinking-water can be tested using membrane filtration techniques.
Results of say 1 per 10 L may be possible, ie, 0.01 E. coli per 100 mL. Testing large volumes can
be very useful when investigating treatment or distribution problems, particularly spasmodic
problems, see section 6.3.3 in Chapter 6: Bacterial Compliance.
Other indicator systems such as faecal streptococci and, rarely, Clostridium perfringens spores,
may also be used as they are particularly persistent to disinfection and so tend to indicate the
efficacy of filtration processes.
While coliphages are common in sewage, somatic coliphage and F-specific RNA coliphages are
found in low numbers in faeces so their presence in water is primarily as an index of sewage
pollution rather than faecal contamination in drinking water (IAWPRC 1991). Nonetheless, a
variety of coliphages (eg, F-RNA coliphage, MS2) have shown potential as model organisms for
monitoring virus removal in drinking water treatment plants (Jofre et al 1995).
Guidelines for Drinking-water Quality Management for New Zealand 2013
173
5.4 Waterborne pathogens
5.4.1
Testing for specific pathogens
Table 5.1 is a summary of the major waterborne pathogens and their significance in water supplies.
Table 5.1: Waterborne pathogens and their significance in water supplies
Pathogen
Bacteria
Burkholderia pseudomallei
Campylobacter jejuni, C. coli
d
Escherichia coli, Pathogenic
E. coli – Enterohaemorrhagic
Legionella spp.
Nontuberculous mycobacteria
e
Pseudomonas aeruginosa
Salmonella typhi
Other salmonellae
Shigella spp.
Vibrio cholerae
Yersinia enterocolitica
Health
significance
Persistence in
a
water supplies
Resistance
b
to chlorine
Relative
c
infectivity
Important
animal source
Low
High
High
High
High
Low
Moderate
High
High
High
High
High
May multiply
Moderate
Moderate
Moderate
Multiply
Multiply
May multiply
Moderate
May multiply
Short
Short
Long
Low
Low
Low
Low
Low
High
Moderate
Low
Low
Low
Low
Low
Low
Moderate
Low
High
Moderate
Low
Low
Low
Low
Moderate
Low
Low
No
Yes
Yes
Yes
No
No
No
No
Yes
No
No
Yes
Adenoviruses
Enteroviruses
Hepatitis A
Hepatitis E
Noroviruses and Sapoviruses
Rotavirus
High
High
High
High
High
High
Long
Long
Long
Long
Long
Long
Moderate
Moderate
Moderate
Moderate
Moderate
Moderate
High
High
High
High
High
High
No
No
No
Potentially
Potentially
No
Protozoa
Acanthamoeba spp.
Cryptosporidium parvum
Cyclospora cayetanensis
Entamoeba histolytica
Giardia intestinalis
Naegleria fowleri
Toxoplasma gondii
High
High
High
High
High
High
High
Long
Long
Long
Moderate
Moderate
f
May multiply
Long
High
High
High
High
High
High
High
High
High
High
High
High
High
High
No
Yes
No
No
Yes
No
Yes
Helminths
Dracunculus medinensis
Schistosoma spp.
High
High
Moderate
Short
Moderate
Moderate
High
High
No
Yes
Viruses
Source: Ex WHO 2004a.
Note: Waterborne transmission of the pathogens listed has been confirmed by epidemiological studies and case
histories. Part of the demonstration of pathogenicity involves reproducing the disease in suitable hosts. Experimental
studies in which volunteers are exposed to known numbers of pathogens provide relative information. As most
studies are done with healthy adult volunteers, such data are applicable to only a part of the exposed population, and
extrapolation to more sensitive groups is an issue that remains to be studied in more detail.
a) Detection period for infective stage in water at 20°C: short, up to one week; moderate, one week to one month;
long, over one month.
b) When the infective stage is suspended freely in water treated at conventional doses and contact times.
Resistance moderate, agent may not be completely destroyed.
c) From experiments with human volunteers or from epidemiological evidence.
d) Includes enteropathogenic, enterotoxigenic and enteroinvasive.
e) Main infection route is by skin contact, but can infect immunosuppressed or cancer patients orally.
f) In warm water.
174
Guidelines for Drinking-water Quality Management for New Zealand 2013
Tests for the presence of specific pathogenic organisms such as Salmonella, Campylobacter or
Cryptosporidium are appropriate for special investigations and in the face of evidence of
outbreaks of waterborne disease. These tests are not recommended for routine monitoring of
water supplies, due to the cost, complexity of testing, and perhaps the interpretation of results.
Under special circumstances they could become Priority 2 determinands.
Promising new techniques based on amplifying and identifying the specific gene or genetic
fragments, for example, polymerase chain reaction (PCR), are revolutionising the monitoring
and investigation of drinking-water supplies. They show particular potential for detection of
pathogens. However, the technology has not yet been developed sufficiently for it to replace
traditional methods and the costs in general are very high.
The use of monoclonal antibody techniques probably shows the most promising applications but
this in addition still requires some development and application of cost factors to enable largescale regular testing to be carried out as is required in the monitoring of water supplies.
Waterborne pathogens are discussed in AWWA (1999).
WHO (2011a) includes useful information (Annex 2: Potential biological and chemical hazards
in building water supplies) about the incubation period, clinical symptoms and source of
exposure for many bacteria and viruses.
5.4.2
Bacterial pathogens from faecal contamination
The human bacterial pathogens that can be transmitted orally by drinking-water and which
present a serious risk of disease include Salmonella spp, Shigella spp, enteropathogenic
Escherichia coli, Vibrio cholerae, Yersinia enterocolitica, Campylobacter jejuni, and
Campylobacter coli.
While typical waterborne pathogens are able to persist in drinking-water, most do not grow or
proliferate in water. Micro-organisms (eg, E. coli and Campylobacter) can accumulate in
sediments and be mobilised with increased water flow or water flow fluctuations.
After being excreted in faeces from the body of their host, bacterial pathogens gradually lose
viability and the ability to infect. The rate of decay varies with different bacteria. It is usually
exponential and after a certain period a pathogen will become undetectable. The most common
waterborne pathogens are those that are highly infectious or highly resistant to decay outside
the body. Pathogens with a low persistence, ie, those that do not survive long outside the host,
must rapidly find a new host and are more likely to be spread by person-to-person contact or by
faulty personal or food hygiene than by drinking-water.
If present in drinking-water, faecal contamination and hence the related waterborne bacterial
pathogens are likely to be dispersed widely and rapidly. Outbreaks of waterborne disease are
therefore frequently characterised by an infection across a whole community.
Guidelines for Drinking-water Quality Management for New Zealand 2013
175
Although bacterial contamination normally can be thought of as a short-term event, there are
examples of long-term after effects. One example was at Queenstown (see Chapter 1). Another
occurred at Walkerton in 2000 (also mentioned in Chapter 1). A seven-year follow-up study of
Walkerton residents showed that many continued to experience long-term adverse health
effects. One of the most severe complications of E. coli O157 infection is HUS (haemolytic
uremic syndrome) and survivors of HUS may have permanent kidney damage, potentially
requiring a kidney transplant later in life. Therefore there has been a particular focus on
children who suffered HUS during the outbreak. The Year 3 follow-up of children who had HUS
showed that 32 percent had microalbuminuria (trace amounts of albumin in the urine)
compared to 5 percent of children who had not had HUS. However by Year 5 these rates had
dropped to 20 percent and 3 percent respectively, and the Year 7 follow-up showed no
worsening of the condition or any overt kidney disease in the children. These findings were
more favourable than had been predicted from previous literature on HUS, but continued
monitoring of kidney function is still deemed desirable in HUS survivors. Also, it was found that
among those who had experienced severe gastroenteritis during the outbreak, 36 percent had
developed Irritable Bowel Syndrome, compared to 28 percent of those who had moderate
gastroenteritis and 10 percent of those who had not been ill (WQRA 2010).
5.4.3
Bacterial pathogens growing in the water supply
Various bacteria that occur naturally in the environment may cause disease opportunistically in
humans. Those most at risk are the elderly, the very young, people with burns or excessive
wounds, those undergoing immunosuppressive therapy, or those with acquired
immunodeficiency syndrome (AIDS). Water used by such people for drinking or bathing, if it
contains large numbers of these opportunistic pathogens, can occasionally produce infections of
the skin and of the mucous membranes of the eye, ear, nose and throat. Examples of such agents
are Pseudomonas aeruginosa, species of Klebsiella and Aeromonas, and certain slow-growing
mycobacteria such as Mycobacterium avium.
The World Health Organization (WHO 2004c) has published an excellent book bringing
together a great deal of what is currently known about the mycobacteria. Refer also to the
Datasheet in Volume 3 of these Guidelines.
Legionellosis, caused by the bacterium Legionella pneumophila, can be a serious illness. It
results from inhalation of aerosols in which the causative organisms have been able to multiply
because of warm conditions and the presence of nutrients. The WHO produced Fact Sheet #285
(February 2005) on legionellosis, available on
www.who.int/entity/mediacentre/factsheets/fs285/en/. This was updated in book form (WHO
2007). Refer also to the Datasheet in Volume 3 of these Guidelines.
5.4.4 Viruses
Viruses are among the smallest and more resilient of infectious agents. In essence they are
nucleic acid molecules that can enter cells and replicate in them. The virus particle consists of a
genome, either RNA or DNA, surrounded by a protective protein shell, the capsid. Frequently
this shell is enclosed in an envelope that contains both protein and lipid. Viruses replicate only
inside specific host cells and they are absolutely dependent on the host cells’ synthetic and
energy-yielding apparatus for producing new virus particles. Thus viruses are not known to
multiply in the environment.
176
Guidelines for Drinking-water Quality Management for New Zealand 2013
The viruses of most significance in relation to drinking-water are those that multiply in human
gut tissues and are excreted in large numbers in the faeces and urine of infected individuals.
Although they cannot multiply outside the tissues of infected hosts, some enteric viruses can
survive in the environment and remain infective for long periods. Human enteric viruses occur
in water largely as a result of contamination by sewage and human excreta. The numbers of
viruses present and their species distribution will reflect the extent that the population is
carrying them.
The different analytical methods currently available can also lead to wide variations in numbers
of viruses found in sewage.
Sewage treatment may reduce numbers by ten to ten thousand-fold, depending on the nature
and degree of treatment. However, even tertiary treatment of sewage will not eliminate all
viruses. As sewage mixes with the receiving water, viruses are carried downstream and the
length of time they remain detectable depends on temperature, their degree of absorption into
sediments, penetration of sunlight into the water, pH and other factors. Consequently, enteric
viruses can be found in sewage-polluted water at the intakes to water treatment plants.
Proper treatment and disinfection, however, should produce drinking-water that is essentially
virus-free. The occurrence of human viruses in source waters and the effectiveness of various
drinking water treatment approaches are discussed in Chapter 7: Virological Compliance.
5.4.5
Pathogenic protozoa
The majority of protozoa are free-living aquatic organisms of no significance to public health.
Protozoa can be differentiated into three general types: ciliates, flagellates and amoebae. They
generally feed on other micro-organisms such as bacteria, algae, cyanobacteria, or other
protozoa.
Protozoa likely to be found in drinking-waters and of public health significance can be grouped
into those of enteric or environmental origin:
•
enteric protozoa occur widely as parasites in the intestine of humans and other mammals and
involve at least two stages (trophozoite and (oo)cyst) in their life cycle (see section 5.4.5.1)
•
some free-living protozoa (FLP) are opportunistic pathogens in humans and are responsible
for some serious diseases of the nervous system and the eye (see section 5.4.5.2).
5.4.5.1
Enteric parasitic protozoa
The most prevalent enteric protozoal parasites associated with waterborne disease include
Giardia intestinalis, Cryptosporidium hominis and C. parvum. Toxoplasma gondii,
Entamoeba histolytica, and Balantidium coli have also been associated with waterborne
outbreaks. Cryptosporidium, a coccidian protozoal parasite, was only identified as a human
pathogen in 1976. It can cause diarrhoeal illness in the immunocompetent but with dire
consequences in immunocompromised individuals. The disease is endemic throughout the
world. The incidence of infection is also high, illustrated by the finding that in the USA
20 percent of young adults have evidence of infection by Cryptosporidium. This rate was over
90 percent amongst children under one year old in a Brazilian shanty town (quoted in WHO
2003(b): Emerging issues in water and infectious diseases).
Guidelines for Drinking-water Quality Management for New Zealand 2013
177
Epidemiological studies often report cases as incidence per (say 1000) population. Sometimes
prevalence is used, being a better indicator of disease burden due to the longer duration of
cryptosporidiosis. Incidence is defined as the number of incidents and prevalence as the number
of days with diarrhoea in a given time period.
Other emerging protozoal parasites of concern include Cyclospora cayetanensis and Isospora
belli. Microsporidia are also emerging pathogens of public health importance and, although
recently classified as fungi, their fate and behaviour in water can be similar to that of the
parasitic protozoa.
The transmissive/infective stages of these parasites are cysts (Giardia, Balantidium,
Entamoeba), oocysts (Cryptosporidium, Cyclospora, Isospora, Toxoplasma) or spores
(Microsporidia). These forms are excreted in faeces of infected hosts as fully infectious agents
(Giardia, Cryptosporidium, Micropsporidia, Balantidium) or as immature stages (Cyclospora,
Isospora, Toxoplasma) requiring a short period of development in the environment to reach the
mature stage. They can get into drinking-water supplies by contamination with human or
animal faeces. All are widely dispersed and have been associated with outbreaks of infection
resulting from drinking contaminated water, see datasheets (Volume 3).
Giardia and Cryptosporidium are the most widely reported causes of waterborne parasitic
disease in developed countries. In New Zealand giardiasis and cryptosporidiosis are the third
and fourth most commonly notified diseases, respectively. These organisms cause varying
degrees of enteric condition that can be manifested from violent diarrhoea symptoms to being
asymptomatic. Immunocompetent people typically recover without intervention. Dehydration is
the most frequent symptom requiring attention in severely affected individuals.
The (oo)cysts of Giardia and Cryptosporidium are widespread in environmental waters of New
Zealand especially in water from areas of intensive stock farming and they can occur in high
concentrations. A recent study of measures that can be taken to reduce the numbers of (oo)cysts
in water appears in Victorian Department of Health (2011). Coliforms, faecal coliforms, and
E. coli have been shown to be poor indicators of the presence of pathogenic protozoa in
drinking-water, so Giardia and Cryptosporidium are considered as Priority 1 determinands in
the DWSNZ.
The organisms can survive for a long time in cold water. Medema et al (1997) conducted bench
scale studies of the influence of temperature on the die-off rate of Cryptosporidium oocysts.
Die-off rates were determined at 5°C and 15°C. Both excystation and vital dye staining were used
to determine oocyst viability. At 5°C, the die-off rate was 0.010 log 10 /day, assuming first order
kinetics. This translates to 0.5 log reduction at 50 days. At 15°C, the die-off rate in natural river
water approximately doubled to 0.024 log 10 /day (excystation) and 0.018 log 10 /day (dye
staining).
Sattar et al (1999) evaluated factors impacting Cryptosporidium and Giardia survival.
Microtubes containing untreated river water were inoculated with purified oocysts and cysts.
Samples were incubated at temperatures ranging from 4 to 30°C; viability of oocysts and cysts
was measured by excystation. At 20°C and 30°C, reductions in viable Cryptosporidium oocysts
ranged from approximately 0.6 to 2.0 log after 30 days. Relatively little inactivation took place
when oocysts were incubated at 4°C.
178
Guidelines for Drinking-water Quality Management for New Zealand 2013
The significance of waterborne transmission in New Zealand is still not clear. The prevalence of
Giardia and Cryptosporidium infection in livestock, domestic, and feral animals suggests a
significant reservoir for zoonotic transmission. However, information is needed on the presence
of human and animal specific genotypes in water in order to clarify the relative importance of
human or animal derived waterborne infections. The datasheets provide further detailed
descriptions of the enteric protozoa. Chapter 8: Protozoa Compliance also provides further
information relating to Cryptosporidium.
A thorough discussion on the impact of waterborne Giardia and Cryptosporidium
internationally appears in WHO (2012, see Chapter 2 in particular).
5.4.5.2
Opportunistically pathogenic free-living protozoa
Free-living protozoa (FLP) are numerous in open surface waters including water supply sources
but greatest numbers can be found in nutrient enriched environments where their bacterivorus
feeding activities are of great benefit, eg, in biological wastewater treatment systems. FLP are
ubiquitous in aquatic environments with a wide tolerance to environmental conditions ranging
from geothermal waters, thermally polluted waters, to water distribution pipes.
The most well-known free-living, opportunistically pathogenic protozoa are the free-living
amoebae, Naegleria, Acanthamoeba and more recently Balamuthia, which cause cerebral or
corneal diseases. Infection is opportunistic and usually associated with recreational bathingwater contact or domestic use of water other than drinking. The occurrence of Naegleria and
Acanthamoeba in water is not necessarily associated with faecal contamination.
Naegleria spp have been responsible for nine recorded deaths in New Zealand since 1968: five
cases were confirmed as N. fowleri (Cursons et al 2003). Infection by N. fowleri is strictly
waterborne and can cause a cerebral infection known as primary amoebic meningoencephalitis
(PAM), a rare but usually fatal condition. All cases of death resulting from Naegleria infections
in New Zealand have been associated with swimming in geothermal pools or rivers receiving
geothermal waters. These deaths led to more control of geothermal tourist areas with specific
advice on pool care including exclusion of soil from the water sources and pools, filtration,
disinfection, and rate of water turnover.
Acanthamoeba species are commonly found in soil and water and cause diseases of the central
nervous system (granulomatous amoebic encephalitis GAE) and a disease of the eye called
keratitis. GAE is invariably fatal but no cases of GAE have been reported in New Zealand to date.
However, although GAE is not associated with swimming, a species known to cause the disease
in humans, Acanthamoeba culbertsoni, has been isolated from New Zealand thermal waters. In
contrast, amoebic keratitis does occur in New Zealand and there have been 8 reported cases
since 1995 (Ellis-Pegler 2003). The disease has been associated with people who wear soft
contact lenses. Acanthamoeba spp has been isolated from contact lens washing fluid on several
occasions.
Balamuthia mandrallis causes GAE in humans and other animals. Little is known about the
ecology of Balamuthia. They are present in soil and possibly water but there is no obvious
association of waterborne transmission with those cases reported of Balamuthia infection
(Schuster and Visvesvara 2004).
Both Acanthamoeba and Naegleria as well as other free-living amoebae are known to ingest
bacterial pathogens such as Legionella (Brown and Barker 1999). Legionella spp. have adapted
to replicate inside amoebae and thus the amoebae containing Legionella within their vacuoles
can act as vectors for packets of Legionella infection.
Guidelines for Drinking-water Quality Management for New Zealand 2013
179
Free-living amoebae can be found in source water and isolated from water distribution pipes.
Their presence is usually associated with thermally polluted waters (eg, Naegleria) or
inadequate disinfection of treated supplies.
Information is increasing on emerging enteric protozoa such as Blastocystis spp, Dientamoeba
fragilis and Endolimax nana. Researchers have recently agreed that Blastocystis spp are
pathogenic, causing intestinal disorders. Datasheets have been prepared for:
•
Acanthamoeba sp.
•
Balantidium coli
•
Blastocystis
•
Cyclospora
•
Cryptosporidium
•
Entamoeba histolytica
•
Giardia intestinalis (lamblia)
•
Isospora
•
Microsporidia
•
Naegleria fowleri
•
Toxoplasma
5.4.6 Helminths
A variety of human and zoonotic helminth (worm parasite) diseases have been found in New
Zealand, including Fasciola, an economically important zoonotic helminth parasite in cattle.
However, reports of helminth infections in the New Zealand human population occur rarely;
infection is most often associated with recent immigrants or travellers returning from areas
where disease is endemic.
Whilst infective helminth parasites should not be present in drinking-water, the low prevalence
of helminth infection in New Zealand indicates that a Maximum Acceptable Value (MAV) in the
DWSNZ is impractical for these disease organisms. Physical treatment processes used for the
removal of protozoal parasites during drinking-water treatment should also remove helminths if
these are present in the source water, as they are generally excluded by their size. Helminth
infective stages are typically larger and heavier than protozoal (oo)cysts (>20 µm). Care is
needed with microscopic identification of organisms from water supplies as adult worms and
larvae are more likely to belong to free-living nematode groups such as Turbatrix or Rhadbitis.
The majority of helminths are not typically transmitted through drinking-water. Exceptions are
Dracunculus (Guinea worm) and in some endemic situations, Fasciola spp (liver fluke). Most
helminth infections are acquired through direct faecal-oral contact (eg, Enterobius), ingestion of
faecally contaminated food (eg, Ascaris, Trichuris), or through contact with contaminated soil
or surface water (eg, hookworm, Schistosoma spp). However, helminth parasites can produce
large numbers of transmissive stages (infective egg or larvae) that can sometimes be found in
water, and there have been reports of incidental disease transmission due to consumption of
contaminated water.
180
Guidelines for Drinking-water Quality Management for New Zealand 2013
Although no MAV is prescribed for helminths in the DWSNZ, precautions should be taken to
protect source water supplies from zoonotic helminth contamination, particularly in rural
communities where livestock may be considered a viable reservoir and to ensure security of
water during and post treatment.
Further information is provided in the helminth and nematode datasheet.
5.4.7
Cyanobacteria (blue-green algae)
Cyanobacterial cells or colonies do not usually cause a health problem in drinking-water. They
will have been removed from properly treated water. They can interfere with water treatment
processes if in large numbers in the raw water, which may lead to other problems.
Their main health problem is the toxins that they can produce. This is discussed in Chapter 9:
Cyanobacterial Compliance. See also the datasheets for cyanobacteria and for cyanotoxins.
5.4.8 Disease from waterborne pathogens
Drinking-water is an important source of infectious agents, particularly ones that cause enteric
infections. Many of the great epidemics of history have been caused by faecal contamination of
drinking-water. While person-to-person contact is equally important it is common for the
population to indicate water as a source of disease. The significance of any particular organism
varies with the disease caused under local water supply conditions. Not all individual members
of any population will be susceptible to a pathogenic organism in the water. Waterborne
infections will depend on the following:
•
the concentration of any pathogenic organism in drinking-water
•
the virulence of the strain
•
the amount of water taken in by individuals which has not been adequately disinfected
•
the minimum infectious dose (MID) of the pathogen in question
•
the immune capability or susceptibility of individuals
•
the incidence of the infection in a community, thus determining the number of enteric
pathogens that would be shed into a potential receiving water source.
Paradoxically, if a particular infection has been received repeatedly from a contaminated water
source the community may have become immune to some of the pathogens. This situation
develops in countries where the number of pathogens in water is high and the standard of
drinking-water is low. Conversely, visitors who drink from such water frequently become ill
while the locals have far fewer ill effects. This is a population immunity but it is acquired at the
cost of illness and death among children and is not considered acceptable in developed
countries.
Where indicators of faecal pollution are found in water the population using that water may not
be showing enteric disease. However, the presence of indicators of faecal pollution means that
the likelihood of faecal pathogens occurring in that water is high. Continual vigilance is required
to determine the need for treatment. If an infection occurs in a community, follow-up
epidemiological studies should be carried out such that the source and route of infection can be
determined and treated.
Guidelines for Drinking-water Quality Management for New Zealand 2013
181
The diseases most frequently associated with water are enteric infections such as infectious
diarrhoea. In many cases the disease is mild and self-limiting. However, a proportion of the
population will suffer more severe outcomes. Several waterborne pathogens such as Vibrio
cholerae, hepatitis E virus and E. coli O157:H7 have high mortality rates.
Since the 1990s evidence that microbial infections are associated with chronic disease started to
accumulate. Several waterborne pathogens have been associated with serious sequellae (ie,
severe illness or chronic or recurrent disease that appears long after the initial exposure to
contaminated water). Examples of sequellae that could potentially be associated with acute
waterborne include (WHO 2003c):
•
diabetes which has been linked to Coxsackie B4 virus
•
myocarditis which has been linked to echovirus
•
Guillian-Barré syndrome associated with Campylobacter spp
•
gastric cancer which has been linked to Helicobacter sp
•
reactive arthritis which has been linked to Klebsiella sp.
5.5 Organisms causing problems other than
disease
5.5.1
General
People in the western world demand water that is free from pathogenic organisms, has a
pleasant taste and odour, is colourless and free from toxic chemical substances and corrosive
properties. In addition to the pathogenic micro-organisms discussed, waters may also contain
(AWWA 2004):
•
cyanobacteria (other than those producing cyanotoxins)
•
iron, manganese, sulphur and nitrifying bacteria
•
actinomycetes and fungi
•
large eukaryote organisms such as algae, crustacea, nematodes and protozoa
•
insect larvae, eg, the midge and mosquito, usually in storage tanks.
Intermittent problems can occur when some or all of these organisms get into distribution
systems where their maintenance or growth is encouraged. Excessive quantities of organic
matter will usually support bacteria and fungi that in turn can support protozoa and crustacea.
Many eukaryotes (cellular organisms with a nucleus, ie, not including viruses) and invertebrates
can then feed on bacteria, fungi and protozoa.
Normally, treated water does not contain sufficient nutrient to support the growth of these
organisms. However, the use of any form of filter bed inevitably retains large amounts of organic
matter providing substrate and shelter. The filter is therefore an excellent growth medium for
bacteria and other organisms higher up the food chain, which feed directly or indirectly on the
bacteria. Filter backwashing is used to control this build-up.
Quantitative limits for this heterogenous group of micro-organisms are not recommended.
182
Guidelines for Drinking-water Quality Management for New Zealand 2013
5.5.2
Organisms causing taste and odour
Unpleasant tastes and smells can result from compounds that are produced by a range of
eukaryote micro-organisms. These include protozoa and cyanobacteria. Protozoa from the
amoebae and the ciliates are likely to produce odorous compounds. The amoeba of the genera
Vanella, Saccamoeba, Ripidomyxa, all of which have bacterial symbionts in their cytoplasm,
can produce geosmin or methylisoborneol (MIB). Other sources of the same compounds are
cyanobacteria and the actinomycetes. Thus it seems likely that the symbionts of the protozoa are
the source of these compounds. Free-swimming ciliates that contain algal symbionts
(zoochlorella), including the genera Stentor and Paramecium, also contribute to odours in
water if they reach high numbers.
See AWWA (2004) for discussion on biology, ecology, identification and control strategies. See
EA (1998) and EA (2004) for some microbiological and chemical methods for assessing taste
and odour problems caused by micro-organisms.
Refer to Chapter 18: Aesthetic Determinands for additional information.
5.5.3
Micro-organisms causing colour
Explosive growths of algae, cyanobacteria and other bacteria, can produce unwanted colour in
water. Such blooms can be controlled by careful application of copper sulphate to the water. If
pigmented organisms such as cyanobacteria and algae are crushed on filters to the extent that
the cells are disrupted to release pigment, they can create colour. Micro-algae that pass through
filters can cause additional turbidity problems.
5.5.4
Iron and manganese deposits due to bacteria
A wide range of micro-organisms (bacteria, fungi and sometimes protozoa) can be categorised
as chemolithotrophic or photolithotrophic, that is, they are able to oxidise metal salts as part of
their metabolism and in doing so cause problems by encrusting pipes, bores or filters. The
elements involved in this are mostly iron, manganese and sulphur. The problems are usually
identifiable by coloured deposits on equipment. In water containing ferrous or manganese salts,
bacteria able to oxidise the compounds can form rust-coloured or black deposits in tanks and on
the walls of pipes where the water flow is slow. If the water flow increases, the deposits may be
detached to cause colour problems in domestic supplies. The slurry may also contain organic
compounds that can break down and produce odour problems. AWWA (2004) discusses the
biology, ecology, identification and control strategies.
Manganese-oxidising organisms are responsible for deposits in wells and water pipes and they
can reduce yield, clog bore pipes, and reduce flow capacity in water pipes. They may also
damage equipment for measuring water flows and produce black-coloured water that can stain
in the domestic environment. Bacteria may attach to the deposits and if disturbed will increase a
colony count of that water. Prevention is based on the elimination of manganese and iron from
raw water if the concentration exceeds 0.1 mg/L iron and 0.04 mg/L manganese.
These chemolithotrophic organisms can impair water quality, but they are usually an
intermittent problem and it is therefore not practical to monitor them routinely because of their
diverse nature and unpredictable occurrence. Consumer concern or operational problems
should be the stimulus for action.
Guidelines for Drinking-water Quality Management for New Zealand 2013
183
Minimising the problems due to iron and manganese bacteria in groundwater is discussed in
Chapter 3: Source Waters, section 3.2.3.4. Methods for removing iron and manganese from
water are covered in Chapter 18: Aesthetic Considerations, section 18.3.
5.5.5
Corrosion resulting from iron and sulphur bacteria
activity
Iron and steel pipes have always been at the mercy of activity by iron and sulphur bacteria. The
iron and steel are nowadays often protected with cement or other coatings, or replaced by other
materials such as PVC. Microbial corrosion of pipe materials results from:
•
depletion of oxygen
•
liberation of corrosive metabolites
•
production of sulphuric acid
•
inclusion of sulphate reducing bacteria in cathodic processes under anaerobic conditions.
Some micro-organisms in water indicate the corrosion of cast iron. Still other micro-organisms
can be responsible for the biodeterioration of non metallic materials such as plastic, rubber and
pipelining materials which provide organic nutrients and encourage micro-organism growth, eg,
Pseudomonas aeruginosa and some coliform organisms (not E. coli). Unchlorinated water, or
water in which the chlorine demand is high and therefore the chlorine residual has disappeared,
supports higher rates of attack than water in which chlorine is still detectable. See AWWA
(2004) for further information.
5.5.6
Large numbers of micro-organisms
It must always be remembered that water prior to treatment contains a heterogeneous
population of micro-organisms. These are mostly aerobic heterotrophic bacteria but their
presence may mask the interpretation of a test based on coliform counts, or even restrict their
growth, and thus yield false results. A case in point has been demonstrated with strains of
Aeromonas that produce acid and gas with coliform media at 44.5°C. Control of such microorganisms is by reduction of the organic carbon source. However, this may not be possible since
most catchments have some organic runoff. The less nutrient-rich a water supply is, the better in
terms of the reduction of possible micro-organism growth.
Shallow groundwaters often have large numbers of bacteria that grow on general bacteriological
media at 37°C, often in the absence of faecal coliforms; total or presumptive coliforms can reach
high numbers in these waters too. These are likely to be naturally-occurring soil bacteria. They
may also be bacteria from septic tank overflow that has passed through an extensive drainage
field where the faecal indicator bacteria have died out or been grazed by larger organisms.
See section 5.3.6 for a discussion on heterotrophic bacteria in water supplies, and Chapter 8:
Protozoa Compliance, section 8.5: Challenge Testing, for how measuring the population density
of these bacteria can be helpful.
Ainsworth (2004) and Bartrum (2003) discuss the occurrence and control of these bacteria.
184
Guidelines for Drinking-water Quality Management for New Zealand 2013
5.5.7
Invertebrate inhabitants of water systems
Problems caused by the presence of invertebrate animals in large water supplies are uncommon
in New Zealand. Poorly operated slow sand filters can give rise to large populations in the
distribution system. Invertebrate animals may be present in shallow wells. Such animals derive
their food from bacteria, algae and protozoa present in the water or on slime growths or
deposits. They include freshwater sponges (the porifera), coelenterates, bryozoans, crustacea,
molluscan bivalves, snails and nematodes. Freshwater mussels have caused major problems
overseas by blocking pipes. The problems caused by many of these, and control strategies, are
discussed in AWWA (2004).
For convenience, the types of organisms can be divided into two sections:
•
free-swimming organisms such as the crustacean Paracalliope spp (freshwater hopper),
Paranephrops (freshwater crayfish) and copepods
•
animals that move along surfaces or are anchored to them such isopods (water lice), snails,
and other molluscs, bryozoans, nematodes and the larvae of chironomids.
In warm weather, slow sand filters can discharge larvae of midges and mosquitoes (eg,
Chironomus and Culex spp) into the water. Such filters may also be heavily infested with adults
and larvae of the genus Psychodidae. If the top layer of a filter collapses, insect larvae and adults
may be drawn down into the unfiltered water. Penetration of invertebrate animals into water
supplies through a water filtration plant is much more likely to occur when low quality raw
waters are used and where high rate filtration processes are used, especially if the filter depth is
less than a metre, or where large sand grains (say more than three mm) have been chosen for
the filters. Pre-chlorination destroys the invertebrates and thereby assists their removal by
filtration, but promotes increased formation of chlorinated organic compounds. Maintaining
chlorine residuals in the distribution system and regularly cleaning mains by flushing can
usually prevent infestation.
The removal of isopods and other crustacea from the distribution system has been effected by
permethrin treatment of water (in parts of the system that have been isolated) at an average
dose not exceeding 0.01 mg/L for 24 to 48 hours. Water treated this way must not be discharged
into watercourses as it will be toxic to fish and other aquatic life. Before using permethrin, the
proposed procedures should be discussed with the Medical Officer of Health.
Note however, that adding permethrin directly to drinking-water for public health purposes is
not recommended by the WHO, as part of its policy to exclude the use of any pyrethroids for
larviciding of mosquito vectors of human disease. This policy is based on concern over the
possible accelerated development of vector resistance to synthetic pyrethroids, which, in their
application to insecticide-treated mosquito nets, are crucial in the current global malaria
strategy.
Renal dialysis units must not be supplied with permethrin-treated water, and those rearing fish
should be warned not to replenish their fish tanks with mains water while it is being treated. The
treated water can be discharged safely to sewers for treatment at sewage works. In
circumstances where such concerns exist, relevant specialist expertise must be sought.
Guidelines for Drinking-water Quality Management for New Zealand 2013
185
References
Ainsworth R (ed). 2004. Safe, Piped Water: Managing microbial water quality in piped
distribution systems. London: IWA Publishing, for the World Health Organization, Geneva.
http://libdoc.who.int/publications/2004/924156251X.pdf
Anderson C. 1991. Cholera epidemic traced to risk miscalculation. Nature 354: 255.
AWWA. 1999. Waterborne Pathogens (1st edition). AWWA Manual M48. Denver CO: American
Water Works Association.
AWWA. 2004. Problem Organisms in Water: Identification and treatment. Manual of Water
Supply Practices, AWWA Manual M7. Denver CO: American Water Works Association.
Bardsley DA. 1934. The distribution and sanitary significance of B. coli, B. lactis aerogenes and
intermediate types of coliform bacilli in water, soil, faeces and ice-cream. Journal of Hygiene
(Cambridge) 34: 38–68.
Bartrum J, et al (eds). 2003. Heterotrophic Plate Counts and Drinking-water Safety: The
significance of HPCs for water quality and human health. WHO Emerging Issues in Water and
Infectious Diseases Series, London, IWA Publishing.
http://www.who.int/water_sanitation_health/emerging/emergingissues/en/index.html
Bermúdez M, Hazen TC. 1988. Phenotypic and genotypic comparisons of Escherichia coli in pristine
waters. Applied and Environmental Microbiology 54(4): 979–83.
Brieseman M, Hill S, Holmes J, et al. 2000. A series of outbreaks of food poisoning? New Zealand
Medical Journal 113(1104): 54–6.
Brown MRW, Barker J. 1999. Unexplored reservoirs of pathogenic bacteria: protozoa and biofilms.
Trends in Microbiology 7(1): 46–50.
Budd W. 1856. On the fever at the Clergy Orphan Asylum. Lancet ii: 617–19.
Cursons R, Sleigh J, Hood D, et al. 2003. A case of primary amoebic meningoencephalitis: North
Island, New Zealand. The New Zealand Medical Journal 116(1187), 5 pages.
Doyle MP, Zhao T, Meng J, et al. 1997. Escherichia coli O157:H7. In MP Doyle (ed) Food
Microbiology: Fundamentals and frontiers. ASM Press, pp 171–91.
Dufour AP. 1984. Health Effects Criteria for Fresh Recreational Waters. EPA-600/1-84-004.
Cincinnati OH: US Environmental Protection Agency.
DWI. 2010. Significance of Methods and Sample Volumes for E. coli and Total Coliform
Measurements. Executive Summary (8 pp). http://dwi.defra.gov.uk/research/completedresearch/2000todate.htm
EA. 1998. The Assessment of Taste, Odour and Related Aesthetic Problems in Drinking Waters
1998: Methods for the examination of waters and associated materials. Environment Agency.
http://www.environmentagency.gov.uk/static/documents/Research/171_taste__odour_in_water.pdf
EA. 2004. The Microbiology of Drinking Water (2004) – Part 11 – Taste, Odour and Related
Aesthetic Problems: Methods for the examination of waters and associated materials. Environment
Agency. http://www.environmentagency.gov.uk/static/documents/Research/mdwpart112004_859972.pdf
Eberhart-Phillips J, Walker N, Garrett N, et al. 1997. Campylobacteriosis in New Zealand: results of a
case-control study. J Epidemiol Comm Health 51: 686–91.
186
Guidelines for Drinking-water Quality Management for New Zealand 2013
Ellis-Pegler R. 2003. Primary amoebic meningoencephalitis – rare and lethal. The New Zealand
Medical Journal 116(1187), 3 pp.
Fujioka R, Sian-Denton C, Borja M, et al. 1999. Soil: the environmental source of Escherichia coli
and enterococci in Guam’s streams. Journal of Applied Microbiology Symposium Supplement 85:
83S–89S.
Hardina CM, Fujioka RS. 1991. Soil: The environmental source of E. coli and enterococci in Hawaii’s
streams. Environmental Ecotoxicology and Water Quality 6: 185–95.
Hippocrates. 1938. On airs, waters, and places. Translated and republished in Medical Classics 3:
19–42.
Hornick RB, Woodward TE, McCrumb RF. 1966. Study of typhoid fever in man: 1. Evaluation of
vaccine effectiveness. Transactions of the Association of American Physicians 79: 361–7.
Houston AC. 1917. Rivers as Sources of Water Supply. London: John Bale and Danielsson.
Hunter P, Waite M, Ronchi E. 2003. Drinking Water and Infectious Disease: Establishing the Links.
Boca Raton: CRC Press, and London: IWA Publishing.
IAWPRC. 1991. Bacteriophages as model viruses in water quality control. Water Research 26(5):
529–45.
Jofre J, Olle E, Lucena F, et al. 1995. Bacteriophage removal in water treatment plants. Water
Science and Technology 31(5–6): 69–73.
Kehr RW, Butterfield CT. 1943. Notes on the relation between coliforms and enteric pathogens.
Public Health Report, Washington 58: 589–607.
Küstner HGV, Gibson IHN, Carmichael TR. 1981. The spread of cholera in South Africa. South
African Medical Journal 60: 87–90.
Leclerc H, Devriese LA, Mossel DAA. 1996. Taxonomical changes in intestinal (faecal) enterococci
and streptococci: consequences on their use as indicators of faecal contamination in drinking water.
Journal of Applied Bacteriology 81: 459–66.
LeJeune JT, Besser TE, Merrill NL, et al. 2001. Livestock drinking water microbiology and the
factors influencing the quality of drinking water offered to cattle. Journal of Dairy Science 84(8):
1856–62.
Manero A, Blanch AR. 1999. Identification of Enterococcus spp. with a biochemical key. Applied and
Environmental Microbiology 55(10): 4425–30.
McBride G, Till D, Ryan T, et al. 2002. Freshwater microbiology research report: pathogen
occurrence and human health risk assessment analysis. Wellington: Ministry for the Environment.
http://www.mfe.govt.nz/publications/water/freshwater-microbiology-nov02/
Medema G, Bahar M, Schets F. 1997. Survival of Cryptosporidium parvum, Escherichia coli, faecal
enterococci and Clostridium perfringens in river water: influence of temperature and autochthonous
micro-organisms. Wat Sci Technol 35(11–12): 249–52.
Ministry for the Environment. 2002. Freshwater Microbiology Research Programme, Pathogen
Occurrence and Human Health Risk Assessment Analysis. Available on the internet at:
http://www.mfe.govt.nz/publications/water/freshwater-microbiology-nov02/
Ministry of Health. 2005. Drinking-water Standards for New Zealand 2005. Wellington: Ministry
of Health.
New Zealand Public Health Surveillance Report (NZPHSR) 2003.
Guidelines for Drinking-water Quality Management for New Zealand 2013
187
OECD/WHO. 2003. Assessing Microbial Safety of Drinking Water. Organisation for Economic
Co-operation and Development and the World Health Organization.
http://www.who.int/water_sanitation_health/dwq/9241546301/en/
Quintero-Betancourt W, Peele ER, Rose JB. 2002. Cryptosporidium parvum and Cyclopora
cayetanensis: a review of laboratory methods for detection of these waterborne parasites. Journal of
Microbiological Methods 49: 209–24.
Risebro HL, Breton L, Aird H, et al. 2012. Contaminated Small Drinking Water Supplies and Risk of
Infectious Intestinal Disease: A prospective cohort study. PLoS ONE 7(8): e42762.
doi:10.1371/journal.pone.0042762. Accessed December 2012 at
http://www.plosone.org/article/info:doi/10.1371/journal.pone.0042762?imageURI=info:doi/10.137
1/journal.pone.0042762.t005
Sattar S, Chauret C, Springthorpe V, et al. 1999. Giardia Cyst and Cryptosporidium Oocyst Survival
in Watersheds and Factors Affecting Inactivation. Denver CO: American Water Works Association
Research Foundation.
Schuster RL, Visvesvara GS. 2004. Free-living amoebae as opportunistic and non-opportunistic
pathogens of humans and animals. International Journal for Parasitology 34: 1001–27.
Sinton L. 2006. Viable but non-culturable bacteria – menace or myth? Water and Wastes in New
Zealand 143: 31–8.
Snow J. 1855. On the Mode of Communication of Cholera (2nd edition). London: J Churchill,
162 pp. Reprinted in: Snow on Cholera. New York: Hafner, 1965 (1936).
Till D, McBride G, Ball A, et al. 2008. Large-scale freshwater microbiological study: rationale, results
and risks. Journal of Water and Health 6(4) IWA Publishing.
USEPA. 2003. Long Term 2 Enhanced Surface Water Treatment Rule; Proposed Rule. National
Primary Drinking Water Regulations: 40 CFR Parts 141 and 142, 11 August. Now see USEPA
(2006a).
USEPA. 2006a. National Primary Drinking Water Regulations: Long Term 2 Enhanced Surface
Water Treatment Rule: Final Rule (LT2ESWTR). Federal Register Part II, 40 CFR Parts 9, 141 and
142. Washington: National Archives and Records Administration. See
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04a.pdf
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04b.pdf
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04c.pdf or go to
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/index.cfm or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
Victorian Department of Health. 2011. Public health issues associated with stock accessing
waterways upstream of drinking water off-takes. 97 pp. Prepared by Water Futures Pty Ltd.
http://docs.health.vic.gov.au/docs/doc/Public-health-issues-associated-with-stock-accessingwaterways-upstream-of-drinking-water-off-takes
WHO. 2001. Water Quality – Guidelines, Standards and Health: Assessment of risk and risk
management for water-related infectious disease. Published on behalf of the World Health
Organization by IWA Publishing. This book is available on the internet at:
http://www.who.int/water_sanitation_health/dwq/whoiwa/en/index.html
WHO. 2002. Evaluation of the H 2 S Method for the Detection of Fecal Contamination of Drinkingwater. WHO/SDE/WSH/02.08. Available on the internet at:
http://www.who.int/entity/water_sanitation_health/dwq/rrh2scom.pdf
WHO. 2003a. Heterotrophic Plate Counts and Drinking-water Safety: The significance of HPCs for
water quality and the human health. 256 pp. Published on behalf of WHO by IWA Publishing.
Available at http://www.who.int/water_sanitation_health/dwq/hpc/en/index.html
188
Guidelines for Drinking-water Quality Management for New Zealand 2013
WHO. 2003b. Emerging issues in water and infectious diseases. ISBN 92 4 159082 3. Available at
http://www.who.int/water_sanitation_health/emerging/emergingissues/en/index.html
WHO. 2003c. Assessing Microbial Safety of Drinking Water: Improving approaches and methods.
http://www.who.int/water_sanitation_health/dwq/9241546301/en/index.html
WHO. 2004. Guidelines for Drinking-water Quality 2004 (3rd edition.). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda
WHO. 2004b. Waterborne Zoonoses: Identification, causes and control. Published on behalf of
WHO by IWA Publishing. 528 pp. Available on the internet at:
www.who.int/water_sanitation_health/diseases/zoonoses/en/index.html or
www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2004c. Pathogenic Mycobacteria in Water: A guide to public health consequences,
monitoring and management. Published by IWA Publishing on behalf of the World Health
Organization, ISBN 92-4-156259-5 (WHO), ISBN 1-84339-059-0 (IWA Publishing. Available at:
http://www.who.int/water_sanitation_health/emerging/pathmycobact/en/
WHO. 2004d. Water Treatment and Pathogen Control: Process efficiency in achieving safe
drinking water. 136 pp. www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2005. Legionellosis, History and Overview. Fact Sheet #285 (February 2005). World Health
Organization. Available on www.who.int/entity/mediacentre/factsheets/fs285/en/ updated to WHO
2007.
WHO. 2005a. Preventing Travellers’ Diarrhoea: How to make drinking-water safe.
WHO/SDE/WSH/05.07. World Health Organization, Geneva. Available on:
http://www.who.int/water_sanitation_health/hygiene/envsan/traveldiarrh/en/index.html
WHO. 2007. Legionella and the Prevention of Legionellosis. Geneva: World Health Organization.
J Bartram, Y Chartier, J Lee, et al (eds). ISBN 92 4 156297 8. 276 pp. See:
http://www.who.int/water_sanitation_health/emerging/legionella.pdf
WHO. 2009. Risk Assessment of Cryptosporidium in Drinking Water. WHO/HSE/WSH/09.04. 143
pp. http://whqlibdoc.who.int/hq/2009/WHO_HSE_WSH_09.04_eng.pdf or
http://www.who.int/water_sanitation_health/publications/by_date/en/index.html
WHO. 2011. Guidelines for Drinking-water Quality (4th edition.). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
WHO. 2011a. Water Safety in Buildings. Geneva: World Health Organization. 164 pp.
http://whqlibdoc.who.int/publications/2011/9789241548106_eng.pdf
WHO. 2012. Animal Waste, Water Quality and Human Health. A Dufour, J Bartram, R Bos, et al
(eds). Published on behalf of the World Health Organization by IWA Publishing, UK. 489 pp.
http://www.who.int/water_sanitation_health/publications/2012/animal_waste/en/
WQRA. 2010. Health Stream 58 (June). Quarterly Public Health Newsletter of Water Quality
Research Australia. http://www.wqra.com.au
Yugoslav Typhoid Commission. 1964. A controlled field trial of the effectiveness of acetone-dried and
inactivated and heat-phenol-inactivated typhoid vaccines in Yugoslavia. Bulletin of WHO 30: 623–
30. http://whqlibdoc.who.int/bulletin/1964/Vol30/Vol30-No5/bulletin_1964_30(5)_631-634.pdf
Guidelines for Drinking-water Quality Management for New Zealand 2013
189
Chapter 6: Bacteriological
compliance
6.1
Introduction
The most common and widespread risk associated with drinking-water is microbial
contamination, the consequences of which mean that control of microbiological quality must
always be of paramount importance, see Chapter 5 for general discussion. Microbiology
compliance includes:
•
bacteria – this chapter
•
viruses – Chapter 7
•
protozoa – Chapter 8
•
cyanobacteria – Chapter 9.
Obviously the entire drinking-water supply cannot be tested for compliance, so monitoring
programmes must be designed to yield statistically reliable and practical information, see
section 2.4 of Chapter 2, and section 6.2.2. Testing a water supply for verification of
microbiological quality must be designed to ensure the best possible chance of detecting
contamination. Sampling should therefore take account of potential variations of water quality
and increased likelihood of contamination, both at source and during distribution. Faecal
contamination usually will not be distributed evenly throughout a piped distribution system. In
systems where water quality is good, the probability of missing the detection of faecal indicator
bacteria is reduced.
The chances of detecting contamination in systems reporting predominantly negative results for
faecal indicator bacteria can be increased by the use of more frequent presence/absence (P/A)
testing. P/A testing can be simpler, faster and less expensive than quantitative methods and can
maximise the detection of faecal indicator bacteria. However P/A testing is only appropriate for
systems where the majority of tests for indicators are negative. Membrane filtration and
multiple tube techniques give a numerical result.
The more frequently a water supply is tested for faecal indicators, the more likely it is that faecal
contamination will be detected. Frequent examination by a simple but reliable method is more
valuable than less frequent testing by a complex test or series of tests. The indicator organism of
choice for faecal contamination is E. coli.
E. coli monitoring requirements can be replaced or reduced by online measurement of the
disinfection process, confirming that it is continuously operating satisfactorily, see section 6.3.7.
These operational requirements also need to be monitored, implementing remedial actions
when there is a transgression.
Section 5.3 in Chapter 5: Microbiological Quality discusses the bacteriological indicators that
can be used for demonstrating drinking-water compliance and treatment plant efficacy and the
reasons for the choice of E. coli as the sole bacterial indicator in the Drinking-water Standards
for New Zealand (DWSNZ). This chapter addresses questions of compliance with limits set on
this indicator. This includes an explanation of how some statistical issues have been addressed
in determining the compliance rules, especially rare false positive results.
190
Guidelines for Drinking-water Quality Management for New Zealand 2013
An important feature of the DWSNZ is the distinction between transgressions and noncompliance. For reasons explained in section 6.2.2, a very small proportion of exceedances of
the Maximum Acceptable Value (MAV), ie, transgressions, can be tolerated with the water
supply remaining in compliance with the DWSNZ. Nevertheless, preventive and remedial
actions are required whenever a transgression occurs. Figures 4.1 and 4.2 in the DWSNZ
summarise some of these actions.
The MAV for E. coli is less than 1 per 100 mL (Table 2.1 of the DWSNZ). The multiple tube
technique used to enumerate E. coli reports the most probable number of organisms (or MPN)
per 100 mL. For compliance purposes, an E. coli result of less than 1 MPN per 100 mL is
considered equivalent to less than 1 per 100 mL, or more correctly 1 CFU per 100 mL, where
CFU means colony forming unit.
WHO (2004a) discusses treatment processes suitable for pathogen control.
6.2 Monitoring for E. coli
6.2.1
General principles
A microbiologically contaminated drinking-water supply can be a major threat to the health of a
community. The main source of this contamination is human and animal faeces. Not only does
contaminated drinking-water have the potential to cause significant illness in consumers (as
outbreaks, or more commonly, ongoing sporadic cases), it may also be the source of epidemics
of disease that spread within the community and have an effect beyond the immediate area
supplied with the contaminated water. The provision of safe drinking-water requires that a
number of barriers, including treatment processes, be put in place to minimise faecal
contamination of water supplies and any ensuring health effects.
Testing a water supply on a regular basis for E. coli, and monitoring the disinfection process, are
important steps for detecting whether the barriers being used to provide safe drinking-water
and to prevent contamination are likely to have been breached. Note that E. coli monitoring
should not be used to decide when further water treatment should commence, or processes
adjusted, because by the time the alert has been raised by a positive test, a large volume of
contaminated water will have entered the distribution system and may have reached some or
many consumers. Largely for this reason, the DWSNZ have over recent editions, shifted the
emphasis from reliance on compliance monitoring testing more to the implementation of risk
management procedures.
To allow reliable detection of barrier failure it is essential that supplies be monitored sufficiently
often that any breakdown is detected promptly and remedied as soon as possible. Ideally, water
suppliers will have process control monitoring procedures in place that can warn of an
impending breakdown; this should be addressed in the PHRMP.
E. coli compliance monitoring will require regular sampling and testing at a frequency and
number based on population size. The larger the population served by a water supply, the
greater the economic consequence to a community of a contaminated supply. The DWSNZ
explicitly cater for population size (for example, see Tables 4.1, 4.2, 4.3, 4.4 and 4.5).
Guidelines for Drinking-water Quality Management for New Zealand 2013
191
Sampling should be planned to be as effective as possible. Since only continuous monitoring for
E. coli would give total confidence in the safety of the water (and this is not feasible), sampling
must be targeted to give the maximum information. This will be achieved by focusing sampling
on the water leaving the treatment plant, and in the case of protozoa, relating sample numbers
to the nature of the source water and the number and types of treatment barriers present. The
larger the population served by a supply the greater the impact of treatment failure (in terms of
the community affected, rather than the individuals affected), and the larger and more extensive
the distribution system, the more opportunity there is for a breach in its integrity to occur.
Section 4.4.4 of the DWSNZ refers to the need to collect samples for E. coli analysis on different
days of the week. This may be difficult for some water suppliers due to isolation, availability of
courier services, or the hours the laboratory are open for business. An exemption is permissible,
provided the water supplier has conducted a risk analysis that shows that sampling on selective
dates does not bias the results. Drinking-water is delivered seven days a week so suppliers need
to know that the water quality is equally satisfactory on all seven. This is discussed further in
Chapter 17: Monitoring, section 17.2.
If monitoring a water supply for E. coli is to have any significant role in preventing people
becoming ill from drinking contaminated water, it is essential that there is an immediate
response whenever a transgression occurs. As explained in section 6.2.2, a supply can transgress
the MAV, yet the supply can still comply with the DWSNZ; this only happens if there are many
samples tested and very few transgressions found. If the only response is to retest, a delay of
several days may occur before remedial action is taken and the breach of the water treatment
barriers identified. During that time the community may have been exposed to a significant
health hazard from the contaminated water. False positive laboratory results are relatively
uncommon, thus a transgression suggests a breach to a treatment barrier. For a water supply to
be well-managed it is essential that all transgressions be acted upon promptly. Any faecal
material that is indicated to be in the water leaving a treatment plant must be of considerable
concern to the supply operator because its presence is a clear warning of a systems failure. Small
numbers of E. coli in a distribution system may pose less of a threat, especially if there is a
chlorine residual, and accordingly the response may be less intensive, but high counts (eg,
>10 per 100 mL) should be a signal for immediate action.
In all cases where faecal contamination is detected it is very important that a competent person
inspect the source water for possible changes, and the treatment plant and/or the distribution
system for unexpected breaches. Someone who thoroughly knows the system under
investigation should be able to identify problems quickly. Trouble-shooting for anyone, familiar
or not with the supply, will always be made easier by the system being clearly documented
together with all contingency plans (which should be documented in the PHRMPs).
Abnormalities in the system are much more readily noticed when it is known what should be
there and how the system is designed to perform.
Every follow-up of a positive E. coli test should be recorded: everything that was observed and
done needs to be recorded. This greatly assists later review(s) of the event and assists in the
implementation of preventive measures. Repeated systems failure will become apparent sooner,
and problems arising from different people being involved at different times are overcome. If
the remedial action taken to correct a problem is not written down, no-one can be sure that
something was actually done.
192
Guidelines for Drinking-water Quality Management for New Zealand 2013
6.2.2 Statistical considerations
The aim of a monitoring programme must be to give a high degree of confidence that the
drinking-water supply is free of contamination. The only way to be 100 percent confident that
100 percent of the water is free of E. coli is to submit the entire supply for testing, and this is not
feasible, there would be none left for drinking! Furthermore, if a small proportion of the water
actually sampled is found to be positive, it may be the result of a false positive phenomenon (eg,
contamination during sampling or processing, or detection of a non-faecal particle, or even
misreporting), rather than a genuine event. Accordingly, practical compliance rules cannot be
derived for 100 percent confidence (ie, certainty) that the supply never transgresses the MAV.
This means that statistical methods must be used to develop the rule, accounting for the
uncertainties. Two main items must be agreed on before those methods can be employed:
1
what percent of the time should the water have no transgressions, even if false positives
occur?
2
what level of confidence should be attached to that claim? In other words, what is the
appropriate burden-of-proof?
The Ministry of Health has a clear mandate in respect of public health to adopt a precautionary
approach. Accordingly, in addressing the second issue, the level of confidence should be high;
95 percent has been adopted (as is common for precautionary approaches in the public health
field).
For the first issue, the position adopted is that E. coli, turbidity, chemicals, disinfection
C.t values and UV fluence should not transgress for more than 5 percent of the time. In bacterial
compliance criterion 2A, the free available chlorine (FAC) content should not transgress for
more than 2 percent of the time. The latter is the more stringent because this compliance
criterion can be achieved without any E. coli monitoring, and is technologically straightforward.
It is important to take a sufficient number of samples to be able to be confident in the results. It
is also important to recognise the possibility of false positive results and occasional small
exceedances of the MAV (ie, transgressions). The DWSNZ accommodate these contrasting
requirements by using percentile standards, mostly 95 percentiles.
For important variables that cannot be (or are not) monitored continuously, there is always a
risk of failing of making one of two errors:
•
failing to detect the proportion of transgressions that actually occur
•
detecting a higher proportion of transgressions than actually occur.
Compliance rules for these percentile standards (Table A1.4 in the DWSNZ, and discussed in
more detail in Appendix 2) are based on a precautionary approach. To do that, the DWSNZ
guard against the first kind of error (often called the consumer’s risk). It does this by minimising
that risk. This means that the second risk (the producer’s risk) will not be minimised,
particularly if the supply is truly borderline for compliance (ie, transgressions actually occurred
for 5 percent of the time). So the DWSNZ are based on the notion of attaining at least 95 percent
confidence of compliance.
Guidelines for Drinking-water Quality Management for New Zealand 2013
193
This means that if only 12-monthly bacteriological samples are collected in one year and none
transgresses the MAV (which is less than 1 E. coli per 100 mL of sample), it is only possible to be
70 percent confident that the water is microbiologically safe. 14 Therefore the desired confidence
cannot be attained. It is only attained for a 95 percentile when one has tested at least
38 samples, of which none transgressed the MAV. For a 98 percentile one would need at least
95 samples (with no transgressions), before attaining the desired confidence.
Figures 6.1 and 6.2 summarise all these results. It should be noted that these sampling
requirements (and those in the DWSNZ editions from 2000) represent a relaxation from those
discussed in the 1995 DWSNZ and Guidelines. For example, one needed a minimum of
58 samples (with no transgressions) to achieve 95 percent confidence of compliance with a
95 percentile standard in the 1995 discussion, but only 38 (with no transgressions) in the 2000
DWSNZ. This reduction is because the 1995 set was derived using classical statistical methods,
whereas the present standards use Bayesian methods. It can be shown (McBride and Ellis 2001)
that the classical methods are the most pessimistic of all possible compliance rules, which makes
them somewhat inappropriate.
Figures 6.1 and 6.2 show the results of the calculations, from which Table A1.4 in the DWSNZ
was derived, see McBride and Ellis 2001 or McBride 2005 for the full details, as summarised in
Appendix 2 (in Volume 2 of the Guidelines).
As an example, reference to Table A1.4 shows that the desired 95 percent level of confidence is
obtained when there are 38–76 samples, none of which transgresses the MAV. One
transgression is allowed if there are between 77–108 samples. Similarly, if four transgressions
occur, a minimum of 194 complying samples is required. These results can be read from
Figure 6.1, by reading the point at which the curved lines cross the horizontal dashed line, which
is at 95 percent confidence of compliance.
Note that in all cases the allowable proportion of transgressions in the samples is less than the
DWSNZ requires. For example, allowing one transgression in 100 samples is 1 percent, yet the
DWSNZ for Table A1.4 contemplates transgressions for up to 5 percent of the time. This is
precisely because a precautionary stance has been taken to the burden-of-proof; it guards
against the possibility of finding few transgressions when in fact the supply was in breach of the
DWSNZ. So there is a high (~95 percent) probability that the MAV was not exceeded for more
than 5 percent of the time if there is only one transgression in 100 samples, and very close to
100 percent confidence if there are none. In other words, the benefit-of-doubt is in favour of the
consumer, not the supplier. This is as it should be.
Note too that as the number of samples increases, the proportion of allowable transgressions
gets ever closer to 5 percent, eg, for 330 samples, one can have 10 transgressions (over
3 percent). Had a permissive stand been taken the allowable proportion of transgressions
among the samples would always be greater than 5 percent.
14
The situation is worse still if one of those samples is a transgression; the Confidence of Compliance falls to
20 percent (McBride and Ellis 2001, McBride 2005).
194
Guidelines for Drinking-water Quality Management for New Zealand 2013
Figure 6.1: Confidence of compliance for a 95 percentile, over smaller and larger datasets
Source: McBride and Ellis 2001 and McBride 2005.
Numbers on the graphs are the observed number of transgressions.
Figure 6.2 has been included for historical reasons, and for interest. The 2008 DWSNZ do not
have any instances where 98 percent confidence is required.
Figure 6.2: Confidence of compliance for a 98 percentile
Source: McBride and Ellis 2001 and McBride 2005.
15
Numbers on the graphs are the observed number of transgressions.
15
These graphs update the version in the 1995 Guidelines, using Bayesian methods. The 1995 graphs were not
Bayesian and so were unduly pessimistic. Furthermore, they contained an error (see McBride and Ellis 2001).
Guidelines for Drinking-water Quality Management for New Zealand 2013
195
6.3 Microbiological compliance
6.3.1
Introduction
The DWSNZ require that all water supplies be subjected to microbiological monitoring because
microbiological determinands are considered to be Priority 1, ie, determinands of health
significance for all drinking-water supplies in New Zealand.
The micro-organisms of most concern are those that are of faecal origin. However, as it would be
impracticable to test for the presence of all faecal organisms, or even a selection of pathogens
that could be in a contaminated water supply, it has been customary to test for microbiological
compliance using indicator bacteria, as discussed in Chapter 5: Microbiological Quality,
section 5.3.
However, in recent years it has become apparent that the traditional bacterial indicators of
faecal contamination, ie, the faecal coliform or more recently the E. coli bacterium, are not good
indicators for some viruses or for the pathogenic protozoa, in particular Giardia and
Cryptosporidium, which have been found in some New Zealand surface waters and non-secure
bore waters. The protozoa compliance criteria are covered in section 5 of the DWSNZ, and are
discussed in Chapter 8: Protozoa Compliance of the Guidelines.
For bacterial compliance in New Zealand, we rely on monitoring E. coli as per the DWSNZ, and
the implementation of PHRMPs. In the US, the Surface Water Treatment Rule (SWTR) requires
that filtration and disinfection must be provided to ensure that the total treatment of the system
achieves at least a 3-log removal or inactivation of Giardia cysts and a 4-log removal/
inactivation of viruses. In addition, the disinfection process must demonstrate by continuous
monitoring and recording that the disinfectant residual in the water entering the distribution
system is never less than 0.2 mg/L for more than four hours. Rather than using a log removal
approach for bacteria, or a C.t value approach, the USEPA Total Coliform Rule requires that
coliforms be absent.
6.3.2 Methods for detecting and enumerating E. coli
As discussed in Chapter 5, section 5.3, E. coli is now the sole bacterial indicator used in the
DWSNZ. A number of the newer methods for testing for coliforms in water test for total
coliforms and/or E. coli. When these tests are used it is only the E. coli result that is sought.
Total coliforms have limited interest in their own right, but with one important exception: when
total coliforms are detected in the absence of E. coli, it is important that the source be
investigated as their presence may be indicative of a barrier failure or biofilm development.
The referee methods for testing bacterial compliance are shown in section A2 of the DWSNZ.
Presence/absence tests that have been accepted by the MoH for compliance testing are listed in
WINZ. IANZ accredited laboratories, and the laboratories that are recognised by the MoH for
conducting bacterial compliance testing, can be found on http://www.drinkingwater.org.nz, or
www.ianz.govt.nz.
If a total (or presumptive) coliform method, or a faecal coliform method, is used that does not
explicitly enumerate or detect E. coli, the results must be considered as equivalent to E. coli.
Thus if these test results are positive, the action must be as if the test were for E. coli. Refer also
to Chapter 5: Microbiological Quality, section 5.4.1.
196
Guidelines for Drinking-water Quality Management for New Zealand 2013
6.3.3 Effective monitoring programmes
Maintaining a safe drinking-water supply is dependent on the presence of multiple barriers to
reduce contamination and the transmission of pathogens. A monitoring programme is designed
to provide an assurance that these barriers are continuing to function and have not been
breached. The need for a large number of samples to be tested if a high level of confidence in the
integrity of a supply is to be maintained is discussed in section 6.2.2. In addition to the
minimum number of samples that are needed for confidence, it will also be important that
sampling is carried out at a specified frequency, so that the minimum interval exists between
successive samples. This will ensure that breaches to the system are identified soon after they
occur. Thus a sampling routine is adopted, eg, once a week, as in the Table 4.2a of the DWSNZ.
It should be noted that not all contamination events are random. Occasionally they may be the
result of a cyclic event, eg, management practices at a treatment plant, or an intermittent
discharge upstream of the intake. Thus it is important that a sampling routine is randomised.
This is most readily done by varying the time of day and the day of the week when regular
sampling is performed. Sampling plans must be documented and adhered to; variations may be
approved by the DWA. See Chapter 17: Monitoring, section 17.2 for further discussion.
How to estimate the sampling frequency for water supplies with varying
population
All water treatment plants and distribution zones are registered to supply a normal or usual
population, which is the population most often found. Some water supply areas experience large
fluctuations in population, such as beach resorts, ski fields and camping grounds. The peak
population must be estimated and submitted to the DWA with the sampling plan. The sampling
frequency should be that required for the higher population for the duration of the higher
population, plus at least two weeks before the population is expected to increase. For water
supplies that are shut down or operate at a very small fraction of the peak rate, this period may
need to be extended to a month. Monitoring before the population increases ensures that there
will be time for any treatment process to settle in, and time to remedy any problems that come
to light.
Monitoring stand-by, out-of-service or intermittent supplies
Scheduled samples do not need to be collected while a normally continuous supply is
interrupted. However, accurate records need to be maintained so the absence of results from
scheduled samples does not result in non-compliance.
Many water suppliers have a water source that is only used occasionally, eg, in the summer,
during a drought, or when there is a problem with the regular source. These supplies do not
need to be included in the routine monitoring schedules. No monitoring is required while a
source or treatment plant is out of service for a period of time, however, the water supplier must
ensure by appropriate monitoring that the source is free of E. coli or that the plant is operating
to its full treatment capability before being placed back on line. Once the source is online,
monitoring should proceed, as a minimum at the rate required by the DWSNZ. Compliance is
based on statistical considerations and intermittent supplies will not be tested as often.
Therefore additional monitoring is recommended while these sources or supplies are operating.
Guidelines for Drinking-water Quality Management for New Zealand 2013
197
Monitoring occasional low-level contamination
On some occasions a membrane filtration technique can prove useful because it can increase the
detection limit of the E. coli test. This can prove helpful in understanding what is going on at
some locations such as water treatment plants, service reservoirs or after a mains repair. For
example, Rotorua District Council (Charleson, personal communication) found one bulk water
supply point occasionally returning faecal coliforms at 1 cfu/100 mL when using a 100 mL
sample, giving rise to the sampler or laboratory being thought of as “having problems”. After
analysing 10 litre samples, counts of about 80 cfu per 10 L (0.8 per 100 mL) were obtained at
the site of concern, as well as lower levels (20–40 per 10 L) at other supply points,
demonstrating that there really was underlying contamination in the source water. Such an
approach would not mean that transgressions would occur, because the DWSNZ requires “Less
than 1 cfu in 100 mL of sample” for E. coli (Table 2.1 of MAVs, DWSNZ). However, in such
circumstances, it is certainly advisable to investigate the cause and introduce an appropriate
remedial action.
6.3.4 Monitoring drinking-water leaving a treatment plant
The DWSNZ consider that there would usually be a greater potential risk to the community if
the water entering the distribution system were contaminated than there would be from
contamination during distribution. Monitoring the water as it enters the distribution system
after the completion of all treatment steps is thus the most critical phase of the monitoring
programme. Not only must it be frequent but also the frequency should reflect the nature of the
source water and treatment processes and the size of the population drinking the supply (see
Table 4.2a in the DWSNZ for presentation of minimum sampling frequencies). Thus the more
vulnerable the source water to contamination, the more monitoring of the efficiency of the
treatment process and the barriers to contamination there needs to be.
The frequency of E. coli monitoring is risk based. A secure bore water requiring no treatment
needs only occasional (monthly or quarterly) testing, whereas surface water leaving the
treatment plant supplied to populations over 10,000 and using bacterial compliance criterion 1
must be tested daily, Table 4.2a of the DWSNZ. Always bear in mind that the DWSNZ states the
minimum sampling frequencies required in order to demonstrate compliance.
Water supply operators must always be alert to events that could have a major impact on source
water quality or the efficiency of barriers against pathogens. Risk management plans should
include an automatic increase in sampling when events occur that could impact significantly on
source water quality or the treatment process, eg, high rainfall. For example, see the discussion
in Chapter 3: Source Waters, section 3.5.1, that shows how E. coli (and presumably many other
microbes) are stored in stream sediments during low flows, and occasionally flushed out in
much higher concentrations during flood events.
Although there is just the one MAV of less than 1 E. coli per 100 mL, section 4.3 of the DWSNZ
has established five sets of compliance criteria for water leaving the treatment plant. These are
based on the type of disinfection employed, and the more effective the disinfection process, the
fewer samples required for testing. The reduced sampling frequency is an attempt to balance
risk with the costs of compliance.
Compliance criterion 1 (section 4.3.1 of DWSNZ) applies where there is no disinfection or
inadequate disinfection. Also, a water supplier may choose to use solely E. coli testing for
bacterial compliance, provided they have nominated this in their annual monitoring plan.
Sampling frequency is population based and varies from weekly to daily.
198
Guidelines for Drinking-water Quality Management for New Zealand 2013
Compliance criterion 2 (section 4.3.2 of DWSNZ) applies when chlorine is dosed continuously.
Criterion 2A applies when the free available chlorine (FAC) is monitored continuously. Because
the efficacy of FAC is pH dependent, pH must be monitored online too, so FACE can be
calculated. E. coli testing is not required if the criterion 2A conditions are met. Criterion 2B
applies when the water is considered to be non-continuously monitored. Sampling frequency is
population based and varies from fortnightly to twice weekly.
Compliance criterion 3 (section 4.3.3 of DWSNZ) is the chlorine dioxide equivalent to criterion
2A, where a residual of 0.2 mg/L is considered equivalent to 0.2 mg/L FAC. If there is a chlorine
dioxide residual as well as FAC, their concentrations may be added. Compliance criterion 3 also
applies, with no additional requirements, if chlorine dioxide disinfection satisfies at least
0.25 protozoal log credits (section 5.14 in DWSNZ).
Compliance criterion 4 (section 4.3.4 of DWSNZ) applies when the water is continuously dosed
with ozone, and the continuously monitored C.t is at least 0.5, eg, a residual of 0.05 mg/L
persists for at least 10 minutes. A reduced E. coli sampling frequency is allowed in
acknowledgement of the disinfecting efficacy of ozone, but because there is no residual,
fortnightly sampling for E. coli testing is required, regardless of population. Satisfying the
protozoal compliance requirements by using ozone (section 5.15, 0.25 log credits or more)
automatically achieves bacterial compliance, and no E. coli monitoring is required, ie, not
compulsory.
Compliance criterion 5 (section 4.3.5 of DWSNZ) applies when UV disinfection is used. If all the
protozoal compliance requirements are met when disinfecting with UV light using a dose
equivalent to 40 mJ/cm2 (section 5.16, DWSNZ), bacterial compliance is automatically
achieved, and no E. coli monitoring is required. For bacterial compliance purposes, UV
appliances must have been validated with MS2 organisms, not for example with T1 (see
section 5.3 of USEPA 2006). If the UV disinfection appliance is not validated, or any other
requirements of section 5.16 are not met, bacterial compliance must be met by using bacterial
compliance criteria 1, 2, 3 or 4.
6.3.5
Monitoring drinking-water from groundwater
a)
Demonstrating bore water security
Section 4.5.2 of the DWSNZ specifies the compliance criteria for demonstrating whether bore
water is secure. These are discussed in more detail in Chapter 3: Source Water, section 3.2
Groundwater.
1
Bore water security criterion 1, section 4.5.2.1, covers demonstrating whether groundwater
is affected by surface or climatic influences.
2
Bore water security criterion 2, section 4.5.2.2, covers bore head protection.
3
Bore water security criterion 3, section 4.5.2.3, covers demonstrating the absence of
E. coli.
The E. coli monitoring requirements depend on the nature of the bore.
If the bore water is from a spring or a groundwater source drawing from an unconfined aquifer
that is less than 10 m below the surface, the water is to be considered equivalent to surface
water. That means one of the bacterial compliance criteria in section 4.3 of the DWSNZ applies,
and one of the protozoal compliance criteria in section 5 applies.
Guidelines for Drinking-water Quality Management for New Zealand 2013
199
If the bore has satisfied bore water security criterion 1, or is drawing from an unconfined aquifer
at least 30 m deep and there is hydrogeological evidence that the bore water is likely to be
secure, the bore is given ‘interim secure status’. Table 4.5 in the DWSNZ specifies the E. coli
monitoring requirements for interim secure bore water. Bore water security criterion 3 is
satisfied if E. coli are absent for 12 months, thereafter sampling can be reduced to the secure
bore water rate.
If the bore is drawing water from an unconfined groundwater source that is between 10 and
30 m below the surface, E. coli need to be absent during the 5 year monitoring period before
bore water security criterion 3 is satisfied, see Table 4.5 in the DWSNZ. During the five- year
proving period, one of the bacterial compliance criteria in section 4.3 of the DWSNZ, and one of
the protozoal compliance criteria in section 5, must be satisfied. Generally, this is most likely to
be achieved by using UV disinfection, or chlorination plus UV.
Section 4.5.5 of the DWSNZ explains the actions to be followed in the event that E. coli are
found during the ‘proving period’.
b)
Ongoing monitoring of secure bore water
Once security has been demonstrated, the initial sampling frequency for E. coli testing for all
populations is monthly; this can be reduced to quarterly once a further 12-month period has
passed with all samples containing less than 1 E. coli per 100 mL, see section 4.5.4 and Table 4.5
of the DWSNZ.
Section 4.5.3 offers reduced E. coli monitoring of bores drawing from a common bore field.
Sections 4.3.9 and 4.5.5 of the DWSNZ specify the actions to be followed in the event of E. coli
been found. Any detection of E. coli requires an immediate reassessment of the supply’s security
status. As well as a sanitary survey and inspection of the bore head, increased E. coli sampling is
required.
Section 3.2.3.1 in Chapter 3: Source Waters discusses procedures to be followed after events
such as major floods and earthquakes. These should be covered in the PHRMP. Ideally, weekly
samples for E. coli testing for at least four weeks should be collected whenever the bore water
may been have affected due to damage to the confining layer, bore head or adjacent bores.
If the secure bore water receives treatment that could allow microbiological contamination, or is
stored uncovered, the water leaving the treatment plant (ie, the water entering the distribution
system) must satisfy one of the bacterial criteria in section 4.3. In this situation, proving bore
water security offers little advantage.
If a bore water maintains its secure status, it satisfies the bacterial compliance criteria. If it is
chlorinated so that FAC can be maintained in the distribution system, there are no additional
monitoring requirements for the water leaving the treatment plant such as monitoring FAC
concentration, pH or turbidity.
Once bore water (secure or not) enters the distribution system, the bacterial compliance criteria
in section 4.4 of the DWSNZ apply.
200
Guidelines for Drinking-water Quality Management for New Zealand 2013
6.3.6 Monitoring drinking-water in the distribution system
The frequency of monitoring of the water in the distribution system will, as for the water leaving
the treatment plant, be related to the population size, so that the larger the population receiving
the water, the more testing is needed; see Table 4.3a in the DWSNZ. There are two reasons for
population-based sampling. One is the number of people at risk from a contaminated supply,
and the other relates to the fact that a distribution system serving a large population will usually
be more extensive than that for a smaller population, thus there is more opportunity for
breaches of the integrity of the system to occur.
It is very important that, when determining the number of samples to be taken for a compliance
monitoring programme, managers look closely at the nature and quality of the distribution
systems, the population base and fluctuations that do or could occur, and events that could
impact on the integrity of the system, eg, very low or very high temperatures (these extremes
tend to occur when the main is shallow or is not even buried), pipework maintenance and
replacement programmes, land use and development, and retention time or distance.
A sampling programme should not be based simply on the minimum number of samples
required for compliance but reflect good management practice (see Chapter 2: Management of
Community Supplies) and be specifically designed for each system. It must be reviewed
regularly to ensure it still meets its objectives and should be responsive to all types of change.
In selecting sampling points for the monitoring of a distribution system it is important that the
points chosen represent the water being supplied to the consumer and give a comprehensive
cover of the network. Points of high draw off should be featured, as should extremities of the
system, where deadends occur, and areas where breaches are more likely, eg, service reservoirs,
low usage areas where the FAC may have dissipated, old pipework, areas of low pressure, or
areas at risk of being excavated.
It is recommended that there be 2–4 times as many sites as the minimum number required, and
that these are rotated on a regular basis. At least one site should be sampled every sample round
in order to indicate trends, especially if FAC is measured at that site as well. The extra sites will
allow good coverage of the distribution system.
Service reservoirs tend to be contaminated more often than water mains, due to both breaches
in structural integrity and to dissipation of chlorine residual in low turnover reservoirs.
Therefore all service reservoirs should be inspected and sampled at least once during the course
of a year, provided they are connected to the supply at the time. If any are only used seasonally,
ie, just satisfying peak summer demand, they should be tested before going back on line.
Water suppliers should consider installing special sample taps off a short link from a watermain,
rather than using consumers’ taps. This will overcome problems such as accidents while
flaming, or obtaining a positive result because the (perhaps dirty) tap was not flamed.
The monitoring plan must be documented, ideally as part of or appended to the PHRMP. The
sampling scheduler facility in WINZ may be helpful in designing the monitoring plan.
The bacterial compliance criteria for water in the distribution system are discussed in
section 4.4 of the DWSNZ. Criterion 6A applies to the situation when only E. coli testing is used.
Criterion 6B is for zones supplying a population of over 500 and the water supplier has chosen
to substitute FAC monitoring for some of the E. coli monitoring; this is discussed further in
section 6.3.7.
Guidelines for Drinking-water Quality Management for New Zealand 2013
201
The DWSNZ also cover bulk distribution zones. These are the parts of the distribution network
that deliver water from the treatment plant(s) to one or more distribution zones. Usually, but
not necessarily, they are owned and operated by a different water supplier, may or may not
include service storage, and services only a nominal number of consumers directly. A bulk
distribution zone may be identified due to its operational characteristics, or the characteristics
of the water it supplies, by agreement between the water supplier(s) and the DWA. See
section 4.4.7 of the DWSNZ for details.
Section 6.4 and section 17.2 of Chapter 17: Monitoring, Water Treatment and Drinking-water,
cover sampling.
6.3.7
Chlorine testing as a substitute for E. coli
Chlorine inactivation of pathogenic bacteria and viruses requires a combination of sufficient
contact time and the chlorine concentration at the end of the contact time. Drinking-water with
a low chlorine demand will maintain the residual for longer.
The hypochlorous acid molecule (HOCl) is a very effective bactericide and virucide. At alkaline
pHs, this dissociates to the hypochlorite ion (OCl-) which is not a very effective bactericide.
Chlorine becomes increasingly less effective as the pH rises above 8, see Chapter 15: Treatment
Processes, Disinfection. The disinfecting power of chlorine in water can be measured by FACE
(the FAC equivalent), which is the FAC concentration that would have the same disinfecting
power as the chlorine solution would have when adjusted to pH 8.
If chlorine is being used correctly and there is evidence that there is adequate chlorine
remaining at the completion of the inactivation step, chlorine monitoring can be used to reduce
the E. coli monitoring frequency required to satisfy bacterial compliance.
For water leaving the treatment plant, FACE concentrations are measured after a contact of at
least 30 minutes, see DWSNZ section 4.3.2. Because water in the distribution system has had a
much longer contact time, much of it at a pH less than 8.0, FAC measurements are appropriate.
Experience in New Zealand is that water leaving the treatment plant with a FACE of at least
0.2 mg/L is most unlikely to contain E. coli. Likewise, water in the distribution system only very
rarely contains E. coli if the FAC is over 0.2 mg/L. A further advantage in allowing substitution
is that chlorine test results are available immediately, whereas E. coli results take at least
24 hours.
Compliant online chlorine monitoring of water leaving the treatment plant gives a very high
level of confidence in the disinfection process. Therefore bacterial compliance criterion 2A
allows FACE monitoring in lieu of E. coli monitoring, DWSNZ section 4.3.2.1. But because
E. coli monitoring may be completely substituted, the FACE must be at least 0.2 mg/L for
98 percent of the time.
Bacterial compliance criterion 2B specifies the conditions that will allow a reduced level of
E. coli monitoring, DWSNZ section 4.3.2.2. Likewise, bacterial compliance criteria 4 and 5 allow
reduced E. coli monitoring, provided the ozone and UV disinfection processes are compliant, see
sections 4.3.4 and 4.3.5.
202
Guidelines for Drinking-water Quality Management for New Zealand 2013
Figure A1.1 in DWSNZ shows how much FAC is required to produce 0.2 mg/L FACE at a pH
from 8.0 to 9.0. Figure 17.6 in Chapter 17: Monitoring (in the Guidelines), shows the percent of
undissociated HOCl at a wide range of pHs. These figures are diagrammatic, so it is not possible
to use them to convert mg/L FAC to mg/L FACE accurately.
This can be done more accurately using a spreadsheet, eg, Excel, see Table 6.1 for an example.
Enter the FAC readings in column A and pH in column B. Copy the following formula and paste
into cell C2 to obtain FACE concentrations. The formula is:
=IF(B2<8,A2,((A2*(1+((10^(-1*(3000/283-10.0686+(0.0253*283))))/10^8)))/(1+((10^(-1*(3000/283-10.0686+(0.0253*283))))/(10^-B2)))))
Substitution of chlorine tests for E. coli tests cannot be allowed so readily for water in the
distribution system. This is because there is less control over the FAC once the water enters the
distribution system. If a breach in the distribution system occurs, there will be no way of
knowing whether there has been adequate contact time for microbial inactivation to have
occurred.
Table 6.1: Example spreadsheet for converting FAC to FACE
Row 1
Column A
FAC
Column B
pH
Column C
FACE
2
1.40
9.0
0.20
3
1.15
8.9
0.20
4
0.92
8.8
0.20
5
0.74
8.7
0.20
6
0.59
8.6
0.20
7
0.46
8.5
0.19
8
0.40
8.4
0.20
9
0.34
8.3
0.20
10
0.28
8.2
0.20
11
0.24
8.1
0.20
12
0.20
8.0
0.20
13
0.20
7.0
0.20
14
0.35
6.8
0.35
15
0.50
8.3
0.30
16
0.45
9.1
0.05
For water supplies serving more than 500 people, DWSNZ section 4.4.2 (compliance
criterion 6B) allows some substitution of E. coli testing of water in the distribution system with
chlorine tests, subject to turbidity constraints, and the FAC being generally > 0.2 mg/L. Earlier
(since 1995) the DWSNZ allowed partial substitution for water supplies serving more than
30,000. The success of this substitution has allowed this approach to be extended. The third
addendum to the WHO Guidelines (2008) states in Table 8.27: “a chlorine residual should be
maintained throughout the distribution system. At the point of delivery, the minimum FAC
should be 0.2 mg/L”.
The DWSNZ state in section 3.1.1: “the DWA must assess the competence of the analyst for
commonly-performed plant or distribution system analyses (field tests), refer HDWAA 69ZL
e and f, and 69ZP h; analysts must be certified as competent if carrying out compliance testing”.
Guidelines for Drinking-water Quality Management for New Zealand 2013
203
Once again, these sampling frequencies are the minimum required to demonstrate compliance;
additional process control testing is recommended. Also, a lot can be learned about the
distribution system if chlorine is monitored continuously at at least one site.
Bacterial compliance criterion 7B, section 4.4.2 in the DWSNZ, specifies the conditions that
allow full substitution of E. coli monitoring in bulk distribution zones with online FAC (or
chlorine dioxide) monitoring.
The ability of chlorine dioxide to inactivate bacteria is at least as effective as chlorine, and it is
not pH dependent. Bacterial compliance criterion 3 allows water leaving the treatment plant to
be monitored by online chlorine dioxide measurement in lieu of E. coli monitoring, see
section 4.3.3 in the DWSNZ. Bacterial compliance criterion 6B allows some E. coli tests to be
substituted by monitoring the chlorine dioxide residual in the distribution system, see
section 4.4.2 in the DWSNZ.
Experience with chloramine disinfection in New Zealand is limited, so the DWSNZ do not allow
substitution of E. coli monitoring by monitoring chloramine residuals.
Disinfection at the treatment plant with ozone or UV light does not generate a residual that can
be carried into the distribution system, so E. coli substitution is not allowed there.
6.4 Sampling and testing
6.4.1
Sample handling
The consequences arising from obtaining faulty samples are serious (eg, declaring that a secure
bore water is no longer secure), so the sample collection technique must be thorough. Calling a
positive test result a ‘false positive’ or blaming it on a contaminated sample (ie, the sampler), a
frequently used excuse, is not acceptable; the minimum corrective action for this is retraining
the staff concerned.
It is possible to include some quality assurance steps in the sampling process. Some water
suppliers take a bottle of sterile water on the sample collection run and include it as a control
sample with the samples collected. Another technique is to take an empty sterile bottle on the
sample run and fill it back in the laboratory with sterile water for testing with samples collected.
Another approach is to collect one sample in duplicate on every sample run in order to develop a
history of repeatability. Water samplers should always take with them more sample bottles than
required so that if there is any suspicion about the integrity of the bottle-filling step, another
sample can be collected.
Ideally, sample sites should be shown on a sample map, with instructions about how to find
them, and must be able to be recognised unambiguously. If the sample is collected from a house
or other situation where there is more than one tap, the tap to be used must be indicated clearly.
It is important that the samples of water collected for testing are collected and transported
properly. Water samplers must be trained in aseptic technique. If the samples are invalid the
subsequent analysis could be a waste of time, and any reporting is likely to be misleading or not
accepted. All sample collectors should be trained in the correct procedures (which should have
been documented) and should be able to demonstrate their competence. Sample collection is
part of field testing, so the DWA will assess the competence of the sampler. Participating
laboratories should provide detailed sampling procedure instructions.
204
Guidelines for Drinking-water Quality Management for New Zealand 2013
All water samples must be identified and labelled clearly. Samples to be included in a
monitoring programme should be labelled with a unique number that clearly identifies the
sampling site and can be interpreted by anyone familiar with the system for identification of
New Zealand water supplies. Sample containers must be labelled on the body of the container
not just on a lid, as these may become separated from the water sample during the laboratory
analysis.
Containers used for collecting microbiological samples must either be sterilised by the
laboratory before use or single-use pre-sterilised containers may be used. Laboratory
sterilisation requires either one hour at 170°C in a hot air oven for glass containers or
15 minutes in an autoclave at 121°C. A pressure cooker can be used if there is no alternative, but
the sterilisation time may then need to be extended and an autoclave indicator used.
The sample containers must have securely fitting stoppers or a leak-free sealing system. Sealing
the container must be a straightforward procedure that does not carry a risk of the sample
becoming contaminated. Sample containers should be filled leaving sufficient air space for the
sample to be thoroughly mixed by shaking before it is tested in the laboratory.
Where chlorine is used as a disinfectant for a water supply, it is important that the chlorine
residual is neutralised by the addition of sodium thiosulphate to the sample and so does not
continue to act. The thiosulphate must be added to the container before it is sterilised. It is not
acceptable to add the thiosulphate afterwards, as this may lead to contamination of the water
sample. For drinking-waters, 0.1 mL of a 3 percent solution of sodium thiosulphate will
neutralise up to 5 mg/L of FAC in a 120 mL sample.
Specially dedicated taps off a short link to a water main can overcome problems of access and
flaming. Service reservoirs should also have dedicated taps; if samples have to be collected by
dipping, special sampling equipment that can be sterilised must be used. In choosing taps to
sample from, avoid those that are leaking or have attachments or hose, unless these are a feature
of the drinking-water system.
There is some debate about flaming taps. Taps in pits, valve chambers, etc (if they have to be
used) should be flamed because they are likely to be contaminated by road dirt, dogs, etc. People
drink directly from taps in dwellings so, in theory, collecting a sample without flaming
represents the drinking-water supply. However, if a fixture contains E. coli (eg, splash from
dirty napkins on the tap in the washhouse) there is a possibility that the result does not reflect
the true condition of the water supply. If taps are unsuitable for flaming then an alternative
surface sterilisation is required, such as spraying with 70 percent alcohol or sodium
hypochlorite solution, but ensure any residue is well and truly flushed off. A study by DWI
(2004) found results for samples taken without prior preparation of the tap showed a number of
failures, mostly for total coliforms. In contrast, the results obtained after disinfection of the tap
– the normal sampling procedure – resulted in only a single failure (for enterococci).
Open the tap and let the water run to waste for several minutes before taking the sample to
represent the water in the system, unless investigating the quality of the first flush or stagnant
water in the pipe. When collecting samples for microbiological testing, fill the container without
prior rinsing. Sample bottles must be kept closed until they are about to be filled. Take care
when opening the container not to contaminate the neck of the container or the inside of the lid
or cap with fingers or to make contact with tap or surrounds. Seal the containers carefully, again
taking care not to contaminate the sample.
Guidelines for Drinking-water Quality Management for New Zealand 2013
205
Both empty and filled sample containers must be stored in a clean environment. Empty
containers that have not been used should be returned to the laboratory to be resterilised if they
become dirty or there is any concern that the seal may have been broken. Devices such as strips
of autoclave tape on the necks of bottles may be used as indicators of seal integrity.
Samples must be transported to the laboratory as quickly as possible after collection and should
be kept cool and in the dark during transport. Water is not a natural environment for E. coli, so
they are not expected to increase in numbers unless the water contains the required nutrients
and is very warm. Water is such an unattractive environment that E. coli are more likely to die
than grow. Their metabolic rate is slower in cold water allowing them to remain alive longer.
If transport times exceed one hour the samples should be maintained below 10°C but not frozen.
Samples that arrive in the laboratory warmer than 10°C shall not be used for compliance testing
unless the temperature of the water has not increased during transit. This can be demonstrated
by:
1
measuring (and recording) the water temperature at the time of sampling and upon
receipt into the laboratory, or
2
observing that the ice or coolant used in the container (eg, chillibin) to transport the
samples is still frozen and that the sample bottles are packed in a manner that would allow
the water sample to cool.
If the water sample has been collected for other tests as well (but obviously not containing
sodium thiosulphate), do the microbiological tests first. If samples cannot be processed
immediately on their arrival in the laboratory, they must be stored in a refrigerator, at a
temperature not exceeding 5°C. The time the samples are processed should be recorded on the
laboratory work sheet.
If the above temperature requirements are not satisfied, it may be valid still to process the
samples, depending on the bacterial history of the supply and the exact details of sample
temperature and transit time. However, the information must be used to modify the sample
transport technique.
The laboratory results are probably the most reliable if the test is performed within six hours of
the sample being collected. Samples more than 24 hours old should be discarded. 16 Tests
performed on such samples cannot be interpreted with any confidence as bacterial counts may
increase, decrease, or remain the same, over time. See section 4.3.6 of the DWSNZ. Sometimes
it may be impossible to satisfy all the temperature and time requirements so there is an
advantage in collecting more than the minimum number specified in the DWSNZ.
6.4.2 Test methods and sources
Bacterial compliance monitoring must be conducted by a laboratory recognised by the MoH for
that work, see IANZ (2007).
The DWSNZ (Appendix A2.1) have specified the referee methods for testing for E. coli, faecal
coliforms and total coliforms. These methods are described in Standard Methods for the
Examination of Water and Wastewater, APHA, 21st edition, 2005 and are already in wide use
in New Zealand laboratories.
16
There may be some exceptional circumstances where this is not possible, such as sampling remote water supplies where the
courier service cannot satisfy the 24-hour requirement. In these circumstances section 4.3.6.1 of the DWSNZ refer readers to
section 3.1.1 which states “Special procedures may be authorised in writing by the Ministry for small or remote drinkingwater supplies”.
206
Guidelines for Drinking-water Quality Management for New Zealand 2013
Non-referee methods are acceptable for water testing provided the performance of the test
compared with the referee test is known and there is provision for checking that the test
continues to perform satisfactorily and the method has been approved for compliance testing by
the Ministry of Health. This can be done either in-house or by regular parallel testing of samples
by laboratories using a referee method.
A report was prepared by NIWA (2005) for the Ministry of Health: A Proposal for Strength of
Agreement Criteria for Lin’s Concordance Correlation Coefficient. A simple test is proposed for
establishing the equivalence of an analytical method with the referee method for E. coli
prescribed in the DWSNZ. A concordance calculator enables the strength-of-agreement to be
calculated.
Presence/absence tests and tests such as the Colilert and Colisure tests now have international
recognition and are approved by the MoH as methods for testing water supplies, have been
available for some years and now have been developed to the stage where they are an extremely
useful and simple approach for testing water supplies where ready access to a routine laboratory
is not available.
Laboratories using presence/absence tests will also need to be able to perform, or get another
laboratory to do for them, enumerations when a positive test result occurs. It is essential when
problem solving a positive result that there are bacterial counts to allow an estimation of the
severity of the problem and to monitor subsequent remedial action, DWSNZ sections 3.1.2 and
4.4.6.
Presence/absence (P/A) tests are unsuitable for testing water supplies known to have E. coli
problems, as delays in obtaining quantitative results would make problem-solving unacceptably
slow.
Whatever method is chosen for detection of E. coli or faecal coliforms, the importance of
resuscitating or recovering strains that have been sub-lethally damaged by environmental
stresses or during drinking-water treatment must be considered.
It is not acceptable to call a positive test result a false positive. False positives can occur, but are
rare when using acceptable test methods. If it is believed that some positive P/A test results are
false positives (ie, caused by bacteria other than E. coli), follow this procedure:
Culture the bacteria growing in the P/A broth on to a selective medium that E. coli can be
recognised on (eg, EMB agar), isolate and purify each colony type, identify taxonomically each
of the isolates and inoculate each pure culture into the P/A test medium. The result is to remain
as a transgression unless all of the following conditions are satisfied:
1
none of the cultures tested are E. coli
2
all of the isolates are identified as something other than E. coli
3
at least one of the isolates gives a positive P/A reaction upon retesting.
E. coli can be enumerated by incubation on selective solid media and by incubating a series of
inoculated tubes containing selective broths. The former method involves counting positive
colonies and reporting the results as the number per 100 mL. The latter technique, the multiple
tube technique, reports results as the most probable number (MPN) per 100 mL, and this is
obtained by looking up MPN tables. Standard Methods (APHA 2005) offers a fairly restricted
arrangement of tubes (numbers thereof and volumes), and therefore a correspondingly small
number of MPN tables. The detection limit in their tables is 1.1 MPN per 100 mL, which is
greater than the DWSNZ MAV of <1 per 100 mL.
Guidelines for Drinking-water Quality Management for New Zealand 2013
207
Standard Methods includes an equation (called Thomas’ simple formula) for calculating the
MPN for when using different volumes or numbers of tubes. Provided more than 100 mL of
aliquot is used in the multiple tube technique, the detection limit becomes suitable for bacterial
compliance purposes. However, Thomas’ simple formula produces approximate results. NIWA
has developed a more exact approach using a program called XactMPN (McBride 2003).
Compliance with bacterial compliance criteria 2A and 7B can be achieved by FAC monitoring
alone. See Chapter 15: Disinfection, section 15.5.1.3 for a discussion on chlorine measurement.
6.4.3 Laboratory competency
The DWSNZ (section 3.1.1) require that water testing laboratories that test water samples for
compliance are on the Ministry of Health’s Register of Laboratories that have been recognised
by the Ministry as competent for the purpose. See Chapter 1: Introduction, section 1.3.10 for a
summary of some requirements of recognised laboratories.
The Ministry will require laboratories to identify water samples with the unique drinking-water
supply code published in the Register of Community Drinking-water Supplies in New Zealand,
to be using acceptable methods (Appendix 2 of the DWSNZ), to have adequate documented
quality assurance procedures, and to demonstrate that they are competent by satisfactory
performance in an inter-laboratory comparison programme.
It is essential that laboratories have documented quality assurance procedures. This does not
need to be in the form of very detailed manuals but the basic procedures of the laboratory must
be written down. It needs to be quite clear what procedure is being used and exactly how the
tests are carried out. All key activities must be documented and everyone involved in testing,
from sample collector to the person reporting the results, must have a thorough understanding
of their responsibilities and duties, any problems that could arise and how they should be dealt
with. All activities undertaken must be recorded so that it is quite clear, from the time of
collection of the sample to the reporting of the results, who did what and when.
All laboratories, regardless of size, must be able to demonstrate competence. This means they
should be audited independently and ideally, participation in an inter-laboratory proficiency
programme. In addition there are a number of other mechanisms for showing competence, eg,
spiked samples, split samples, duplicates, positive and negative controls, both within the
laboratory and in collaborative tests with other laboratories.
The positive control sample is particularly important. If all the water supply samples give a
negative result, this could be explained by all the samples being free of E. coli, but equally it
could be explained by the test not working. Maybe the incubation temperature was too hot or
cold, or maybe there was an inhibitor in the water samples that caused the test not to work.
With a positive control sample included in the same batch of samples, this problem is resolved:
a)
if the control sample gives a positive result then the negative tests demonstrate the
absence of E. coli in the samples that test negative
b)
if the control sample gives a negative sample then the samples giving negative results may
also contain E. coli that did not grow under the conditions of the test so invalidating the
results for that batch of samples.
A negative control sample testing positive suggests contamination of the control sample, of the
media or equipment, or handling, or sample identification. If water supply samples in this batch
also tested positive, interpretation of results will be difficult.
See Chapter 17 for further discussion.
208
Guidelines for Drinking-water Quality Management for New Zealand 2013
6.5 Transgressions
6.5.1
Response
An important aspect of a drinking-water monitoring programme is the response that is made to
a transgression. When a sample of drinking-water is found to contain E. coli it is essential that
there be an immediate response to identify the possible source of the contamination and to
implement corrective actions. The minimum response recommended is shown by flow diagrams
in section 4 of the DWSNZ, Figures 4.1 and 4.2.
Sampling and testing must continue through this response phase at an elevated level. This
means that sampling should be on at least a daily basis. It is not satisfactory to take a sample
and then wait for the result of the test before further samples are collected. There must be a
series of samples being evaluated over a period of time to give a comprehensive picture of the
extent of the problem. The DWSNZ require that at least three consecutive days must be free of
positive E. coli results before corrective action may be considered to have been successful. This
means three days of tests, not tests three days apart!
Water suppliers’ PHRMPs must also document planned responses to events other than failing to
satisfy the criteria in the DWSNZ that will obviously lead to a bacterial transgression or noncompliance. These will tend to be supply-specific but will include matters such as dealing with
power cuts, running out of disinfectant or failure of the disinfection system or disinfection
demand exceeding the maximum dose rate, labour problems, breach of security, spills of
wastewater or other contamination.
a)
Response to finding E. coli in secure groundwater
This topic has already been discussed in section 6.3.5, which refers to sections 4.3.9 and 4.5.5 of
the DWSNZ. Also, read section 3.2 (Groundwater) of Chapter 3: Source Waters for aspects
concerning secure groundwater, water quality and bore head protection.
b)
Response to finding E. coli in the water leaving a treatment plant
Water suppliers using bacterial compliance criteria 1 and 2B must monitor E. coli. Water
suppliers using bacterial compliance criteria 4 and 5 in such a manner that protozoal
compliance is not achieved also must monitor E. coli. Remedial actions for when E. coli are
found are covered in section 4.3.9 of the DWSNZ.
The detection of E. coli in samples taken from water leaving the treatment plant is a major
concern to the plant operator as it indicates failure of one or more of the barriers and a major
risk to the community of illness from drinking the contaminated water. For the susceptible
sections of the population such as babies, the elderly, and those with a number of medical
conditions, contaminated drinking-water may, in the absence of major pathogens, still be the
cause of significant illness. Thus the supply authority must respond immediately and effectively
to the detection of E. coli in repeat samples, eg, by additional disinfection and/or issuing a boil
water notice (see Appendix, this chapter).
Guidelines for Drinking-water Quality Management for New Zealand 2013
209
Other conditions may give rise to the need for a boil water notice, such as an increase in the
turbidity of the final water after heavy rain, indicating a breakdown in the treatment process, or
when the water entering the distribution system is turbid and unchlorinated. Issuing a boil
water notice must be considered at an early stage in the investigation and not seen as a last
resort when all else has failed. The community’s health is paramount and there is a moral
obligation for the water supply authority to alert the community to potential hazards. Boil water
notices should remain in force until the water supply has returned to a satisfactory quality;
however, they are not meant to be a permanent solution to a sub-standard supply.
The response to possible scenarios should be documented in the PHRMP. Firstly, see Figure 4.1
and section 4.3.9 of the DWSNZ.
In attempting to discover the cause, records of the previous day’s turbidity, pH, and FAC levels
in the final water should be examined, as well as the turbidity in the raw water and throughout
the treatment process. Check all records of the operation, inspection of disinfectant dosage, and
check all relevant calibrations.
If E. coli were found in a sample of water leaving the treatment plant the previous day, then that
water may still be in the distribution today. This needs to be checked because contamination
events that exceed 24 hours can be serious. Knowledge of the distribution system will indicate
where the extra sample(s) should collected. The number of additional distribution system
samples that are collected will depend on the results of the inspection of plant records, the size
of the distribution system, and the number of E. coli present in the sample.
c)
Response to a transgression of an operational requirement
Water suppliers using bacterial compliance criteria 2A, 2B, 3, 4 and 5 for water leaving the
treatment plant must monitor parameters related to the performance of the disinfection process
being used. These operational requirement tests can include FAC, chlorine dioxide, and ozone
concentrations, UV intensity, pH, turbidity, temperature and flow. Remedial actions are covered
in section 4.3.9 of the DWSNZ.
Satisfying bacterial compliance criteria 2A and 3 does not require any E. coli monitoring, so
transgressions must be attended to immediately.
A well-managed water treatment plant will have introduced control limits that trigger corrective
actions before reaching transgression level. Potentially useful actions will appear in the PHRMP.
d)
Response to finding E. coli in the water in the distribution system or zone
Finding E. coli in one part of a distribution zone should trigger an immediate search for the
source of that contamination. If the level of contamination is high (≥10 E. coli per 100 mL) the
need to warn consumers in the affected areas should be considered. Where the source of the
contamination is found quickly and corrected, there may be no need to alert the community
because the hazard no longer exists. However, the drinking-water assessor should still be
informed because there has possibly been an opportunity for transmission of waterborne
disease. If the source of the contamination is not readily apparent, or is not able to be corrected
immediately, the community must be informed.
As with all systems failures, it is important that the failure and the corrective actions are well
documented and that sampling regimes remain enhanced until there is complete confidence
that the corrective actions have been effective and no recurrence of the failure is likely. This will
require consideration of various possibilities.
210
Guidelines for Drinking-water Quality Management for New Zealand 2013
Firstly, see Figure 4.2 and section 4.4.6 of the DWSNZ. The distribution system can comprise
three clearly different components; these are discussed separately below.
The water suppliers’ local pipework
Say the laboratory reports that E. coli has been found in a sample or samples collected the
previous day. One of the responses the DWSNZ requires is to resample immediately. This
requires some deliberation:
•
if the water leaving the treatment plant also contained E. coli, and all samples collected that
day from the distribution contained E. coli, then it is highly likely that there is a large scale
public health problem, due to contaminated water or inadequately disinfected water passing
through the system
•
if the water leaving the treatment plant also contained E. coli, but only one sample (of many)
from the distribution system contained E. coli, then the problem may have existed for only a
relatively short period, or that the sampling had just detected the beginning of a large scale
problem
•
if the water leaving the treatment plant did not contain E. coli, and there had been only one
sample collected from the distribution system, then it is possible that the cause was due to
inadequately disinfected water passing through the system but that the cause (at the
treatment plant) was largely diminished by the time the samples were collected; or it could a
spasmodic contamination event in the distribution system
•
if the water leaving the treatment plant did not contain E. coli, and the one sample with
E. coli was one of many collected from the distribution system that day, then the problem
may be either spasmodic, or the sampling detected the end of a larger scale problem.
Each of these scenarios suggests a different response. The two most practical responses are:
•
the minimum resampling should include the sample site that produced the E. coli. If the
contamination was local, this will show whether the problem still persists
•
the previous day’s water will now be further through the distribution system and it may still
be contaminated. An understanding of the network will indicate the most likely sample sites
to check this.
There may be other features or knowledge that suggest a different approach. For example:
•
if the FAC level in the positive sample was lower than expected, it may indicate that some
dirty water entered the distribution system while it was being repaired, or
•
it may indicate that a service reservoir had been releasing water, or
•
it may indicate that there had been some water leaving the treatment plant with less FAC
than normal, or no FAC, for a while
•
if the total plate count of heterotrophic bacteria at the site where E. coli were found was
higher than usual, it may indicate that contaminated water entered the system; check where
the mains repair gang has been operating, or if the Fire Service has been using or testing fire
hydrants
•
if the FAC level in the positive sample was within the normal range, it is possible that the
contamination was very recent and/or very near the sample site.
For discussion on heterotrophic bacteria, see WHO (2003).
Throughout the above discussion, it is assumed that appropriate backflow prevention is in place.
Guidelines for Drinking-water Quality Management for New Zealand 2013
211
The numbers of E. coli found will also suggest different actions. For example, finding several
samples with more than say 10 E. coli per 100 mL should prompt a much more intensive and
urgent response than finding just one sample with 1 E. coli per 100 mL.
Each water supply is unique, so the response when finding distribution system samples with
E. coli should be based on the characteristics of the supply, with due acknowledgement of
previous episodes. The various scenarios should be addressed in the PHRMP so the procedure is
documented before the event, and valuable time is not lost.
Service reservoirs
The response will depend on how the reservoir or tank is operated. Some are in constant use
with such a short retention time that the FAC concentration in the water leaving the reservoir is
not much lower than that going in.
Some have a very long retention time so that FAC is rarely found in the water leaving the
reservoir. Others are only used to maintain pressure in hilly areas during periods of peak
consumption. Some have a common inlet/outlet, so some water will be fresh and some old.
Advice on service reservoir design and operations appears in Chapter 16: The Distribution
System.
Collecting samples from service reservoirs can be a challenge and may require special
techniques and equipment. It is recommended that sample taps be included at the design stage
of new reservoirs, and if possible, installed during a shutdown of existing reservoirs.
Finding E. coli in a service reservoir is usually a sign that it is not as secure as it should be. Apart
from problems arising from poor design, problems can result from contaminated water entering
through cracks in the concrete roof, or walls if partly submerged (Kettell and Bennett 1993).
Problems can also result from hatches being left open, or being prised open by vandals, or if
gaps are big enough to allow birds or other animals (or their wastes) egress. As well as collecting
the samples, the water sampler should also inspect the reservoir.
The PHRMP should include a service reservoir inspection and maintenance programme.
Bulk distribution zones
The response when finding E. coli (compliance criterion 7A) should be as for the water suppliers’
local pipework above, except that the previous day’s water will now be further through the
distribution system and this probably means in another authority’s system. The responses that
should follow discovery of E. coli in a bulk distribution zone should be documented in
agreement with the client(s), before E. coli are found. A minimum requirement must be to
advise the client of the discovery.
If the FAC concentration falls to transgression level (compliance criterion 7B), investigate the
cause immediately, see DWSNZ section 4.4.7.5. Possible remedial actions should be anticipated
in the PHRPM, and may include: checking records of the FAC leaving the treatment plant,
checking chlorine consumption vs flow, recalibrating monitoring equipment.
212
Guidelines for Drinking-water Quality Management for New Zealand 2013
6.5.2
Record keeping
For each water supply there should be a fully documented description of the microbiological
monitoring programme. The documentation should include details of the treatment plant and
the barriers, the sampling regime and the results of the testing, both routine and non-routine.
The first step in the record keeping process will be to determine how many samples are to be
taken, and when. This is decided after evaluation of the nature of the source water, the type of
treatment process and the extent and age of the distribution system. This must include separate
calculations for the water leaving the treatment plant from that in the distribution system. These
calculations should be based on a hazard analysis of the system and identification of any critical
points in the process and system where enhanced sampling would provide good assurance of the
efficiency of the process, monitoring any weak points in the distribution system, being
responsive to external factors that could affect efficiency, etc. Sampling points must be
identified clearly and evaluated to give comprehensive coverage of the system.
The results of the routine sampling must be kept in an easily accessible form and must include
an automatic alert when transgressions occur. This could be a function of the laboratory
undertaking the tests. The laboratories must be provided with clear instructions regarding to
whom transgressions are to be reported, and how. Once a transgression is notified the water
supplier should follow the procedures documented in the PHRMP and all outcomes of this
response recorded. Water supply managers may wish to include a format for recording the
follow-up procedure in the PHRMP. At the end of a period of non-conformance, the episode
should be analysed and the introduction of procedures to prevent a recurrence considered.
Action plans must allow for contingencies such as the absence of any staff.
Where a number of transgressions occur, it is essential that a complete evaluation of the water
supply occurs to look at how the situation can be improved. In extreme cases this may lead to a
recommendation that a source no longer be used or that major improvements to the process and
system be implemented to assure compliance with the DWSNZ. The information for making
such decisions can only come from well-kept records that give a comprehensive overview of all
test results, problems and attempted solutions.
Reporting requirements are covered in section 13 of the DWSNZ.
Appendix: Boil water notices
Water suppliers need to accept that boil water notices may be needed at some stage to address
short-term problems. These need to be considered in advance, ideally as a part of the Public
Health Risk Management Plan (PHRMP). The plan should address:
•
the purpose of boil water notices
•
which situations should prompt a boil water notice to be issued
•
how to handle situations that boil water notices cannot address
•
who should initiate, approve, authorise, and release a boil water notice
•
who (in the water supply authority) should be informed
•
who else needs to be told, including those with special needs
•
maintaining a current contact list of all involved, including emergency contacts
•
each person’s responsibilities, including those outside the water supply authority
•
what the boil water notice should say
Guidelines for Drinking-water Quality Management for New Zealand 2013
213
•
how all those affected shall be informed of the boil water notice
•
how to inform those concerned with progress in dealing with the situation
•
when an alternative water supply should be provided, and how to do this
•
how and whom to advise that the boil water notice has been withdrawn
•
follow-up procedures to assess performance and improvements.
See DWI (2012) for a discussion on the effectiveness of different methods of informing the
public of the need to boil water.
References
APHA. 2005. Standard Methods for the Examination of Water and Wastewater (21st edition).
American Public Health Association, American Water Works Association, Water Environment
Federation.
DWI. 2004. Quality of Drinking Water in Public Buildings. Report No: DWI 6348. 167 pp.
http://dwi.defra.gov.uk/research/completedresearch/reports/DWI70_2_164_public%20buildings.pdf
DWI. 2012. Improving Communication on Cryptosporidium and ‘Boil Water’ Notices: Lessons from
Pitsford. Final report to the Drinking Water Inspectorate. 5 pp.
http://dwi.defra.gov.uk/research/completed-research/2000todate.htm
IANZ. 2007. Supplementary Criteria for Accreditation No. 1.2/2.2: Ministry of Health Register of
Water Testing Laboratories (2nd edition). Auckland: International Accreditation New Zealand,
August. See: http://www.ianz.govt.nz/
Kettell D, Bennett N. 1993. Lyttelton Water Supply: 1992 annus horribilis. New Zealand Water and
Wastes Annual Conference.
McBride GB, Ellis JC. 2001. Confidence of compliance: a Bayesian approach for percentile
standards. Water Research 35(5): 1117–24.
McBride GB. 2003. Preparing Exact Most Probable Number (MPN) Tables Using Occupancy
Theory, and Accompanying Measures of Uncertainty. NIWA Technical Report 121, 63 pp.
http://lib3.dss.go.th/fulltext/Journal/J.AOAC%201999-2003/J.AOAC2003/v86n5(sepoct)/v86n5p1084.pdf
McBride GB. 2005. Using Statistical Methods for Water Quality Management: Issues, Problems
and Solutions. New York: John Wiley & Sons.
Ministry of Health. MoH Register of Community Drinking-Water Supplies and Suppliers in New
Zealand. Wellington: Ministry of Health. Available at: http://www.moh.govt.nz/water then select
Publications and find the Register.
MoH Register of Recognised Laboratories. Available at: http://www.moh.govt.nz/water then select
Publications and find the Register.
MoH 2005. Drinking-water Standards for New Zealand 2005 (followed by the 2008 revision).
Wellington: Ministry of Health.
NIWA. 2005. A Proposal for Strength of Agreement Criteria for Lin’s Concordance Correlation
Coefficient. Prepared for the Ministry of Health by GB McBride, NIWA Client Report
HAM2005-062. http://www.moh.govt.nz/moh.nsf/pagesmh/4269?Open
214
Guidelines for Drinking-water Quality Management for New Zealand 2013
USEPA. 2006. Ultraviolet Disinfection Guidance Manual for the Long Term 2 Enhanced Surface
Water Treatment Rule. EPA-815-R-06-007. Washington: United States Environmental Protection
Agency, Office of Water. Available at:
http://www.epa.gov/ogwdw/disinfection/lt2/pdfs/guide_lt2_uvguidance.pdf or go to
http://water.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
WHO. 2003. Heterotrophic Plate Counts and Drinking-water Safety: The significance of HPCs for
water quality and the human health. 256 pp.
http://www.who.int/water_sanitation_health/dwq/hpc/en/index.html
WHO. 2004. Guidelines for Drinking-water Quality (3rd edition). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda.
WHO. 2004a. Water Treatment and Pathogen Control: Process efficiency in achieving safe
drinking water. 136 pp. www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2011. Guidelines for Drinking-water Quality 2011 (4th edition). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
Guidelines for Drinking-water Quality Management for New Zealand 2013
215
Chapter 7: Virological
compliance
7.1
Introduction
No maximum acceptable values (MAVs) have been set for human viruses in the Drinking-water
Standards for New Zealand 2005 (revised 2008). It is likely that a MAV or MAVs will be
established in a future edition. This chapter foreshadows such developments.
In the absence of any MAVs for viruses in the current DWSNZ it should be understood that if
they are specifically sought in drinking-water, they should not be detected. If detected, advice
should be sought from the relevant health authorities.
Neither the Australian Drinking Water Guidelines (NHMRC, NRMMC 2011) nor the WHO
(2011) Guidelines include a guideline value for any viruses.
There are more than 140 different types of human enteric viruses that may contaminate potable
source waters. These include several important groups: Hepatitis A virus, Hepatitis E virus,
norovirus, enterovirus and adenovirus, that have been associated with waterborne illness and
are capable of causing severe, and in some cases fatal, infections. Datasheets have been prepared
for the more important viruses.
Viruses are obligate intracellular parasites, which means they cannot grow or multiply outside
their host. Viruses simply consist of a nucleic acid genome surrounded by a protein capsid and,
in some cases, a lipoprotein envelope. These viruses are very small, ranging from 20–80 nm
(0.02–0.08 micrometres) in diameter; see Figure 4.2 in Chapter 4 to gain a perspective of their
size.
Human enteric viruses are present in the gut, respiratory tract and occasionally urine of an
infected person, and are discharged with body wastes into wastewater and the environment.
Infected people do not always show signs of illness (they are asymptomatic) but they will still
produce viruses in their wastes. Specific viruses or strains of viruses are not always present in a
community at any one time, but representatives of the large groups (eg, adenovirus or
enterovirus) are generally present on most occasions.
Enteric viruses may be found in high numbers in domestic wastewater. Recent New Zealand
studies have shown adenovirus and enteroviruses to be present in concentrations greater 10,000
infectious virus units per litre of wastewater (DRG 2002). The number of viruses in wastewater
varies with the level of infection in the community but, in general, human viruses will always be
present in wastewater, averaging100–1000 infectious viruses/L, occasionally reaching very high
levels of >10,000 infectious virus/L (Lewis et al 1986). Wastewater treatment processes that do
not include a disinfection step may be inefficient in removing or inactivating viruses
(<90 percent removal) so viruses may be found in the raw water.
216
Guidelines for Drinking-water Quality Management for New Zealand 2013
Human enteric viruses cannot multiply in the environment once outside the host. The viruses
are characterised by the ability to survive (ie, retain capability to cause infection) for days, weeks
or longer, in the environment depending on the type of water, season and other factors (Hunter
1997).
A large proportion of the human viruses present in source drinking-waters will normally be
removed or inactivated by well-operated standard drinking-water treatment processes.
Routine monitoring for viruses in treated water and source water is currently impractical in
most situations in New Zealand because of the high cost of sampling and analysis, and problems
of detection of a full range of the viruses occurring.
Not all viruses pose a health risk to humans. For example, a pesticide product, carpovirusine
(see PMEP), contains the active ingredient codling moth granulosis virus (CpGV). This
substance appears on the NZFSA’s complete database of Agricultural Compounds and
Veterinary Medicines (ACVM) as at 2009 (see https://eatsafe.nzfsa.govt.nz/web/public/acvmregister and select entire register). It is also approved in many other countries to control codling
moth on apples and pears.
National Guidelines for separation distances based on virus transport between on-site domestic
wastewater systems and wells have been developed (ESR 2010a).
7.2 Health significance of human viruses in
drinking-water
Section 11.2 of WHO (2004) begins with:
Viruses associated with waterborne transmission are predominantly those that can infect
the gastrointestinal tract and are excreted in the faeces of infected humans (enteric
viruses). With the exception of hepatitis E, humans are considered to be the only source of
human infectious species. Enteric viruses typically cause acute disease with a short
incubation period. Water may also play a role in the transmission of other viruses with
different modes of action. As a group, viruses can cause a wide variety of infections and
symptoms involving different routes of transmission, routes and sites of infection and
routes of excretion. The combination of these routes and sites of infection can vary and
will not always follow expected patterns. For example, viruses that are considered to
primarily cause respiratory infections and symptoms are usually transmitted by personto-person spread of respiratory droplets. However, some of these respiratory viruses may
be discharged in faeces, leading to potential contamination of water (eg, see influenza
virus datasheet) and subsequent transmission through aerosols and droplets. Another
example is viruses excreted in urine, such as polyomaviruses, which could contaminate
and then be potentially transmitted by water, with possible long-term health effects, such
as cancer, that are not readily associated epidemiologically with waterborne transmission.
Guidelines for Drinking-water Quality Management for New Zealand 2013
217
Hepatitis A virus, Hepatitis E virus, norovirus, enterovirus adenovirus and rotavirus may occur
in drinking-water where they are present in the source water and when water treatment does
not remove them completely. Very few human enteric viruses (1–50 virus particles depending
on type) are required to produce an infection in a susceptible person (Hunter 1997; Teunis et al
2008). The symptoms generally attributed to enteric viruses are gastroenteritis and diarrhoea
but they can also cause hepatitis, respiratory, central nervous system, liver, muscular and heart
infections. Some waterborne viruses have also been associated with some forms of diabetes,
chronic fatigue syndrome and dementia (Nwachcuku and Gerba 2004; Ashbolt 2004; Klemola
et al 2008). The major groups of viruses contaminating water are discussed below but may not
represent all the viruses likely to be transmitted by water. It is reasonable to expect that further
important groups of waterborne viruses will be detected in the future and that these will most
likely cause atypical waterborne disease (Nwachcuku and Gerba 2004; Ashbolt 2004).
Norovirus: this group of caliciviruses includes the Norwalk and Norwalk-like viruses. Members
of this group are strongly associated with waterborne outbreaks in many parts of the world.
Symptoms of infection are self-limiting and include vomiting, diarrhoea and nausea over 24–48
hours. Norovirus is quite prevalent in New Zealand and is responsible for a large proportion of
viral gastroenteritis reported to health authorities (ESR 2004). This virus is one of the easiest to
link to a common source outbreak as the symptoms occur rapidly after contact with the virus
(approximately 24 hours). More than 200 people developed acute gastroenteritis in July 2006
when sewage contaminated the water supply (which had inadequate treatment) at a South Island
ski resort. The illness was caused by norovirus genogroup GI-5 (Hewitt et al 2007).
Hepatitis A and E: Hepatitis A and E have a relatively low occurrence in New Zealand (ESR
2004) but induce quite significant symptoms including fever, malaise anorexia and jaundice.
The disease caused by these viruses is essentially clinically indistinguishable and is generally
self-limiting but has a 1 to 2 percent mortality rate. The infectious doses for these viruses are
relatively low (10–100 viruses) and symptoms do not occur until 10–50 days after infection.
Internationally Hepatitis A and E outbreaks have frequently been associated with water
(Kasorndorkbua et al 2005; Meng 2005; Vasickova et al 2005; Kuniholm et al 2009).
Enteroviruses and adenoviruses: these two different groups represent the viruses that are
most commonly found in contaminated surface water (Ashbolt 2004). These viruses produce a
very broad range of symptoms including respiratory, skin and eye, nervous system, liver, heart
and muscular systems. Gastroenteritis with vomiting and diarrhoea is a less common outcome
of infection with these viruses and is limited to only a few adenovirus and enterovirus types.
Reported waterborne outbreaks of these viruses, other than in swimming pools, are very
infrequent. It is not clear whether lack of reporting is because the dominant symptoms produced
by these viruses are not those traditionally associated with water or food borne disease, or
because such outbreaks are indeed rare (Hunter 1997).
Rotavirus: rotavirus has been detected in sewage, rivers and lakes, and treated drinking-water.
Transmission occurs via the faecal-to-oral route. Cases of infection tend to be sporadic but several
waterborne outbreaks have been reported (Gratacap-Cavallier et al 2000; Parashar et al 2003;
Amar et al 2005). Rotaviruses are responsible for a large proportion of severe episodes of
diarrhoea in small children and infants, and they also cause gastroenteritis in the elderly. They are
responsible for as much as 50 percent of the gastroenteritis in hospitalised paediatric patients
during the cooler months of the year in temperature climates (Parashar et al 2003). Acute
infection is characterised by the abrupt onset of severe watery diarrhoea with fever and vomiting.
Dehydration and metabolic acidosis may develop, resulting in death if untreated. Children aged
6 to 24 months are the most severely affected. Rotaviruses are ubiquitous, infecting over
90 percent of all children up to three years of age internationally (Pérez-Vargas et al 2006; Brooks
et al 2007). New Zealand, however, does not have a human rotavirus surveillance programme so
the prevalence of rotovirus and its disease burden cannot be estimated.
218
Guidelines for Drinking-water Quality Management for New Zealand 2013
Virus infections resulting from correctly treated water have not been reported in New Zealand
(ESR 2004), although internationally such outbreaks are recognised (Hunter 1997; Hrudey and
Hrudey 2007). Human viruses have been reported to occur at very low levels (0.1–1/100 L) in
conventionally treated drinking-water in many countries (Vivier et al 2004) including New
Zealand (Kim 2005).
Estimations of viral disease risk using standard risk assessment techniques with a high
infectivity virus predict the surprisingly high annual risk of infection of between 1:3 and 1:25
from conventionally treated drinking-water contaminated by viruses at low levels (~1 virus per
100 litres) (Gerba and Rose 1992).
Viruses and phages are also discussed in Chapter 5: General Microbiological Quality, sections
5.3.4 and 5.4.4. More detailed discussion appears in the datasheets.
7.3 Occurrence of human viruses in source
waters
The New Zealand freshwater microbiology study (Till et al 2002) is the most significant study of
human viruses occurrence in surface water in New Zealand to date. This study carried out in
collaboration between the Ministries for the Environment, Agriculture and Forestry, and Health
tested recreational water locations on 25 rivers and lakes every two weeks for 15 months.
Human adenovirus and/or enterovirus were detected, by qualitative molecular methods, in
more than 50 percent of the 275 samples collected. These data suggest that human virus occurs
quite frequently in surface waters and in a wide range of source water locations and types.
Subsequent culture based studies of virus occurrence in the Waikato River showed that
adenovirus and enterovirus levels are generally low, less than 5 per 100 L, but on some
occasions may be as high as 10 per 100 L (Watercare Waikato River Monitoring studies
2003–2004).
Studies using sensitive qualitative molecular-based virus detection methods suggest that
adenovirus occurrence may be 10 times higher than this level on some occasions in the Waikato
River (Kim et al 2005) although it is not clear whether all of these viruses are able to produce
infections.
International data collated by WHO suggest that typical surface source waters may contain
0–10 viruses per litre (WHO 2004).
7.4 Risk management
Potential for disease outbreaks associated with human virus contamination of source waters,
and the potential to carry over to treated drinking-water is recognised throughout the developed
world. Approaches to controlling the risks are largely through protection of source water quality
by control of human activity in reservoir catchments, and through adequate treatment and
disinfection of drinking-water. It is now well accepted that bacterial indicators such as E. coli
are not necessarily adequate surrogates of viral occurrence. Human viruses tend to be more
resistant to environmental stresses and water treatment mechanisms than are bacterial
indicators, so the absence of the indicator may not equate with absence of the virus
contaminant.
Guidelines for Drinking-water Quality Management for New Zealand 2013
219
7.4.1
International approaches
The paucity of knowledge on the specific occurrence of human viruses in source waters, and the
problems of virus detection and regular monitoring, mean that most guideline documents
include only the qualitative requirement that, if tested for, human viruses should not be
detected in treated drinking-water.
Where virus guidelines or standard requirements are in place these are stated either in terms of
virus occurrence, or as water treatment plant virus removal efficiency. Such values are either
derived from acceptable levels of health risk or, pragmatically, reflect virus detection capability.
Recent standard and guideline recommendations have moved towards risk-based evaluation of
water treatment requirements. The USEPA Surface Water Treatment Rule includes a virus
treatment requirement and requires that treatment of both filtered and unfiltered water sources
is sufficient to remove or inactivate 99.99 percent (4 log) of viruses (USEPA, SWTR). This
requirement is principally based on the acceptable (USEPA 1994) level of waterborne illness in a
community (one case per 10,000 consumers) and the likely level of viruses in surface water. 17
Recent US proposals for surface water disinfection (USEPA 2003/2006a) use the adenovirus
group as the target virus.
The WHO Guidelines recognise that water treatment requirements will differ for different
communities, and propose a risk-based approach for setting performance targets for surface
water treatment plants (WHO 2004).
The risk-based approach takes into account a broad range of factors including virus occurrence
and infectivity, water type, community health status and treatment characteristics. Such an
approach requires a detailed knowledge of the water supply, water treatment performance and
community activities and health status.
Approaches to managing viruses in treated water also recognise that the greatest health risk to a
community occurs when water treatment conditions are atypical such as when source water
condition is unusual, very high levels of virus occur, or through poor performance, or even
failure, within the water treatment process.
7.4.2
Virus removal by current water treatment processes
Reduction of virus numbers in water as a result of treatment can occur through either virus
removal or virus inactivation. Each virus type may react somewhat differently to particular
water treatment methods, but the bulk of research to-date suggests that some broad
generalisations can be made. WHO (2004a) discusses treatment processes suitable for pathogen
control.
Virus removal can occur by physical association of a virus with other particles. Virus association
with particles may be enhanced by addition of coagulants to form a floc, which can then be
removed by settlement and/or filtration. The extremely small size of viruses means that they are
unlikely to be removed if they are not associated with other particles. Water treatment processes
such as flocculation, sand filtration, microfiltration and ultrafiltration, and prolonged standing
in reservoirs, will result in physical removal of particle-associated viruses. Only reverse osmosis
and dialysis membranes have pore sizes small enough to trap virus particles that are not
associated with larger particles or flocs.
17
In the Drinking-Water Contaminant Candidate List 3—Draft in the Federal Register: 21 February 2008, 73(35);
http://www.epa.gov).
220
Guidelines for Drinking-water Quality Management for New Zealand 2013
The effectiveness of virus removal is affected by those factors that act against particle
association or floc formation including water condition and pH (LeChevelier and Au 2004).
Virus inactivation occurs through disruption of the external protein coat (capsid), modification
of specific surface sites needed for infection (host receptor recognition sites) or major change to
the nucleic acid (RNA or DNA). Disinfectants such as chlorine, chlorine dioxide, and ozone will
cause disruption of the virus coat and of the exposed nucleic acids (Shin and Sobsey 2003, Tree
et al 2003). Ultraviolet light in the range of 200–310 nm (antimicrobial range) will disrupt the
nucleic acids by causing cross-linking, leaving viruses unable to replicate.
Viruses can also be inactivated by prolonged holding in reservoirs that are exposed to sunlight,
elevated temperature and extremes of pH, eg, lime treatment (Sobsey 1989). Different virus
types and strains will show different levels of resistance to chemical or physical inactivation.
Adenoviruses are considered to be the most resistant virus group to many disinfection
treatments, because of their structure and nucleic acid makeup, and have been used by the
USEPA as a model virus for designing UV criteria for surface water treatment (USEPA
2003/2006a).
The potential for virus inactivation by disinfectants is reduced by the presence of other particles
or organic matter that will consume disinfectants or of light adsorbing or blocking materials that
reduce UV penetration (LeChevelier and Au 2004).
Repair of disinfection damage is unlikely to occur in viruses as they do not appear to have repair
mechanisms. It has been suggested that some viruses (such as the double stranded DNA
adenoviruses) may be able to repair their DNA, and if there is no damage to the virus coat, they
are still able to infect a host cell (Nwachcuku and Gerba 2004).
Water treatment plants will normally include both virus removal and virus inactivation
processes that act as multiple barriers.
Studies conducted in the 1980s using cell cultures to detect cytopathic effects (CPE) to assay
infective viruses indicated that well-operated conventional water treatment removed 2-logs of
viruses, and disinfection with chlorine could readily achieve a further 2-logs reduction. These
studies also indicated that finished water produced by well-operated conventional treatment
and disinfection was usually free of infective viruses as judged by examination of large volumes
by the cell culture/CPE assay method. A target of 4-logs of virus reduction for surface waters
was adopted by the USEPA in 1989 and has become the de facto benchmark for water treatment
in other developed nations even if not formally stated in their standards or guidelines. More
recent research indicates that the definition of a ‘well-operated’ conventional water treatment
needs to be reassessed. Studies indicated that control of the coagulation/sedimentation/
filtration process is critical for pathogen removal, and that implementation of an operational
target of 0.2 NTU or less for turbidity for individual filtration units is needed to ensure optimum
virus removal (2.0 to 2.5 logs). Operation at higher turbidity levels, but still well below the
traditionally accepted figure of 1 NTU, may achieve only minimal virus removal (less than
0.5 log), and greatly diminishes the effectiveness of the filtration barrier. The target of 0.2 NTU
or less also provides enhanced removal of other pathogen classes (taken from the executive
summary of Strategic Review of Waterborne Viruses, Occasional Paper 11, CRC for Water
Quality and Treatment).
Guidelines for Drinking-water Quality Management for New Zealand 2013
221
Waterborne outbreaks of viral disease have been recorded mainly in groundwater systems
where no disinfection or treatment to remove pathogens has been routinely practised. In
instances where viral outbreaks have occurred in disinfected water supplies, investigations have
revealed either a failure in disinfection, or unusually high levels of contamination in source
waters which overwhelmed the disinfectant dose being applied. Therefore, provided that
adequate control and monitoring of treatment and disinfection processes is being implemented,
well-operated conventional water treatment and disinfection will provide an effective barrier
against such outbreaks. Epidemiological studies of the possible contribution of pathogens in
conventionally treated drinking-water to endemic gastroenteritis have given mixed results;
however a study of robust design conducted in the USA found no evidence that waterborne
pathogens made a detectable contribution to gastroenteritis in a city served by a stringently
operated conventional water supply system (taken from the executive summary of Strategic
Review of Waterborne Viruses, Occasional Paper 11, CRC for Water Quality and Treatment).
Virus removal and inactivation efficiencies for a range of water treatment processes are
reviewed in the WHO (2004) Guidelines (Chapter 7), and by LeChevallier and Au (2004).
The ESR is conducting a long-term research project for the Ministry of Health with the aim to
generate evidence about the concentrations and risks to public health of viruses in river water.
The 2009–2010 report presented data gathered from water samples taken from the Waikato
River in Waikato and the Oreti River in Southland to put together a quantitative microbial risk
assessment. This provided a statistical estimate of the risk of infections arising from various
viruses following consumption of drinking-water, and enabled an estimate of the amount of
treatment required to reduce the risk of viral-related waterborne disease. Key findings were:
•
Enteric viruses were detected in essentially all river water samples taken from the Waikato
River at Huntly and the Oreti River at Branxholme, with most water samples containing
three or more virus types. Adenovirus (AdV), norovirus-GII (NoV GII) and rotavirus (RoV)
were the most frequently detected.
•
The risk of infection is assumed to be independent for each virus.
•
The project predicts that daily consumption of untreated river water will cause most of the
population of Huntly or Invercargill to become infected by each of the enteric viruses over the
course of one year. This prediction gives a baseline from which to assess the amount of virus
treatment required. Water treatment significantly reduced the number of people predicted to
become infected.
•
For Waikato River water, if 4-log-removal/inactivation of viruses were achieved at the
treatment plant, and it is assumed that all the viruses in the source water were infective, it
was predicted that 200–300 people in Huntly would become infected with RoV and with AdV
during one year. However, if only 10 percent of the viruses in the intake water were infective
then the number of infected people would drop to about 30 in one year.
•
Similarly, for the Oreti River water, if 4-log-removal/inactivation were achieved at the
treatment plant, and it is assumed that all the viruses in the source water were infective, it
was predicted that 1000–3500 people in Invercargill would become infected with RoV and
with AdV during one year. If only 10 percent of the viruses in the intake water were infective,
the number of infected people would fall to 150–350 in one year.
•
An assessment for NoV is not yet possible; however if the values are of a similar magnitude to
those of AdV and RoV, then some of the outbreaks of NoV seen in the community could arise
from drinking-water treated to achieve at least a 4-log removal of viruses.
•
The study has indicated that even at low concentrations, waterborne viruses may have an
effect on public health, and that in the absence of water treatment achieving 4-log removal or
inactivation of viruses, the effect on health would be significant.
222
Guidelines for Drinking-water Quality Management for New Zealand 2013
No analyses were carried out to determine whether either the Huntly or Branxholme treatment
plants operate sufficiently to reduce the viral concentration, or whether the level of disease
predicted by the study could be seen in the community by medical professionals; it could be
present as ‘background’ gastroenteritis.
7.5 Sampling, testing and data
interpretation
The determination of virus removal efficiency within a water treatment plant, or occurrence in
treated water, is dependant on the ability to reliably detect and enumerate the viruses.
Determination of the health risk that viruses pose to the community using the water further
depends on the ability to demonstrate or infer that the viruses detected are capable of causing
human infection.
Virus detection and enumeration: No single method allows detection of all virus types and
strains. Traditionally, viruses have been concentrated from water samples using filtration or
adsorption based techniques with subsequent detection by culture in a permissive human or
primate cell line. Many of the virus concentration techniques were developed using poliovirus or
other enterovirus types and it is unclear how effectively these work for other virus types,
particularly norovirus. Most virus concentration and detection methods recover less than
50 percent of the viruses present in a sample (Haramoto et al 2005).
Virus concentration from large volumes of water is laborious and time consuming and adds
significantly to the cost of virus analysis. Not all virus types are culturable in cell lines, with
norovirus and hepatitis E virus not able to be cultured routinely and are not detectable using
traditional culture-based methods. Some virus cell culture-lines are susceptible to several virus
types, for example BGM cells will permit enterovirus, adenovirus and reovirus to grow (Lee and
Jeong 2004), thus if all these are present in one sample, they cannot be separated based on
cytopathic effect alone.
Viruses (culturable and non-culturable) can be detected at very low levels using polymerase
chain reaction (PCR) based molecular methods that target novel DNA or RNA sequences in the
genetic information of the virus. Virus assay using PCR can target individual viruses or groups
of viruses, and multiple analyses are required to investigate all the relevant viruses in a
particular sample (Greening et al 2002). Recent advances in real-time PCR have made these
methods both rapid and quantitative and potentially quite routine. PCR based methods use only
a small amount of the original sample in the assay, so compared with culture methods that
typically use considerably more of the original sample, PCR molecular methods are around
10-fold less sensitive than culture based methods for virus detection (Lewis et al 2000).
PCR-based methods are very utilitarian, offering the advantages of rapid turn-around time,
detection of currently unculturable viruses, and lower assay costs than traditional culturing
methods (Lee and Jeong 2004). However, they are still generally too expensive to be used
routinely.
Virus sampling strategies: Relatively few viruses are needed for an infection to occur in a
susceptible person, so low numbers of viruses must be quantified in relatively large volumes of
finished water. For example, if source waters contain 5000 viruses per 100 L it would be
necessary to sample and analyse at the very least 200 L of finished water to demonstrate a 4-log
reduction in viruses. Typically, source water sample volumes should be 10–100 L, partially
treated waters 50–200 L, and finished, disinfected water sample volumes 100–200 L.
Guidelines for Drinking-water Quality Management for New Zealand 2013
223
The current cost of virus analysis may make regular monitoring beyond the means of many
groups responsible for drinking-water treatment.
Specific short-term studies of virus occurrence and inactivation/removal within a plant are
feasible but should be designed carefully to allow adequate interpretation of the data.
Determination of virus infectivity: Molecular methods for virus detection do not
specifically show whether viruses are still infectious. Detection of viruses using a cell-culture
based technique shows that the viruses are infectious and pose a risk of illness to water
consumers. Infectivity of a virus can however be inferred for certain RNA viruses (norovirus,
enteroviruses, Hepatitis A and E) from molecular detection data where the viruses are subjected
to chemical disinfection, but not UV disinfection (Greening et al 2002). Virus viability is
inferred whenever virus nucleic acid is detected because the nucleic acids (single stranded RNA)
are extremely susceptible to degradation in the environmental.
Interpretation of virus detection and occurrence data: Where viruses are detected in
finished drinking-water the response to the data should be based, in consultation with relevant
health authorities, on a risk evaluation incorporating the type and number of virus detected, the
reproducibility of the result, and the health status and vulnerability of the community.
7.6 C.t values
A C.t value is the product of the residual concentration (C mg/L) of the disinfectant after the
contact time (t minutes) required to cause a specified level of inactivation in a micro-organism.
The C.t value is a measure of the exposure to the disinfectant and has the unit mg.min/L.
Further discussion appears in Chapter 15: Disinfection Processes, section 15.2.1 (C.t values) and
section 15.2.9 (measuring the contact time).
A range of C.t values is given in Appendix C of the Disinfection Profiling and Benchmarking
Technical Guidance Manual (USEPA 1999), including C.t tables for disinfection of viruses by
various disinfectants. These tables are referenced to AWWA (1991) in the text of USEPA (1999)
and in CRC (2005). The 1991 data were carried out using hepatitis A virus and are derived from
experiments conducted by Sobsey and co-workers in the late 1980s (Sobsey et al 1988).
Subsequent publications continue to use the USEPA 1991 tables because research on
disinfectant contact time has apparently not been revisited.
The USEPA Surface Water Treatment Rule (SWTR) required (inter alia) that treatment of both
filtered and unfiltered sources remove or inactivate 4 log (99.99 percent) of viruses. This
requirement was enacted in 1989. The 1991 tables were developed to assist water suppliers
assess the degree of disinfection of viruses being achieved at their water treatment plants.
USEPA’s LT2ESWTR (2003/2006a) includes a table showing the C.t values for disinfecting
viruses using UV light. The proposed UV doses for inactivation of viruses were based on the
dose-response of adenovirus because, among viruses that have been studied, it appears to be the
most UV-resistant and is a widespread waterborne pathogen. Health effects of adenovirus are
described in Embrey (1999).
It is doubtful that this same approach was used in developing the 1991 tables; viruses are simply
referred to collectively, and ‘viruses’ were not defined in the 1991 information provided. Some
viruses require a much higher C.t value than others. Nor is it explained whether the data relate
to studies in single virions or cell-associated virions – the latter require a higher C.t value.
224
Guidelines for Drinking-water Quality Management for New Zealand 2013
Table 7.1 shows the UV doses that water suppliers must apply to receive credit for up to 4 log
inactivation of viruses. This is taken from Table IV – 21 in USEPA (2003), Table IV.D-5 in
USEPA (2006a), and Table 1.4 in USEPA (2006b). The UV dose requirements in Table 7.1
account for uncertainty in the UV dose-response relationships of the target pathogens but do not
address other significant sources of uncertainty in full-scale UV disinfection applications. These
other sources of uncertainty are due to the hydraulic effects of the UV installation, the UV
reactor equipment (eg, UV sensors), and the monitoring approach. Due to these factors, the
USEPA requires water suppliers to use UV reactors that have undergone validation testing, see
Chapter 8. Clearly, UV disinfection is impractical for attempting to achieve 4-log inactivation of
adenovirus; treatment plants using UV disinfection for protozoa inactivation would achieve both
bacterial compliance and 4-log virus inactivation simply by chlorination.
Tables 7.2, 7.3, 7.4, 7.5 have been taken from Appendix C of USEPA (1999) and copied from the
1991 publication, ie, they refer to ‘undefined viruses’. The units are mg.min/L. Although ozone is
clearly the most effective disinfectant, it does not leave a residual. Chlorine is more potent than
chlorine dioxide, and chloramine is all but ineffective.
Table 7.1: UV dose requirements for virus inactivation credit
1
Log credit
Virus
2
UV dose (mJ/cm )
0.5
39
1.0 (90% removal)
58
1.5
79
2.0 (99% removal)
100
2.5
121
3.0 (99.9% removal)
143
3.5
163
4.0 (99.99% removal)
186
1
Based on adenovirus studies.
A free available chlorine content of 0.20 mg/L after 30 minutes retention time is equivalent to a
C.t value of 6. Based on Table 7.2, this would achieve 4 log inactivation of viruses at 10°C. To
achieve 4 log inactivations at 5°C, which requires a C.t value of 8.0, the minimum retention time
should be 40 minutes (0.20 mg/L after 40 minutes retention time; C.t = 8.0). If that retention
time cannot be achieved, the residual free chlorine content should be increased to 0.30 mg/L
(0.30 mg/L after 30 minutes retention time; C.t = 9.0).
Table 7.2: C.t values for inactivation of viruses by free chlorine, pH 6–9
Log inactivation
1°C
5°C
10°C
15°C
20°C
25°C
2
5.8
4.0
3.0
2.0
1.0
1.0
3
8.7
6.0
4.0
3.0
2.0
1.0
4
11.6
8.0
6.0
4.0
3.0
2.0
Guidelines for Drinking-water Quality Management for New Zealand 2013
225
Table 7.3: C.t values for inactivation of viruses by chloramine
Log inactivation
1°C
5°C
10°C
15°C
20°C
25°C
2
1243
857
643
428
321
214
3
2063
1423
1067
712
534
356
4
2883
1988
1491
994
746
497
Table 7.4: C.t values for inactivation of viruses by chlorine dioxide, pH 6–9
Log inactivation
1°C
5°C
10°C
15°C
20°C
25°C
2
8.4
5.6
4.2
2.8
2.1
1.4
3
25.6
17.1
12.8
8.6
6.4
4.3
4
50.1
33.4
25.1
16.7
12.5
8.4
Table 7.5: C.t values for inactivation of viruses by ozone
Log inactivation
1°C
5°C
10°C
15°C
20°C
25°C
2
0.9
0.6
0.5
0.3
0.25
0.15
3
1.4
0.9
0.8
0.5
0.40
0.25
4
1.8
1.2
1.0
0.6
0.50
0.30
DWSNZ section 4.3.2.1 (bacterial compliance) allows E. coli monitoring to be substituted by
online FAC monitoring provided the chlorine contact time is more than 30 minutes, and the pH
is <8, turbidity <1 NTU. This represents a C.t value of 6 or more. The DWSNZ also assume that
drinking-water that meets the bacterial compliance requirements should be free from infective
viruses. That means that when the water temperature falls below 10°C, either the FAC level or
the contact time, or both, need to be increased. During an outbreak of norovirus at Cardrona in
2012, samples were found to contain viruses in the absence of E. coli when the chlorine dose was
inadequate.
Some new research has been reported in DWI (2010). Baseline disinfection experiments were
performed in pH 7 and pH 8 demand-free reagent grade water with 0.2 mg/L free chlorine or
1 mg/L monochloramine at 5ºC. These baseline experiments were performed using several
human adenoviruses (HAdV2, HAdV40, and HAdV41), two coxsackieviruses (coxsackievirus
B3 [CVB3] and coxsackievirus B5 [CVB5]), two echoviruses (echovirus 1 [E1] and echovirus
11 [E11]) and murine norovirus (MNV, studied as a surrogate for human norovirus). The most
resistant representative of each virus type for each disinfectant was selected for additional virus
disinfection experiments using three distinct types of source water collected from drinking water
treatment plants. Experiments were performed in source water at pH 7 and 8 using 0.2 and
1 mg/L free chlorine or 1 mg/L and 3 mg/L monochloramine at 5 and 15ºC. Free chlorine and
monochloramine disinfection experiments were then performed for aggregated preparations of
HAdV2 in source water from one drinking water treatment plant. Viral titres before and after
disinfection were determined by virus-specific plaque assays. The efficiency factor Hom model
was used to calculate C.t values (disinfectant concentration in mg/L x exposure in min) required
to achieve 2-, 3-, and 4-log 10 reductions in viral titres.
226
Guidelines for Drinking-water Quality Management for New Zealand 2013
In all water types, chlorine and monochloramine disinfection were most effective for MNV, with
3-log 10 C.t values at 5°C ranging from <0.02 to 0.03 for chlorine and 53 to 111 for
monochloramine. Chlorine disinfection was least effective for CVB5 for all water types, with
3-log 10 C.t values at 5°C ranging from 2.3 to 7.6. Monochloramine disinfection was least effective
for HAdV2 and E11, depending on pH and water type. At 5°C, 3-log 10 C.t values for HAdV2
ranged from 1044 to 3308, while those for E11 ranged from 814 to 2288. Overall, chlorine was
much more effective than monochloramine, and disinfection proceeded faster at 15°C and at
pH 7 for all water types. C.t values for chlorine and monochloramine disinfection of aggregated
HAdV2 were 2 and 1.4 times higher than for monodispersed HAdV2, respectively.
The results from this project indicate that a C.t value of 10 (or 20, if incorporating a 2x factor of
safety for aggregated virus) may be needed to achieve a 4-log 10 inactivation of CVB5 with free
chlorine at 5ºC, pH 8, which is above the C.t value of 8 recommended in the USEPA’s Guidance
Manual for Compliance with the Filtration and Disinfection Requirements for Public Water
Systems Using Surface Water Sources (Guidance Manual) to achieve a 4-log 10 inactivation with
chlorine at 5°C, pH 6–9. The Guidance Manual recommended C.t value of 8 included a safety
factor of 3x to account for potential virus aggregation. However, C.t values for the study viruses,
including CVB5, were below the 2-log 10 C.t value of 12 reported by the World Health
Organization as an expected performance level for chlorine disinfection of viruses in water at
5°C, pH 7–7.5.
References
Amar CFL, East CL, Grant KA, et al. 2005. Detection of viral, bacterial, and parasitological RNA or
DNA of nine intestinal pathogens in fecal samples archived as part of the English Infectious
Intestinal Disease study – assessment of the stability of target nucleic acid. Diagnostic Molecular
Pathology 14: 90–6.
Ashbolt NJ. 2004. Microbial contamination of drinking water and disease outcomes in developing
regions. Toxicology 198: 229–38.
AWWA. 1991. Guidance Manual for Compliance with the Filtration and Disinfection Requirements
for Public Water Systems Using Surface Water Sources. Denver, CO: American Water Works
Association.
Brooks GF, Carroll KC, Butel JS, et al. 2007. Jawetz, Melnick & Adelberg’s Medical Microbiology.
New York: McGraw Hill.
CRC for Water Quality and Treatment. Strategic Review of Waterborne Viruses, Occasional
Paper 11. CRC for Water Quality and Treatment, Private Mail Bag 3, Salisbury, South Australia 5108.
See www.wqra.com.au/publications/ or http://water.nstl.gov.cn/MirrorResources/283/index.html
DRG (Disinfection Review Group). 2002. Pilot Study Investigations: Surrogate study results and
recommendations. A Disinfection Review Group report to Watercare Services Ltd.
DWI. 2010. Contaminant Candidate List Viruses: Evaluation of disinfection efficacy
(Project #3134). Executive Summary: http://dwi.defra.gov.uk/research/completedresearch/2000todate.htm
Embrey M. 1999. Adenovirus in Drinking Water, Literature Summary. Final report. Washington
DC: The George Washington University School of Public Health and Health Services, Department of
Environmental and Occupational Health.
ESR. 2004. Annual Summary of Outbreaks in New Zealand: 2003. Report to Ministry of Health
ISSN 1176-3485.
Guidelines for Drinking-water Quality Management for New Zealand 2013
227
ESR. 2010. Environmental Microbiological Risk Assessment and Management; EMRAM Virus
Project 2009–2010 Progress Report. Client Report (FW10051).
ESR. 2010a. Guidelines for Separation Distances Based on Virus Transport Between On-site
Domestic Wastewater Systems and Wells. ESR Client Report No. CSC1001. 296 pp.
http://www.envirolink.govt.nz/PageFiles/31/Guidelines_for_separation_distances_based_on_viru
s_transport_.pdf
Gerba CP, Rose J. 1992. Estimating viral risk from drinking-water. In: Comparative Environmental
Risk Assessment, chapter 9, pp 117–37. CR Conthern Lewis Publishers.
Gratacap-Cavallier B, Genoulaz O, Brengel-Pesce K, et al. 2000. Detection of human and animal
rotavirus sequences in drinking water. Applied and Environmental Microbiology 66: 2690–2.
Greening G, Hewitt J, Lewis G. 2002. Evaluation of integrated cell culture-PCR (C-PCR) for
virological analysis of environmental samples. Journal of Applied Microbiology 93: 745–50.
Haramoto E, Katayama H, Oguma K, et al. 2005. Application of cation-coated filter method for
detection of noroviruses, enteroviruses, adenoviruses, and torque tenoviruses in the Tamagawa River
in Japan. Applied and Environmental Microbiology 71: 2403–11.
Hewitt J, Bell D, Simmons GC, et al. 2007. Gastroenteritis outbreak caused by waterborne norovirus
at a New Zealand ski resort. Appl Environ Microbiol 73(24): 7853–7.
Hrudey SE, Hrudey EJ. 2007. Published case studies of waterborne disease outbreaks – evidence of
a recurrent threat. Water Environment Research 79: 233–45.
Hunter P. 1997. Viral gastroenteritis in waterborne disease. Epidemiology and Ecology, chapter 28,
pp 222–31. John Wiley & Sons.
Kasorndorkbua C, Opriessnig T, Huang FF, et al. 2005. Infectious swine hepatitis E virus is present
in pig manure storage facilities on United States farms, but evidence of water contamination is
lacking. Applied and Environmental Microbiology 71: 7831–7.
Kim J. 2005. Human Adenovirus in the Waikato River: Implication for water supply and public
health. MSc thesis. University of Auckland Library.
Klemola P, Kaijalainen S, Ylipaasto P, et al. 2008. Diabetogenic effects of the most prevalent
enteroviruses in Finnish sewage. Immunology of Diabetes 1150: 210–12.
Kuniholm MH, Purcell R, McQuillan G, et al. 2009. Epidemiology of hepatitis E virus in the United
States: Results from the Third National Health and Nutrition Examination Survey, 1988–1994. The
Journal of Infectious Diseases 200: 48–56.
LeChevelier M, Au K-K. 2004. Water Treatment and Pathogen Control: Process efficiency in
achieving safe drinking-water. WHO Drinking-Water Quality Series. WHO, Geneva.
http://www.who.int/water_sanitation_health/publications/en/
Lee HK, Jeong YS. 2004. Comparison of total culturable virus assay and multiplex integrated cell
culture-PCR for reliability of waterborne virus detection. Applied & Environmental Microbiology
70: 3632–6.
Lewis GD, Austin FJ, Loutit MW, et al. 1986. Enterovirus removal from sewage – the effectiveness of
four different treatment plants. Water Research 20: 1291–7.
Lewis G, Molloy SL, Greening GE, et al. 2000. Influence of environmental factors on virus detection
by RT-PCR and cell culture. Journal of Applied Microbiology 88: 633–40.
Meng XJ. 2005. Hepatitis E virus: Cross-species infection and zoonotic risk. Clinical Microbiology
Newsletter 27: 43–8.
228
Guidelines for Drinking-water Quality Management for New Zealand 2013
Ministry of Health. 2005. Drinking-water Standards for New Zealand 2005. Wellington: Ministry
of Health. Also see the 2008 revision.
NHMRC, NRMMC. 2011. Australian Drinking Water Guidelines Paper 6 National Water Quality
Management Strategy. Canberra: National Health and Medical Research Council, National
Resource Management Ministerial Council, Commonwealth of Australia. 1244 pp.
http://www.nhmrc.gov.au/guidelines/publications/eh52
Nwachcuku N, Gerba CP. 2004. Emerging waterborne pathogens: can we kill them all? Current
Opinion in Biotechnology 15: 175–80.
Parashar UD, Hummelman EG, Bresee JS, et al. 2003. Global illness and deaths caused by rotavirus
disease in children. Emerging Infectious Diseases 9: 565–72.
Pérez-Vargas J, Isa P, López S, et al. 2006. Rotavirus vaccine: early introduction in Latin Americarisks and benefits. Archives of Medical Research 37: 1–10.
PMEP (accessed 2011). Pesticide Active Ingredient Information: Biopesticides and biocontrols:
Bioinsecticides. http://pmep.cce.cornell.edu/profiles/index.html.
Shin G-A, Sobsey MD. 2003. Reduction of Norwalk Virus, Poliovirus 1, and Bacteriophage MS2 by
ozone disinfection of water. Appl Environ Microbiol 69(7): 3975–8.
Sobsey MD, Fuji T, Shields PA. 1988. Inactivation of hepatitis A virus and model viruses in water by
free chlorine and monochloramine. Water Science and Technology 20: 385–91.
Sobsey MD. 1989. Inactivation of health-related microorganisms in water by disinfection processes.
Water Science and Technology 21(3): 179–95.
Teunis PFM, Moe CL, Liu P, et al. 2008. Norwalk virus: How infectious is it? Journal of Medical
Virology 80(8): 1468–76.
Till D, McBride G, Ball A, et al. 2008. Large-scale freshwater microbiological study: rationale, results
and risks. Journal of Water and Health 6(4): 443–60.
Tree JA, Adams MR, Lees DN. 2003. Chlorination of indicator bacteria and viruses in primary
sewage effluent. Applied & Environmental Microbiology 69(4): 2038–43.
USEPA. 1994. National Primary Drinking Water Regulations: Enhanced surface water treatment
regulations. 59 FR 38832, 29 July.
USEPA. 1999. Disinfection Profiling and Benchmarking Guidance Manual. EPA 815-R-99. Available
at: http://www.epa.gov/safewater/mdbp/mdbptg.html or go to
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/index.cfm
USEPA. 2003. National Primary Drinking Water Regulations: Long Term 2 Enhanced Surface
Water Treatment Rule; Proposed Rule. 40 CFR Parts 141 and 142, 11 August. Now see USEPA
(2006a) for Final Rule.
USEPA. 2006a. National Primary Drinking Water Regulations: Long Term 2 Enhanced Surface
Water Treatment Rule: Final Rule. (LT2ESWTR). Federal Register Part II, 40 CFR Parts 9, 141 and
142. Washington: National Archives and Records Administration. See
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04a.pdf
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04b.pdf
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04c.pdf or go to
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/index.cfm or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2006b. Ultraviolet Disinfection Guidance Manual for the Final Long Term 2 Enhanced
Surface Water Treatment Rule. Office of Water, EPA 815-R-06-007, November. See:
www.epa.gov/ogwdw/disinfection/lt2/pdfs/guide_lt2_uvguidance.pdf
Guidelines for Drinking-water Quality Management for New Zealand 2013
229
Vasickova P, Dvorska L, Lorencova A, et al. 2005. Viruses as a cause of foodborne diseases: a review
of the literature. Veterinarni Medicina 50: 89–104.
Vivier JC, Ehlers MM, Grabow WO. 2004. Detection of enteroviruses in treated drinking-water.
Water Research 38(11): 2699–705.
Watercare Services Ltd (personal communication). 2002. Surrogate Study: Mangere wastewater
treatment plant.
Watercare Services Ltd (personal communication). 2003, 2004. Adenovirus and Enterovirus
Monitoring Data, Waikato River at Mercer.
WHO. 2004. Guidelines for Drinking-water Quality 2004 (3rd edition). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda.
WHO. 2004a. Water Treatment and Pathogen Control: Process efficiency in achieving safe
drinking water. 136 pp. www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2011. Guidelines for Drinking-water Quality 2011 (4th edition). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
230
Guidelines for Drinking-water Quality Management for New Zealand 2013
Chapter 8: Protozoal
compliance
8.1 Introduction
The Maximum Acceptable Value (MAV) for total pathogenic protozoa in drinking-water is less
than 1 infectious (oo)cyst (cysts and oocysts) per 100 litres; see Table 2.1 of the Drinking-water
Standards for New Zealand 2005, revised 2008 (DWSNZ). Note that until the methodology for
determining the viability or infectivity of detected (oo)cysts improves, results are to be reported
as verified (oo)cysts.
Cryptosporidium and Giardia testing of drinking-waters:
•
requires very large volumes to be filtered in order to achieve the sensitivity required
•
requires two or more days to achieve a result
•
requires very skilled laboratory personnel
•
requires very expensive laboratory equipment
•
does not allow large numbers of samples to be processed per day
•
very few laboratories in New Zealand are accredited for this work.
Because it is impractical to demonstrate compliance with the protozoal MAV in the DWSNZ
with statistical rigour, operational requirements for treatment processes known to remove or
inactivate (oo)cysts are used instead. In many situations, when a monitoring test result fails to
satisfy an operational requirement, the point of failure is readily identified; this would not be the
case if testing directly for protozoa.
The operational requirements include:
•
turbidity monitoring (or particle counting) for filtration processes
•
direct integrity testing (for membrane filtration plants)
•
indirect integrity testing (for membranes, bags and cartridges)
•
pressure differential for bag and cartridge filtration
•
monitoring with UV intensity sensors
•
C.t values for ozone and chlorine dioxide disinfection
•
and in several cases the use of certificated or validated water treatment appliances is
required.
This chapter discusses the compliance issues relating to the removal or inactivation of protozoa
at the water treatment plant. Chapters 12–15 discuss the management and operational aspects
of pre-treatment, coagulation, filtration, and disinfection processes respectively.
If water leaving the treatment plant satisfies the appropriate protozoa compliance criteria, and if
the bacterial compliance criteria for water in the distribution system are satisfied, then it is
considered that it is unlikely that protozoa will present a health risk in the distribution system.
Guidelines for Drinking-water Quality Management for New Zealand 2013
231
Some more general aspects of microbiology and related illnesses are discussed in Chapter 5:
General Microbiological Quality.
Water sources and their selection are covered in Chapters 3 and 4, although section 8.2 of this
chapter deals with source water protozoal risk categorisation based on Cryptosporidium
occurrence or catchment characteristics.
Much of the early work was carried out in the UK after various outbreaks of cryptosporidiosis.
Badenoch produced the first major reports in 1990 and 1995, with recommendations, and
Bouchier (1998) updated these.
WHO (2004a) discusses treatment processes suitable for pathogen control. The WHO (in 2006)
produced WHO Guidelines for Drinking Water Quality: Cryptosporidium which discusses many
issues related to this protozoan, and includes an extensive bibliography.
Details of specific protozoa (not just Cryptosporidium and Giardia) are covered in the datasheets.
8.2 Source water
8.2.1
Introduction
The requirements for protozoal compliance with the DWSNZ are based on a cumulative log
credit approach explained in section 5.2 of the DWSNZ and section 8.3 of the Guidelines. To
define the level of treatment required for a water supplier to demonstrate protozoal compliance
with the DWSNZ, the raw water must be categorised with respect to the risk presented by the
concentrations of protozoa in the water. This can be done by either assessing protozoal risk, or
by measuring Cryptosporidium oocyst numbers. The categorisation determines the minimum
number of protozoal log credits the supply’s treatment processes must achieve. Section 5.2.1 of
the DWSNZ specifies how the categories are defined, and the way in which the source water
quality is to be evaluated. The oocyst concentrations that define the risk categories are based on
the boundaries used by the USEPA to define the ‘bin classifications’ contained in their proposed
Long Term 2 Enhanced Surface Water Treatment Rule (USEPA 2003a), and confirmed in their
final rule (USEPA 2006a).
Cryptosporidium and Giardia sampling and testing is discussed in section 8.6.1. The analytical
procedure to be used is based on Method 1623 (USEPA 2005b). See also Source Water
Monitoring Guidance Manual for Public Water Systems for the Long Term 2 Enhanced
Surface Water Treatment Rule (USEPA 2006b). Laboratories conducting protozoa testing for
compliance purposes are to have IANZ accreditation for this work.
Ideally, protozoal categorisation would only be based on the results of monitoring oocysts.
Water suppliers made it abundantly clear during the DWSNZ consultation process that they felt
this would be too expensive; hence the introduction of the catchment risk category approach for
supplies <10,000 population.
In recognition of the relatively high cost of analysing samples for Cryptosporidium, the USEPA
(2003a) explored the use of indicator criteria to identify raw waters that may have high levels of
Cryptosporidium occurrence. Data were evaluated for possible indicator parameters, including
faecal coliforms, total coliforms, E. coli, viruses, and turbidity. E. coli was found to provide the
best performance as a Cryptosporidium indicator in source waters, and the inclusion of other
parameters like turbidity was not found to improve accuracy. As a consequence, the DWSNZ 2005
had also adopted some E. coli monitoring of source water. The basis for this was not strong, so the
requirement for E. coli monitoring of source waters was dropped from the 2008 revision.
232
Guidelines for Drinking-water Quality Management for New Zealand 2013
Because bore waters are often free from microbiological contamination, their protozoal log
credit requirement is assessed in a different manner than used for surface water. Protozoal risk
categories for bore waters will be one of 0, 2, 3, 4 or 5 log credits; see section 8.2.2. Source water
protozoal risk categories for surface supplies will be one of 3, 4 or 5 log credits. Secure bore
waters are considered to be free from protozoa. Bore water security is discussed in Chapter 3.
In effect, there is a default for surface waters of 4 logs. With no towns or farms in the catchment
this drops to 3 logs; with excessive farm or human wastes 5 log removals will be required. No
New Zealand source water is expected to need 5 log removals. If a source water is so bad that
5 logs are really needed, there is a huge incentive to change the source or clean up the land use
practices.
One DWSNZ draft attempted to reduce the log credit requirement for lakes/reservoirs, but that
became bogged down in discussions about (oo)cyst settling rates, the effect of sunlight and
predators, water temperature, retention time, stratification, mixing, and depth of abstraction
valve etc.
It was assumed that the quality of springs and very shallow bores may be no better than that of a
surface water passing through <10 m of gravel, hence the 3–5 log removal requirement was
retained. The 10–30 m deep group of bores was allowed 1 log credit requirement less than the
‘default’ on the grounds that the quality would improve as the water percolated through the
extra depth of soil. The DWSNZ Expert Committee couldn’t reduce the log credit requirement
any further for non-secure bore waters because it is well known that die-off of micro-organisms
is less prominent in gravel, limestone and basalt structures due to their relatively rapid
transport rates. It was accepted that any large groundwater user that felt they were being asked
to meet too many log credits would choose to prove their point by monitoring for oocysts – that
would be cheaper than installing extra water treatment plant. The intention was that use of
unconfined shallow bores/springs was to be discouraged; any >10 m should only need UV
disinfection at most.
8.2.2 Approach to categorisation
8.2.2.1
a)
Bore waters
Bores drawn from confined aquifers
A bore of any depth drawing from a confined aquifer can be given interim security if it satisfies
bore water criterion 1 and bore water criterion 2, (see section 4.5 of DWSNZ). Secure and
interim secure bores are deemed to satisfy the protozoal compliance criteria, ie, no protozoal log
credits required. The secure status is gained/maintained so long as bore water criterion 3
(absence of E. coli) is satisfied. Note that it may be difficult to show that bores up to 10 m deep
are drawing from confined aquifers, and it may be difficult for them to satisfy bore water
criterion 1 and bore water criterion 3; however, they are included for completeness.
Water drawn from an aquifer that is considered to be confined but does not satisfy (or has not
been assessed against) bore water criterion 1 and/or bore water criterion 2, is deemed to be
equivalent to water drawn from an unconfined aquifer.
Guidelines for Drinking-water Quality Management for New Zealand 2013
233
b)
Bores drawn from unconfined aquifers (or if status of aquifer unknown)
A bore drawing from an unconfined aquifer more than 10 m below the ground surface can be
given interim security if it satisfies bore water criterion 1 and bore water criterion 2, and some
other conditions: see section 4.5.1 of the DWSNZ. Secure and interim secure bores are deemed
to satisfy the protozoal compliance criteria. Note that bores drawing from unconfined aquifers
are a lot less likely to satisfy bore water criterion 1, and consequently may fail to satisfy bore
water criterion 3.
Chapter 3: Source Waters, section 3.2.4 has further discussion on establishing the security of
bore water supplies.
Tables 5.1a and 5.1b in the DWSNZ have confused some readers. The following applies for the
situations where bore water criterion 1 cannot be or is not satisfied.
Depth (m)
Protozoal log credit requirement
<10
Equivalent to surface water, so 3–5 log removals needed, see section 8.2.2.2
10–30
3 log credits required during the five-year E. coli proving period
30+
If hydrogeological evidence suggests that the bore water is likely to be secure, then interim secure
status may be granted, see (a), otherwise 2 log credits required, provided bore water criterion 2 is
satisfied
8.2.2.2 Surface waters
There are two approaches for determining the protozoal risk categorisation of surface waters,
see DWSNZ sections 5.2.1.1 and 5.2.1.2, and both require a five-yearly review, see section 8.2.6.
a)
Catchment risk category approach (Guidelines section 8.2.3)
This is the default option for supplies serving a population up to 10,000, and is based on
assessing the perceived risk related to the surface water catchment categories as defined in
DWSNZ Table 5.1a. Should a water supplier consider the assignation of the log credit
requirement to be inappropriate, any appeal must be supported by data obtained by monitoring
Cryptosporidium (see b).
b)
Measurement of Cryptosporidium oocysts approach (Guidelines section 8.2.4)
This is the default option for supplies serving a population over 10,000, and is based on
matching the mean oocyst concentration with the log credit categories in Table 5.1b of the
DWSNZ. Should the water supplier consider this approach to have led to an inappropriate log
credit requirement, the log requirement based on the perceived risk related to the surface water
catchment categories as defined in section 5.2.1.1 and Table 5.1a may be adopted.
Note that all water suppliers conduct catchment assessments as part of their PHRMP process,
and this is an ongoing process. Catchment assessments are intended to consider all aspects that
may impact on the quality of the raw water and the security of the supply. Information from the
initial catchment assessment will have been used in the selection of the water treatment plant
site and design. Protozoal risk categorisation only considers those activities that may affect the
number of Cryptosporidium oocysts.
234
Guidelines for Drinking-water Quality Management for New Zealand 2013
8.2.3 Catchment risk category approach
The catchment risk categorisation procedure involves a survey of the catchment. The form for
recording the survey results appears in Appendix 3 of the DWSNZ. The DWA will assign the log
credit requirement once the catchment assessment has been completed. Where appropriate the
assignment process will make use of the Cryptosporidium monitoring results provided by the
>10,000 supplies.
When water is drawn from more than one catchment, the catchment with the greatest protozoal
risk will determine the log credit requirement for the treatment plant.
This risk categorisation approach should provide an evaluation of the catchment that identifies
all likely sources of Cryptosporidium oocysts in the raw water, even if the frequency at which
these events occur is low. Risk assessment is a difficult tool to use meaningfully when the
relationships between activities in the catchment and raw water quality are not understood. The
water supply needs to be safe to drink at all times, therefore the catchment survey must take
into account the conditions most likely to challenge the treatment process.
Scottish Water (2003) devised a scoring system for assessing water supply catchments in
response to Cryptosporidium problems. This publication indicates their weighted assessment of
the various impacts due to Cryptosporidium, and has some relevance to New Zealand
conditions so should provide some useful background reading. Also, drinking water assessors
have produced Guidance Notes which is more detailed than Appendix 3 of the DWSNZ. These
Guidance Notes are appended to this chapter.
8.2.4 Measurement of Cryptosporidium oocysts approach
Although this approach seems to have the advantage of providing quantitative information for
the categorisation, the mean oocyst concentration will depend on the frequency of sampling and
whether the collection times coincide with episodes of poor or good water quality. Samples
taken too infrequently may miss poor water quality episodes when oocyst counts are high. This
could result in an inadequate level of treatment being provided. Conversely, excess treatment
may be indicated.
To achieve a balance between accuracy and costs, the monitoring programme must comprise at
least 26 samples collected over a 12-month period at approximately equal time intervals to
attempt to ensure representative samples and minimise seasonal bias. The samples must be
tested quantitatively for Giardia cysts and Cryptosporidium oocysts. Subject to laboratory and
delivery services, samples should be taken to cover every day of the week and must cover at least
Monday to Friday three times during the sampling programme; this has been discussed further
in Chapter 17: Monitoring, section 17.2. The results from the monitoring programme must be
reported to the DWA who will assign the log credit requirement.
The DWA must be informed of the year’s monitoring plan before it starts, and any changes to it
must be agreed with the DWA before the changes are made. This is to avoid samples being taken
intentionally at times when the concentrations of oocysts in the raw water are expected to be low.
The sampling location must meet a number of criteria (see DWSNZ section 5.2.2). These are
designed to ensure that the samples are representative of the water quality entering the first
treatment process for which log credits will be claimed. If water taken from the source at the
point of abstraction does not undergo any changes in quality before treatment, then the
untreated water may be obtained from this location. Where water is drawn from multiple
sources, samples must be taken from the combined flow.
Guidelines for Drinking-water Quality Management for New Zealand 2013
235
DWSNZ section 5.2.1.3 specifies the requirements for when waste water is recycled to the head
of a treatment plant. Poor quality recycle water, and large or sudden discharges of recycle water
will challenge the treatment process. That is why the DWSNZ require the instantaneous return
rate not to exceed 10 percent of the plant inflow, and the recycle water turbidity is to be
measured to show that the solids/liquid separation process is operating effectively. The DWSNZ
did not specify turbidity limits/durations because that would be too prescriptive, instead
turbidity monitoring was adopted that would show a water supplier when a suitable response is
required, eg, to divert to waste.
If any water supplier measures oocysts with a recovery of around 40 percent in the raw water,
but <10 percent for recycle water, they should attend to their solids/liquid separation process
before wasting too much time collecting dubious data. A water supplier that recycles their
wastes in accordance with the requirements of section 5.2.1.3 of the DWSNZ would be unlikely
to require any more log credits than the raw water requires.
DWSNZ Table 5.1b refers to the mean value; this is the arithmetic mean. Laboratories must
achieve a detection limit of better than 0.75 oocysts/10 litres. This is to ensure that there is no
misclassification of source waters because the sensitivity of the technique was inadequate. When
calculating the mean oocyst concentration, results that have been reported as less than the
detection limit, should be assigned an arbitrary value of zero.
USEPA (2003a, 2006a) stated “Spike data indicate that average recovery of Cryptosporidium
oocysts with Methods 1622 or 1623 in a national monitoring program will be approximately
40 percent. Studies on natural waters for Cryptosporidium using both Method 1623 and a
method (cell culture-PCR) to test for infectivity suggested that 37 percent of the
Cryptosporidium oocysts detected by Method 1623 were infectious”. Consequently, USEPA
accepted the “recommendation that monitoring results should not be adjusted to account for
either recovery or the fraction that is infectious”.
However, for the purpose of protozoal risk categorisation in the DWSNZ, oocysts numbers
should be reported after normalising to a 40 percent recovery rate. Because individual
recoveries can commonly vary from 15–55 percent, the normalising approach was considered to
be fairer and more consistent. Therefore, if a test result of 0.48 oocysts per 10 L was obtained in
a batch that achieved 30 percent recovery, the result to be reported = 0.48 x 40/30 = 0.64.
Conversely, had the recovery been 56 percent, the reported result should = 0.48 x 40/56 = 0.34.
To assist in understanding the relationships between catchment activities and Cryptosporidium
oocysts levels, it would be helpful to collate the following additional information when samples
are collected:
1
weather conditions, or the operation of irrigation systems, in the catchment or recharge
zone on the day the sample was collected and on each of the two previous days
2
for surface waters, the turbidity, a description of the source water quality (visual
appearance) and how this compared with the water quality during fine weather, and river
flow/river height
3
for all sources, the date and time of sampling
4
which sources are in use at the time of sampling if the treatment plant is fed from multiple
sources
5
other factors that might influence the level of raw water contamination, such as irregular
or seasonal land-use activities, and precedent weather.
236
Guidelines for Drinking-water Quality Management for New Zealand 2013
8.2.5 Comparison of protozoal risk assessment and oocyst
monitoring data
Water suppliers that consider the original protozoal risk categorisation to be inappropriate are
permitted to use the other approach. It is possible that in doing so, the conclusion from each
approach may be different. Similarly, it may be possible that the five-yearly review produces a
different outcome. Before rejecting one or other of the conclusions, the reasons for such
discrepancies should be identified. The most likely reasons for a discrepancy are:
a)
monitoring oocysts has missed high risk but low frequency events
b)
the catchment risk assessment has omitted, or underestimated the importance of, a
contaminating activity in the catchment, or land use changes have occurred
c)
monitoring oocysts has coincided with atypical events
d)
the risk assessment has placed too high an importance on a contaminating activity in the
catchment.
When considering protection of public health, reasons a) and b) are a primary concern.
Moreover, if b) is the reason for the discrepancy, remedial actions to reduce the risk to supply
may be misdirected because important sources of contamination may have been overlooked.
A review of the weather conditions and the appearance of the source water, when samples for
Cryptosporidium testing were taken is a helpful place to start in investigating the cause of
discrepancies. The five additional pieces of information discussed in the previous section that
should be collected when samples are taken will assist in this investigation.
Comparison of this information with what is known about potential sources of contamination in
the catchment may explain the reason for measured oocyst concentrations being lower or higher
than expected on the basis of the risk assessment. For example, high oocyst concentrations in
the absence of rain points to the source of contamination not being reliant on rain to transport
contaminants to the receiving water, eg, stock had direct access to the water source, or human
wastes are entering the source water. Sampling dates/times may help to ascertain whether an
unexpected event, such as an upstream wastewater treatment malfunction, may have
contributed to the poor quality of the raw water.
A review of the protozoal risk categorisation questionnaire to determine which activities
contributed most to the overall risk score may help to answer questions such as:
•
to what degree are these likely to have been influenced by rain, and was it raining about the
time of sampling?
•
are these activities likely to contribute intermittently to poor water quality, and therefore
were they likely to have been significant at the times when samples were taken?
Land use can have a large impact on water quality. Check matters such as whether:
•
dairy conversions have been occurring
•
stock numbers have been as expected
•
calving has taken place
•
stock has been moved
•
animal wastes have been irrigated
•
animal waste treatment systems have performed poorly
•
riparian strips installed or damaged
•
pasture was overgrazed before heavy rain fell.
Guidelines for Drinking-water Quality Management for New Zealand 2013
237
Information acquired during the investigation, or simply from undertaking the risk assessment,
may highlight actions that could be taken to reduce the level of risk to the supply. For example,
high contaminant levels in the raw water may occur on an infrequent basis and consequently
may not become evident from monitoring. These low-frequency events may challenge the
treatment plant’s ability to reduce oocyst concentrations to an acceptable level, even if the
treatment plant is compliant with the DWSNZ. Knowing the cause of these events could provide
a guide to remedial actions needed in the catchment. Water suppliers would be expected to
address such matters in their PHRMPs.
8.2.6 Catchment categorisation review
In the DWSNZ, section 5.2.1.1 Catchment risk category approach includes: “Reassessments must
be made at at least five-yearly intervals”. Section 5.2.1.2 Cryptosporidium monitoring includes:
“The protozoa monitoring programme must be repeated at at least five-yearly intervals”. Taken
literally, this could involve an unnecessarily expensive process. The real intent is explained
below in a) or b).
Also, a source water that receives a higher level of treatment than the minimum requirement
only needs to be reviewed if the source water quality is likely to deteriorate markedly. For
example, some water suppliers have chosen to process a 3-log source water in a 4-log removal
water treatment plant. Even if the source water quality deteriorated, it is highly improbable that
it could become a 5-log source water.
If the review suggests the log credit requirement has increased resulting in the need to upgrade
the water treatment process, the water supplier shall address how and when they will do this in
their PHRMP.
a)
Catchment risk category approach
Water suppliers whose source water has a risk of requiring an increase in the number of
protozoal log removals should be looking at their catchment on a regular basis, including
attempting to control catchment land use through the regional council consent process, using
NES where appropriate. These activities should be described in their PHRMP. If such a water
supplier believes the risk has increased, they should check whether there is something they can
do about it, such as modifying their intakes or improving their water treatment process – they
should want to do that to ensure that their drinking water remains safe to drink – they should
not wait until the five-yearly review is due.
Note: paragraph 2 in section 5.2.1.1 of the DWSNZ states “Should the assignation of the log
credit made by the Ministry be considered inappropriate, any appeal (section 1.9) must be
supported by data obtained by monitoring Cryptosporidium (section 5.2.1.2)”. It is reasonable
to assume that this approach may also apply to the five-yearly reviews.
Cryptosporidium monitoring
b)
The words “The protozoa monitoring programme must be repeated at at least five-yearly
intervals” can be interpreted more reasonably as:
The protozoa monitoring programme must be repeated in response to:
•
a change in catchment activities that indicates a likely increase in oocyst numbers; or
•
an intention by the water supplier to employ a protozoal treatment with a reduced
protozoal log removal rating; or
238
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
an outbreak of waterborne protozoal infection linked to the water supply that is not
explained by a lapse in protozoal treatment.
In the extreme (but fairly common situation) of a bush catchment, a water supplier may need to
do no more than state that “the raw water is still being drawn from a bush catchment with no
agricultural activity or human wastes”. It could be helpful if they operate an ongoing predator
control programme.
If the original log removal requirement had been based on protozoa numbers and the review
(also using protozoa numbers) suggests more log credits are required, then that means the
protozoa risk has increased, OR it was simply ‘bad luck’ when the latest samples were collected
(or good luck with the first). How this is handled will depend on the results. For example:
•
if the original results produced mean oocysts of say 0.70 per 10 L and the review mean is
0.80, it could be suggested sampling continue until a clearer pattern has emerged
•
but if the results went from say 0.20 per 10 L to 1.25 per 10 L, there can be little to argue
about, the source water has certainly changed from 3-log to 4-log
•
if most samples were ‘less thans’ and one was ‘large’ for no obvious reason, perhaps it should
be suggested that sampling continues until there is more confidence that the ‘large’ result was
really an outlier.
Note: section 5.2.1.2 says in the second paragraph: If the water supplier considers the
Cryptosporidium monitoring option results in an inappropriate log credit requirement, the
catchment risk categorisation approach as defined in section 5.2.1.1 and Table 5.1a may be
adopted. It is reasonable to assume that this approach may also apply to the five-yearly reviews.
8.3 The cumulative log credit approach
Section 5.2 of the DWSNZ explains the cumulative log credit approach for the removal or
inactivation of protozoa. Editions prior to 2005 did not take account of the additive effect of a
series of treatment processes on protozoa removal.
The cumulative effect of successive treatment processes can be calculated by adding the log
credits of the qualifying processes that are in continuous use. Using the log credit approach
allows the cumulative effects to be added, because, arithmetically, it is not possible to add
percentages. See Table A1.2 in DWSNZ for the conversion table from percentage removal to
logarithms. Some examples of the calculations follow.
8.3.1
Calculation of log credits
Example 1: say the influent contained 1000 ‘things’ per litre and the effluent contained 100:
1000 – 100
1000
=
900
1000
=
0.90 = 90% removal
log 1000 – log 100 = 3.0 – 2.0 = 1 log removal
Example 2: say the influent contained 100,000 ‘things’ per litre and the effluent contained 10:
100,000 – 10
100,000
= 99,990
100,000
=
0.9999 = 99.99% removal
log 100,000 – log 10 = 5 – 1 = 4 log removal
Guidelines for Drinking-water Quality Management for New Zealand 2013
239
Example 3: say the influent contained 1000 ‘things’ per litre and the effluent contained 30:
1000 – 30
1000
=
970
1000
=
0.97 = 97% removal
log 1000 – log 30 = 3.00 – 1.48 = 1.52 log removal (round to 1.5)
Note: Using a spreadsheet, eg, key in =log10(30) to get the log of 30 (ie, = 1.4771)
Example 4: raw water turbidity = 0.96 NTU and settled water = 0.51 NTU:
0.96 – 0.51
0.96
=
0.45
0.96
=
0.469 = 46.9% removal
log 0.96 – log 0.51 = 0.0177 – (-0.2924)
= -0.0177 + 0.2924
= 0.2747 (round to 0.27) log removal
The negative signs make the arithmetic a little more complex, so percentages have been adopted
in the DWSNZ section 5.4.1 for the coagulation/sedimentation process not using rapid gravity
(or pressure) granular particle filtration.
Section 5.2.1 of the DWSNZ and section 8.2 of the Guidelines explain how the source water
protozoal risk categories are determined. The concept of source water categorisation is quite
simple: the dirtier the water, the greater the amount of treatment needed. This is called riskbased. The result is that once a water source has been tested and categorised, water suppliers
will know how many log credits are required in order to comply with the protozoa criteria in the
DWSNZ. They can then choose a process or combination of processes that suits their particular
requirements. See Chapter 4: Selection of Water Source and Treatment, section 4.5 for
discussion relating to treatment processes other than for protozoa.
8.3.2 Which treatment processes are additive for protozoal
compliance?
Section 5.2.3 of the DWSNZ explains which treatment processes can be combined for the
purposes of being awarded log credits. The processes discussed below may be preceded by
qualifying bank filtration (0.5 or 1.0 log credit).
1a)
Coagulation-based processes (using traditional rapid granular media
filtration)
•
Coagulation/sedimentation/filtration (3.0 log credit), or
•
Coagulation/direct sand filtration (2.5 log credit).
These processes may be followed by:
•
enhanced combined filtration (0.5 log credit), or
•
enhanced individual filtration (1.0 log credit), or
•
secondary granular filtration (eg, sand or carbon) (0.5 log credit).
These processes may be followed by tertiary filtration:
•
cartridge filtration (0.5 log credit), or
•
bag filtration (0.5 log credit).
240
Guidelines for Drinking-water Quality Management for New Zealand 2013
1b)
Coagulation based processes (using membrane filtration)
•
Coagulation/sedimentation/rapid granular media filtration (3.0 log credit), or
•
Coagulation/direct rapid granular media filtration (2.5 log credit), or
•
Coagulation/sedimentation without rapid granular media filtration (0.5 log credit).
These processes (1a and 1b) may be followed by membrane filtration (for log credits, see
DWSNZ section 5.11, almost entirely 4-log).
1c)
Disinfection following a process that uses coagulation
Steps included in 1a) and 1b) can be followed by:
•
chlorine dioxide disinfection (dose dependant log credit), or
•
ozone disinfection (dose dependant log credit), or
•
UV disinfection (dose dependant log credit).
Note that these disinfectants can be used singly or in combination, up to 3 log credits.
2a)
Filtration processes without coagulation (using a single filtration
process)
•
Diatomaceous earth (2.5 log credit), or
•
Slow sand (2.5 log credit), or
•
Membrane filtration (for log credit, see DWSNZ section 5.11, most likely 4-log), or
•
Cartridge filtration (2.0 log credit), or
•
Bag filtration (1.0 log credit).
2b)
Any option in step 2a can be followed by
•
Chlorine dioxide disinfection (dose dependant log credit), or
•
Ozone disinfection (dose dependant log credit), or
•
UV disinfection (dose dependant log credit).
Note that these disinfectants can be used singly or in combination, up to 3 log credits.
3a)
Filtration processes (using two filtration processes)
•
Diatomaceous earth (2.5 log credit), or
•
Slow sand (2.5 log credit).
These processes may be followed by:
•
membrane filtration (for log credit: see DWSNZ section 5.11, most likely 4-log), or
•
cartridge filtration (0.5 log credit), or
•
bag filtration (0.5 log credit).
Guidelines for Drinking-water Quality Management for New Zealand 2013
241
3b)
Any option in step 3a can be followed by
•
Chlorine dioxide disinfection (dose dependant log credit), or
•
Ozone disinfection (dose dependant log credit), or
•
UV disinfection (dose dependant log credit).
Note that these disinfectants can be used singly or in combination, up to 3 log credits.
4)
Disinfection only
•
Chlorine dioxide disinfection (dose dependant log credit), or
•
Ozone disinfection (dose dependant log credit), or
•
UV disinfection (dose dependant log credit).
Note that these disinfectants can be used singly or in combination, up to 3 log credits.
Log credits for combinations not shown above may be obtained by application to the Ministry of
Health. See also DWSNZ section 5.17 and section 8.4.5 of the Guidelines.
When filters are not used in a primary role they do not qualify for the full number of log credits.
For example, coagulation/sedimentation/sand filtration earns 3.0 log credits, and coagulation
plus sedimentation without filtration earns 0.5 log credits, implying that a sand filter used in its
primary role is worth 2.5 log credits. But used as a secondary filter in a process that includes
coagulation, it earns only 0.5 log credits. The same approach has been adopted for when
cartridge and bag filters are used in a secondary role.
Credits for filters used following the primary filter are only awarded if they are finer than the
primary filter. Sand filters do not earn any log credits once all (or almost all) the coagulant has
been removed, eg, after membrane filtration. Sand filters operate mainly by charged particles
(associated with floc) adsorbing to the sand grains, which is how particles that are theoretically
small enough to pass through the filter are removed. Without coagulation there is little
adsorption.
The USEPA (2003a, 2006a) states that since the available data are not sufficient to support the
C.t calculation for an inactivation level greater than 3 log, total disinfection inactivation credits
are limited to less than or equal to 3 log. If one disinfection system is operated such that 3 log
inactivation of Cryptosporidium is being achieved, then it is not likely that a second disinfectant
(also being dosed at sufficient to achieve 3 logs) would remove any (or many) more oocysts. So
using two disinfectants would not realistically be additive, ie, it wouldn’t deserve 6 log credits.
The second disinfectant is certainly an additional barrier, but beyond 3 log, it is not an additive
one. A second disinfectant doesn’t necessarily improve the inactivation of Cryptosporidium
oocysts in the water (unless of course one of the barriers fails, but while it’s broken down it
earns zero credits).
However, using filtration plus disinfection does constitute two barriers, two completely different
treatment processes, so these are additive.
The following examples illustrate how to match treatment processes to the log credits required
by the source water categorisation:
242
Guidelines for Drinking-water Quality Management for New Zealand 2013
To earn 2 log credits
Non-secure bore waters drawn from an unconfined aquifer more than 30 m below the surface
need only 2 log credits to satisfy the protozoa compliance criteria. This can be achieved by
filtering the water, eg, using diatomaceous earth or cartridge filtration. It can also be achieved
by disinfecting at the appropriate dose of UV, ozone or chlorine dioxide. Note however, that if
UV disinfection is being used for protozoal compliance without chlorine, the UV dose will need
to be equivalent to 40 mJ/cm2 in order to achieve bacterial compliance criterion 5, see
section 4.3.5 of the DWSNZ.
To earn 3 log credits
Most surface water supplies, non-secure bore waters less than 30 m deep, and springs should
only need to achieve 3 protozoal log credits (refer section 8.2). Of course, water suppliers may
want the security or back-up of achieving more than the minimum requirement.
a)
b)
Source waters without colour and with fairly consistently low turbidity, such as non-secure
bore waters and upland streams in catchments without much bush or peaty soil, probably
do not need to use a chemical coagulation processes. Subject to meeting all the
requirements of the DWSNZ, they may choose to:
•
dose enough ozone and/or UV light to satisfy the C.t values in Tables 5.6 or 5.7 of the
DWSNZ to earn 3 log credits
•
use bag filtration (1 log) plus enough ozone and/or UV to earn 2 more log credits
•
use cartridge filtration (2 log) plus enough ozone and/or UV to earn 1 log credit
•
use diatomaceous earth filtration (2.5 log) plus a low dose of UV to earn another 0.5 log
credits
•
use bank filtration (0.5 or 1.0 log credits) plus cartridge filtration (2 log) plus a low UV
dose.
Source waters that need or choose to use coagulation to remove colour or where the
turbidity is too high for filtration-only systems may:
•
use alum/PAC coagulation with sedimentation/DAF plus rapid granular media
filtration so that water leaving each filter is less than 0.30 NTU (3 log)
•
use direct filtration (2.5 log) plus enhanced filtration (0.5 log if the combined filtered
water turbidity is less than 0.15 NTU)
•
use coagulation with direct rapid granular media filtration (2.5 log) plus a low dose of
UV, ozone or chlorine dioxide to earn another 0.5 log
•
use diatomaceous earth filtration (2.5 log) plus enough ozone (which may remove some
of the colour too) to earn another 0.5 log.
To earn 4 log credits
Only a few New Zealand source waters will probably need to achieve 4 log removals. If the water
is not highly coloured or turbid, this can be achieved simply by increasing the disinfectant dose
in the last four examples in a) or the last two examples in b).
A number of options are available for source waters that are not so clean. Some are:
•
use alum/PAC coagulation with sedimentation/DAF plus rapid gravity sand filtration (3 log)
plus a fairly low dose of UV and/or ozone to earn another 1 log
•
use coagulation with direct filtration (2.5 log) plus enough UV and/or ozone to earn another
1.5 log credits
Guidelines for Drinking-water Quality Management for New Zealand 2013
243
•
use bank filtration (0.5 or 1.0 log) with direct filtration (2.5 log) plus enough UV and/or
ozone to earn another 0.5 or 1.0 log credits
•
use coagulation, sedimentation plus filtration (3 log) plus enhanced filtration (0.5 log if the
combined filtered water turbidity is less than 0.15 NTU), plus a low dose of UV, ozone or
chlorine dioxide to earn another 0.5 log
•
use membrane filtration, which will probably earn 4 log credits, along with whatever other
treatment is necessary to achieve the desired filtered water quality.
These are just examples. As can be seen, many treatment processes can be combined in order to
achieve the required number of log credits.
Note that although chlorine is not effective in the inactivation of Cryptosporidium, it is still a
very effective disinfectant for other micro-organisms. Subject to C.t values, chlorine can
inactivate Giardia cysts. It also allows a residual to pass into and persist through the
distribution system. Its use is still very highly recommended.
Water is abstracted from some very dirty rivers overseas, such as the lower reaches of the Rhine.
Treatment there could include bank filtration, coagulation, membrane filtration, disinfection
with ozone, followed by biologically active filters, and then chlorinated to maintain a residual in
the distribution system. This treatment train could earn 6–9 log credits! It is not expected that
any New Zealand source water would need 5 protozoal log credits.
8.4 Compliance
This section offers guidance on compliance issues for each treatment process that earns
protozoal log credits, plus some that don’t. The concept is based on the USEPA (2003a, 2006a)
Long Term 2 Enhanced Surface Water Treatment Rule (known as the LT2ESWTR). Additional
information is available in the Long Term 2 Enhanced Surface Water Treatment Rule, Toolbox
Guidance Manual Review Draft (USEPA 2009). Chapters 12–15 of the Guidelines discuss the
technical and operational aspects of these treatment processes.
Because the approach adopted in the DWSNZ for demonstrating protozoal compliance is a
relatively new concept, some of the background studies that the USEPA used in developing the
log credits and associated criteria have been included in the following sub-sections. Some of the
studies give useful information about treatment performance and efficiency. The World Health
Organization (eg, WHO 2011) is moving in the same direction, being a logical extension of their
long-held belief in the value of multiple barriers.
Section 8.4.5 covers treatment processes not included in the DWSNZ.
Water supplies using any of the following removal or inactivation processes, but not for the
purpose of gaining protozoa log credits, do not need to satisfy the protozoal compliance criteria.
244
Guidelines for Drinking-water Quality Management for New Zealand 2013
8.4.1
Pretreatment processes
8.4.1.1
Bankside filtration
Process description
Bank filtration is a water treatment process that makes use of surface water that has naturally
infiltrated under the ground via the riverbed or bank(s) and is recovered via a pumping well.
Bank filtrate is water drawn into a pumping well from a nearby surface water source which has
travelled through the subsurface, either vertically, horizontally or both, mixing to some degree
with other groundwater.
It is envisaged that the greatest benefit of this process may be for water suppliers using turbid or
flashy rivers.
Operational aspects of bankside filtration and infiltration galleries are discussed in Chapter 12:
Treatment Processes, Pretreatment, section 12.3.1. Refer also to Chapter 4 of the review draft
LT2ESWTR Toolbox Guidance Manual (USEPA 2009) which discusses issues related to bank
filtration.
DWSNZ criteria
See section 5.3 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the 0.5 or 1.0 log credit. If the bank filtration process is not being
operated for the purpose of protozoal compliance, these requirements do not need to be met.
The DWSNZ (requirement 5 in section 5.3.1) state that for manual sampling:
•
there is documented evidence that the turbidity (of the abstracted water) does not exceed
2.0 NTU during the week after a flood.
This does not require on-going turbidity measurement. A survey of turbidity levels (a minimum
of 30 data points) and river flows/depths over a week following a major flood is sufficient; the
rainfall leading up to the flood should be recorded as well.
Wells near rivers do not always deliver river water. It is possible that they intercept shallow
groundwater (most probably from an unconfined aquifer), and in this case, the process does not
qualify for log credits. Therefore it needs to be demonstrated that the water abstracted by the
bank filtration process is in fact from the river. This can be done by comparing water analyses,
or can be confirmed by a hydrogeological survey.
Further information
Most of the following discussion has been taken from the LT2ESWTR (USEPA 2003a). Most of
the data assessed by the USEPA were from studies of aquifers developed in Dutch North Sea
margin sand dune fields and, therefore, represent optimal removal conditions consistent with a
homogenous, well sorted (by wind), uniform sand filter.
Only granular aquifers are eligible for bank filtration credit. Granular aquifers are those
comprised of sand, clay, silt, rock fragments, pebbles or larger particles and minor cement. The
aquifer material is required to be unconsolidated, with subsurface samples friable upon touch.
Guidelines for Drinking-water Quality Management for New Zealand 2013
245
The aquifer at the well site must be characterised to determine aquifer properties. At a
minimum, the aquifer characterisation must include the collection of relatively undisturbed,
continuous, core samples from the surface to a depth equal to the bottom of the well screen. The
proposed site must have substantial core recovery during drilling operations; specifically, the
recovered core length must be at least 90 percent of the total projected depth to the well screen.
Samples of the recovered core must be submitted to a laboratory for sieve analysis to determine
grain size distribution over the entire recovered core length. Each sieve sample must be acquired
at regular intervals over the length of the recovered core, with one sample representing a
composite of each metre of recovered core. Because it is anticipated that wells will range from
15 to 30 metres in depth, a metre sampling interval will result in about 15 to 30 samples for
analysis. Each sampled interval must be examined to determine if more than ten percent of the
grains in that interval are less than 1.0 mm in diameter.
The length of core with more than ten percent of the grains less than 1.0 mm in diameter must
be summed to determine the overall core length with sufficient fine-grained material so as to
provide adequate removal. An aquifer is eligible for removal credit if at least 90 percent of the
sampled core length contains sufficient fine-grained material as defined.
Cryptosporidium oocysts have a natural affinity for attaching to fine-grained material. The
value of 1.0 mm for the bounding size of the sand grains was determined based on calculations
performed by Harter using data from Harter et al (2000). Harter showed that for groundwater
velocities typical of a bank filtration site (1.5 to 15 m/day), a typical bank filtration site
composed of grains with a diameter of 1.0 mm would achieve at least 1.0 log removal over a
50 foot transport distance. Larger-sized grains would achieve less removal, all other factors
being equal.
A number of devices are used for the collection of groundwater including horizontal and vertical
wells, spring boxes, and infiltration galleries. Among these, only horizontal and vertical wells are
eligible for log removal credit.
Horizontal wells are designed to capture large volumes of surface water recharge. They typically
are constructed by the excavation of a central vertical caisson with laterals that extend
horizontally from the caisson bottom in all directions or only under the riverbed. Groundwater
flow to a horizontal well that extends under surface water is predominantly downward. In
contrast, groundwater flow to a vertical well adjacent to surface water may be predominantly in
the horizontal direction. For horizontal wells, the laterals must be located at least 7.5 m distant
from the normal-flow surface water riverbed for 0.5 log removal credit and at least 15 m distant
from the normal-flow surface water riverbed for 1.0 log Cryptosporidium removal credit. The
groundwater flow path to a horizontal well is the measured distance from the bed of the river
under normal flow conditions to the closest horizontal well lateral.
A spring box is located at the ground surface and is designed to contain spring outflow and
protect it from surface contamination until the water is utilised. Often, localised fracturing or
solution-enhanced channels are the cause of the focused discharge to the spring orifice. These
fractures and solution channels have significant potential to transport microbial contaminants
so that natural filtration may be poor. Thus, spring boxes are not proposed to be eligible for
bank filtration credit.
246
Guidelines for Drinking-water Quality Management for New Zealand 2013
An infiltration gallery is typically a slotted pipe installed horizontally into a trench and backfilled
with granular material. The gallery is designed to collect water infiltrating from the surface or to
intercept groundwater flowing naturally toward the surface water. The infiltration rate may be
manipulated by varying the properties of the backfill or the nature of the soil-water interface.
Because the filtration properties of the material overlying an infiltration gallery may be designed
or purposefully altered to optimise oocyst removal or for other reasons, this engineered system
is not bank filtration, which relies solely on the natural properties of the system. The protozoal
log removal requirement for river water drawn from a non-qualifying infiltration gallery can be
assessed by sampling for oocysts in the raw water instead of the river. An infiltration gallery
designed to the requirements of bankside filtration and performs accordingly, may earn
protozoa log credits.
8.4.1.2
Off-river storage
The 2001 draft of LT2ESWTR acknowledged the benefits of off-river raw water storage, when it
was intended to give 0.5 log and 1 log presumptive credits for reservoirs with hydraulic
detention times of 21 and 60 days, respectively.
The USEPA (2003a) subsequently concluded that the data they assessed illustrated the
challenge in reliably estimating the amount of removal that will occur in any particular storage
reservoir. Because of this variability and the relatively small amount of available data, they
decided it was too difficult to extrapolate from these studies to develop nationally applicable
criteria for awarding removal credits to raw water storage.
See section 12.3.2 in Chapter 12: Treatment Processes: Pretreatment for further discussion on
off-river storage.
Section 8.2 discusses the Cryptosporidium monitoring requirements for determining the
protozoal risk categorisation of source waters. The benefit of any Cryptosporidium die-off
during off-river storage will be acknowledged by sampling the water arriving at the treatment
plant. Refer also to Chapter 3 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA
2009) which discusses issues related to alternative sources and intakes.
8.4.1.3
Presedimentation (only with chemical coagulation)
The USEPA (2003a) proposed in its draft LT2ESWTR to award a presumptive 0.5 log
Cryptosporidium treatment credit for presedimentation that meets the following three criteria:
1
the presedimentation basin must be in continuous operation and must treat all of the flow
reaching the treatment plant
2
the system must continuously add a coagulant to the presedimentation basin
3
the system must demonstrate on a monthly basis at least 0.5 log reduction of influent
turbidity through the presedimentation process in at least 11 of the 12 previous
consecutive months. This monthly demonstration of turbidity reduction must be based on
the arithmetic mean of at least daily turbidity measurements in the presedimentation
basin influent and effluent.
Note that in the DWSNZ, the 0.5 log reduction has been equated to 70 percent removal, in order
to make the arithmetic easier; see section 8.3 for how to convert percent removal to log
reduction.
Guidelines for Drinking-water Quality Management for New Zealand 2013
247
The criteria were based on an assessment of data relating mean turbidity reduction and the
percent of months when mean aerobic spore removal was at least 0.5 log. Data indicate that
aerobic spores may serve as a surrogate for Cryptosporidium removal by sedimentation,
provided optimal chemical dosage conditions apply. Satisfying the criteria appears to provide
approximately 90 percent assurance that average spore (and hence oocyst) removal will be
0.5 log or greater.
In most parts of the world, presedimentation is usually no more than a pond that has been dug
out between the intake and the plant for the purpose of reducing the gross solids load on the
sedimentation tanks. Sometimes alum is dosed crudely into the presedimentation basin to
enhance settling. To distinguish between these two types of presedimentation, the DWSNZ
discuss the USEPA concept of presedimentation in section 5.4, where it is more commonly
termed sedimentation in New Zealand.
Refer to Chapter 12: Treatment Processes, Pretreatment, section 12.3.3 for a discussion on
operational and performance aspects of presedimentation. Refer also to Chapter 5 of the review
draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) which discusses issues related to
presedimentation.
Section 8.2 discusses the Cryptosporidium monitoring requirements for determining the
protozoal risk categorisation of source waters. The benefit of any Cryptosporidium removal by
using presedimentation (with or without coagulation) will be acknowledged by sampling the
water arriving at the treatment plant.
8.4.1.4
Watershed control
The LT2ESWTR Final Rule (USEPA 2006) allows a 0.5 log credit for water supplies where an
approved ‘watershed control programme’ has been implemented and carried out. The criteria
required for compliance are not particularly quantitative and consequently fairly difficult to
apply and assess. In developing the DWSNZ, it was decided that it would be more practicable to
incorporate the benefit of any watershed enhancement by monitoring the Cryptosporidium
oocysts at the raw water intake or as the water reaches the water treatment plant.
Protecting source water quality should be standard water supply practice. Chapter 2 of the
LT2ESWTR Toolbox Guidance Manual Review Draft (USEPA 2009) discusses aspects related
to watershed control in detail; it includes a large, useful list of references.
8.4.2 Coagulation processes
8.4.2.1
Coagulation, sedimentation, filtration
Process description
Sometimes called full or conventional treatment, the coagulation, flocculation, sedimentation
and filtration process involves dosage of a chemical, most commonly aluminium sulphate or
PAC (polyaluminium chloride), that forms a floc which attracts to it particulate and colloidal
matter before separating out in a basin or tank by sedimentation or flotation. Settled water is
then passed through rapid gravity (or sometimes pressure) granular (usually sand) filters.
Coagulation and sedimentation without (or prior to) filtration is what the USEPA refers to as
presedimentation. See section 8.4.1.3.
248
Guidelines for Drinking-water Quality Management for New Zealand 2013
Note that sand filtration without chemical coagulation does not remove protozoa from water
with any reliability, so does not qualify for protozoal log credits. Some very fine media
proprietary filter systems are on the market; they need to be validated and are covered in
section 5.17 of the DWSNZ; see section 8.4.5 of these Guidelines.
Operation of the process is discussed in Chapter 13: Treatment Process, Coagulation. Refer also
to Chapters 5 and 6 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009)
which discuss issues related to sedimentation and lime softening processes.
DWSNZ criteria
•
Coagulation, sedimentation without filtration: 0.5 log.
•
Coagulation, sedimentation with rapid gravity sand filtration: 3.0 log.
See section 5.4 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for log credits. In the unlikely event that the process is not being
operated for the purpose of protozoal compliance, these requirements do not need to be met;
theoretically, a water supply could achieve 3 log credits for disinfecting with ozone or UV even if
the coagulation, sedimentation and filtration process fails to comply.
The success or failure in satisfying the criteria for this process depends very much on the skill in
the collection and testing of samples for turbidity. Turbidity monitoring is discussed in section
8.6.2.1.
Particle counting or particle monitoring can be used instead, but these tests are not easy to
perform, see section 8.6.2.2. The DWSNZ do not include a criterion for the size or number of
particles. Results tend to be instrument specific, so the performance can be assessed either by
the absolute number or the log removal of particles, or a combination. Any water supplier
electing to use particle counting or particle monitoring should contact the Ministry of Health for
further information.
Further information
In its proposed LT2ESWTR, the USEPA (2003a) surveyed studies of the performance of
treatment plants in removing Cryptosporidium, as well as other micron-sized particles (eg,
aerobic spores) that may serve as indicators of Cryptosporidium removal. They concluded that
these studies supported an estimate of 3 log (99.9 percent) for the average Cryptosporidium
removal efficiency in conventional water treatment plants. Nearly all of the filter runs evaluated
in the survey exhibited spikes where filtered water particle counts increased, and pilot work
showed that pathogens are more likely to be released during these spike events.
Full-scale plants in these studies typically demonstrated 2–3 log removal of Cryptosporidium,
and pilot plants achieved up to almost 6 log removals under optimised conditions. In general,
the degree of removal that can be quantified in full-scale plants is limited because
Cryptosporidium levels following filtration are often below the detection limit of the analytical
method. Pilot studies overcome this limitation by seeding high concentrations of oocysts to the
plant influent, but extrapolation of the performance of a pilot plant to the routine performance
of full-scale plants is uncertain. Cryptosporidium removal efficiency in these studies was
observed to depend on a number of factors including: water quality, coagulant application,
treatment rates and optimisation, filtered water turbidity, and the filtration cycle. The highest
removal rates were observed in plants that achieved very low effluent turbidities.
Guidelines for Drinking-water Quality Management for New Zealand 2013
249
Due to the shortage of data relating to oocysts, the USEPA (2003a) evaluated data provided by
water suppliers on the removal of other types of particles, mainly aerobic spores, in the
sedimentation processes of full-scale plants. Data indicate that aerobic spores may serve as a
surrogate for Cryptosporidium removal by sedimentation provided optimal chemical dosage
conditions apply (Dugan et al 2001).
Data on the removal of spores (Bacillus subtilis and total aerobic spores) during operation of
full-scale sedimentation basins were collected independently and reported by three water
suppliers. A summary of this spore removal data is shown in Table 8.1.
Table 8.1: Mean spore removal for full-scale sedimentation basins
Water treatment plant
Mean spore removal
St Louis
1.1 log (B. subtilis)
Kansas City
0.8 log (B. subtilis)
0.46 log (B. subtilis) without coagulant
Cincinnati (lamella plates)
0.6 log (total aerobic spores)
The USEPA (2003a) analysed the relationship between removal of spores and reduction in
turbidity by sedimentation for the three water supplies that provided these data. Results of this
analysis are summarised in Table 8.2, which shows the relationship between monthly mean
turbidity reduction and the percent of months when mean spore removal was at least 0.5 log.
Table 8.2: Relationship between mean turbidity reduction during sedimentation and the
percent of months when mean spore removal was at least 0.5 log
Log reduction in turbidity
(monthly mean)
Percent of months with at least 0.5 log
mean reduction in spores
Up to 0.1
64%
Up to 0.2
68%
Up to 0.3
73%
Up to 0.4
78%
Up to 0.5
89%
Up to 0.6
91%
Up to 0.7
90%
Up to 0.8
89%
Up to 0.9
95%
Up to 1.0
96%
To simplify the arithmetic, the DWSNZ adopted a 70 percent turbidity reduction criterion
instead of the 0.5 log, in order for the sedimentation process to qualify for the 0.5 log credit, see
section 5.4 of DWSNZ. The 0.5 log credit only applies when the coagulation, sedimentation
process is not followed by rapid gravity sand filtration; normally it does, in which case 3 log
credits are possible.
When the raw water turbidity is low, most coagulation/sedimentation processes may have some
difficulty achieving 70 percent reduction. For example, if the raw water turbidity averages
3 NTU for a few months of the year, the average settled water turbidity would have to be less
than 0.9 NTU for those months, which could be difficult for some plants to achieve.
250
Guidelines for Drinking-water Quality Management for New Zealand 2013
One study (Dugan et al 2001) evaluated the ability of conventional treatment to remove
Cryptosporidium under varying water quality and treatment conditions, and assessed turbidity,
aerobic spores, and total particle counts (TPC) as indicators of Cryptosporidium removal.
Under optimal coagulation conditions, oocyst removal across the sedimentation basin ranged
from 0.6 to 1.8 log, averaging 1.3 log, and removal across the filters ranged from 2.9 to greater
than 4.4 log, averaging greater than 3.7 log. Removal of aerobic spores, TPC, and turbidity all
correlated with removal of Cryptosporidium by sedimentation, and these parameters were
conservative indicators of Cryptosporidium removal across filtration. Suboptimal coagulation
conditions (underdosed relative to jar test predictions) significantly reduced plant performance.
Under those conditions, oocyst removal in the sedimentation basin averaged 0.2 log, and
removal by filtration averaged 1.5 log.
Harrington et al (2001) studied the removal of Cryptosporidium by sedimentation and
dissolved air flotation (DAF) using bench scale jar tests and pilot scale conventional treatment
trains. In the bench scale experiments, all run at optimised coagulant doses, mean log removal
of Cryptosporidium was 1.2 by sedimentation and 1.7 by DAF.
Lime softening is a water treatment process that uses precipitation with lime and other
chemicals to reduce hardness and enhance clarification prior to filtration. A single-stage
softening plant, which is used to remove calcium hardness, includes a primary clarifier and
filtration components. The USEPA (2003a) has determined that lime softening plants achieve a
level of Cryptosporidium removal equivalent to conventional treatment plants (ie, average of
3 log).
8.4.2.2 Coagulation, direct filtration
Process description
This process is similar to full or conventional treatment as described in section 8.4.2.1, except
that there is no sedimentation or flotation step. Because all the particulate and colloidal matter
is removed by rapid gravity or sometimes pressure sand filters, this process is only appropriate
for relatively clean raw waters, particularly if they do not experience sudden changes in quality.
Operation of the process is discussed in Chapter 13: Treatment Process, Coagulation.
DWSNZ criteria
See section 5.5 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the 2.5 log credits. In the unlikely event that the process is not
being operated for the purpose of protozoal compliance, these requirements do not need to be
met; theoretically, a water supply could achieve sufficient log credits for disinfecting with ozone
or UV, even if the coagulation and filtration process fails to comply.
Refer to section 8.6.2.1 and 8.6.2.2 for comments about turbidity monitoring and particle counting.
There have been reports of direct filtration plants failing without the operator noticing. This can
happen when alum is added to a low turbidity raw water (say 0.35 NTU) at the wrong dose or
wrong pH. The filtered water turbidity may be slightly lower than the raw water (say 0.30 NTU)
but little floc has formed so the filters are largely ineffective, allowing protozoa (oo)cysts to pass
through, while the water appears to comply with the DWSNZ. Plants where this has happened
often have raw water with low turbidity but high colour, usually occurring when the water is cold
(say <10°C). It may be advisable at plants where this can happen to include residual aluminium
or UV absorbance monitoring.
Guidelines for Drinking-water Quality Management for New Zealand 2013
251
The filter washing process is not continued until the filter is 100 percent clean. As a result,
filtrate for several minutes after backwashing can have an elevated turbidity, even if the filter
has a slow start mechanism. Modern filter design usually arranges to waste filtered water for the
first 10–20 minutes. The turbidity of this water does not need to be monitored for compliance
purposes. However, it would be useful to monitor it for operational reasons.
Further information
The USEPA (2003a) has concluded that the majority of available data support a lower estimate
of Cryptosporidium removal efficiency for direct filtration plants. Pilot and full-scale studies
demonstrate that sedimentation basins, which are absent in direct filtration, can achieve 0.5 log
or greater Cryptosporidium reduction.
Emelko et al (2000) investigated Cryptosporidium removal during vulnerable filtration periods
using a pilot scale direct filtration system. The authors evaluated different operational
conditions: stable, early breakthrough and late breakthrough. During stable operation, effluent
turbidity was approximately 0.04 NTU and Cryptosporidium removal ranged from 4.7 to
5.8 log. In the early breakthrough period, effluent turbidity increased from approximately 0.04
to 0.2 NTU, and Cryptosporidium removal decreased significantly, averaging 2.1 log. For the
late breakthrough period, where effluent turbidity began at approximately 0.25 NTU and ended
at 0.35 NTU, Cryptosporidium removal dropped to an average of 1.4 log.
8.4.2.3 Second stage filtration
Process description
In the proposed LT2ESWTR, the USEPA (2003a) states that only water treatment plants that
include chemical coagulation and rapid sand or dual media filtration, with or without
sedimentation, qualify for log credits when using secondary filtration. Secondary filtration
(called second stage filtration in the DWSNZ) consists of rapid sand, dual media, granular
activated carbon (GAC), or other fine grain media in a separate filtration stage. The USEPA
(2003a) considered that secondary filtration log credits were appropriate based on the
theoretical consideration that the same mechanisms of pathogen removal will be operative in
both a primary and secondary filtration stage. Therefore, shallow bed, coarse media, high rate
filtration systems cannot comply. A cap, such as GAC or anthracite, on a single stage of filtration
will not qualify for this credit.
The DWSNZ also allow the use of cartridge and membrane filtration when used as secondary
filters, provided they also follow chemical coagulation and rapid sand or dual media filtration,
with or without sedimentation. The rationale is that these filters are fine enough to trap particles
as small as protozoa that pass through the primary filter. It was decided to award these filtration
processes 0.5 log credits also. See section 8.3 for further discussion. Secondary filters that are
coarser than the primary filters will not noticeably enhance further removal of protozoa.
Operation of the rapid sand or dual media filtration process is discussed in Chapter 13:
Treatment Process, Coagulation. Cartridge and membrane filtration are discussed in Chapter 14:
Treatment Process, Filtration. Refer also to Chapter 9 of the review draft LT2ESWTR Toolbox
Guidance Manual (USEPA 2009) which discusses issues related to second stage filtration.
252
Guidelines for Drinking-water Quality Management for New Zealand 2013
DWSNZ criteria
See section 5.6 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the additional 0.5 log credit. Note that sand or carbon grain size
(or their effective sizes and uniformity coefficients), bed depth, and filtration rates are not
specified; log credits are awarded on the ability to achieve a specified filtrate turbidity. Refer to
sections 8.6.2.1 and 8.6.2.2 for comments about turbidity monitoring and particle counting.
In responding to the draft USEPA LT2ESWTR proposal, all commenters opposed setting
regulatory design standards for secondary filters on the basis that water suppliers and states
need the flexibility to determine appropriate treatment designs. Consequently, the USEPA did
not establish filter design criteria in their final rule (USEPA 2006a), but required that states
approve the second-stage filtration design. Similarly, any New Zealand water supplier wishing to
qualify for log credits for second-stage filtration will need to have the filter design checked by a
DWA.
Further information
Data on increased removal resulting from a second stage of filtration are limited, and there is
uncertainty regarding how effective secondary filtration will be in reducing levels of microbial
pathogens that are not removed by the first stage of filtration.
The USEPA (2003a) received data from the City of Cincinnati, Ohio, on the removal of aerobic
spores through a conventional treatment plant using GAC contactors for DBP, taste, and odour
control after rapid sand filtration. During 1999 and 2000, the mean values of reported spore
concentrations in the influent and effluent of the GAC contactors were 35.7 and 6.4 cfu/100 mL
respectively, indicating an average removal of 0.75 log across the contactors. Approximately
16 percent of the GAC filtered water results were below detection limit (1 cfu/100 mL) so the
actual log spore removal may have been greater than indicated by these results.
8.4.2.4 Combined filtration
Process description
The enhanced combined filtration category is for water treatment plants that practise
continuous chemical coagulation, with or without sedimentation. Only rapid granular or dual
media filters qualify for log credits when using combined filtration for protozoa removal. The
additional 0.5 log credit is awarded for achieving and maintaining a lower turbidity in the
combined filtered water than required in section 5.3 of the DWSNZ. Refer also to Chapter 7 of
the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) which discusses issues
related to combined and individual filter performance.
DWSNZ criteria
See section 5.7 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the additional 0.5 log credit. Refer to section 8.6.2.1 for
comments about turbidity monitoring and 8.6.2.2 for particle counting.
Further information
In its proposed LT2ESWTR, the USEPA (2003a) reviewed studies that evaluated the efficiency
of granular media filtration in removing Cryptosporidium when operating at different effluent
turbidity levels.
Guidelines for Drinking-water Quality Management for New Zealand 2013
253
The USEPA considered that plants attempting to meet a turbidity standard of 0.15 NTU in
95 percent of samples will consistently operate below 0.10 NTU in order to ensure continuous
compliance. (This could in effect be their control limit.) Therefore the USEPA compared
Cryptosporidium removal efficiency when effluent turbidity was 0.10 NTU or less with removal
efficiency in the range of 0.11 to 0.20 NTU.
Patania et al (1995) conducted pilot-scale studies at four locations to evaluate the removal of
seeded Cryptosporidium and Giardia, turbidity, and particles. Treatment processes, coagulants,
and coagulant doses differed among the four locations. Samples of filter effluent were taken at
times of stable operation and filter maturation.
Emelko et al (1999) used a bench scale dual media filter to study Cryptosporidium removal
during both optimal and challenged operating conditions. Water containing a suspension of
kaolinite clay was spiked with oocysts, coagulated in-line with alum, and filtered. Oocyst
removal was evaluated during stable operation when effluent turbidity was below 0.10 NTU.
Removal was also measured after a hydraulic surge that caused process upset, and with
coagulant addition terminated. These later two conditions resulted in effluent turbidities greater
than 0.10 NTU and decreased removal of Cryptosporidium.
Dugan et al (2001) evaluated Cryptosporidium removal in a pilot scale conventional treatment
plant. Sixteen filtration runs seeded with Cryptosporidium were conducted at different raw
water turbidities and coagulation conditions. Eleven of the runs had an effluent turbidity below
0.10 NTU, and five runs had effluent turbidity between 0.10 and 0.20 NTU.
The results from these three studies are summarised in Table 8.3. Cryptosporidium removal
when the turbidity was 0.10 NTU or lower was markedly better than when in the 0.11–0.20 NTU
range (mean improvement 0.85 log, minimum improvement 0.5 log).
Table 8.3: Studies of Cryptosporidium removal at different filtrate turbidity levels
Micro-organism
Log removal found in turbidity range of:
Study
Up to 0.10 NTU
0.11–0.20 NTU
Cryptosporidium
Giardia
4.4
4.2
3.6
3.2
Patania
Cryptosporidium
4.1
3.6
Emelko
Cryptosporidium
3.7
2.6
Dugan
8.4.2.5 Individual filtration
Process description
The enhanced individual filtration category is for water treatment plants that practise chemical
coagulation, with or without sedimentation. Only rapid granular or dual media filters qualify for
log credits when using individual filtration for protozoa removal. The additional 1 log credit is
awarded for achieving and maintaining a lower turbidity in the water leaving each filter than
required in section 5.3 of the DWSNZ. Refer also to Chapter 7 of the review draft LT2ESWTR
Toolbox Guidance Manual (USEPA 2009) which discusses issues related to combined and
individual filter performance.
254
Guidelines for Drinking-water Quality Management for New Zealand 2013
DWSNZ criteria
See section 5.8 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the additional 1.0 log credit. Refer to section 8.6.2.1 for
comments about turbidity monitoring and 8.6.2.2 for particle counting.
One of the criteria that has to be met is that the turbidity of the water leaving any filter does not
exceed 0.30 NTU for more than 1 percent of the time, over the compliance period; 1 percent of a
30-day month is a total of 7.2 hours or an average of 14.4 minutes per day. This criterion implies
that if filters are washed daily, they will need a run-to-waste facility to avoid the period when
filters traditionally produce their dirtiest water.
Further information
Refer to the discussion in the Combined Filtration section above.
In its proposed LT2ESWTR the USEPA (2003a) considered that modestly elevated turbidity
from a single filter may not significantly impact combined filter effluent turbidity levels, but may
indicate a substantial reduction in the overall pathogen removal efficiency of the filtration
process. Consequently, water supplies that continually achieve very low turbidity in each
individual filter are likely to provide a significantly more effective microbial barrier. The USEPA
expects that supplies that select this toolbox option will have achieved a high level of treatment
process optimisation and process control, and will have both a history of consistent performance
over a range of raw water quality conditions and the capability and resources to maintain this
performance long-term.
8.4.3 Filtration processes
Filtration processes such as roughing filters and microstrainers are not discussed in this chapter
because they do not earn log credits for the removal of protozoa. Refer to Chapter 12:
Pretreatment Processes.
8.4.3.1
Diatomaceous earth
Process description
Diatomaceous earth filtration is a process in which a precoat cake of filter medium is deposited
on a support membrane and additional diatomaceous earth (DE) is continuously added to the
feed water to maintain the permeability of the filter cake. The process can operate under
vacuum or pressure. Normally there is no upstream coagulation process, so to avoid
uneconomically short filter runs, the process is limited to fairly consistently clean raw waters, ie,
low turbidity and colour.
Operation of the process is discussed in Chapter 14: Treatment Process, Filtration.
DWSNZ criteria
See section 5.9 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the 2.5 log credits. In the unlikely event that the process is not
being operated for the purpose of protozoal compliance, these requirements do not need to be
met; theoretically, a water supply could achieve sufficient log credits for disinfecting with ozone
or UV, even if the filtration process fails to comply.
Guidelines for Drinking-water Quality Management for New Zealand 2013
255
One of the requirements is that the minimum DE precoat thickness that is needed before
protozoa are removed reliably in different raw water conditions is to be determined by turbidity
testing. This will involve (where applicable) testing a range of raw water conditions such as after
rain, during droughts, warm and cold water, and when algae are near their maximum numbers.
The tests should also include a period of maximum flow conditions at different precoat loading
rates. While this is happening the water leaving the filters should be run to waste or returned to
the raw water. The results of these trials should be documented.
The DWSNZ also include a clause that exempts some water supplies from meeting various
turbidity requirements. This was added for the very small number of water supplies that have
fine colloidal silica, sometimes called glacial flour, in their raw water. This material is a fraction
of the size of (oo)cysts and most will pass through the filters. Therefore turbidity stops being a
reliable measure of the filter’s ability to remove (oo)cysts. It is possible in the future that the
MoH will conduct a survey of these plants, using particle counters.
Particles trapped in the filter are held rather tenuously, so whenever the filtrate turbidity
exceeds the influent turbidity, the very real risk of a discharge of oocysts must be accepted, and
handled appropriately.
Further information
The USEPA (2003a) considered that a study of DE filtration by Ongerth and Hutton (2001)
supported the findings of earlier studies in showing that a well-designed and operated DE plant
can achieve Cryptosporidium removal equivalent to a conventional treatment plant (ie, average
of 3 log). In developing the DWSNZ it was considered DE filtration was more like the
coagulation, direct filtration process than a full scale conventional treatment plant because
neither include the sedimentation stage, which has been shown to achieve 0.5 log or greater
Cryptosporidium reduction.
8.4.3.2 Slow sand filtration
Process description
Slow sand filtration is a process involving passage of raw water through a bed of sand at low
velocity, generally less than 0.4 m/h (which compares with say 20 m/h in rapid granular media
filtration) resulting in substantial particulate removal by physical and biological mechanisms.
Removal of microbial pathogens in slow sand filters is complex and is believed to occur through
a combination of physical, chemical, and biological mechanisms, both on the surface
(schmutzdecke) and in the interior of the filter bed.
Operation of the process is discussed in Chapter 14: Treatment Process, Filtration.
DWSNZ criteria
See section 5.10 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the 2.5 log credits. In the unlikely event that the process is not
being operated for the purpose of protozoal compliance, these requirements do not need to be
met; theoretically, a water supply could achieve sufficient log credits for disinfecting with ozone
or UV, even if the filtration process fails to comply.
Particles trapped on the sand grains are held rather tenuously, so whenever the filtrate turbidity
exceeds the influent turbidity, the very real risk of a discharge of oocysts must be accepted, and
handled appropriately.
256
Guidelines for Drinking-water Quality Management for New Zealand 2013
Further information
Hall et al (1994) examined the removal of Cryptosporidium with a pilot scale slow sand
filtration plant. Cryptosporidium removals ranged from 2.8 to 4.3 log after filter maturation,
with an average of 3.8 log (at least one week after filter scraping). Raw water turbidity ranged
from 3.0–7.5 NTU for three of four runs and 15.0 NTU for a fourth run. Filtered water turbidity
was 0.2–0.4 NTU, except for the fourth run which was 2.5 NTU.
Fogel et al (1993) evaluated removal efficiencies for Cryptosporidium and Giardia with a
full-scale slow sand filtration plant. The removals ranged from 0.1–0.5 log for Cryptosporidium
and 0.9–1.4 log for Giardia. Raw water turbidity ranged from 1.3–1.6 NTU and decreased to
0.35 NTU after filtration. The authors attributed the low Cryptosporidium and Giardia
removals to the relatively poor grade of filter media and lower water temperature. The sand had
a higher uniformity coefficient than recommended by design standards. This creates larger pore
spaces within the filter bed that retard biological removal capacity. Lower water temperatures
(1°C) also decreased biological activity in the filter media.
The study by Fogel et al is significant because it indicates that a slow sand filtration plant may
achieve less than 2 log removal of Cryptosporidium removal while still being in compliance with
filtrate turbidity requirements. This is why the compliance criteria in the DWSNZ include water
temperature monitoring with a lower limit of 6°C.
8.4.3.3 Bag filtration
Process description
The USEPA (2003a) defined bag filters as pressure driven separation processes that remove
particulate matter larger than 1 micrometre using an engineered porous filtration medium
through either surface or depth filtration.
Bag filters are typically constructed of a non-rigid, fabric filtration media housed in a pressure
vessel in which the direction of flow is from the inside of the bag to the outside.
Operation of the process is discussed in Chapter 14: Treatment Process, Filtration. The testing
protocol for the verification of equipment performance is described in EPA/NSF ETV (2005).
Refer also to Chapter 8 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA
2009) which discusses issues related to bag and cartridge filtration.
DWSNZ criteria
See section 5.13 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the 1 log credit. In the unlikely event that the process is not being
operated for the purpose of protozoal compliance, these requirements do not need to be met;
theoretically, a water supply could achieve sufficient log credits for disinfecting with ozone or
UV, even if the filtration process fails to comply.
To obtain 1 log credit, bag filters must be validated to achieve 2 log removals of
Cryptosporidium. This factor of safety (which has been adopted from the USEPA (2003a,
2006a) is applied to the removal credit because:
•
the removal efficiency of bag filters over the course of a filter run has been observed to vary
by more than 1 log
•
bag filters are not routinely direct integrity tested during operation, so there is no means of
verifying the removal efficiency of filtration units during routine use.
Guidelines for Drinking-water Quality Management for New Zealand 2013
257
Validated bags can also earn 0.5 log credits if used (but highly unlikely) as secondary filters after
a coagulation process, and 0.5 log credits when used a secondary filters after diatomaceous
earth or slow sand filters (see Table 5.2 DWSNZ).
The DWSNZ also include a clause that exempts some water supplies from meeting various
turbidity requirements. This was added for the very small number of water supplies that have
fine colloidal silica, sometimes called glacial flour, in their raw water. This material is a fraction
of the size of (oo)cysts and most will pass through the filters. Therefore turbidity stops being a
reliable measure of the filter’s ability to remove (oo)cysts. It is possible in the future that the
MoH will conduct a survey of these plants, using particle counters.
Particles trapped in the bag are held rather tenuously, so whenever the filtrate turbidity exceeds
the influent turbidity, the very real risk of a discharge of oocysts must be accepted, and handled
appropriately.
An investigation is required, if in any day, the pressure drop across the bag filter increases by
more than five percent of the total allowable. Also, if the pressure differential does not increase
over a reasonable time span, consideration must be given to the possibility that the water is
short-circuiting via faulty seals etc. Refer to Chapter 14: Treatment Processes: Filtration,
section 14.5 for further information.
Performance validation/certification
Manufacturers commonly rate fabric filters by pore size or pore distribution. However, there is
no industry standard for measuring or reporting these characteristics. This lack of
standardisation causes problems for establishing design criteria to ensure that a given bag filter
will effectively remove a given percentage of Cryptosporidium. Furthermore, an oocyst has
different structural characteristics than the markers used to determine pore size; thus, the rate
of rejection may differ for an oocyst versus the test markers used to determine pore size or
molecular weight cutoff. To compensate for these factors of uncertainty for Cryptosporidium
removal, the LT2ESWTR requires bag filters to be challenge tested to determine removal credit.
The log removal validation is based on challenge testing. The equipment supplier or
manufacturer must perform challenge tests before the water supplier purchases the plant.
Certificates of performance are to be supplied. The Medical Officer of Health may also require
challenge tests to check that a treatment (or other) problem has been rectified. Challenge testing
must be conducted on a full-scale filter element identical in material and construction to the
filter elements proposed for use in full-scale treatment facilities.
Water suppliers may adopt the equipment or appliance supplier’s certification provided:
a)
b)
258
it meets one of the following:
•
the Membrane Filtration Guidance Manual (USEPA 2005), which contains detailed
guidance on developing challenge test protocol and conducting the test for membrane
processes that relate to these requirements
•
the Protocol for Equipment Verification Testing for Physical Removal of
Microbiological and Particulate Contaminants (NSF 2005), which has a chapter for
testing bag and cartridge filters (Chapter 4)
•
a standard formally recognised by the Ministry of Health as being equivalent.
an appropriately accredited inspection body performs the testing
Guidelines for Drinking-water Quality Management for New Zealand 2013
c)
the tests are made on entire units, including filtration media, seals, filter housing and
other components integral to the process
d)
the installed equipment is identical (or validated as equivalent) to the equipment tested
during the certification process.
Further information
A limited amount of published data is available regarding the removal efficiency of bag filters
with respect to Cryptosporidium oocysts or suitable surrogates. The relevant studies identified
by the USEPA (2003a) in the literature are summarised in Table 8.4.
Table 8.4: Results from studies of Cryptosporidium (or surrogate) removal by bag filters
Organism/surrogate
Log removal
Study
Cryptosporidium
3.0
Cornwell and Le Chevalier 2002
Cryptosporidium
0.5 to 3.6
Li et al 1997
4.5 micron spheres
0.5 to 2.0
Goodrich et al 1995
These data demonstrate highly variable removal performance, ranging from 0.5 log to 3.6 log.
Li et al (1997) evaluated three bag filters with similar pore size ratings and observed a 3 log
difference in Cryptosporidium oocyst removal among them. These results indicate that bag
filters may be capable of achieving removal of oocysts in excess of 3 log, but performance can
vary significantly among products, and there appears to be no correlation between pore size
rating and removal efficiency.
Based on available data, specific design criteria that correlate with removal efficiency cannot be
derived for bag filters. The removal efficiency of these proprietary devices can be impacted by
product variability, increasing pressure drop over the filtration cycle, flow rate, and other
operating conditions.
The removal efficiency of some bag filtration devices has been shown to decrease over the course
of a filtration cycle due to the accumulation of solids and resulting increase in pressure drop. As
an example, Li et al (1997) observed that the removal of 4.5 micrometre microspheres by a bag
filter decreased from 3.4 log to 1.3 log over the course of a filtration cycle.
The data in Table 8.4 were generated from studies performed under a variety of operating
conditions, many of which could not be considered conservative (or worst-case) operation.
These considerations led to the challenge testing requirements which are intended to establish a
product specific removal efficiency rather than site-specific.
Only a few bag filtration studies have attempted to correlate turbidity removal with removal of
Cryptosporidium oocysts or surrogates. Li et al (1997) found that the removal efficiency for
turbidity was consistently lower than the removal efficiency for oocysts or microspheres for the
three bag filters evaluated. None of the filters was capable of consistently producing a filtered
water turbidity below 0.3 NTU for the waters evaluated.
Guidelines for Drinking-water Quality Management for New Zealand 2013
259
8.4.3.4 Cartridge filtration
Process description
The USEPA (2003a) defined cartridge filters as pressure driven separation processes that
remove particulate matter larger than 1 micrometer using an engineered porous filtration
medium through either surface or depth filtration.
Cartridge filters are typically constructed as rigid or semi-rigid, self-supporting filter elements
housed in pressure vessels in which flow is from the outside of the cartridge to the inside.
Although all filters classified as cartridge filters share similarities with respect to their
construction, there are significant differences among the various commercial devices. An
important distinction is the ability to directly test the integrity of the filtration system in order to
verify that there are no leaks that could result in contamination of the filtrate. Any membrane
cartridge filtration device that can be direct integrity tested according to the criteria specified in
the membrane filtration section of DWSNZ (section 5.11) is eligible for protozoal removal credit
as a membrane, subject to the criteria specified in that section.
Operation of the process is discussed in Chapter 14: Treatment Process, Filtration. The testing
protocol for the verification of equipment performance is described in EPA/NSF ETV (2005).
Refer also to Chapter 8 of the LT2ESWTR Toolbox Guidance Manual review draft (USEPA
2009) which discusses issues related to bag and cartridge filtration.
DWSNZ criteria
See section 5.12 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the 2 log credits. In the unlikely event that the process is not
being operated for the purpose of protozoal compliance, these requirements do not need to be
met; theoretically, most water supplies could achieve sufficient log credits for disinfecting with
ozone or UV, even if the filtration process fails to comply.
To obtain 2 log credits, cartridges must be validated to achieve 3 log removals (cyst reduction) of
Cryptosporidium. This 1 log factor of safety is applied to the removal credit for cartridge filters
because:
•
the removal efficiency of some cartridge filters has been observed to vary by more than 1 log
over the course of operation
•
cartridge filters are not routinely subjected to direct integrity testing during operation , so
there is no means of verifying the removal efficiency of filtration units during routine use.
Qualifying for 2 log credits means that cartridge filtration may be a particularly suitable process
for use on a non-secure bore water supply. This can then be followed by chlorination in order to
achieve bacterial compliance, and to protect the distribution system.
Validated cartridge filters can also earn 0.5 log credits if used (but highly unlikely) as secondary
filters after a coagulation process, and 0.5 log credits when used a secondary filters after
diatomaceous earth or slow sand filters (see Table 5.2 DWSNZ).
The DWSNZ also include a clause that exempts some water supplies from meeting various
turbidity requirements. This was added for the very small number of water supplies that have
fine colloidal silica, sometimes called glacial flour, in their raw water. This material is a fraction
of the size of (oo)cysts and most will pass through the filters. Therefore turbidity stops being a
reliable measure of the filter’s ability to remove (oo)cysts. It is possible in the future that the
MoH will conduct a survey of these plants, using particle counters.
260
Guidelines for Drinking-water Quality Management for New Zealand 2013
Particles trapped in the cartridge are held rather tenuously, so whenever the filtrate turbidity
exceeds the influent turbidity, the very real risk of a discharge of oocysts having occurred must
be accepted, and handled appropriately.
An investigation is required, if in any day, the pressure drop across the cartridge filter increases
by more than 5 percent of the total allowable. Also, if the pressure differential does not increase
over a reasonable time, consideration must be given to the possibility that the water is shortcircuiting via faulty seals etc. Refer to Chapter 14: Treatment Processes: Filtration, section 14.5
for further information.
See section 8.4.2.3 for a discussion relating to the use of cartridge filtration when used as
secondary filters.
Performance validation/certification
Manufacturers commonly rate fabric filters by pore size or pore distribution, usually ‘defined’ as
being absolute or nominal. However, there is no industry standard for measuring or reporting
these characteristics. This lack of standardisation causes problems for establishing design
criteria to ensure that a given cartridge filter will effectively remove a given percentage of
Cryptosporidium. Furthermore, an oocyst has different structural characteristics than the
markers used to determine pore size; thus, the rate of rejection may differ for an oocyst versus
the test markers used to determine pore size or molecular weight cut-off. To compensate for
these factors of uncertainty for Cryptosporidium removal, the LT2ESWTR requires cartridge
filters to be challenge tested to determine removal credit, see section 8.5.
The log removal validation is based on challenge testing. The equipment supplier or
manufacturer must perform challenge tests before the water supplier purchases the plant.
Certificates of performance are to be supplied. The Ministry of Health may also require
challenge tests to check that a treatment (or other) problem has been rectified. Challenge testing
must be conducted on a full-scale filter element identical in material and construction to the
filter elements proposed for use in full-scale treatment facilities – except see d) and e) below.
Water suppliers may adopt the equipment or appliance supplier’s certification provided:
a)
it meets one of the following:
•
the Membrane Filtration Guidance Manual (USEPA 2005), which contains detailed
guidance on developing challenge test protocol and conducting the test for membrane
processes that relate to these requirements
•
the (oo)cyst reduction conditions of Drinking Water Treatment Units: Health effects,
NSF/ANSI 53 (NSF and ANSI 2002a, and subsequent revisions)
•
a standard formally recognised by the Ministry of Health as being equivalent, eg,
AS/NZS 4348:1995 in conjunction with AS/NZS 3497:1998 (updated 2001).
b)
an appropriately accredited inspection body has performed the testing
c)
the installed equipment is identical (or validated as equivalent) to the equipment tested
during the certification process
d)
the tests are made on entire units, including filtration media, seals, filter housing and
other components integral to the process; this is usually impracticable for larger
units, so see e)
Guidelines for Drinking-water Quality Management for New Zealand 2013
261
e)
a certificated cartridge filter can easily fail due to its assembly, ie, ‘its seals and other
components integral to the process’. Using a cartridge that satisfies the challenge test
requirements is acceptable if:
•
the cartridge is single-open-ended (SOE), plug-in style, sealed in the housing with
o-rings
•
scaling up to multiple cartridges, the field cartridge is the same diameter and
construction as the test cartridge and the cartridge is of uniform construction over its
entire length with no joins or joiners; heat-bonded joins are suitable
•
an automatic air release valve is installed on the top of the filter housing to release any
trapped air
•
a default maximum headloss of 150 kPa is set unless the manufacturer can demonstrate
that performance is maintained beyond that. Cartridges must be replaced before the
terminal pressure drop is reached
•
new/replacement cartridges and plants that operate an on/off regime are run to waste
for the first five minutes they come online
•
all components are made from materials approved for use in water supply, eg,
ANSI/NSF Standard 61 or equivalent.
As a result of the above clarification, use of the following template is required.
Template for assessing cartridge filter compliance
5.12.1
Log credit assessment of – ……………………………. cartridge filter
To obtain 2.0 protozoa log credits for cartridge filtration, the following requirements must be
met during periods when the filtered water is being produced.
DWSNZ requirement
Status
Requirement 1: Each cartridge has a certified Cryptosporidium or cyst removal
efficiency of at least 3 log. Water suppliers may adopt the supplier’s certification
provided:
a)
it meets one of the following:
i)
ii)
the Membrane Filtration Guidance Manual (USEPA 2005), which contains
detailed guidance on developing challenge test protocol and conducting
the test for membrane processes that relate to these requirements
the (oo)cyst reduction conditions of Drinking Water Treatment Units:
Health effects, NSF/ANSI 53 (NSF and ANSI 2002a, and subsequent
revisions)
or
a standard formally recognised by the Ministry of Health as being equivalent, eg,
AS/NZS 4348:1995 in conjunction with AS/NZS 3497:1998 (updated 2001).
b)
an appropriately accredited inspection body has performed the testing
c)
the installed equipment is identical (or validated as equivalent) to the equipment
tested during the certification process
d)
the tests are made on entire units, including filtration media, seals, filter housing
and other components integral to the process. Because this is usually
impracticable for larger units, see e)
262
Guidelines for Drinking-water Quality Management for New Zealand 2013
No. Used the
alternative e), as
follows
DWSNZ requirement
e)
Status
a certificated cartridge filter can fail due to its operation or its assembly, ie, “its
seals and other components integral to the process”. Using a cartridge that
satisfies the challenge test requirements is acceptable if:
• the cartridge is single-open-ended (SOE), plug-in style, sealed in the
housing with o-rings
• scaling up to multiple cartridges, the field cartridge is the same diameter and
construction as the test cartridge and the cartridge is of uniform construction
over its entire length with no joins or joiners; heat-bonded joins are suitable
• an automatic air release valve is installed on the top of the filter housing to
release any trapped air
• a default maximum headloss of 150 kPa is set unless the manufacturer can
demonstrate that performance is maintained beyond that. Cartridges must
be replaced before the terminal pressure drop is reached
• new/replacement cartridges and plants that operate an on/off regime are run
to waste for the first 5 minutes they come online
• all components are made from materials approved for use in water supply,
eg, ANSI/NSF Standard 61 or equivalent.
Requirements 2, 3, and 4 relate to filtrate monitoring
Requirement 5 is covered in 1c)
NA
Requirement 6: A slow opening/closing valve is fitted ahead of the cartridge filter plant,
and the filtrate passes either through a pressure surge valve or directly to a tank before
any subsequent process or pumping.
Requirement 7: The flow through each housing is measured. A restrictor that maintains
the flow below the certified maximum operating rate is fitted to each housing.
Requirement 8: Differential pressure measurements across the housing are recorded to
confirm that the minimum differential pressure always exceeds the differential pressure
corresponding to a clean filter established during commissioning, and are kept within
the manufacturer’s recommendations.
Further information
A limited amount of published data is available regarding the removal efficiency of cartridge
filters with respect to Cryptosporidium oocysts or suitable surrogates. The relevant studies
identified by the USEPA (2003a) in the literature are summarised in Table 8.5.
Table 8.5: Results from studies of Cryptosporidium (or surrogate) removal by cartridge
filters
Organism/surrogate
Log removal
Study
Cryptosporidium
3.5 average
Enriques et al 1999
Cryptosporidium
3.3 average
Roessler 1998
Cryptosporidium
1.1 to 3.3
Schaub et al 1993
5.7 micron spheres
0.5 to 3.6
Long 1983
These data demonstrated highly variable removal performance, ranging from 0.5 log to 3.6 log.
Results of these studies also show no correlation between the pore size rating established by the
manufacturer and the removal efficiency of a filtration device. In a study evaluating two
cartridge filters, both with a pore size rating of 3 micrometres, a 2 log difference in
Cryptosporidium oocyst removal was observed between the two filters (Schaub et al 1993).
Guidelines for Drinking-water Quality Management for New Zealand 2013
263
Another study evaluated seventeen cartridge filters with a range of pore size ratings from 1 to
10 micrometres and found no correlation with removal efficiency (Long 1983). It has been noted
that although Cryptosporidium is 4 to 6 microns in size, it can still pass through an absolute
3-micron size filter by deforming and squeezing through (USEPA 2003b).
Based on available data, specific design criteria that correlate to removal efficiency cannot be
derived for cartridge filters. The removal efficiency of these proprietary devices can be impacted
by product variability, increasing pressure drop over the filtration cycle, flow rate, and other
operating conditions. The data in Table 8.5 were generated from studies performed under a
variety of operating conditions, many of which could not be considered conservative (or worstcase) operation. These considerations led to the challenge testing requirements which are
intended to establish a product specific removal efficiency, rather than site-specific.
8.4.3.5 Membrane filtration
Process description
In their proposed LT2ESWTR, the USEPA (2003a) defined membrane filtration as a pressure or
vacuum driven separation process in which particulate matter larger than 1 μm (micrometre) is
rejected by a nonfibrous, engineered barrier, primarily through a size exclusion mechanism, and
which has a measurable removal efficiency of a target organism that can be verified through the
application of a direct integrity test.
This definition is intended to include the common membrane classifications: microfiltration
(MF), ultrafiltration (UF), nanofiltration (NF), and reverse osmosis (RO). MF and UF are
relatively low pressure membrane filtration processes that are primarily used to remove
particulate matter and microbial contaminants. NF and RO are membrane separation processes
that are primarily used to remove dissolved contaminants through a variety of mechanisms, but
which also remove particulate matter via a size exclusion mechanism. MF and UF are the more
common larger processes. The others tend to be used for individual supplies (eg, point-of-use)
or for special purposes.
MF membranes are generally considered to have a pore size range of 0.1–0.2 microns or
micrometres (nominally 0.1 microns), although there are exceptions. For UF, pore sizes
generally range from 0.01–0.05 microns (nominally 0.01 microns) or less, decreasing to an
extent at which the concept of a discernible ‘pore’ becomes inappropriate, a point at which some
discrete macromolecules can be retained by the membrane material. In terms of a pore size, the
lower cutoff for a UF membrane is approximately 0.005 μm. Because some UF membranes have
the ability to retain larger organic macromolecules, they have been characterised historically by
a molecular weight cutoff (MWCO) rather than by a particular pore size. Typical MWCO levels
for UF membranes range from 10,000 to 500,000 Daltons, with most membranes used for
water treatment at approximately 100,000 MWCO.
The critical distinction between membrane filtration processes and bag and cartridge filters is
that the integrity of membrane filtration processes can be tested directly. Based on this
distinction, membrane material configured into a cartridge filtration device that meets the
definition of membrane filtration and that can be direct integrity tested according to the criteria
specified in this section is eligible for the same removal credit as a membrane filtration process.
Membrane devices can be designed in a variety of configurations including hollow-fibre
modules, hollow-fibre cassettes, spiral-wound elements, cartridge filter elements, plate and
frame modules, and tubular modules among others.
264
Guidelines for Drinking-water Quality Management for New Zealand 2013
The generic term module is used to refer to all of these various configurations and is defined as
the smallest component of a membrane unit in which a specific membrane surface area is
housed in a device with a filtrate outlet structure. A membrane unit is defined as a group of
membrane modules that share common valving that allows the unit to be isolated from the rest
of the system for the purpose of integrity testing or other maintenance.
Operation of the process is discussed in Chapter 14: Treatment Process, Filtration. Refer also to
Chapter 14 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) which
discusses issues related to membrane filtration.
DWSNZ criteria
It is possible to earn 3 or more log credits by using membrane filtration as the sole treatment
process. Membrane filtration can also earn its full number of log credits when used in place of
rapid gravity sand filters in a chemical coagulation plant. See section 8.4.2.3 for a discussion
relating to the use of membrane filters used as secondary filters.
Because membrane filters have a pore size range of 0.1–0.2 microns or smaller, other filtration
systems used in a secondary role are not likely to remove the particles that pass through the
membrane filter, so they cannot earn secondary filtration log credits.
See section 5.11 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for log credits. In the unlikely event that the process is not being
operated for the purpose of protozoal compliance, these requirements do not need to be met;
theoretically, a water supply could achieve 3 log credits for disinfecting with ozone or UV, even if
the filtration process fails to comply.
In systems that operate on/off, the filtrate is recycled or wasted until the approved upper control
limits of the indirect integrity monitoring (eg, turbidity or particle counting) are no longer
exceeded. If air routinely affects the online measurement of turbidity and/or particle counting
on restart, and it has been demonstrated that the turbidity and/or particle count on restart is
falsely indicating inadequate performance of the membranes, then on return to service the
turbidity must be less than 0.10 NTU or the particle count below the upper control limit within
15 minutes. If this is not achieved the filtrate must be recycled or wasted until this level of
performance has been achieved.
Performance validation/certification
The Membrane Filtration Guidance Manual (proposal: USEPA 2003c, and final rule: USEPA
2005a) sets out a procedure for challenge testing, see section 8.5 of these Guidelines. The
requirement for challenge testing is intended to be product-specific such that site-specific
demonstration of Cryptosporidium removal efficiency is not necessary. Once the log removal of
a membrane has been established through a challenge test that meets the requirements of
LT2ESWTR, additional challenge testing is not required unless significant modifications are
made to the membrane process.
The maximum number of protozoal log credits that a membrane filtration process is eligible to
receive depends upon the manufacturer’s certification of the log removal that the filter plant can
deliver, see section 5.11.1 of the DWSNZ. So far, all MF plants in New Zealand have been
assigned 4 log credits. Although some models have validation for up to 7 log removals, these
results have usually been achieved under short term trial conditions, rather than continuously
running with variable raw water quality and operating conditions.
Guidelines for Drinking-water Quality Management for New Zealand 2013
265
The testing protocol for the verification of equipment performance is described in a 246-page
publication of EPA/NSF ETV (2002), and in the Membrane Filtration Guidance Manual
(USEPA 2005a).
Procedure for older plants
Data from challenge studies conducted prior to promulgation of the DWSNZ 2005 can be
considered in lieu of additional testing. However, the prior testing must have been conducted in
a manner that demonstrates removal efficiency for Cryptosporidium greater than the treatment
credit awarded to the process.
The Membrane Filtration Guidance Manual (USEPA 2005a) states in section 3.15 that as a
general guide, the following challenge test conditions have been identified as potentially yielding
results that do not satisfy the intent of the rule:
•
challenge testing conducted on obsolete products. Refer to section 3.14 for guidance on the
re-testing of modified membrane modules
•
challenge testing conducted on small-scale modules. Small-scale module testing is permitted
under the LT2ESWTR if certain criteria are met. Refer to section 3.8 for guidance regarding
the testing of small-scale modules
•
challenge testing using unacceptable surrogates for Cryptosporidium. The challenge
particulate used in a grandfathered test must provide equivalent or sufficiently conservative
removal efficiency relative to Cryptosporidium oocysts. Refer to section 3.9 regarding the
selection of surrogates for use in challenge testing
•
challenge particulate enumeration using unacceptable methodology. The challenge
particulate must have been quantified using an acceptable method. Specifically, gross
measurements are generally considered unacceptable. Refer to section 3.9 regarding methods
for enumerating various challenge particulates
•
unavailable quality control release value (QCRV). If non-destructive performance testing was
not used to establish a suitable QCRV in a previous study, it may be difficult or impossible to
relate the demonstrated removal efficiency to the non-destructive performance test results
for untested modules that are produced.
Section 3.15 of the Membrane Filtration Guidance Manual adds that there may also be cases in
which deviations from challenge testing requirements under the LT2ESWTR may not be
significant, such that additional testing would not be required.
Further information
A number of studies have been conducted which have demonstrated the ability of membrane
filtration processes to remove pathogens, including Cryptosporidium, to below detection levels.
A literature review summarising the results of several comprehensive studies was conducted and
reported by the USEPA (2001a) and is presented in Low Pressure Membrane Filtration for
Pathogen Removal: Application, Implementation, and Regulatory Issues.
Many of these studies used Cryptosporidium seeding to demonstrate removal efficiencies as
high as 7 log. The collective results from these studies demonstrate that an integral membrane
module, ie, a membrane module without any leaks or defects, with an exclusion characteristic
smaller than Cryptosporidium, is capable of removing this pathogen to below detection in the
filtrate, independent of the feed concentration.
266
Guidelines for Drinking-water Quality Management for New Zealand 2013
Although it is not uncommon for a membrane plant (MF and UF) to demonstrate up to 6 or
more protozoal log credits in the challenge test, most are assigned 4 log credits by the US
regulatory bodies. Therefore to be assigned other than 4 log credits would require unusual
circumstances. The most likely reason for going under 4 would probably be due to the use of a
direct integrity test (DIT) of lower resolution and sensitivity than those commonly used today.
Maybe more than 4 log credits could be assigned to NF or RO plants, but so far their use in New
Zealand has been limited to very small supplies. In the very improbable event of a New Zealand
source water being of such poor quality that it is categorised as needing more than 4 protozoal
log credits, the multiple barrier principle would suggest that a membrane plant on its own would
offer insufficient confidence about the final product being safe to drink.
8.4.4 Disinfection processes
8.4.4.1
Chlorine dioxide
Process description
The disinfectant chlorine dioxide (ClO 2 ) is made on site and can be dosed into the water supply
to inactivate micro-organisms, including bacteria, viruses, and protozoa such as
Cryptosporidium.
Operation of the chlorine dioxide disinfection process is discussed in section 15.5.3 of
Chapter 15: Treatment Process, Disinfection; C.t values are discussed in section 15.2.1; contact
tanks and hydraulic residence time (t) are discussed in section 15.2.9. Refer also to Chapter 10 of
the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) which discusses issues
related to disinfection using chlorine dioxide.
DWSNZ criteria
See section 5.14 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for log credits. In the unlikely event that the process is not being
operated for the purpose of protozoal compliance, these requirements do not need to be met;
theoretically, a water supply could achieve 3 log credits for disinfecting with ozone or UV, or
could earn 3 log credits using chemical coagulation, sedimentation, filtration, or membrane
filtration, even if the chlorine dioxide process is non-complying.
The chlorite ion (ClO 2 -) is the predominant by-product when chlorine dioxide is used as a
disinfectant. About 50–70 percent of the chlorine dioxide dosed into the water may be
converted to chlorite. On this basis, the maximum dose that can be used without chlorite
exceeding its 0.8 mg/L MAV is about 1.2–1.6 mg/L. As a result, impracticably long contact times
may be needed to achieve protozoal compliance. Pilot trials to determine the chlorite levels that
will form should be undertaken.
Further information
The C.t table in the DWSNZ for protozoal compliance using chlorine dioxide was taken from
Table IV.D–4 in the LT2ESWTR (USEPA 2006a). It was based on Clark et al (2003) who
employed data from Li et al (2001) to develop equations for predicting inactivation, and used
data from Owens et al (1999) and Ruffell et al (2000) to validate the equations. The following
equation can be used to determine the log credit between the indicated values in Table 5.5 in the
DWSNZ:
log credit = 0.001506 x 1.09116temp x C.t
C.t values are described in Chapter 15: Disinfection, sections 15.2.1 and 15.2.9.
Guidelines for Drinking-water Quality Management for New Zealand 2013
267
Another step in developing the C.t values for Cryptosporidium inactivation involved
consideration of the appropriate confidence bound to apply when analysing the inactivation
data. A confidence bound represents a safety margin that accounts for variability and
uncertainty in the data that underlie the analysis. Confidence bounds are intended to provide a
high likelihood that water supplies operating at a given C.t value will achieve at least the
corresponding log inactivation level in the C.t table. Two types of confidence bounds that are
used when assessing relationships between variables, such as disinfectant dose and log
inactivation, are confidence in the regression and confidence in the prediction. USEPA (2003a,
2006a) discusses these in the LT2ESWTR. The use of confidence bounds probably explains why
the C.t values have increased, particularly at higher temperatures, since DWSNZ (2000).
Since the available data are not sufficient to support the C.t calculation for an inactivation level
greater than 3 log, the use of the C.t table in the DWSNZ is limited to inactivation less than or
equal to 3 log. In addition, the temperature limitation is 1–25°C. If the water temperature is
higher than 25°C, the temperature should be set to 25°C for the log inactivation calculation.
8.4.4.2 Ozone
Process description
The disinfectant ozone (O 3 ) is made on site and can be dosed into the water supply to inactivate
micro-organisms, including bacteria, viruses, and protozoa such as Cryptosporidium.
Operation of the process is discussed in Chapter 15: Disinfection, section 15.5.4. Refer also to
Chapter 11 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) which
discusses issues related to disinfection using ozone.
DWSNZ criteria
See section 5.15 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for log credits. Disinfecting with ozone can cause bromate to exceed
its MAV of 0.01 mg/L, so pilot trials are needed to determine acceptable dosage conditions.
Alternatively, if the raw water bromide content is less than 0.006 mg/L, the bromate
concentration should not exceed the MAV. In the unlikely event that the process is not being
operated for the purpose of protozoal compliance, these requirements do not need to be met;
theoretically, a water supply could achieve 3 log credits for disinfecting with chlorine dioxide or
UV, or could earn 3 log credits using chemical coagulation, sedimentation, filtration, or
membrane filtration, even if the ozone process is non-complying.
Section 5.15.2(6) of the DWSNZ states that flow measurements must be made continuously for
supplies serving more than 500 people. Flow is an important component in calculating C.t. If a
plant is said to be constant flow, the water supplier needs to be able to demonstrate that the flow
is maintained within 10 percent of that flow for 95 percent of the time.
Performance validation/certification
The validation process is reasonably complex so it would be expected that ozone appliances
would be validated by the manufacturers. The appliance needs to comprise more than one
reaction chamber (see Table 8.6).
The residual ozone is measured at a prescribed point in the ozone contactor to validate by
challenge testing that it is able to achieve the required inactivation of test organisms. Chapter 5:
Disinfection, section 15.5.4.3 discusses the sampling techniques, test methods and calibration
procedure.
268
Guidelines for Drinking-water Quality Management for New Zealand 2013
Chapter 11 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) discusses
issues related to the measurement of contact time in different types of ozone generators and
contactors.
Section 8.6.2.5 discusses C.t values and how to determine t in an ozone contactor.
Determining ozone C.t
The protozoa log credits for various C.t values at different temperatures are given in Table 5.6 in
the DWSNZ, taken from Table IV.D–3 in the LT2ESWTR (USEPA 2006a). See section 15.2.1 of
the Guidelines for a fuller description of C.t. For ozone, the value used for C depends on the
design of the reaction vessel/contactor.
Turbine
Co-current flow
Counter-current flow
Reactive flow
C out
C out or (C in + C out )/2
C out /2
C out
C.t can be calculated for an entire ozone contactor or for individual segments. The C.t for the
individual segments can be summed to give a total C.t for all of the segments. C is measured at
the beginning and end of an individual segment or at the end of the segment.
Chapter 11.3 of the review draft Ozone Toolbox Guidance Manual (USEPA 2009) describes how
protozoa log credits are calculated for various ozone contactors. These are summarised in
Table 8.6 (which is Table 11.2 in USEPA 2009).
Table 8.6: Methods and terminology for calculating the log inactivation credit when using
ozone
a)
No tracer data
Section
description
Terminology
Method for calculating loginactivation credit
Restrictions
Chambers where ozone is added
First chamber
First dissolution
chamber
No log-inactivation credit is
recommended
None
Other chambers
Co-current or
counter-current
dissolution
chambers
CSTR* method in each
chamber with a measured
effluent ozone residual
concentration
No credit is given to a dissolution
chamber unless a detectable ozone
residual has been measured upstream
of this chamber
≥ 3 consecutive
chambers
Extended-CSTR
zone
Extended-CSTR method in
each chamber
Detectable ozone residual should be
present in at least three chambers in this
zone, measured via in-situ sample ports
Otherwise, the CSTR method should be
applied individually to each chamber
having a measured ozone residual
< 3 consecutive
chambers
CSTR reactive
chambers
CSTR method in each
chamber
None
Reactive chambers
Note: CSTR is continuously stirred tank reactor.
Guidelines for Drinking-water Quality Management for New Zealand 2013
269
b)
With tracer data
Section
description
Terminology
Method for calculating loginactivation credit
Restrictions
Chambers where ozone is added
First chamber
First dissolution
chamber
No log-inactivation credit is
recommended
None
Other chambers
Co-current or
counter-current
dissolution
chambers
T 10 or CSTR method in each
chamber with a measured
effluent ozone residual
concentration
No credit is given to a dissolution
chamber unless a detectable ozone
residual has been measured upstream of
this chamber
>= 3 consecutive
chambers
Extended-CSTR
zone
Extended-CSTR method in
each chamber
Detectable ozone residual should be
present in at least three chambers in this
zone, measured via in-situ sample ports
Otherwise, the T 10 or CSTR method
should be applied individually to each
chamber having a measured ozone
residual
< 3 consecutive
chambers
CSTR reactive
chambers
T 10 or CSTR method in each
chamber
None
Reactive chambers
Further information
The C.t table for ozone in the DWSNZ was taken from the LT2ESWTR (USEPA 2006a). It was
based on Clark et al (2002) who used data from studies of ozone inactivation of
Cryptosporidium in laboratory water to develop predictive equations for estimating inactivation
(Rennecker et al 1999, Li et al 2001), and data from studies in natural water to validate the
equations (Owens et al 2000, Oppenheimer et al 2000). The following equation can be used to
determine the log credit between the indicated values in Table 5.6 in the DWSNZ:
log credit = 0.0397 x 1.09757temp x C.t
Another step in developing the C.t values for Cryptosporidium inactivation involved
consideration of the appropriate confidence bound to apply when analysing the inactivation
data. A confidence bound represents a safety margin that accounts for variability and
uncertainty in the data that underlie the analysis. Confidence bounds are intended to provide a
high likelihood that water supplies operating at a given C.t value will achieve at least the
corresponding log inactivation level in the C.t table. Two types of confidence bounds that are
used when assessing relationships between variables, such as disinfectant dose and log
inactivation, are confidence in the regression and confidence in the prediction. USEPA (2003a,
2006a) discusses these in the LT2ESWTR. The use of confidence bounds probably explains why
the C.t values have increased at higher temperatures and decreased at lower temperatures since
DWSNZ (2000).
Since the available data are not sufficient to support the C.t calculation for an inactivation level
greater than 3 log, the use of the C.t table in DWSNZ is limited to inactivation less than or equal
to 3 log. In addition, the temperature limitation is 1–25°C. If the water temperature is higher
than 25°C, the temperature should be set to 25°C for the log inactivation calculation.
It has been reported that turbidities up to 5 NTU did not affect disinfection (Walsh et al 1980).
However, economic problems are likely at this level of turbidity. Blakemore (personal
communication) has noted that at Timaru she found the ozone demand increases dramatically
with increasing turbidity, and an added problem at 5 NTU occurs when chlorine is added to
maintain FAC in the distribution system. Therefore the DWSNZ set a limit of 1 NTU.
270
Guidelines for Drinking-water Quality Management for New Zealand 2013
8.4.4.3 Ultraviolet light
Process description
UV disinfection is a physical process relying on the transference of electromagnetic energy from
a source (lamp) to an organism’s cellular material.
Operation of the process is discussed in Chapter 15: Treatment Process, Disinfection,
section 15.5.5. A great deal of information appears in the Ultraviolet Disinfection Manual
(USEPA 2006c), over 500 pages in fact, that covers every aspect of the use of the UV
disinfection process for water supply. Refer also to Chapter 13 of the review draft LT2ESWTR
Toolbox Guidance Manual (USEPA 2009) which discusses issues related to using UV
disinfection.
DWSNZ criteria
See section 5.16 of the DWSNZ for the protozoal compliance requirements that need to be
satisfied in order to qualify for the claimed log credits. In the unlikely event that the process is
not being operated for the purpose of protozoal compliance, these requirements do not need to
be met.
UV disinfection can also be used to achieve bacterial compliance. Section 8.5 describes how UV
appliances can be validated using either MS2 or T1 organisms. If the appliance is installed to
inactivate bacteria as well as protozoa, the validation must have tested MS2 organisms.
Performance validation/certification
UV disinfection systems do not produce a chemical residual, so a direct C.t approach as used for
chlorine dioxide and ozone cannot be used. UV appliances used for protozoal compliance need
to be validated or certified to demonstrate the dose that they are capable of delivering at
different water qualities and flow rates.
The number of log credits claimed must be either (a) or (b):
a)
3.0 log credits for appliances validated against DVGW Technical Standard W294, or
öNORM M5873 (Osterreichisches Normungsinstitut 2001/2003), or NSF/ANSI 55-2002*
(NSF and ANSI 2002b) for Class A systems (for populations up to 5000) that deliver a
fixed dose or fluence of 40 mJ/cm2
b)
the number the reactor has been validated to achieve (up to 3 logs) following the
procedures and requirements specified in Ultraviolet Disinfection Guidance Manual or
UVDGM (USEPA 2006c).
*
UV disinfection systems that meet this standard can be found on the website http://nsf.com/Certified/DWTU/.
Note that at the time of writing the DWSNZ, appliances covered by (a) mainly delivered a fixed
dose (fluence) and claimed 3-logs, whereas appliances covered by (b) can claim 0.25 to 3.0
protozoal log credits, depending on the validated dose and operating conditions. The fixed dose
of 40 mJ/cm2 allows the appliance to be used to inactivate both bacteria and protozoa (oo)cysts,
whereas using the UVDGM, a 12 mJ/cm2 dose will earn 3 protozoal log credits (see Table 8.7
and section 8.5).
In most cases in New Zealand, UV appliances are validated off-site due to the complexities of
onsite validation. Onsite validation is not discussed in these Guidelines. The manufacturer’s
validation is only applicable when the installed appliance is identical to the appliance that was
tested, and the inlet and outlet hydraulic conditions are equal to or better than the conditions
used in the validation process. Appliances will need to be revalidated if they are modified.
Guidelines for Drinking-water Quality Management for New Zealand 2013
271
Validation testing of UV appliances must determine a range of operating conditions the
appliance can monitor and under which the appliance delivers the required UV irradiance
(dose), as measured by the UV intensity meter (UV sensor), to achieve the target log credit for a
range of flows. These operating conditions must include, at least:
•
flow rates
•
UV intensity (fluence rate) as measured by a UV intensity sensor
•
UV lamp status
•
minimum UV transmittance of the water for which the UV appliance has been validated to
achieve the target inactivation.
The validation procedure must take account of uncertainties in the disinfection system including
uncertainties related to the velocity distribution, lamp aging and UV intensity sensors.
The validation certificate must:
•
be an original, written in English, unique to the model of appliance
•
have been written by the certifying authority, and describe the validation procedure
•
state the qualifications of the certifying authority that conducted the validation. The
validation testing must have third-party verification by an agency accredited to ISO/IEC
17025 (IANZ 2005) or by the New Zealand National Metrology Institute (or accreditation to
an equivalent standard accepted by the Ministry of Health). The National DWAs
Coordination team maintains a list of agencies that have been accepted by the MoH
•
provide a detailed list of the components and dimensions of the appliance that had been
validated and relate to the parts comprising the water supplier’s appliance and to its name
plate (or data plate) fixed to the appliance
•
define what type of UV lamps are installed in the reactor, and clarifies the reasons for the
choice of the aging/fouling factor, which is equal to the fouling factor multiplied by the aging
factor and typically ranges from 0.4 to 0.9 (see section 5.4.6 of the UVDGM)
•
show clearly the means by which the appliance was shown to comply with the requirements
of the standard that it was tested against; these requirements will include the number of UV
intensity sensors, the position of the sensors, the spectral response of the sensors, the
variation in lamp UV output and the sensor uncertainty
•
contain statements, graphs or tables clearly showing the range of UV transmittance and flow
that the validation covered; where appropriate these graphs should also show the target dose
(usually measured in mJ/cm2) that was certified for each UV transmittance and flow
combination
•
contain statements, graphs or tables showing clearly the UV intensity (ie, the sensor reading,
usually in mWs/cm2) that is required at a given flow to produce a given target dose
•
contain a detailed description of the inlet and outlet hydraulic conditions
•
include a description of the measured headloss across the UV appliance
•
include a description of the challenge micro-organism used and its dose response curve that
was generated as part of the validation
•
include a description of how uncertainty arising from the number of experimental data points
has been taken into account in setting the target dose and related sensor reading.
The end-user should use the validation report to ensure that all aspects of the validation process
are applicable to the specifics of the proposed application. This review at a minimum should
cover the inlet and outlet piping configurations and the operating range of UV transmittance
and flow to provide the required dose.
272
Guidelines for Drinking-water Quality Management for New Zealand 2013
The performance of the equipment when installed should be verified against the validation at
the end of lamp life. The time to reach the end of lamp life condition is itself an important
performance criterion for a UV system. The end-user should require that the UV equipment
pass a performance test at the end of lamp life, typically this will occur after at least 12 months
of operation. The appliances will also include a system that continuously monitors lamp status.
Note: The amount of inactivation that is achieved is a function of the amount of UV light that
the micro-organisms receive. This is called the UV dose, or more correctly, the fluence. The SI
units of UV dose are J/m2. The units of mJ/cm2 are also used. One mJ/cm2 is equal to 10 J/m2.
The dose is the product of the intensity of UV light and the time that the micro-organisms are
exposed to it (which means related to flow). The unit of intensity is watts (W). The unit of time is
seconds (s). Consequently the dose is sometimes referred to as mW.s/cm2 or W.s/m2. One
mJ/cm2 is equal to 1 mWs/cm2.
UV equipment manufacturers are increasingly claiming that their validation documentation
(and some exceed 300 pages) is commercially sensitive and not available to regulatory bodies.
Instead they offer much abbreviated reports or simply state that the appliance meets
requirements. In conjunction with the major New Zealand importers of UV equipment, the MoH
has developed a template questionnaire to overcome this difficulty. It is reproduced below, and
is followed by explanations.
Template for UV disinfection: evidence of validation
1.
The water being disinfected
a.
Water treatment plant
b.
WINZ number
c.
Water supplier
d.
Protozoal log credit categorisation
e.
Target protozoal log credits by UV
f.
For protozoal compliance (5.16) or both protozoal and
bacterial (5.16 plus 4.3.5)
g.
Design UVT range, %T (10 mm)
2.
The UV disinfection system
a.
Manufacturer
b.
Model
c.
Serial number(s)
d.
Number of reactors
e.
Lamp type
f.
Importer/New Zealand agent; contact name
g.
Water supplier’s consultants; contact name
h.
Installed by, and date commissioned
LP, or
LPHO, or
MP, or
Other
Guidelines for Drinking-water Quality Management for New Zealand 2013
273
3.
The validation
a.
‘Standard’ validated to, under which compliance is sought
b.
Relevant validation testing body
c.
Validation certificate/report signed by, and date
d.
The validation testing body has third-party verification by an
agency accredited to
i)
ii)
e.
f.
g.
ISO/IEC 17025 (IANZ 2005) or
to an equivalent standard accepted by MoH
i)
ii)
iii)
iv)
v)
DVGW
ÖNORM
NSF
UVDGM
Other (specify)
i)
ii)
Challenge micro-organism used for standard under which
compliance is sought:
Microorganism(s)
Bacillus subtilis
MS-2 coliphage
T1
T7
Other
RED range tested
UVT (%, 10 mm) and flow range covered in validation
UVT
Flow (units)
Components in use during validation:
Lamp
• Sleeve
• Ballast
• Duty sensor
• Sensor window
Part numbers:
Evidence:
i)
ii)
2
mJ/cm
•
h.
Number of lamps per reactor
i.
Number of sensors per reactor
j.
Change lamps at x hours run time
k.
Do hydraulics of installed appliance meet the validation
conditions?
4.
Dosage control
Sighted validation report:
Declaration – append
Yes:
No:
Comment:
Either: The UV Intensity Setpoint Approach
a.
UVT measured online or in lab?
b.
Define the relationship between flow, UVT and intensity
Or: The Calculated Dose Approach
c.
Describe how a DWA can be certain that the appliance is
operating within its validated condition
i)
ii)
d.
274
A signed declaration from
manufacturer’s senior management –
append
Other:
What is the dose alarm setpoint?
What is the maximum validated flow?
Guidelines for Drinking-water Quality Management for New Zealand 2013
5.
Standardisation
Sensors
a.
How is the duty sensor checked that it is within specification?
Append as required.
i)
ii)
vs reference sensor
Other traceable procedure
b.
What tolerance is allowed before remedial action is required?
c.
What action is specified for when the duty sensor is out of
specification?
d.
What supporting documentation describes original reference
sensor standardisation? Append if possible.
Standard:
Issued by:
Date:
e.
How is the reference sensor standardised at the water
treatment plant?
i)
ii)
iii)
As per UVDGM
Other traceable procedure
Replaced annually
f.
How is UVT instrument standardised?
i)
ii)
Manual UVT:
Online UVT:
g.
At what frequency?
6.
Alarms
a.
How is the UV appliance set to warn of transgressions?
7.
Monitoring
(water supplier to complete those questions that are not appropriate for the UV vendor to complete)
a.
Where are the results of standardisations recorded?
b.
What remedial action will be followed when the duty sensor is
out of specification?
c.
At what frequency will each alarm condition be verified?
d.
Monitoring results have to be reported. Who is responsible?
i)
ii)
iii)
The UV people:
Consultants/contractors:
The water supplier:
e.
Will the reporting requirements of section 3.2 of DWSNZ be
met?
i)
ii)
Third party verification:
Other:
UVT
UV dose:
UV sensor (intensity meter):
UVT:
Turbidity:
Flow:
Other (describe):
Commentary on items in the template
•
General: The USEPA’s UVDGM is a guidance manual, but in this document it is treated as
though it were a ‘standard’, because referencing it in the DWSNZ effectively makes it a
standard.
•
1c: The MoH has allocated a unique identifier to every water source and treatment plant.
•
1d: Protozoal log credit categorisations are determined or signed off by DWAs.
•
2a: Some UV appliances are made by more than one manufacturer – list equivalents.
•
2b: Some models of UV appliances go by more than one name – list equivalents.
•
2f: This question relates only to UV disinfection.
Guidelines for Drinking-water Quality Management for New Zealand 2013
275
•
3 (general): DVGW and ÖNORM include an expiry date on their certificates. The MoH
accepts that if an appliance continues to meet the original validation, the expiry date may be
ignored.
•
3a: Some appliance models are validated to more than one standard. Where this is so, record
both, but indicate which one applies to this water treatment plant.
•
3e: RED is reduction equivalent dose, a term used in North America. It is the same as REF
which is reduction equivalent fluence, a term used in Europe. ÖNORM describes it as the
“average microbiocidal fluence measured by the biodosimeter according to Annex d in the
irradiation chamber, in J/m2”. The UVDGM describes it as ‘see UV dose’, which reads: “the
UV energy per unit area incident on a surface, typically reported in units of mJ/cm2 or J/m2.
The UV dose received by a waterborne microorganism in a reactor vessel accounts for the
effects on UV intensity of the absorbance of the water, absorbance of the quartz sleeves,
reflection and refraction of light from the water surface and reactor walls, and the germicidal
effectiveness of the UV wavelengths transmitted”. Note that 40 mJ/cm2 = 400 J/m2 =
40 mW-s/cm2 (also written as 40 mWs/cm2).
•
3f: If units are SSK-254/m, convert to %T (10 mm), but enter both.
•
3g: A DWA needs to know that the installed appliance comprises the same parts as the
appliance that was validated, hence the need for part numbers. If the part numbers have
changed between validation and delivery, some form of verification from the manufacturer
will be required. If the delivered parts are not the same, a new validation certificate will be
required. If the validation report/certificate is not sighted in New Zealand, the manufacturer
needs to provide a signed declaration that the parts supplied are the parts that underwent
validation. If the New Zealand agents have the validation report/certificate but it is not
available to others, the New Zealand agent/importer needs to provide a signed declaration
that the parts supplied are the parts that underwent validation.
•
3i: Under DVGW each MP lamp needs a sensor.
•
3j: The hours run meter must incorporate the effect of on/off switching.
•
3k: Flow patterns can exert a major effect on inactivation efficacy.
•
4: Section 5.16.2 of DWSNZ includes: “The validation certificate must define the operating
conditions under which the reactor can deliver the UV dose required by the validation
procedure”.
•
4a: Population based – see Table 5.7 in DWSNZ.
•
4b: This can be in the form of an appended graph, equation or table.
•
4c: The Calculated Dose Approach uses a dose-monitoring equation or algorithm to estimate
the UV dose, based on operating conditions (typically flow rate, UV intensity, and UVT). The
dose-monitoring equation is usually developed during validation testing, and is incorporated
in the ‘black box’ that controls dosage. Using the Calculated Dose Approach makes it difficult
to check that the operating conditions under which the reactor delivered the UV dose in the
validation procedure are actually being met.
A declaration from senior management of the manufacturer that the algorithm/program used
during validation has been incorporated in the control system, followed by a declaration from
senior management of the importer/New Zealand agent that the algorithm/program has not
been changed should suffice.
276
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
5a, b, c and d: The manufacturer/importer/agent needs to set these up for the water
supplier.
•
5e: The reference sensor must have appropriate documentation, traceable to ISO 17025 or
equivalent standard accepted by MoH.
•
5f: This can be by following the manufacturer’s procedure, or cross-checking against a lab
bench instrument. If an online UVT monitor takes more than 15 min to standardise or
maintain (eg, for cleaning) it shall not be deemed to have failed to comply with section 5.16.1,
part 5a(i)C (which refers to a three-minute period).
•
5g: The manufacturer needs to tell the water supplier how often to standardise the UVT.
•
6 (general): This section refers to alarms that indicate transgressions, non-compliances and
failures etc. It is expected a lower level of alarm will warn the operation of impending
transgressions or maintenance requirements etc.
•
6a: Whether these alarms are provided will depend on the dosage control system in
operation.
•
6b: If the alarm system is part of the procedure that is involved in indicating compliance, it
needs to be sufficiently sensitive, and of course, still ‘alive’. The manufacturer/importer/
agent needs to include this facility. An alarm that indicates just ‘on’ or ‘off’ is not sufficient. A
more sensitive technique is needed, eg, noting the response when a sensor is removed
slightly, or a slight reduction in lamp power.
•
7 (general comments): The procedure used for monitoring compliance may be provided
by the UV appliance manufacturers, the New Zealand importers/agents, consultants/
contractors, or the water supplier. But ultimately, it is the responsibility of the water supplier
to ensure that the monitoring requirements of the DWSNZ are met.
Section 3.2 of the DWSNZ includes:
Compliance with the DWSNZ requires some determinands not to exceed a certain value
for more than three, five or 15 minutes. This requires accuracy in time measurement
and recording to ensure no short-term transgressions go unrecorded. Generally, for
remote measurements, unless a high-speed communications network is used, this
requires the remote terminal unit to time-stamp the data as it is recorded. The
sampling frequency must be as specified above. Where this cannot be achieved at
present, suitable equipment must be installed and operating as stated in section 69C of
the Act.
The data records may be compressed using a procedure that preserves the accuracy of
the original measurements. Data must be reported as a percentage of the time (or
duration, where required) that the value was exceeded (or met) during the compliance
monitoring period.
In section 5.16, the compliance monitoring period for continuously monitored parameters is
one month; for all other measurement frequencies the compliance monitoring period is one
year. DWAs cannot be expected to analyse thousands of data points – for example, there are
43,200 minutes in a 30-day month. If the data cannot be presented in a summarised format
that readily shows compliance, the DWA will deem it as failing to comply.
•
7e: The accuracy of the procedure used to convert online monitoring data to the compliance
report sent to the DWA needs to be verified by an appropriately qualified third party.
Guidelines for Drinking-water Quality Management for New Zealand 2013
277
Calibration and monitoring
The 2005 DWSNZ required the UV dose (fluence) to be not less than the reduction equivalent
dose (RED) target required for the claimed log credit .... The 2008 DWSNZ require the UV
irradiance (intensity or sensor reading) to be not less than the value established by validation
required to achieve the claimed log credit .... The switch from UV dose to UV intensity was
because the dose is ‘what comes out of the lamp’ while intensity (dose delivery) is ‘what hits the
most distant (oo)cyst’, which is more logical operationally. However, the approach used, which
depends on the manufacturer, is not the critical point. What is important is that the monitoring
system indicates that the appliance is being operated within the conditions of its validation.
UV intensity sensors
UV intensity sensors measure the intensity of the germicidal UV light at a specified distance
from a set of UV lamps. Sensors are discussed further in section 8.6.2.6 (this chapter).
A UV disinfection appliance (called reactor by USEPA) may contain one or more UV lamps.
Appliances may contain one or more UV sensors. Section 6.3.2.2 of USEPA (2006c) states:
UV lamp output differs for each lamp, depending on lamp age and lot. However, a UV
sensor cannot measure lamp output variability unless each lamp has a UV sensor. Water
suppliers that have UV reactors with a UV sensor monitoring more than one lamp should
assess the UV lamp variability every two months for MP lamps or every three months for
LP and LPHO lamps. If all the lamps monitored by a UV sensor are close in age (ie, their
age varies by less than 20 percent), it is not necessary to check the output of each lamp. In
this case, the oldest lamp should be placed in the position nearest the UV sensor.
There should be at least one UV intensity sensor for every 10 LP or LPHO lamps (see DVGW)
and the number and type of intensity sensors must be the same as for the unit that was certified
or validated. These sensors are called duty sensors and are in continuous use.
All UV appliances will have at least one duty UV intensity sensor as an integral part; they all
need to be calibrated against the reference sensor at regular intervals as required by the
standard that they were certified to, or monthly as per section 5.16.2(b) in the DWSNZ,
whichever is the more frequent.
The reference sensor is to be calibrated at least annually, in accordance with the validating
authority, eg, USEPA 2003d/2006c. The calibration must be done by an accredited person or
organisation, see section 5.16.3 of DWSNZ. However, due to the cost of sending a sensor
overseas for recalibration, the DWSNZ allow reference sensors to be used as duty sensors (in
lieu of recalibration) and a new calibrated sensor can be purchased for use as a replacement
reference sensor.
UV intensity sensor measurement uncertainty
Duty sensor calibration is discussed in section 6.4.1.1 of USEPA (2006c). The duty sensor is
considered to be operating satisfactorily if it reads within 20 percent of the reference sensor
reading (preferably mean values of several readings). If it is outside the 20 percent allowance,
immediately check that the reference sensor is still correct. If it is, then change the duty sensor.
USEPA states that it is permissible to continue using the duty sensor with a correction factor.
However, this is said to be not energy-efficient.
278
Guidelines for Drinking-water Quality Management for New Zealand 2013
UV transmittance
Some UV disinfection systems automatically adjust the UV dose as the UV transmittance
(measured at 253.7 nm) of the water flowing through the appliance varies. The UV
transmittance requirements in section 5.16.1(5) of the DWSNZ do not apply to these appliances.
Other appliances rely on the UV transmittance of the water not being less than that noted during
the validation. UV transmittance needs to be monitored when using appliances of this type. If
the population served is more than 10,000, UV transmittance is required to be monitored
continuously. Only one UV transmittance meter that monitors the combined water that passes
into or out of the UV disinfection plant is required.
The DWSNZ require less frequent monitoring for systems that serve fewer than 10,000 people,
but the end-user should investigate the cost and benefit of installing a continuous UV
transmittance monitor.
Water suppliers must take care to note the units for UV transmittance on their validation
certificate. The readings are not always reported for the traditional 10 mm path length. Annex C
in ÖNORM (2001) is a table that relates UV transmittance readings at 100, 50 and 10 mm path
lengths; it includes a column headed SSK per metre (spectral attenuation coefficient) which we
call UV absorbance (10 mm) except SSK per metre is numerically 100 times larger. Appendix
A1.5.9 in the DWSNZ shows how to convert UV absorbance to UV transmittance. Examples
follow:
•
say the absorbance is 0.0721 (as measured in a 10 mm cell), ie, A 10 mm = 0.0721
•
to convert to transmittance, see DWSNZ Appendix A1.5.9, which states A = -logT
•
therefore 0.0721 A = 0.847 T, or 84.7%T.
Using ÖNORM: 84.7% T 10 mm = 43.6% T 50 mm = 19% T 100 mm = 7.212 SSK/m (= 0.0721 A 10 mm ).
That is: %UVT = 10-(SSK/100) where SSK is SSK/m and measured at 254 nm.
USEPA (2006a) states in section 6.4.1.2 that online UVT analysers be standardised at least
weekly by comparing the online UVT measurements with UVT measurements using a bench-top
spectrophotometer. The bench-top spectrophotometer should be maintained and standardised
at the frequency required by the manufacturer. The standardisation monitoring frequency can
be decreased or increased based on the performance demonstrated over a one-year period.
USEPA considers the online reading to be satisfactory if it is within 2 percent of the bench-top
spectrophotometer reading.
Turbidity
Section 5.16.1 of the DWSNZ stipulates the turbidity requirements for the water passing through
the UV appliances.
Flow measurement
The DWSNZ requires that all appliances serving more than 500 people have a dedicated flow
meter and the flow through the appliance needs to be limited to not more than the flow for
which the appliance was validated. The flow through appliances serving a smaller population
also needs to be limited to less than the flow for which the appliance was validated.
Guidelines for Drinking-water Quality Management for New Zealand 2013
279
Alarms
Monitoring equipment should be connected to alarm devices to notify when operators are
required. All alarm events should be recorded. USEPA (2006c) discusses (section 4.3.3) the use
of minor, major and critical alarms.
Records
To avoid misinterpretation over meeting validation requirements, it is strongly recommended
that a detailed diary be kept of plant operations, calibrations, maintenance and replacement of
lamps etc. See Annex G in ÖNORM (2001) for further information.
Further information
The USEPA (2003a, 2006a) considered that a major recent development was the finding that
UV light is highly effective for inactivating Cryptosporidium and Giardia at low doses. Research
prior to 1998 had indicated that very high doses of UV light were required to achieve substantial
disinfection of protozoa. These results were based largely on the use of in vitro assays, which
were later shown to substantially overestimate the UV doses required to prevent infection
(Clancy et al 1998, Bukhari et al 1999, Craik et al 2000). Research using in vivo assays (eg,
neonatal mouse infectivity) and cell culture techniques to measure infectivity has provided
strong evidence that both Giardia and Cryptosporidium are highly sensitive to low doses of UV.
USEPA (2006a) stated that even though microbial repair can occur, neither photorepair nor
dark repair is anticipated to affect the performance of drinking-water UV disinfection.
These studies demonstrated that a dose of 10 mJ/cm2 was able to achieve Cryptosporidium
inactivation of at least 3 log, compared with a typical UV dose for general water supply
disinfection being about 30–40 mJ/cm2 (or 300–400 J/m2 in ISO units). Table 8.7 (taken from
Table 1.4 of USEPA 2006c) shows the relationship between UV dose and available log credits for
protozoa inactivation.
Qian et al (2004) performed a meta-analysis of a number of drinking-water UV efficacy studies
and concluded that doses up to 20 mJ/cm2 are necessary to achieve at least a 3-log
(99.9 percent) reduction of Giardia cysts and Cryptosporidium oocysts, with at least 95 percent
confidence (ie, no more than a 5 percent risk of failing to meet that level of reduction).
Table 8.7: UV dose requirements for Cryptosporidium inactivation credits
Log credit
2
UV dose (mJ/cm )
0.5
2
1.0 (90% removal)
3
1.5
4
2.0 (99% removal)
6
2.5
9
3.0 (99.9% removal)
12
3.5
15 (not applicable in DWSNZ 2008)
4.0 (99.99% removal)
22 (not applicable in DWSNZ 2008)
These doses (rounded to whole numbers) are based on UV light at 254 nm as delivered by a low
pressure mercury vapour lamp. The doses can be applied to other lamps such as medium
pressure through reactor validation testing.
280
Guidelines for Drinking-water Quality Management for New Zealand 2013
USEPA (2003a) states that to receive disinfection credit for a UV reactor, manufacturers are
required to demonstrate through validation testing that the reactor can deliver the required UV
dose. The USEPA developed dose requirements for Cryptosporidium that account for the
uncertainty associated with the dose-response of the micro-organisms in controlled
experimental conditions. In practical applications, other sources of uncertainty are introduced
due to hydraulic effects, UV reactor equipment, water quality, and sensor quality. The validation
protocol applies a safety factor to the dose requirements to account for these areas of
uncertainty and variability.
DWI (2010) states that the absorbance of UV light by nitrate (at wavelengths below 240 nm) can
lead to the formation of nitrite by photolysis. This can be managed through the selection of UV
lamp or sleeve type.
8.4.5 Other processes and other log credit determinations
The USEPA included in their LT2ESWTR Final Rule (USEPA 2006a) a section called
Demonstration of Performance, basically to cover processes and procedures not specified in
detail in the LT2ESWTR. Refer also to Chapter 12 of the review draft LT2ESWTR Toolbox
Guidance Manual (USEPA 2009).
As a consequence, the 2008 DWSNZ include a new section, section 5.17: Alternative processes:
treatment compliance criteria, whereby water suppliers may apply to the Ministry of Health to
have other treatment processes assessed for a log credit rating. This approach allows water
suppliers to apply for a log credit rating (or a variation to the prescribed log credits) for a
treatment plant or process:
a)
not covered in sections 5.1–5.16 of the DWSNZ
b)
that performs demonstrably better than its compliance criteria
c)
that performs to a lesser, but reliable, level than specified in its compliance criteria.
a)
Treatment processes not covered in sections 5.1–5.16 of the DWSNZ
Any water supplier or equipment supplier that wishes to use a treatment process not covered in
sections 5.1–5.16 of the DWSNZ, and it is considered to be effective in the removal or
inactivation of protozoal (oo)cysts, can apply to the Ministry of Health for an assessment to
decide whether the process qualifies for any log credits.
If it appears to qualify, then the next step will be to determine the number of log credits allowed,
and the criteria that will need to be satisfied in order to qualify for those log credits.
The application will need to be made before installing the equipment or process. The
information that will be needed with the application will include:
•
a description of the quality of the raw water that will be treated
•
a detailed description of the treatment process and its limitations
•
the intended maximum and (and minimum if relevant) treatment rates
•
results from a bench-scale and/or pilot plant challenge test
•
the operating parameters that need to be met in order to confirm the claimed log removal
Guidelines for Drinking-water Quality Management for New Zealand 2013
281
•
and where possible a quantitative description of the performance of the full-scale process
elsewhere, including details of (oo)cyst removal/inactivation or equivalent, including:
– a description of the water the process treated
– the treatment rates or loading rates the data provided relate to
– monitoring results.
The supporting data supplied must have been generated by organisations accredited by
appropriate agencies acceptable to the Ministry of Health.
As shown by the USEPA (2006a) when assessing treatment processes in developing their
LT2ESWTR, any water treatment plant using a new process should earn fewer log credits than it
achieved in bench-scale or pilot plant challenge tests. Reasons include:
•
variations in treatment rate can affect treatment performance
•
variable raw water composition can affect treatment requirements and performance
•
variable water temperature can affect treatment performance
•
whether all plants using the new process will operate similarly
•
deterioration in treated water quality as a treatment cycle progresses
•
wear and tear/maintenance problems of the process
•
the skill level required by operators
•
the degree of difficulty in maintaining optimum performance
•
the time for the process to recover after a problem has been identified and rectified.
The number of log credits a new process is awarded would probably relate more closely to the
results presented from a full-scale plant, provided it was operating ‘normally’ during the testing.
However, there will be difficulties if the full-scale plant is producing drinking-water, because, in
general, the degree of removal that can be quantified in full-scale plants is limited because
Cryptosporidium oocyst levels following filtration are often below the detection limit of the
analytical method. Due to the shortage of data relating to oocysts, the USEPA (2003a, 2006a)
evaluated data provided by water suppliers on the removal of other types of particles, mainly
aerobic spores, in the sedimentation processes of full-scale plants. Data indicate that aerobic
spores may serve as a surrogate for Cryptosporidium removal by sedimentation provided
optimal chemical dosage conditions apply (Dugan et al 2001). This is discussed further in
section 8.5, along with other monitoring techniques such as particle counting.
Due to the above, a regulatory authority is likely to err on the side of safety. Safety measures
include requiring another treatment to be included while the new process undergoes in situ
evaluation. For example, if a source water needs 3 log removals, the new process could be
evaluated while the water was being disinfected by UV light, equivalent to a 3-log dose. If the
new process is awarded say 2 log credits, the UV disinfection dose rate could then be reduced.
If a new process satisfies the above, compliance criteria specific to that process and site will be
developed.
282
Guidelines for Drinking-water Quality Management for New Zealand 2013
b)
Treatment that performs demonstrably better than its compliance
criteria
The prescribed log credits for treatment processes in the DWSNZ are based on conservative
estimates of mean Cryptosporidium removal efficiencies. Due to site-specific conditions, some
treatment plants may consistently achieve greater Cryptosporidium removal than reflected in the
prescribed log credits. Water suppliers may receive log credits for a water treatment plant or a
treatment process within a plant that is based on demonstration of Cryptosporidium removal
efficiency. Demonstration of performance testing will be specific to a particular site and will depend
on the treatment processes being tested, water quality, plant infrastructure, technical and
management resources, and other factors. Demonstration of performance testing should encompass
the full range of expected operating conditions, and cover at least a year’s continuous operation.
Demonstration of Cryptosporidium removal efficiency will usually not be possible, so indirect
techniques will generally be needed. These may include monitoring the removal of aerobic or
anaerobic spores, or using particle counting, see section 5.5. The supporting data supplied must
have been generated by organisations accredited by appropriate international agencies
acceptable to the Ministry of Health. If the supporting data is satisfactory, compliance criteria
specific to that process and site will be developed.
Treatment plants cannot claim additional log credits by this process if they are already claiming
log credits for individual processes. For example, a coagulation/sedimentation/filtration plant
(DWSNZ section 5.4) cannot claim demonstration of performance log credits if it is also
claiming log credits for enhanced combined filter performance (DWSNZ section 5.7).
Treatment plants claiming 3 log credits for an existing disinfection process cannot increase this
by demonstration of performance. If a water supply needs more than 3 log credits for protozoal
compliance, a filtration technique should provide the additional log credits, ie application of the
multiple barrier principle.
c)
Treatment that performs to a lesser but reliable level specified in its
compliance criteria
Some treatment processes may fail to satisfy the compliance criteria prescribed for that process
by a small margin. It is considered unreasonable to award zero log credits for that process if it
can still demonstrate a measurable, but lesser, consistent removal of Cryptosporidium. This
option is only available to processes that fail to satisfy their compliance criteria by a small
margin. It does not apply to plants that cannot cope with peak flows, turbid water after heavy
rain, cold water, power failures or other conditions that a process or plant would be expected to
handle effectively.
Also, some treatment processes may be validated for a higher log credit than the source water
requires. For example, source water that requires 3 log removals may be treated by a membrane
filtration plant that is validated to remove 4.0 logs if it achieves a specified particle removal rate.
A water supplier may be able to demonstrate 3.5 log removals (for example) if it achieves a
slightly less demanding particle removal rate.
Demonstration of Cryptosporidium removal efficiency will usually not be possible, so indirect
techniques will generally be needed. These may include monitoring the removal of aerobic or
anaerobic spores, or using particle counting, see section 8.5. The supporting data supplied must
have been generated by organisations accredited by appropriate international agencies
acceptable to the Ministry of Health. If the supporting data is satisfactory, compliance criteria
specific to that process and site will be developed.
Guidelines for Drinking-water Quality Management for New Zealand 2013
283
Demonstration of performance testing should encompass the full range of expected operating
conditions, and cover at least a year’s continuous operation.
8.5 Challenge testing
Normally it would be expected that the manufacturer of the treatment process would conduct
the challenge test. However, for some treatment processes, there is no reason why a water
supplier cannot arrange to have this done in New Zealand.
8.5.1
Using Cryptosporidium oocysts
Although use of Cryptosporidium as the challenge particulate offers the advantages of directly
measuring removal efficiency, and eliminates issues regarding the appropriateness of a
surrogate, it may not be practical or feasible due to economic considerations (particularly in
large plants), or to health concerns about working directly with the pathogen. Thus the use of
surrogates may be the most viable option for challenge testing.
The use of other organisms and molecular markers is discussed in the Membrane Filtration
Guidance Manual (USEPA 2003c, 2005a).
8.5.2 Using microspheres
Microspheres can be used for measuring the removal efficacy of filtration processes. EPA/NSF
ETV (2002) describes the technique in detail, in their sections 12.3.3, 12.4.2 and 14.9. For
testing involving microscopic enumeration, fluorescent microspheres and an optical microscope
equipped with ultraviolet illumination are used.
8.5.3 Using naturally occurring bacteria
Rice et al (1996) stated:
Monitoring for indigenous spores of aerobic spore-forming bacteria represents a viable
method for determining treatment plant performance. Comparison of spore levels in
source water and filter effluents provides an indication of biological particle removal
efficiency.
Section 8.4.1.2 Challenge Particulate in the LT2ESWTR Toolbox Guidance Manual Review Draft
discusses the use of surrogates for challenge testing, examples being P. dimunita and
S. marcessans (USEPA 2009).
The heterotrophic plate count may be a suitable (and cost-effective) test. A membrane filtration
laboratory method is needed so large volumes can be filtered. Trials would be needed for each
water supply to see how much water needs to be filtered. With low turbidity water it should be
possible to filter 5–10 litres to ensure a reasonable number of bacteria grow. Also, the optimum
incubation temperature and time will need to be determined. Some may need to be incubated at
22°C (or even room temperature if an incubator is not available) for 72 hours. An example follows:
•
say 25 mL of raw water is filtered through the membrane and incubated, and 200 colonies
are counted = 8000 CFU/L
•
say 5 L of treated water is filtered through the membrane and incubated, and 40 colonies are
counted = 8 CFU/L
•
that equals 99.9 percent removal (3 log), which should give confidence that the treatment
process was removing the larger protozoa effectively.
284
Guidelines for Drinking-water Quality Management for New Zealand 2013
8.5.4 Bag and cartridge filter challenge
Bag and cartridge filter manufacturers commonly rate their products by pore size or pore
distribution. However, there is no industry standard for measuring or reporting these
characteristics. This lack of standardisation causes problems for establishing design criteria to
ensure that a given bag or cartridge filter will effectively remove a given percentage of
Cryptosporidium. Furthermore, an oocyst has different structural characteristics than the
markers used to determine pore size; thus, the rate of rejection may differ for an oocyst versus
the test markers used to determine pore size or molecular weight cutoff.
To compensate for these factors of uncertainty for Cryptosporidium removal, the DWSNZ
require bag or cartridge filters to be challenge tested, a process in which a known quantity of
Cryptosporidium oocysts (or an acceptable surrogate) is added to the filter influent and the
effluent concentration is measured to determine the removal capabilities of the filter. This
testing is product-specific, not site-specific, meaning it does not have to be tested at every water
supply seeking removal credit. Instead, a manufacturer (or independent third party) must
challenge test each of its products in order to obtain a 2 or 3 log Cryptosporidium removal
rating. Details for the challenge test that relate specifically for bag and cartridge filtration appear
in section 8 of the review draft Toolbox Guidance Manual (USEPA 2009). More general
requirements are the same as membrane filtration.
8.5.5
Membrane filter challenge
The material below has been taken from the Membrane Filtration Guidance Manual (USEPA
2003c, 2005a). The introduction has been copied in full. The chapters’ titles are listed to
indicate the information that is available in the Guidance Manual (about 330 pages).
3.1
Introduction
The LT2ESWTR requires that any membrane filtration system used to comply with the
Cryptosporidium treatment requirements of the rule undergo challenge testing. The primary
purpose of this challenge testing is to establish the log removal that an integral membrane can
achieve.
Under the LT2ESWTR, the maximum removal credit that a membrane filtration system is
eligible to receive is the lower of the two values established as follows:
•
the removal efficiency demonstrated during challenge testing; or
•
the maximum log removal that can be verified by the particular direct integrity test used
during the course of normal operation.
The requirement for challenge testing under the LT2ESWTR is intended to be product-specific
such that site-specific demonstration of Cryptosporidium removal efficiency is not necessary.
Once the log removal of a membrane has been established through a challenge test that meets
the requirements of LT2ESWTR, additional challenge testing is not required unless
significant modifications are made to the membrane process (as discussed in
section 3.14). The rule specifies criteria for the following aspects of challenge testing:
•
full-scale vs small-scale module testing
•
appropriate challenge particulates
•
challenge particulate concentrations
•
test operating conditions
•
calculation of removal efficiency
Guidelines for Drinking-water Quality Management for New Zealand 2013
285
•
verifying characteristic removal efficiency for untested modules
•
module modifications.
The discussion of challenge testing applies similarly to microfiltration, ultrafiltration,
nanofiltration, and reverse osmosis, except as otherwise noted.
Although the primary focus of challenge testing as required under the LT2ESTWR is
demonstration of Cryptosporidium removal, the general framework for challenge testing
developed in this guidance manual may be adapted for use in establishing removal efficiencies
for other microbial pathogens of concern, including bacteria, viruses, and other protozoa such as
Giardia.
Chapter 3 is organised into sections that describe the various issues to be considered in the
design and implementation of a challenge test.
•
Section 3.2: Summary of challenge testing requirements
•
Section 3.3: Test organisation qualification
•
Section 3.4: General procedure for developing a challenge test protocol
•
Section 3.5: Module specifications
•
Section 3.6: Non-destructive performance testing
•
Section 3.7: Selection of modules for challenge testing
•
Section 3.8: Small-scale module testing
•
Section 3.9: Target organisms and challenge particulates
•
Section 3.10: Challenge test solutions
•
Section 3.11: Challenge test systems
•
Section 3.12: Sampling
•
Section 3.13: Analysis and reporting of challenge test results
•
Section 3.14: Re-testing of modified membrane modules
•
Section 3.15: Grandfathering challenge test data from previous studies
8.5.6 UV appliance challenge
The validation protocol in the UV Guidance Manual (USEPA 2003d, 2006c) builds on wellestablished protocols used in Europe and North America: see also DVGW Technical Standard
W294, öNORM M5873 (Osterreichisches Normungsinstitut 2001/2003), and NSF/ANSI
55-2002 (NSF and ANSI 2002b) for Class A systems.
A UV disinfection appliance manufacturer typically delivers a UV appliance to a test facility. Test
personnel inspect the UV appliance and document features of the design that impact dose
delivery and monitoring (eg, appliance dimensions and sensor properties). The UV appliance is
installed within a biodosimetry test stand with inlet and outlet piping that should result in equal
or worse dose delivery than with the appliance installed at the treatment plant site. The UV
appliance is operated under various test conditions of flow, UVT, and lamp power. The test
condition of UVT is typically obtained using a UV-absorbing compound injected into the flow
upstream of the UV appliance. A challenge micro-organism is injected into the flow upstream of
the UV appliance. The concentration of viable challenge microorganisms is measured in samples
collected at the appliance’s inlet and outlet. The results are used to calculate the log inactivation
of the challenge microorganism achieved by the UV appliance.
286
Guidelines for Drinking-water Quality Management for New Zealand 2013
The UV dose-response of the challenge micro-organism present in the inlet sample is measured
using a bench-scale device termed a collimated beam apparatus. The UV dose-response curve is
used to relate the log inactivation observed through the appliance to a UV dose value termed the
Reduction Equivalent Dose (RED). A safety factor is applied to the results to account for any
bias and random uncertainty associated with the validation of the UV appliance and the online
monitoring approach used to indicate dose delivery both during validation and during operation
at the water treatment plant. Lastly, a validation report is prepared that describes the UV
appliance tested, the test protocol, the test results, and the inactivation credits that can be
assigned to the UV appliance under given conditions of flow, UVT, and lamp output. Refer also
to section 8.4.4.3.
Section 5.3 of USEPA (2006) UV Disinfection Guidance Manual for the Final LT2ESWTR
(UVDGM) allows different test organisms to be used for validation, the choice being dependent
on the target pathogen. Table 5.2 in UVDGM shows the delivered UV dose required to inactivate
a range of micro-organisms.
Successfully challenging with MS2, for example, means the appliance is validated to deliver a
40 mJ/cm2 dose which is the same dose specified in DVGW, öNORM and NSF, which means the
appliance is validated for bacterial disinfection (and of course protozoal compliance).
Successfully challenging with T1, for example, means the appliance is validated to achieve 3 log
inactivation of Cryptosporidium and Giardia delivering a dose of at least 12 mJ/cm2.
The validation certificate must state the conditions, ie, UVT, and in this context, particularly the
flow rate. A UV disinfection appliance has now been marketed in NZ with validation using both
MS2 and T1. That particular appliance has been validated to achieve 3 log inactivation of
Cryptosporidium and Giardia at 80 USGPM, but is validated for bacterial disinfection at only
50 USGPM.
So, if this appliance is installed at a water treatment plant that achieves bacterial compliance by
dosing with chlorine (say), protozoal compliance can be achieved at a flow up to 80 USGPM. But
if the UV appliance is meant to achieve bacterial and protozoal compliance, the flow rate cannot
exceed 50 USGPM.
8.5.7
General
USEPA (2005a) notes: although the primary focus of challenge testing as required under the
LT2ESTWR is demonstration of Cryptosporidium removal (or inactivation), the general
framework for challenge testing developed in the membrane filtration guidance manual may be
adapted for use in establishing removal efficiencies for other microbial pathogens of concern,
including bacteria, viruses, and other protozoa such as Giardia.
Guidelines for Drinking-water Quality Management for New Zealand 2013
287
8.6 Sampling and testing for protozoa and
substitute compliance tests
8.6.1
Giardia and Cryptosporidium testing
The log credits derived in the LT2ESWTR (USEPA 2006a) for the various treatment processes
used for protozoa removal or inactivation were based on the use of Methods 1622 and 1623
(USEPA 2001b and 2001c). Method 1623 was also used when categorising raw waters according
to the number of Cryptosporidium present. The latest version of Method 1623 appears in
USEPA (2005b).
The USEPA developed Method 1622 (detects Cryptosporidium) and 1623 (detects
Cryptosporidium and Giardia) to achieve higher recovery rates and lower inter- and intralaboratory variability than previous methods. These methods incorporate improvements in the
concentration, separation, staining, and microscope examination procedures.
Specific improvements include the use of more effective filters, immunomagnetic separation
(IMS) to separate the oocysts and cysts from extraneous materials present in the water sample,
and examination based on immunofluorescence assay (IFA), 4, 6-diamidino-2-phenylindole
(DAPI) staining results, and differential interference contrast (DIC) microscopic analysis for
determination of oocyst concentrations.
The performance of these methods was tested through single-laboratory studies and validated
through round robin studies. To assess method recovery, matrix spike samples were analysed on
five sampling events for each plant. The protozoa laboratory spiked the additional sample with a
known quantity of Cryptosporidium oocysts and Giardia cysts (the quantity was unknown to
the laboratory performing the analysis) and filtered and analysed both samples using Methods
1622/23. Recovery averaged 43 percent for Cryptosporidium with a relative standard deviation
of 47 percent (Connell et al 2000).
Although Methods 1622 and 1623 have several advantages over the earlier method, they also
have some of the same limitations. These methods do not determine whether a cyst or oocyst is
viable or infectious, and both methods require a skilled microscopist and several hours of
sample preparation and analyses.
The minimum sample size for raw water categorisation purposes is 10 litres. The USEPA has
prepared draft guidance for sampling and testing (USEPA 2006b and 2006d). The final version
appears as USEPA (2006b).
The filters that have been approved in LT2ESWTR (USEPA 2003a) for use with Methods 1622
and 1623 (USEPA 2005b) are the Pall Gelman EnvirochekTM HV filter, the IDEXX Filta-MaxTM
foam filter, and the Whatman Nucleopore CrypTestTM filter (product subsequently withdrawn
from the market).
Methods 1622 and 1623 include fluorescein isothiocyanate (FITC) as the primary antibody stain
for Cryptosporidium detection, DAPI staining to detect nuclei, and DIC to detect internal
structures. For the purpose of raw water categorisation, water suppliers must report total
Cryptosporidium oocysts as detected by FITC as determined by the colour (apple green or
alternative approved stain colour), size (4–6 microns) and shape (round to oval). This total
includes all of the oocysts identified as described here, less atypical organisms identified by
FITC, DIC, or DAPI (eg, possessing spikes, stalks, appendages, pores, one or two large nuclei
filling the cell, red fluorescing chloroplasts, crystals, spores, etc).
288
Guidelines for Drinking-water Quality Management for New Zealand 2013
Methods 1622 and 1623 require matrix spike samples; one matrix spike sample must be
analysed for each 20 source water samples. The volume of the matrix spike sample must be
within 10 percent of the volume of the unspiked sample that is collected at the same time, and
the samples must be collected by splitting the sample stream or collecting the samples
sequentially. The matrix spike sample and the associated unspiked sample must be analysed by
the same procedure. Matrix spike samples must be spiked and filtered in the laboratory.
Laboratories must also meet the quality control requirements in Methods 1622 and 1623. For
compliance testing, laboratories need to be accredited by IANZ.
8.6.2 Alternatives to Giardia and Cryptosporidium testing
Because it is impractical to use Cryptosporidium or Giardia testing of water treatment plants to
demonstrate compliance with the protozoal MAV in the DWSNZ, various operational
performance requirements are used instead. These include turbidity (or particle counting),
direct integrity testing, indirect integrity testing, pressure differential, UV intensity, and C.t
values for ozone and chlorine dioxide disinfection.
8.6.2.1
Turbidity measurement
The water industry has used turbidity as a marker of consistency and quality of water effluent
from water treatment plant filters. In the DWSNZ, turbidity monitoring is an operational
requirement for bacterial and protozoal compliance. An increase in turbidity measurement is
perceived as deterioration in the performance of the treatment process, with a potential for the
breakthrough of pathogens from the filters, or a reduction in disinfection efficacy. Conversely, as
a generalisation, the lower the turbidity, the lower the pathogen risk.
Nephelometry is the only method of determination to be used for turbidity measurements. It is a
method-defined parameter that can detect the presence of a wide variety of particles in water
(eg, clay, silt, mineral particles, organic and inorganic matter, and micro-organisms).
Turbidity is not a direct measurement of suspended particles in water. Turbidimeters detect the
intensity of light scattered from particles at one or more angles to an incident beam of light. The
angular distribution of scattered light depends on a number of conditions, including the
wavelength of the incident light, as well as particle size, shape, and composition. It is difficult to
correlate the turbidity with the number or concentration of particles in suspension. The results
are expressed in nephelometric turbidity units (NTU). Other methods of measurement use
different principles of measurement and yield results in units that cannot be converted to NTU.
The design of nephelometric instruments should take into account the physics of scattered light.
Small particles less than one-10th of the light wavelength will scatter light uniformly in both
forward and backward directions. As the particle size approaches and exceeds the wavelength of
the incident light, more light is transmitted in the forward direction. Because of this intensity
pattern, the angle at which the light is measured is a critical factor; the current international
standards (eg, USEPA Method 180.1 and ISO 7027) have determined the most appropriate angle
to be 90 degrees.
Turbidimeters with scattered light detectors at 90° to the incident beam are called
nephelometers. Hach instruments satisfy USEPA Method 180.1. The Great Lakes Instrument
model Accu4 Turbidity system operates in accordance with GLI Method 2 and ISO 7027 (1999).
Guidelines for Drinking-water Quality Management for New Zealand 2013
289
Two types of online turbidimeters may have the capability to measure low levels of turbidity:
conventional, and laser turbidimeters. Conventional turbidimeters typically use a tungsten lamp
or other light-emitting diode (LED) as a light source; laser turbidimeters use a laser light source.
Because laser turbidimetry is a relatively new technique, its effectiveness as a monitoring tool is
still being evaluated. Recent research indicates that laser turbidimeters are more sensitive than
conventional turbidimeters and may perform comparably to particle counters. Manufacturer
specifications indicate that laser turbidimeters may have increased sensitivity in excess of two
orders of magnitude over conventional turbidimeters. There are cases where laser turbidimeters
are measuring drinking-water quite successfully at less than 0.05 NTU; conventional
turbidimeters become increasingly difficult to maintain reliability below 0.20 NTU. Some laser
turbidimeters measure in mNTU (milliNTU). Laser turbidimetry is covered by USEPA Method
10133.
Research indicates laser turbidimeters can be optimised to measure very low turbidities. Since
most microfiltration and ultrafiltration systems produce filtrate water consistently in the range
of 0.03 to 0.07 NTU as measured by conventional turbidimeters, laser turbidimeters or particle
counters (see section 8.2.2.2) may be better suited for monitoring membrane filtrate.
Currently, there are four USEPA-approved analytical methods for the measurement of turbidity.
These are as follows:
•
USEPA Method 180.1, Determination of Turbidity by Nephelometry (USEPA 1993a)
•
Great Lakes Instrument Method 2 (called GLI 2 or USEPA Method 180.2) (USEPA 1993b)
•
Hach FilterTrak Method 10133 (also called USEPA Method 10133) (USEPA 2002)
•
Standard Method 2130B, Turbidity – Nephelometric Method (American Public Health
Association (APHA 2005, Method 2130B).
The DWSNZ also allow turbidimeters that comply with ISO 7027 to be used for compliance
monitoring, and any other system approved by the USEPA for drinking-water compliance
monitoring. Some ISO 7027 instruments read turbidity as formazin nephelometric units (FNU)
which are equivalent to NTU although derived from different measurement techniques.
Guidance on the installation, standardisation, operation, and maintenance of online
turbidimeters is usually provided by the manufacturer but is also provided in the Guidance
Manual for Compliance with the Interim Enhanced Surface Water Treatment Rule: Turbidity
Provisions (USEPA 1999). Section 3.2 of DWSNZ covers some general compliance requirements
for continuous monitoring. Appendix A2.4 of the DWSNZ discusses standardisation and
verification of online, bench top and portable turbidimeters.
Light sources
Tungsten lamps fitted with monochromators and filters, diodes and lasers may be used as
sources of monochromatic radiation. However, some older apparatus fitted with tungsten
lamps, but without monochromators or filters, are still in use (polychromatic sources) and,
while the reproducibility of such apparatus may be less than that of apparatus providing
monochromatic radiation, they can be used for the daily control and monitoring of turbidity at
waterworks and treatment plants. Results cannot, however, be compared when using different
apparatus. Lamps used by ISO instruments have light requirements with incident light outputs
of 860 nm and a spectral bandwidth of less than 60 nm. The detector and filter system, if used,
that conform to USEPA 180.1 measure between 400 and 600 nm.
290
Guidelines for Drinking-water Quality Management for New Zealand 2013
Tungsten light sources are generally more sensitive to small particles but sample colour
interferes, particularly as the turbidity increases; LED light sources are not as sensitive to small
particles but are not likely to have colour interferences. However, colour in good quality
drinking-water should have a minimal effect on turbidity measurements. Some laboratory and
portable turbidimeters using the tungsten filament lamp employ a ratio optical system to
compensate for colour.
Bubble trap
Not all process turbidimeters use bubble traps. Some use baffles or positive pressure to reduce
bubbles. Some use vacuum. Bubble traps have varied efficiencies depending on the installation
and the instrument. Bubble traps should not be added to some models; check with the
manufacturer. Generally, a slow flow rate will assist in better bubble removal, and hence more
accurate results. However, slow flow rates are not always ideal, ie, particles may settle out.
Conversely, high flow rates may scour build-up off pipe surfaces.
With reference to the membrane filtration process, turbidimeters are subject to air entrainment
error. Any air bubbles introduced into the system during production, backwashing, chemical
cleaning, or integrity testing may artificially increase the turbidity reading. After backwash or
chemical cleaning (particularly if air is used in the process), turbidity measurements may not be
representative of filtrate quality until any entrained air is purged from the system. This purge
time will vary between different filtration systems and their respective operations. Bubble traps
may be used with conventional and laser turbidimeters to minimise or eliminate this error.
Sampling (ETV 2002)
The method for collecting grab samples shall consist of running a slow, steady stream from the
sample tap, triple-rinsing a dedicated sample beaker in this stream, allowing the sample to flow
down the side of the beaker to minimise bubble entrainment, double-rinsing the sample vial
with the sample, carefully pouring from the beaker down the side of the sample vial, wiping the
sample vial clean, inserting the sample vial into the turbidimeter, and recording the measured
turbidity. In the case of cold water samples that cause the vial to fog preventing accurate
readings, the vial shall be allowed to warm up by partial submersion in a warm water bath for
approximately 30 seconds.
It is possible to have extremely good correlation between online and laboratory turbidimeters.
Correct technique is extremely important with both standardisation and sample measurement.
Sample cells are a well-known source of error. Scratches, stray light, dirt, fingerprints,
orientation, etc contribute to these errors.
Standardisation and verification
Standardisation of turbidimeters (sometimes called calibration) comprises three components:
a)
standardisation (or primary standardisation)
b)
verification (or secondary or check standardisation)
c)
zero calibration (or zero check).
Guidelines for Drinking-water Quality Management for New Zealand 2013
291
Primary standardisation
Bench top turbidimeters may be used for compliance testing of manual samples in laboratories
recognised by the Ministry of Health. The turbidimeter must be standardised and used
according to the conditions of their accreditation. Otherwise standardisation of bench top and
portable instruments should be performed to manufacturer’s recommendations. With the
availability of stabilised formazin standards (StablCal) any errors from making or diluting
standards have been reduced significantly. However, care of the cells still must be observed
rigidly.
Appendix A2.4 of the DWSNZ states that standardisation of bench top turbidimeters used as
field instruments, and portable and online turbidimeters must be undertaken by personnel
approved to do so by the DWA, and in accordance with the instrument manufacturer’s specified
procedures and frequency or three-monthly whichever is more frequent. Standardisation must
be performed using traceable standards such as StablCal (Hach) or PrimeTime (HF Scientific)
(or other MoH-approved stabilised formazin preparation); or AMCO-AEPA-1 styrene
divinylbenzene microsphere suspensions (Advanced Polymer Systems). Alternatively, userdiluted formazin preparations may be used provided:
1
the calibration point is 20 NTU or greater
2
the 4000 NTU formazin preparation is obtained from a quality certified manufacturer
3
the dilution is done immediately prior to use for calibration.
Re field testing, the quality assurance procedures associated with standardisation and
verification must be approved by the DWA.
The StablCal, PrimeTime (both stabilised formazin preparations) and AMCO-AEPA-1 standards
can only be used before their expiry dates, where applicable.
The user-diluted standard must be made from a stock 4000 NTU formazin standard diluted
with low turbidity water. ‘Turbidity-free’ water should be prepared as specified by the
instrument manufacturer, or by APHA (2005), or in the method of determination being
followed. Formazin is the only standard that can be prepared reproducibly from traceable raw
materials. The particle size distribution of formazin is 0.01 to 10 μm, similar to the particle size
distribution found in most natural water samples.
Only the use of formazin, stabilised formazin, and styrene divinylbenzene (SDVB) are accepted
for reporting by the USEPA. SDVB standards are microscopic beads with narrow size
distribution. However, due to the mono-dispersed nature of the size distribution, incident light
may over-scatter into the forward direction and result in inaccurate calibrations. Therefore,
SDVB standards are instrument-specific, and some manufacturers may recommend not using
them.
Why standardise at 20 NTU? (ie, when using formazin suspensions)
To make an accurate low level turbidity standard is extremely difficult and fraught with errors.
The relationship between nephelometric detector response to turbidity is highly linear in the
range of 0 to 40 NTU, if no colour exists or is very low. This linearity requires only two points
for standardising over this range. Criteria are:
•
20 NTU is the midpoint
•
the standard is prepared easily with a high degree of accuracy
292
Guidelines for Drinking-water Quality Management for New Zealand 2013
•
accuracy is maintained from the standard to the lowest measurement levels because the
relationship between nephelometric light scatter and turbidity is linear
•
errors due to stray light are negligible at 20 NTU and do not affect the level of accuracy of the
standard curve.
Verification of process instruments
Verification that the performance of portable instruments has not changed since standardisation
must be carried out daily, or each time the instrument is switched on.
Verification of online turbidimeters must be carried out at least weekly using the manufacturer’s
secondary or check standard. If the instrument reading is outside the limits specified for the
check standard, then that instrument must be restandardised using the standardisation method;
or replaced. Table 8.8 summarises some methods for the standardisation and verification of
turbidimeters.
Check standards have been developed by manufacturers to duplicate light scatter. The devices
are instrument specific and are usually traced back to formazin. Their use for standardisation is
not acceptable for compliance reporting by the USEPA (or in the DWSNZ), ie, must not be used
for standardisation. However, they can be used for verification, ie, QA purposes.
Table 8.8: Summary of some methods for standardisation and verification of
turbidimeters
Standard
Type
Particle size
Comments
User prepared formazin
4000 NTU
Standardisation 0.01 to 10.0 μm May be difficult to provide traceability. Lower dilution
limit is 2 NTU.
Commercially prepared
formazin 4000 NTU
Standardisation 0.01 to 10.0 μm Test performed by manufacturer to ensure complete
reaction and therefore able to provide traceability.
Lower dilution limit is 2 NTU.
Stabilised formazin – eg, Standardisation 0.01 to 10.0 μm Traceable. Standards ready to use down to
StablCal, PrimeTime
or verification
0.10 NTU.
SDVB
Standardisation 0.1 to 1 μm
or verification
Not recommended for standardisation. Can be used
for verification below 1.0 NTU. Instrument specific.
Optomechanical check
standard
Verification
Instrument specific standards. Verification down to
0.5 NTU.
NA
These devices, along with other check standards (eg, latex or gel) need to be referenced back to a
formazin standard on a regular basis. It is recommended to standardise each turbidimeter, then
to insert the check standard and record the value for the specific turbidimeter. Check standards
need to be reassigned new values after each comparison with the standard because they may
have become scratched; also the lamp of the optical system in the instrument may have
changed, eg, due to dust or lamp aging.
Verification that the performance of the instrument has not changed since standardising must
be carried out on
•
online turbidimeters: weekly, or after any interruption to continuous reading
•
manual turbidimeters: daily, or each time it is switched on.
If the value displayed is outside the specified limits (commonly ±10 percent) of the reference
value previously established from standardisation, eg, outside by more than 1.98 NTU of a
reference value of 19.8 NTU, then a new primary calibration is needed.
Guidelines for Drinking-water Quality Management for New Zealand 2013
293
Verification of laboratory and field instruments
If a check standard is not available, the instrument should be verified against a laboratory
instrument that has been standardised recently. This will involve checking the range of
turbidities that the samples will fall into. This is not the preferred method because section 5
(page 2–11) of APHA (2005) says to report to the nearest 0.05 NTU if the turbidity is in the 0–1
NTU range; this is not very practical, particularly if the water being monitored has a turbidity
less than 0.20 NTU. However, ISO 7027 allows readings less than 0.99 NTU to be reported to
the nearest 0.01 NTU, which is more appropriate for the newer instruments.
Zero check
Some instruments automatically ‘fix’ the zero point. This is sometimes done by turning off the
incident light. Others require a ‘turbidity-free’ blank. Although it is impossible to produce
‘turbidity-free’ water, it is possible to produce a blank with a turbidity of about 0.02 NTU; when
standardising with 20/40 NTU formazin the small error at the zero point is insignificant.
Current thinking is the zero turbidity is impossible; due to molecular scatter in particle-free
water, the lowest turbidity is thought to be 0.010 to 0.015 NTU.
Monitoring and reporting
The DWSNZ stipulate that for online monitors:
•
the signal averaging time is to be one minute or less
•
where discrete readings are recorded, the interval between readings is not to be more than
one minute.
The previous paragraph discusses rounding of readings from manual instruments. Online
instruments will report the readings ‘as is’ where there is a maximum turbidity that must never
be exceeded. Where compliance is related to the percent of time, turbidimeters will report the
percent of time the turbidity is non-complying, rather than absolute values.
Ensure that the analyser 4–20 mA output scan matches the PLC/SCADA digital output. Check
that the SCADA display mimics the instrument display. See Chapter 17: Monitoring,
section 17.4.4 for a discussion on reporting results and storing data, and section 17.6 for how to
compare test results against a MAV or operational requirement.
Turbidity measurement
With practice, it is not unrealistic to expect reasonably good correlation between most good
laboratory turbidimeters and process turbidimeters if based on the same primary standard.
When measuring low-level turbidity in the laboratory, care must be taken to eliminate errors.
Sample cells are the biggest source of error in turbidity measurements. Cells should be matched
or indexed, clean, and scratch free. The appropriate silicon oil should be applied to mask any
scratches that the eye cannot see.
Bubbles can be removed by applying a vacuum to the cell. A syringe with a rubber bung is an
easy solution to this.
To avoid any problems with condensation on the cell, allow the sample to reach room
temperature, this will also assist with bubbles dissipating.
294
Guidelines for Drinking-water Quality Management for New Zealand 2013
Measurement of low level turbidity
An important aspect of awarding additional protozoa removal credits for lower finished water
turbidity is the performance of turbidimeters when measuring turbidity below 0.3 NTU or even
below 0.10 NTU. The following paragraphs from the proposed LT2ESWTR (USEPA 2003a)
summarise results from several studies that evaluated low level measurement of turbidity by
different online and bench top instruments. The USEPA believes that results from these studies
indicate that currently available turbidity monitoring equipment is capable of reliably assessing
turbidity at levels below 0.10 NTU, provided the instruments are approved for compliance
monitoring, are well-calibrated and well-maintained.
A performance evaluation (USEPA 1998), was carried out to address concern regarding the
ability to reliably measure low turbidity levels. The study involved distribution of different types
of laboratory-prepared standard solutions with reported turbidity values of 0.150 NTU or
0.160 NTU. The data indicated a positive bias for all instruments when compared against a
reported true value. Online instruments in this study had a larger positive bias and higher
standard deviation (RSD approximately 50 percent). The positive bias is consistent with
previous studies (USEPA 1998) and suggests that error in turbidimeter readings may be
generally conservative (ie, water supplies will operate at lower than required filtered water
turbidity levels).
Letterman et al (2001) evaluated the effect of turbidimeter design and calibration methods on
inter-instrument performance, comparing bench top with online instruments and instruments
within each of those categories from different manufacturers. Reported filtered water turbidity
values ranged from 0.05 to 1.0 NTU. The results were consistent with those of the earlier study,
specifically the positive bias of online instruments. Letterman et al found generally poor
agreement among different online instruments and between bench-top and online instruments.
The authors observed that results were independent of the calibration method, though certain
experiments suggested that analyst experience might have had some effect on turbidity readings
from bench-top instruments.
Sadar (1999) conducted an intra-instrument study of low level turbidity measurements among
instruments from the same manufacturer. This study was performed under well-controlled
laboratory conditions. Intra-instrument variation among different models, and between bench
top and online instruments, occurred but at significantly lower levels than the Letterman et al
inter-instrument study. Newer instruments also tended to read lower than older instruments,
which the author attributed to a reduction in stray light and lower sensitivities in the newer
instruments. Sadar also found a generally positive bias when comparing online with bench-top
and when comparing all instruments with a prepared standard.
The American Society for Testing and Materials (ASTM) has issued standard test methods for
measurement of turbidity below 5 NTU by online (ASTM 2001) and static (ASTM 2003)
instrument modes. These standards are not used very often in New Zealand. The methods
specify that the instrument should permit detection of turbidity differences of 0.01 NTU or less
in waters having turbidities of less than 1.00 NTU (ASTM 2001) and 5.0 NTU (ASTM 2003),
respectively. Inter-laboratory study data included with the method for a known turbidity
standard of 0.122 NTU show an analyst relative deviation of 7.5 percent and a laboratory relative
deviation of 16 percent (ASTM 2003).
Guidelines for Drinking-water Quality Management for New Zealand 2013
295
In summary, the data collected in these studies indicate that currently available monitoring
equipment can reliably measure turbidity at levels of 0.10 NTU and lower. This requires
rigorous standardisation and verification procedures, as well as diligent maintenance of
turbidity monitoring equipment (Burlingame 1998, Sadar 1999). Systems that pursue additional
protozoal credit for lower finished water turbidity must develop the procedures necessary to
ensure accurate and reliable measurement of turbidity at levels of 0.10 NTU and less.
8.6.2.2 Particle counting and particle monitoring
A simple conversion factor relating particle counting and turbidity measurements is not possible
because the two techniques differ fundamentally in terms of discernment. Particle counting
measures two characteristics of particulates: numbers of particles and particle size. Samples
with identical clarity can be distinguished on the basis of these two features; one sample may
contain many small particles, whereas another may contain few large particles. Turbidity, on the
other hand, cannot distinguish between two samples of identical clarity and different particulate
composition.
Online particle counters use a laser-based light scattering technique to count particles and
group them according to size.
Particle monitors also operate on the principle of light obstruction; however, rather than
counting particles and grouping them by size, particle monitors measure particulate water
quality on a dimensionless scale relative to an established baseline. The instrument measures
fluctuations in intensity of a narrow light beam that is transmitted through the sample. The
monitor does not count particle sizes, but provides an index (ranging from 0 to 9999) of the
water quality. No calibration is required for this instrument since the output is a relative
measurement of water quality. The potential advantages of this monitor are its low cost and ease
of operation compared with particle counters, but little information has been published
regarding the use of particle monitors in potable water treatment applications.
Particle counters convey information about particle size. Any significant increase in the number
of particles exceeding 3 micrometres may indicate that a breach may have occurred allowing the
passage of Cryptosporidium oocysts. Any particle counters that are used for the purpose of
filtrate monitoring to satisfy the continuous indirect integrity monitoring requirement should be
calibrated to detect particles in the size range of Cryptosporidium oocysts (ie, 3 to 8
micrometres).
Although Adham et al (1995) determined that particle counting was the most sensitive of the
three common methods of continuous indirect integrity monitoring (ie, conventional turbidity
monitoring, particle counting, and particle monitoring), particle counting instruments have a
number of well-established operational problems that potentially can distort both the accuracy
and precision of their measurements.
Either particle counting or particle monitoring may be used for compliance with the continuous
indirect integrity monitoring requirements of the DWSNZ, ie, for membrane, cartridge or bag
filtration. They can also substitute for monitoring the turbidity of water leaving a filter where a
coagulation process is used. Due to the range of instruments, and the small amount of use of the
technique in New Zealand, details have not been prescribed in the DWSNZ. Water suppliers
wishing to use particle counting need to submit their monitoring plan to the DWA for approval.
296
Guidelines for Drinking-water Quality Management for New Zealand 2013
Both the International Organisation for Standardisation (ISO) and the American Society for
Testing and Materials (ASTM) have published standards relating to particle counting
techniques, as follows:
•
ISO 11500 – Hydraulic fluid power – Determination of particulate contamination by
automatic counting using the light extinction principle (1999)
•
ISO 11943 – Hydraulic fluid power – Online automatic particle counting systems for liquids –
Methods of calibration and validation (1999)
•
ASTM F658-00a – Standard Practice for Calibration of a Liquid-Borne Particle Counter
Using an Optical System Based Upon Light Extinction (2000).
In addition, there are some relevant references on the use of particle counters in water
treatment applications that may serve as a useful source of additional information:
•
Fundamentals of Drinking Water Particle Counting. AWWARF 2000
•
Particle Count Method Development for Concentration Standards and Sample Stabilisation.
American Water Works Association Research Foundation, AWWARF 2000.
The former publication was prepared by Erica Hargesheimer, and is available from NZWWA. It
is very thorough and covers virtually everything that is needed for deciding whether to use
particle counting, and how to use it, once installed.
Advantages of particle counting and particle monitoring are generally similar, and include the
following:
•
more sensitive to smaller integrity breaches than conventional turbidimeters
•
widely used in surface water treatment plants (particle counters)
•
absolute (as opposed to relative) measure of water quality (particle counters)
•
ability to yield information regarding test resolution (particle counters).
Limitations of particle counting and particle monitoring include:
•
imprecision between instruments at low particulate concentrations
•
susceptible to air entrainment error
•
susceptible to coincidence and clogging error at higher particle concentrations
•
more expensive instrumentation (particle counters) than conventional turbidimeters
•
only relative measure of water quality (particle monitors)
•
more operation and maintenance support needed than conventional turbidimeters.
Online particle sensors must have capabilities for measurement of particles as small as
2 microns and have a coincidence error of less than 10 percent. The resolution should be at least
10 percent at 10 microns. Flow control shall be within 5 percent of the designed rate.
The particle counter must be delivered to the water supplier pre-standardised, and subsequent
standardisations shall be performed at least every 18 months. The particle counter manufacturer
or an independent third party shall provide data and methods that the online particle sensors
meet these criteria.
APHA (2005) includes method 2560 on the use of particle counters and size distribution, with a
section on quality control.
Guidelines for Drinking-water Quality Management for New Zealand 2013
297
Two time intervals are important with online particle counters:
•
the count time or interval, which is the time the instrument takes to analyse a sample
•
the count frequency, which is the time between samples.
Count times are typically 30–60 seconds, and count frequencies are typically 5–6 minutes
(AWWARF 2000a).
There are many variables related to particle counters. Ideally the following should be satisfied:
•
the particle counter is an optical particle counter
•
standardisation to be done in accordance with manufacturer’s specification but not less than
every 18 months
•
resolution to be 10 percent at 10 microns or better
•
sensitivity to be 2 microns
•
range from 2 to 400 microns
•
flow cell size must be not less than 700 microns
•
type of flow cell must be of the volumetric type
•
coincidence to be 10 percent at 16,000 particles per mL
•
must be operated with an appropriate quality assurance programme
•
minimum two channel (most affordable for small systems)
•
flow control shall be at manufacturer’s specified flow rate ±5 percent or better
•
plumbed to minimise interference due to air bubbles
•
must be properly maintained (cleaned) to manufacturer’s specification
•
must replace with approved tubing annually.
8.6.2.3 Direct integrity test (membrane filtration)
Because it is impractical to use Cryptosporidium testing of membrane filtration plants to
demonstrate compliance with the protozoal MAV in the DWSNZ, a performance requirement is
used instead. Direct integrity testing is described in the Membrane Filtration Guidance Manual
(USEPA 2003c, 2005a), in Chapter 4 (50 pages). However, the supplier or manufacturer of the
plant, in accordance with the conditions of their validation, will dictate the test to be used by any
membrane filtration plant owner.
The following has been taken from the introduction to direct integrity testing in the Membrane
Filtration Guidance Manual.
In order for a membrane process to be an effective barrier against pathogens and other
particulate matter, the filtration system must be free of any leaks or defects resulting in an
integrity breach. Thus, it is critical that operators are able to demonstrate the integrity of this
barrier on an ongoing basis during system operation. Direct integrity testing represents the
most accurate means of assessing the integrity of a membrane filtration system that is currently
available.
298
Guidelines for Drinking-water Quality Management for New Zealand 2013
A direct integrity test is defined as a physical test applied to a membrane unit in order to identify
and isolate integrity breaches. The removal efficiency of a membrane filtration process must be
verified routinely during operation using direct integrity testing. This must be applied to the
physical elements of the entire unit, including membranes, seals, potting material, associated
valves and piping, and all other components which could result in contamination of the filtrate
under compromised conditions.
There are two general classes of direct integrity tests commonly used in membrane filtration
facilities: pressure-based tests and marker-based tests. The pressure-based tests are based on
bubble point theory and involve applying a pressure or vacuum to one side of a membrane
barrier and monitoring for parameters such as pressure loss or the displacement of air or water
in order to establish whether an integrity breach is present.
The various pressure-based tests include the pressure- and vacuum decay tests, the diffusive
airflow test, and the water displacement test.
Marker-based tests use either a spiked particulate or molecular marker to verify membrane
integrity by directly assessing removal of the marker, similar to a challenge test.
The direct integrity test used must meet the specified performance criteria for resolution,
sensitivity, and frequency. Thus, a water supply may use an appropriate pressure- or markerbased test or any other method that both meets the performance criteria and is approved by the
state. The performance criteria for direct integrity tests are summarised as follows:
•
resolution: the direct integrity test must be responsive to an integrity breach of the order of
3 micrometres or less
•
sensitivity: the direct integrity test must be able to verify a log removal equal to or greater
than the removal credit awarded to the membrane filtration
•
frequency: a direct integrity test must be conducted on each membrane unit at a frequency
of no less than once every 24 hours of operation.
In addition to the performance criteria, the rule also requires the establishment of a control
limit for the direct integrity test that is indicative of an integral membrane unit capable of
achieving the removal credit awarded. If the results of the direct integrity test exceed this limit,
the rule requires that the affected membrane unit be taken offline for diagnostic testing and
repair.
See Chapter 14: Filtration Processes, section 14.4.5 for a discussion on some operational aspects
of direct integrity testing.
8.6.2.4 Indirect integrity testing (continuous)
Used for membrane, bag and cartridge filtration. Note that bag and cartridge filtration does not
currently undergo direct integrity testing for compliance with DWSNZ.
The various indirect integrity monitoring methods are not physical tests applied specifically to a
filter, but involve monitoring some aspect of filtrate quality as a surrogate measure of integrity.
These are discussed in the 17 pages of Chapter 5 of the Membrane Filtration Guidance Manual
(USEPA 2003c, 2005a). The information therein also applies to bag and cartridge filtration.
Guidelines for Drinking-water Quality Management for New Zealand 2013
299
Most direct integrity tests require the membranes to be taken offline, so direct tests are limited
to periodic application. A failed direct integrity test indicates that an integrity breach occurred at
some time between the most recent direct test in which integrity has been verified and the failed
test, but indicates nothing about integrity over the period between direct test applications.
Currently, continuous direct integrity monitoring techniques are not available.
Continuous indirect integrity testing is defined as monitoring some aspect or component of
filtrate quality that is indicative of the removal of particulate matter. The DWSNZ specify
turbidity monitoring of the filtrate as the default methodology for continuous indirect integrity
monitoring.
Turbidity was selected because it is an accepted monitoring technology within the water
treatment industry, and it is used as both a relative and an absolute indicator of water quality.
However, because particle counting is more sensitive than turbidity monitoring, the DWSNZ
contain a provision to allow it as an alternative.
Other surrogate measures of integrity are possible, such as dissolved solids or conductivity for
the indirect integrity monitoring requirements for nanofiltration and reverse osmosis.
This chapter of the Membrane Filtration Guidance Manual is divided into the following sections:
•
Section 5.2: Turbidity monitoring
•
Section 5.3: Particle counting and particle monitoring
•
Section 5.4: Other indirect monitoring surrogates
•
Section 5.5: Data analysis and reporting.
Note that a non-continuous indirect method (eg, a silt density index (SDI) test) has limited value
for integrity monitoring, offering neither the ability to directly test the membranes nor the
advantage of online monitoring. As a result, non-continuous indirect methods do not satisfy the
indirect integrity monitoring requirements of the LT2ESWTR and are not addressed in this
chapter.
8.6.2.5 C.t values for chlorine dioxide and ozone
The C.t value is the residual concentration (C) of disinfectant (mg/L) multiplied by the hydraulic
residence time (in t minutes). The origin of the concept is discussed in Chapter 15: Disinfection,
section 15.2.1. Refer to Chapter 15: Disinfection, section 15.2.9 for a discussion about contact
tanks, mixing conditions and measuring the residence time.
The procedure for measuring the hydraulic residence time (t) is described in Chapter 15:
Disinfection, section 15.2.9.
For chlorine dioxide (ClO 2 ), this involves continuous measurement of the residual chlorine
dioxide (measured as ClO 2 ) at a predetermined sample site, somewhere upstream of the first
consumer. When used for protozoal compliance, the test method must not include FAC. The
time the ClO 2 has been in contact with the water up to that point is calculated. The product of
the two must exceed the value in the C.t table for the relevant water temperature. If the contact
tank also acts as a storage tank, the contact time must be adjusted for when the tank is not full.
For ozone, this involves continuous measurement of the residual ozone at a point in the ozone
contactor that has been validated by challenge testing as being able to achieve the required level
of inactivation (log credit) of test organisms.
300
Guidelines for Drinking-water Quality Management for New Zealand 2013
Chapter 11 of the review draft LT2ESWTR Toolbox Guidance Manual (USEPA 2009) discusses
issues related to the measurement of contact time in different types of ozone generators and
contactors.
If no tracer study data are available for determining t, the USEPA recommends using the
continuous stirred tank reactor (CSTR) approach or the Extended-CSTR approach. The t 10 /t
ratios are based on baffle characteristics from hydraulic studies of clearwells and basins.
The guidance manual presents three methods for calculating C.t:
1
t 10 : calculates C.t through a contactor assuming hydraulic conditions similar to plug flow
and can be used with or without tracer study data; t 10 is the time it takes for 90 percent of
the water to pass the contactor. Even in well-baffled contactors, the t 10 is most often less
than 65 percent of the average hydraulic detention time through the contactor, and
generally underestimates the true C.t achieved.
2
CSTR: calculates log inactivation credit using hydraulic detention time. It is applicable to
contactors that experience significant back mixing or when no tracer study data are
available.
3
Extended CSTR: a combination of the CSTR and segmented flow analysis approaches. It
uses the hydraulic detention time for the contact time and incorporates the ozone decay
rate to calculate concentration. It is not applied to chambers into which ozone is
introduced.
These methods differ in the level of effort associated with them and, in general, the ozone dose
required to achieve a given level of inactivation. Selecting the appropriate method(s) to use
depends on the configuration of the ozone contactor and amount of process evaluation and
monitoring that a water supplier wishes to undertake. Combinations of two or more methods
may also be used. For example, contactors with multiple segments may have one or two
segments with their C.t calculated using either the t 10 or CSTR methods, while the C.t for the
remaining segment is calculated using the Extended-CSTR approach. The t 10 and CSTR are the
simplest methods.
The sample site and collection requirements are described in USEPA (2009).
The ozone contactor will have been rated in validation tests by the manufacturer or supplier for
the water being treated, so the flow, ozone concentration at the monitoring point, and the
general operating conditions must be maintained at no worse than the conditions that applied in
the validation.
The product of the flow and residual must exceed the value in the C.t table for the relevant water
temperature and target log credit.
8.6.2.6 UV intensity measurement
UV intensity sensors are photosensitive detectors that measure the UV intensity at a point within
the UV reactor. Sensors are used to indicate dose delivery by providing information related to UV
intensity at different points in the reactor. The measurement responds to changes in lamp output
due to lamp power setting, lamp aging, lamp sleeve aging, and lamp sleeve fouling. Depending on
sensor position, UV intensity sensors may also respond to changes in UV absorbance of the water
being treated. UV intensity sensors are composed of optical components, a photodetector, an
amplifier, a housing, and an electrical connector. The optical components may include monitoring
windows, light pipes, diffusers, apertures, and filters. Monitoring windows and light pipes are
designed to deliver light to photodetector. Diffusers and apertures are designed to reduce the
Guidelines for Drinking-water Quality Management for New Zealand 2013
301
amount of UV light reaching the photodetector, thereby reducing sensor degradation that is
caused by UV energy. Optical filters are used to modify the spectral response such that the sensor
only responds to germicidal wavelengths (200 to 300 nm).
The proposed Ultraviolet Disinfection Manual (USEPA 2003d) stated UV appliances with MP
lamps should be equipped with one UV intensity sensor per lamp. USEPA (2006c) requires a
minimum of one UV sensor per UV reactor; the actual number should be identical to the UV
reactor that was, or will be, validated.
UV sensors are photosensitive detectors that measure UV intensity. UV sensors used in
drinking-water UV applications, particularly those with MP or other polychromatic lamps,
should be germicidal. Germicidal sensors are defined as having the following properties:
•
a spectral response (ie, UV intensity measured at various wavelengths) that peaks between
250 and 280 nanometers (nm)
•
less than 10 percent of its total measurement is due to light above 300 nm when mounted on
the UV reactor and viewing the UV lamps through the water that will be treated.
Section 5.16.3(2) of DWSNZ specify the requirements for calibration of UV intensity sensors.
Further information is included in section 8.4.4.3 (this chapter). Annex A and B of ÖNORM
(2001) also cover sensor requirements.
8.6.2.7
Pressure differential
There are no generally accepted direct integrity tests for bag and cartridge filtration. The best
technique for assessing the efficiency in removing particles the size of protozoa is particle
counting; at present this is difficult and expensive. The commonest indirect integrity test,
turbidity, is often of little practical value in demonstrating bag and cartridge compliance
because:
•
often the raw water has a very low turbidity, for example spring water with a turbidity of say
0.4 NTU. So an operational requirement of 0.5 NTU for compliance purposes could be
achieved even if the filter is ruptured or being by-passed!
•
some raw waters have a high turbidity due to very fine particles, for example due to silica
from glacial flour. These particles may be 10–100 times smaller than protozoa, and will
mostly pass through bag or cartridge filters, so are not a good indicator of the filter’s ability to
remove protozoa.
Therefore another operational requirement is needed to show whether the filter is likely to be
performing satisfactorily. Experience has shown that monitoring the pressure differential across
the filter housing is a good indicator of performance. The gauge readings and electrical signals
from differential pressure transmitters can be verified by using a pressure meter. See
Chapter 14: Treatment Processes, Filtration, Figure 14.4 for a recommended layout.
Sections 5.12 and 5.13 of DWSNZ specify the pressure differential monitoring requirements.
8.6.2.8 Microscopic particulate analysis
DWSNZ 2000 included microscopic particulate analysis (MPA) as a method for assessing the
efficacy of cartridge, bag, diatomaceous earth and slow sand filters in removing protozoa. The
test has not been mentioned in subsequent DWSNZ. However, MPA can still be a useful tool,
either quantitatively or qualitatively. A good description of its usefulness appears in a paper by
Hancock (1999).
302
Guidelines for Drinking-water Quality Management for New Zealand 2013
The methodology is described in Vasconcelos (1992), Harris et al (1996), Hancock (1999).
Analysing raw and filtered samples allows the log reduction of organisms to be calculated.
Large volumes, say 2000 litres, of sample can be passed through a filter to trap the organisms.
This can be arranged various ways, such as monitoring an entire filter run.
8.7 Transgressions
8.7.1
Response
Refer to Chapter 17: Monitoring, section 17.6 for a discussion on how to compare a test result
against a MAV or operational requirement.
In the DWSNZ, Figure 5.1: Response to turbidity transgression in water after treatment, and
Figure 5.2: Response to disinfectant (chlorine dioxide, ozone, UV) transgression in drinkingwater leaving a treatment plant, clarify at what stage the DWA is to be consulted, and at what
stage routine monitoring can be resumed.
The DWSNZ state that well-managed water supplies will have established a control limit for
each MAV or operational requirement. Control limits are discussed in Chapter 17: Monitoring.
The preventive actions that are to be considered when a control limit is approaching or reached
are to be documented in the PHRMP. The purpose of control limits and the preventive actions is
to avoid reaching the transgression level of the MAV or operational requirement, thereby
reducing the risk of non-compliance.
Table 5.2 (in DWSNZ) lists the protozoal inactivation or removal processes. Operational aspects
of each of these are discussed in Chapters 12–15 of the Guidelines, where relevant. These
chapters offer water suppliers some guidance relating to preventive and remedial actions that
may be appropriate for inclusion in their PHRMPs. However, there are so many different raw
water qualities, treatment processes, modes of operation, and staffing arrangements that it is
not possible to cover all contingencies in these Guidelines.
Water suppliers’ PHRMPs must also document planned responses to events other than failing to
satisfy the criteria in the DWSNZ that will obviously lead to a protozoal transgression or noncompliance. These will tend to be supply-specific but will include matters such as dealing with
emergencies such as power cuts, earthquakes and floods, as well as running out of coagulant or
disinfectant, failure of the filtration or disinfection system, disinfection demand exceeding the
maximum dose rate, labour problems, breach of security, spills of wastewater or other
contamination. Apart from the obvious, these situations may be detected during catchment
assessments (ie, what affects the source water), sanitary inspections (of the water supply), or
bore head protection inspections.
Some general comments may offer helpful advice for when a control limit is reached or when a
transgression occurs:
•
unusual weather or water temperatures may have caused raw water conditions to change to
the extent that the treatment process is no longer effective (includes algal growth)
•
upstream activities such as discharges, gravel extraction, or changes in land use may have
modified the raw water quality
•
the raw water is possibly being extracted from the wrong depth or position
•
screening or other pretreatment may need inspection or maintenance
•
bore head protection or screening may have failed
Guidelines for Drinking-water Quality Management for New Zealand 2013
303
•
recycled wash water or sludge supernatant may be affecting the treatment process
•
new plant may not have been commissioned correctly
•
the water demand may be too high for parts of the treatment process to cope
•
various components of the treatment process may require more maintenance or replacement
•
the flow through the plant may be unbalanced resulting in some components being
overloaded
•
the dose of a coagulant, polyelectrolyte, diatomaceous earth, disinfectant or UV may be
incorrect
•
hydraulic operations may cause a surge through filters, which may then discharge particles
•
chemical supplies, spares, and other consumables may be purchased or delivered too late
•
monitoring equipment may not be standardised correctly, or may be inappropriate
•
water storage tanks may have been interfered with, or have cracked/split
•
alarm systems, or the response to them, may be inadequate
•
there may be insufficient back up to cover unusual events
•
staff may need further training
•
a standby electricity generator may be needed
•
water may need to be diverted to waste briefly.
Further information is available in the Ministry of Health’s Public Health Risk Management
Plan Guides that can be accessed on www.moh.govt.nz/water and clicking on Publications, then
on Public Health Risk Management Plans, then selecting the relevant Guide. These have been
referenced in the treatment chapters.
8.7.2
Reporting
Section 13 of the DWSNZ lists the general compliance criteria for records.
All compliance monitoring programmes must be documented, giving details of sample sites,
sample collection techniques, sample handling or storage, tests to be conducted, methods used,
times and dates. Any variations to the programme or procedure should be noted.
Details relating to instrument calibration, maintenance, and replacements should be recorded.
Results from participating in interlaboratory testing programmes should be retained, along with
reports of investigations into the cause of any unsatisfactory test results.
Water suppliers will need to consider how to store test results, particularly those generated by
online instruments. A lot of information is generated a year! Ideally, online instruments will
only report transgressions, or the percent of time (or samples) that comply. Guidance is offered
in Chapter 17: Monitoring: section 17.4.4.
304
Guidelines for Drinking-water Quality Management for New Zealand 2013
References
Adham SS, Jacangelo JG, Laine J-M. 1995. Low-pressure membranes: assessing integrity. J AWWA
87(3): 62.
AS/NZS 4348. 1995. Water Supply – Domestic type water treatment appliances – Performance
requirements.
AS/NZS 3497. 1998 (updated 2001). Drinking water treatment units – Plumbing requirements.
APHA et al. 2005. Standard Methods for the Examination of Water and Wastewater (21st edition).
Washington, DC: American Public Health Association, American Water Works Association, and
Water Environment Federation.
ASTM. 2000. F 658-00a – Standard Practice for Calibration of a Liquid-borne Particle Counter
using an Optical System Based upon Light Extinction. West Conshohocken, PA: American Society
for Testing and Materials.
ASTM. 2001. Standard Test Method for Online Measurement of Turbidity below 5 NTU in Water.
D–6698–01. West Conshohocken, PA: American Society for Testing and Materials.
ASTM. 2003. Standard Test Method for Determination of Turbidity below 5 NTU in Static Mode.
D–6855–03. West Conshohocken, PA: American Society for Testing and Materials.
AWWARF. 2000a. Fundamentals of Drinking Water Particle Counting. American Water Works
Association Research Foundation. American Water Works Association. Prepared by Erica
Hargesheimer.
AWWARF. 2000b. Particle Count Method Development for Concentration Standards and Sample
Stabilisation. American Water Works Association Research Foundation. ISBN 1583210423,
American Water Works Association.
Bouchier I. 1998. Cryptosporidium in Water Supplies. Third Report of the Group of Experts to:
Department of the Environment, Transport and the Regions & Department of Health (UK).
http://dwi.defra.gov.uk/research/bouchier/index.htm
Bukhari Z, Hargy T, Bolton J, et al. 1999. Medium-pressure UV for oocyst inactivation. J AWWA
91(3): 86–94.
Burlingame G, Pickel M, Roman J. 1998. Practical applications of turbidity monitoring. J AWWA
90(8): 57–69.
Clancy J, Bukhari Z, Hargy T, et al. 1998. Inactivation of Cryptosporidium Parvum Oocysts using
Medium-pressure Ultraviolet Light in Bench and Pilot Scale Studies. San Diego, CA: Proceedings
1998 American Water Works Association Water Quality Technology Conference, November 1–4.
Clark R, Sivagensan M, Rice E, et al. 2002. Development of a C.t equation for the inactivation of
Cryptosporidium oocysts with ozone. Wat Res 36: 3141–9.
Clark R, Sivagensan M, Rice E, et al. 2003. Development of a C.t equation for the inactivation of
Cryptosporidium oocysts with chlorine dioxide. Wat Res 37(11): 2773–83.
Connell K, Rodgers C, Shank-Givens H, et al. 2000. Building a better protozoa data set. J AWWA
92(10): 30–43.
Cornwell D, LeChevallier M. 2001. Treatment Options for Removal of Giardia, Cryptosporidium,
and Other Contaminants in Recycled Backwash Water. Denver, CO: American Water Works
Association Research Foundation.
Craik S, Finch G, Bolton J, et al. 2000. Inactivation of Giardia muris cysts using medium-pressure
ultraviolet radiation in filtered drinking water. Wat Res 34(18): 4325–32.
Guidelines for Drinking-water Quality Management for New Zealand 2013
305
Dugan N, Fox K, Owens J, et al. 2001. Controlling Cryptosporidium oocyts using conventional
treatment. J AWWA 93(12): 64–76.
DVGW W294. Deutsche Vereinigung des Gas- und Wasserfaches (German Association for Gas and
Water). This is the German standard for UV disinfection.
DWI. 2010. Guidance on the Use of Ultraviolet Irradiation for the Disinfection of Public Water
Supplies. 16 pp. Drinking Water Inspectorate, DEFRA, London.
http://www.dwi.gov.uk/stakeholders/guidance-and-codes-of-practice/
Emelko M, Huck P, Slawson R. 1999. Design and Operational Strategies for Optimising
Cryptosporidium Removal by Filters. Proceedings of 1999 Water Quality Technology Conference.
Denver, CO: American Water Works Association.
Emelko M, Huck P, Douglas I, et al. 2000. Cryptosporidium and Microsphere Removal During Low
Turbidity End-of-run and Early Breakthrough Filtration. Proceeding of Water Quality Technology
Conference. Denver, CO: American Water Works Association.
Enriquez C, Greba C. 1999. Evaluation of the Osmonics Water Purification Filters FGF, FAP and
SXE in their Ability to Remove Cryptosporidium parvum Oocysts. Final Report. Tucson, AZ:
University of Arizona.
EPA/NSF ETV. 2005. Protocol for Equipment Verification Testing for Physical Removal of
Microbiological and Particulate Contaminants. Prepared by NSF International, Ann Arbor, MI for
USEPA. 40 CFR 35.6450. See: http://www.epa.gov/etv/pubs/059205epadwctr.pdf
Fogel D, Isaac-Renton J, Guasparini R, et al. 1993. Removing Giardia and Cryptosporidium by slow
sand filtration. J AWWA 85(11): 77–84.
GLI. No date. GLI Method 2 (ie, re turbidity measurement: USEPA GLI 2). Milwaukee, WI: Great
Lakes International Inc. See USEPA 1999.
Goodrich J, Li S, Lykins B. 1995. Cost and Performance Evaluations of Alternative Filtration
Technologies for Small Communities. Proceedings of the Annual Conference of the American Water
Works Association. Denver, CO.
Hach Method 10133. Determination of Turbidity by Laser Nephelometry. Revision 2.0, 7 January
2000, available at: http://www.hach.com and search for 10133. Or Google Hach Method 10133.
Hall T, Pressdee J, Carrington E. 1994. Removal of Cryptosporidium Oocysts by Water Treatment
Processes. Report No. FR0457, Bucks, UK: Foundation for Water Research.
Hancock CM. 1999. Using indigenous microbiota to monitor water quality. Water Supply 17(2): 91–4.
Harrington G, Chen H, Harris A, et al. 2001. Removal of Emerging Waterborne Pathogens. Denver,
CO: American Water Works Association Research Foundation.
Harris SI, Hancock CM, Vasconcelos J. 1996. Microscopic Particulate Analysis for Filtration Plant
Optimisation. Seattle, Washington: USEPA, Region X, EPA/910/R/96/01.
Harter T, Wagner S, Atwill E. 2000. Colloid transport and filtration of Cryptosporidium parvum in
sandy soils and aquifer sediments. Environ Sci Technol 34: 62–70.
ISO. 1999. International Standard 7027 – Water Quality – Determination of turbidity. Geneva,
Switzerland: International Organisation for Standardisation.
ISO. 1999. Hydraulic Fluid Power: On-line automatic particle counting systems for liquids –
methods of calibration and validation. Geneva, Switzerland: International Organization for
Standardisation, 11943.
Letterman R, Johnson C, Viswanathan S, et al. 2001. A Study of Low Level Turbidity
Measurements. Denver, CO: American Water Works Association Research Foundation.
306
Guidelines for Drinking-water Quality Management for New Zealand 2013
Li S, Goodrich J, Owens J, et al. 1997. Reliability of surrogates for determining Cryptosporidium
removal. J AWWA 89(5): 90.
Li H, Finch G, Smith D, et al. 2001. Sequential Disinfection Design Criteria for Inactivation of
Cryptosporidium Oocysts in Drinking Water. Denver, CO: American Water Works Association
Research Foundation.
Long WR. 1983. Evaluation of cartridge filters for the removal of Giardia lamblia cyst models from
drinking water systems. Journal of Environmental Health 45(5): 220.
MoH Public Health Risk Management Plan Guide PHRMP Ref. S1.1. Surface and Groundwater.
Wellington: Ministry of Health. Available on the Ministry of Health website:
http://www.moh.govt.nz/water then select publications and Public Health Risk Management Plans.
Ministry of Health. 2000. Drinking-water Standards for New Zealand 2000. Wellington: Ministry
of Health.
Ministry of Health. 2005. Drinking-water Standards for New Zealand 2005. Wellington: Ministry
of Health.
NSF/ANSI 53-2002 (plus Addenda 1 and 2). Drinking-water Treatment Units Health Effects. Ann
Arbor, MI. Note that NSF continually updates their standards. The latest version is: NSF/ANSI 532009e: Drinking Water Treatment Units: Health effects, 23rd edition, 120 pp.
NSF/ANSI 55-2002. Ultraviolet Microbiological Water Treatment Systems. Ann Arbor MI.
Ongerth J, Hutton P. 2001. Testing of diatomaceous earth filtration for removal of Cryptosporidium
oocysts. J AWWA 93(12): 54–63.
ÖNORM. 2001. Plants for the Disinfection of Water using Ultraviolet Radiation: Requirements and
testing. ÖNORM 5873-1 (this is for low pressure lamps). Osterreichisches Normungsinstitut, A-1021
Wien. See http://www.on-norm.at
ÖNORM. 2003. Plants for the Disinfection of Water using Ultraviolet Radiation: Requirements and
testing. ÖNORM 5873-2 (this is for medium pressure lamps). Osterreichisches Normungsinstitut,
A-1021 Wien. See http://www.on-norm.at
Oppenheimer J, Aieta E, Trussell R, et al. 2000. Evaluation of Cryptosporidium inactivation in
natural waters. Denver, CO: American Water Works Association Research Foundation.
Owens J, Miltner R, Slifko T, et al. 1999. In Vitro Excystation and Infectivity in Mice and Cell
Culture to Assess Chlorine Dioxide Inactivation of Cryptosporidium Oocysts. In Proceedings of the
Water Quality Technology Conference of the American Water Works Association. Denver, CO.
Owens J, Miltner R, Rice E, et al. 2000. Pilot-scale ozone inactivation of Cryptosporidium and other
micro-organisms in natural water. Ozone Sci Eng 22(5): 501–17.
Patania N, Jacangelo J, Cummings L, et al. 1995. Optimisation of Filtration for Cyst Removal.
AWWARF. Denver, CO, 178 pp.
Qian SS, Donnelly M, Scmelling DC, et al. 2004. Ultraviolet light inactivation of protozoa in drinking
water: a Bayesian meta-analysis. Water Research 38: 317–26.
Rennecker J, Marinas B, Owens J, et al. 1999. Inactivation of Cryptosporidium parvum oocysts with
ozone. Wat Res 33(11): 2481–8.
Rice EW, Fox KR, Miltner RJ, et al. 1996. Evaluating treatment plant performance using indigenous
aerobic bacterial endospores. J AWWA 88(9): 122–30.
Ruffell K, Rennecker J, Marinas B. 2000. Inactivation of Cryptosporidium parvum oocysts with
chlorine dioxide. Wat Res 34(3): 868–76.
Guidelines for Drinking-water Quality Management for New Zealand 2013
307
Sadar M. 1999. Turbidimeter Instrument Comparison: Low-level sample measurements technical
information series. Hach Company. ap/dp 4/99 1ed rev1 D90.5 Lit No. 7063.
Schaub S, Hargett H, Schmidt M, et al. 1993. Reverse Osmosis Water Purification Unit: Efficacy of
cartridge filters for removal of bacteria and protozoan cysts when RO elements are bypassed. Fort
Detrick, Frederick, MD: US Army Biomedical Research and Development Laboratory.
Scottish Water. 2003. The Cryptosporidium (Scottish Water) Directions 2003. Available on:
http://www.scotland.gov.uk/Publications/2004/01/18727/31490
USEPA. 1993a. Method 180.1, Determination of Turbidity by Nephelometry. Appears as Appendix B
of Guidance Manual, Turbidity Provisions, USEPA 1999. Cincinnati: United States Environmental
Protection Agency. Available on the internet at:
http://www.epa.gov/OGWDW/mdbp/pdf/turbidity/app_b.pdf or go to
http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/mdbptg.cfm
USEPA. 1993b. Method 180.2, Turbidity GLI Method 2. Appears as Appendix D of Guidance
Manual, Turbidity Provisions, USEPA 1999. Cincinnati: United States Environmental Protection
Agency. Available on the internet at: http://www.epa.gov/OGWDW/mdbp/pdf/turbidity/app_d.pdf
or go to http://water.epa.gov/lawsregs/rulesregs/sdwa/mdbp/mdbptg.cfm
USEPA. 1998. Water Supply Performance Evaluation Study #41. National Exposure Research
Laboratory. Office of Research and Development.
USEPA. 1999. Guidance Manual for Compliance with the Interim Enhanced Surface Water
Treatment Rule: Turbidity provisions. EPA 815-R-99-010. Document is available for download
at: http://www.epa.gov/safewater/mdbp/mdbptg.html or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/mdbp/implement.cfm
USEPA. 2001a. Low-Pressure Membrane Filtration for Pathogen Removal: Application,
implementation, and regulatory issues. EPA 815-C-01-001. Document is available for download at:
http://www.epa.gov/safewater/disinfection/lt2/compliance.html or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2001b. Method 1622: Cryptosporidium in Water by Filtration/IMS/FA. EPA–821–R–01–
026, April. Now see USEPA 2005b.
USEPA. 2001c. Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA. EPA
821–R–01–025, April. Now see USEPA 2005b.
USEPA. 2002. Hach Filter Trak Method 10133: Determination of turbidity by laser nephelometry.
Revision 2.0. Fed Regist 67(209), 40 CFR §141.74, 29 October. US Environmental Protection
Agency. Description can be obtained from Hach Co, Loveland, CO, or the New Zealand agents. See
Hach (above) for internet address.
USEPA. 2003. Long Term 2 Enhanced Surface Water Treatment Rule – Toolbox Guidance Manual
[Draft]. EPA 815-D-03-009. United States Environmental Protection Agency, Office of Water,
Washington. 333 pp. (See USEPA 2009 for latest version).
http://www.epa.gov/safewater/disinfection/lt2/pdfs/guide_lt2_toolbox.pdf or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2003a. Long Term 2 Enhanced Surface Water Treatment Rule; Proposed Rule. National
Primary Drinking Water Regulations: 40 CFR Parts 141 and 142, August 11. Now see USEPA 2006a
for Final Rule.
USEPA. 2003b. Small Drinking Water Systems Handbook: A guide to ‘packaged’ filtration and
disinfection technologies with remote monitoring and control tools. EPA/600/R-03/041. Office of
Research and Development. Water Supply and Water Resources Division, United States
Environmental Protection Agency.
308
Guidelines for Drinking-water Quality Management for New Zealand 2013
USEPA. 2003c. Membrane Filtration Guidance Manual. (Proposal draft). EPA 815-D-03-008.
Washington: United States Environmental Protection Agency. Now refer to USEPA 2005a.
USEPA. 2003d. Ultraviolet Disinfection Guidance Manual. (Draft). EPA-815-D-03-007.
Washington: United States Environmental Protection Agency. Now refer to USEPA 2006c.
USEPA. 2005a. Membrane Filtration Guidance Manual. EPA 815-R-06-009. Washington: United
States Environmental Protection Agency, Office of Water. Available at:
http://www.epa.gov/ogwdw/disinfection/lt2/pdfs/guide_lt2_membranefiltration_final.pdf or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2005b. Method 1623: Cryptosporidium and Giardia in Water by Filtration/IMS/FA.
Washington, DC: US Environmental Protection Agency, Office of Water. EPA-815-R-05-002.
Document is available for download at: www.epa.gov/nerlcwww/1623de05.pdf or go to
http://www.epa.gov/microbes/online.html
USEPA. 2006a. National Primary Drinking Water Regulations: Long term 2 enhanced surface
water treatment rule: final rule. (LT2ESWTR). Federal Register Part II, 40 CFR Parts 9, 141
and 142. Washington: National Archives and Records Administration. See
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04a.pdf
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04b.pdf
http://www.epa.gov/fedrgstr/EPA-WATER/2006/January/Day-05/w04c.pdf or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2006b. Source Water Monitoring Guidance Manual for Public Water Systems for the Long
Term 2 Enhanced Surface Water Treatment Rule. Washington, DC: USEPA, Office of Water
(4601M). EPA 815-R06-005. Available on the internet at:
http://www.epa.gov/ogwdw/disinfection/lt2/pdfs/guide_lt2_swmonitoringguidance.pdf or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2006c. Ultraviolet Disinfection Guidance Manual for the Long Term 2 Enhanced Surface
Water Treatment Rule. EPA-815-R-06-007. Washington: United States Environmental Protection
Agency, Office of Water. Available at:
www.epa.gov/safewater/disinfection/lt2/pdfs/guide_lt2_uvguidance.pdf
USEPA. 2006d. Microbial Laboratory Guidance Manual for the Final Long-Term 2 Enhanced
Surface Water Treatment Rule. Office of Water, EPA 815-R06-006, February. See:
http://www.epa.gov/safewater/disinfection/lt2/compliance.html or go to
http://www.epa.gov/lawsregs/rulesregs/sdwa/lt2/compliance.cfm
USEPA. 2009. Long Term 2 Enhanced Surface Water Treatment Rule: Toolbox Guidance Manual.
Review Draft. EPA-815-D-09-001. 340 pp. Washington: United States Environmental Protection
Agency. Go into http://www.epa.gov/safewater/disinfection/lt2/ and then enter: toolbox guidance
manual review draft in the ‘search’ box. See
www.epa.gov/safewater/disinfection/lt2/pdfs/guide_lt2_lt2eswtr_toolboxgm_revdraft_060809.pd
f or http://nepis.epa.gov/Exe/ZyPURL.cgi?Dockey=P10045O1.txt
USEPA. Regularly updated. Analytical Methods Approved for Compliance Monitoring under the
Enhanced Surface Water Treatment Rule. See
http://www.epa.gov/safewater/methods/pdfs/methods/methods_swtrules.pdf or go to
http://www.epa.gov/microbes/online.html
Vasconcelos J, Harris SI. 1992. Consensus Method for Determining Groundwaters under the Direct
Influence of Surface Water Using Microscopic Particulate Analysis (MPA). Port Orchard, WA:
USEPA, Manchester Environmental Laboratory. Report 910/9-92-029.
Walsh DS, et al. 1980. Ozone inactivation of floc associated viruses and bacteria. J Environ Eng Div
ASCE 106: 711–26.
Guidelines for Drinking-water Quality Management for New Zealand 2013
309
WHO. 2004. Guidelines for Drinking-water Quality 2004 (3rd edition). Geneva: World Health
Organization. Available at: www.who.int/water_sanitation_health/dwq/gdwq3/en/print.html see
also the addenda.
WHO. 2004a. Water Treatment and Pathogen Control: Process efficiency in achieving safe
drinking water. 136 pp. www.who.int/water_sanitation_health/publications/en/index.html
WHO. 2006. Cryptosporidium – WHO Guidelines for Drinking Water Quality; EHC
Cryptosporidium, draft 2. 138 pp.
http://www.who.int/water_sanitation_health/gdwqrevision/cryptodraft2.pdf
WHO. 2009. Risk Assessment of Cryptosporidium in Drinking Water. WHO/HSE/WSH/09.04.
143 pp. http://whqlibdoc.who.int/hq/2009/WHO_HSE_WSH_09.04_eng.pdf
WHO. 2011. Guidelines for Drinking-water Quality 2011 (4th edition). Geneva: World Health
Organization. Available at:
http://www.who.int/water_sanitation_health/publications/2011/dwq_guidelines/en/index.html
Appendix: Guidance Notes: Interpretation of
Catchment Protozoal Risk Category
To be read in conjunction with section 8.2.
This guidance document aims to provide additional information to enable interpretation of
Table 5.1a in situations where it is not entirely clear which log credit category is most
appropriate.
Although the DWSNZ refer to either Cryptosporidium monitoring or Catchment Assessment as
being the ‘standard approach’ dependent on population served, it should be noted that the
Cryptosporidium monitoring option is considered the more accurate and therefore the
preferred method for assigning log credits to source waters. Table 5.1a is necessarily
conservative and its use may lead to overdesign of capital works for treatment and extra ongoing
running costs for a community.
Part 1: Surface waters
Water from forest, bush, scrub or tussock catchments with
no agricultural or human activity
3 Log
This category does not present interpretation difficulties. It is generally clear which catchments
fit into this category. No further guidance is considered necessary.
Water from pastoral catchments that always has low
concentration of cattle, sheep, horses or humans in immediate
vicinity or upstream
310
Guidelines for Drinking-water Quality Management for New Zealand 2013
4 Log
Water from pastoral catchments with frequent high concentrations of
cattle, sheep, horses or humans, or a waste treatment outfall nearby
or upstream
5 Log
The 4 and 5 log surface water categories have caused some interpretation difficulties,
particularly the descriptions ‘frequent high’ and ‘always low’. Unfortunately defining these terms
by providing absolute numbers or stock densities (eg, stock units, or numbers of animals per
unit area) is not in itself beneficial in terms of establishing the associated risk.
Preliminary results from raw water Cryptosporidium monitoring in New Zealand catchments
indicate that there are likely to be very few catchments in New Zealand that present a protozoal
risk great enough to require 5 log protozoal removals. It is on the basis of this information that
the following guidance is provided. The aim is to ensure that water suppliers are not
unreasonably burdened with the requirement for 5 log treatment (as the precautionary
approach in Table 5.1a appears to direct) when the risk may not warrant it.
These guidelines contain a list of ‘alert criteria’ that should be considered in situations where it
is difficult to determine whether a catchment fits into the 4 or 5 log category. The alert criteria
are land use activities and discharges known to be associated with greater protozoal
contamination. Interpretation difficulties generally arise in situations where the influence of
animals in the catchment is unclear. For example, Table 5.1a states that the 5 log category
should be assigned in situations where a waste treatment outfall is ‘nearby or upstream’. The
proximity of the point source discharge to the intake is critical and therefore these guidelines
propose that distance from the intake be considered prior to the 5 log category being
automatically assigned to such sources.
If none of the alert criteria are met the catchment should be assigned 4 log category.
Alert criteria (surface waters in 4 or 5 log categories)
Point source 18 discharges
•
Waste treatment discharge to water (eg, sewage outfall, meatworks effluent discharge) up to
1000 m upstream of intake or 100 m downstream.
•
Stormwater discharges (via discharge pipe) up to 1000 m upstream of intake or 100 m
downstream.
Non-point source19 discharges
•
Animal effluent disposal (to land) up to 500 m upstream of intake (eg, dairy shed effluent,
pig effluent, truck effluent disposal).
•
Animal (cattle, sheep, deer, horse) access to waterway (no fencing or riparian boundary)
within 1000 m upstream of intake.
•
Sewage disposal to land (if there are concerns about the effectiveness of treatment) within
500 m upstream of intake.
•
Large numbers of feral animals within 500 m of intake.
18
Point source = a stationary point of pollution, such as a discharge pipe.
19
Non-point source = diffuse pollution sources (ie, without a single point of origin or not introduced into the
receiving water from a specific outlet).
Guidelines for Drinking-water Quality Management for New Zealand 2013
311
•
Any other activity that results in high concentrations of animals being present (other than on
hard stand areas where effluent is collected for treatment) eg, livestock sale yards, animal
transport company holding yards within 500 m of intake, farm practices such as ‘sacrifice’ or
‘wintering off’ paddocks used at high stocking rates to protect other pasture during wet
periods, strip grazing of livestock herds).
Where contaminating land uses are present take into consideration the slope of the land to
determine likelihood of impact on the intake area.
Mitigating factors for surface waters
These are factors that may reduce the likelihood of high levels of protozoa contamination
reaching a treatment plant intake despite factors above being present. Mitigating factors
include, but may not be limited to:
•
the flow rate of the river or stream relative to the discharge rate of a point source – high river
flows coupled with low discharge rates will help dilute contaminant levels
•
a well designed, managed and maintained riparian strip
•
impoundment of some description before abstraction – the longer the residence time the
greater the reduction in microbial contaminants
•
for lakes and reservoirs – the distance between the shore and intake
•
the level to which wastewater or meatworks effluent has been treated before discharge.
Response if one or more of ‘alert criteria’ met and no mitigating factors
Drinking water assessors should not immediately assign a 5 log categorisation to water sources
that meet one or more of the alert criteria (and no mitigating factors in place). Water suppliers
with sources in this category (or in circumstances where the supplier disagrees with the log
credit category assigned as a result of the assessment) should be strongly encouraged/advised to
undertake raw water Cryptosporidium monitoring to confirm the appropriate log credit.
It should be noted that any drinking water supply eligible for DWAP is not likely to be
recommended for subsidy funding if 5 log credits have been assigned to it and that this has not
been confirmed by raw water Cryptosporidium monitoring.
Part 2: Bore water supplies
Bore water drawn from >10 m deep: the bore water section of Table 5.1a is clear.
Bore water drawn from <10 m deep: Table 5.1a is difficult to apply to groundwater <10 m deep.
Source waters in this category are directed into the surface water section of the table, but these
categories are unhelpful in assessing the risk associated with groundwaters that have no
hydraulic link 20 to a surface water source. The surface water categories of Table 5.1a should be
directly applied to groundwaters that are known to be hydraulically linked to a nearby surface
water source.
Part 2 of this guidance document aims to provide additional guidance on assigning a log
removal category for bore water drawn from <10 m deep.
20
Hydraulic link = this is when surface water and groundwater are directly linked. When water is pumped from a
well that is hydraulically connected to a nearby stream, it reduces the flow in the stream. Where turbidity
increases in groundwater after heavy rain has increased the turbidity in nearby surface water, this is an indication
of a hydraulic link.
312
Guidelines for Drinking-water Quality Management for New Zealand 2013
The DWA should request that the water supplier provides information about land use activities
and discharges occurring in two zones around the well: a 5–50 m ‘inner’ buffer zone and a
50–250 m ‘wider’ buffer zone. The water supplier also needs to provide information on the soil
and sub-surface materials. This type of information may be available through the regional
council. The DWA should then consider the impact of activities occurring within the two zones
and determine the applicable log removal category by reference to Table 1 and 2 below. An
additional set of mitigating factors (outlined below) can be considered for bore water drawn
from <10m deep that fall into the 5 log category (and may enable them to be reduced to 4 log).
‘Impact of activity’ descriptions
None: no sources of animal or human faecal contamination.
Low: livestock and/or feral animal and/or human faecal contamination sources are present but
do not meet the level(s) of intensity described under the ‘high’ impact description below.
High: one or more of the following land use activities is present:
•
on-site sewage disposal (eg, septic tanks, soakage/boulder pits)
•
offal pits
•
effluent storage ponds / effluent spraying / effluent disposal by border dyke irrigation
•
high stocking rates 21 (eg, ‘sacrifice’ or ‘wintering off’ paddocks used at high stocking rates to
protect other pasture during wet periods, strip grazing of livestock herds)
•
any other activity that results in high concentrations of animals being present (other than on
hard stand areas where effluent is collected for treatment) eg, livestock saleyards, animal
transport company holding yards.
Note: Drinking water suppliers should endeavour to exclude high risk sites when selecting a
location for a new drinking water source. The Resource Management (National Environmental
Standards for Sources of Human Drinking Water) Regulations 2007 should be consulted where
new contaminating land uses are proposed that may impact on existing water sources.
Obtaining a higher quality source water may be a better option than investing in more extensive
treatment.
21
Definitions for stocking rates vary. Average stocking rate for dairy cattle in New Zealand in 2010 was 2.8 cows per
hectare (DairyNZ). Definitions vary, but generally rates above 3.5–4.0 cows per hectare are considered high.
Stocking rates are calculated over the area of the farm. Farm practices that concentrate large numbers of animals
into small areas are of more significance in terms of contaminant runoff and leaching into groundwater.
Guidelines for Drinking-water Quality Management for New Zealand 2013
313
Table A1: Log removal requirements for bore waters <10 m deep based on land use
activities and soil / sub-surface material permeability
Fenced
exclusion
zone
Impact of activity in
inner buffer zone
(refer to definitions
above)
Impact of activity in
wider buffer zone
(refer to definitions
above)
Soil
Sub-surface
permeability
permeability
(refer to Table 2 (refer to Table 2
below)
below)
(5m)
(5–50m)
(50–250m)
Low
High
Low
High
Yes
None
None
N/A
N/A
N/A
N/A
Yes
None
Low

Yes
None
Low

Yes
None
Low

Yes
None
Low

Yes
None
High

Yes
None
High

Yes
None
High

Yes
None
High

Yes
Low
Low

Yes
Low
Low

Yes
Low
Low

Yes
Low
Low

Yes
Low
High

Yes
Low
High

Yes
Low
High

Yes
Low
High

Yes
High
N/A

Yes
High
N/A

Yes
High
N/A

Yes
High
N/A


Log
treatment
requirement
3
3


4
3


4
4


5
4


5
3


3
4


4
4


5
4


5
4


4
5

No
5
5
Table A2: Sub-surface media and soil categories
High filtration* media/soils
Sub-surface media
Soils
Pumice sand
Clay
Raw and recent soils
Semiarid soils
Pumice soils
Allophanic soils
Low filtration media/soils (some of these are more accurately Gravel
described as ‘medium’ filtration materials but for the
Alluvial sand
purposes of categorising for Table 1 they are considered
Coastal sand
‘low filtration’
Sandstone
Fractured rock
Silt
Ash
Peat
*
All other types of soils
A ‘high filtration’ media/soil indicates that water that passes through the media will have been subjected to a high
level of filtration. Not to be confused with soils that are ‘free draining’, meaning that liquids pass through the media
easily.
314
Guidelines for Drinking-water Quality Management for New Zealand 2013
Mitigating factors for bore waters <10 m deep in 5 log category
Bore waters <10 m deep that Table A1 identifies as requiring 5 log treatment may be reduced to
4 log if the DWA considers that sufficient mitigating factors are in place that reduce the
likelihood of the high risk activity causing contamination of the source. The following are
examples of mitigating factors that should be considered:
On-site sewage disposal: systems that incorporate newer treatment technology are far less
likely to present a risk to groundwater than older style systems. Consider also the nature of the
disposal field (boulder pit / soakage pits present much higher risk than trickle irrigation
systems).
Groundwater flow: good information on groundwater flow may alleviate concerns about
some potentially contaminating land-use activities (that have been identified within the buffer
zone) if the direction of groundwater flow shows the land use will not impact on water drawn in
by the well.
Soil thickness: the greater the soil thickness, the greater the removal of oocysts.
Soil type: allophanic and pumice soils are extremely efficient in removing microbes from water
permeating through them. If it is determined that these types of soil are present, a 3 log
requirement can be assigned to the system.
Drinking water assessors should not immediately assign a 5 log categorisation to any
groundwater source. Water suppliers with sources in this category (or in circumstances where
the supplier disagrees with the log credit category assigned as a result of the assessment) should
be strongly encouraged/advised to undertake raw water Cryptosporidium monitoring to
confirm the appropriate log credit.
It should be noted that any drinking water supply eligible for DWAP is not likely to be
recommended for subsidy funding if 5 log credits have been assigned to it and that this has not
been confirmed by raw water Cryptosporidium monitoring.
Part 3: Five-yearly reassessment of protozoa risk category
The DWSNZ require that water suppliers >10,000 redo their protozoa monitoring programme
every five years and that water suppliers <10,000 redo their catchment assessment every five
years. A water supplier <10,000 that has elected to do protozoa monitoring to establish their
initial log credit requirement may choose to do catchment assessments for the subsequent five
yearly reassessments. Suppliers in this category may remain on the original log credit
(determined by protozoa monitoring) if the catchment assessment confirms that no changes in
the catchment have occurred that would have altered the protozoa risk. See section 8.2.6.
Guidelines for Drinking-water Quality Management for New Zealand 2013
315
Chapter 9: Cyanobacterial
compliance
9.1
Introduction
This chapter provides a large amount of information on cyanobacteria and cyanotoxins because
of the increasing number of supplies that encounter difficulties with these micro-organisms, and
because many water suppliers may have little understanding of how to manage them. Although
prepared primarily for use in relation to drinking-water supplies, the information should also be
of use to those managing recreational waters.
The Ministry for the Environment and the Ministry of Health published the New Zealand
Guidelines for Cyanobacteria in Recreational Fresh Waters – Interim Guidelines. This
document contains material that is also relevant to managing and sampling drinking-water
sources and is recommended to be consulted for additional scope.
In addition, Water Quality Research Australian (formally CRC for Water Quality) produces
many reports and technical notes relevant to managing cyanobacteria in both recreational
waters and drinking-water sources and is recommended to be consulted for additional scope.
These reports can be found at: http://www.wqra.com.au
A recent (2009) Guidance Manual was published for the Global Water Research Coalition
(GWRC) by Water Quality Research Australia and SA Water – see references.
For those who do not wish to read the full text, but are concerned with information to support
the requirements of the DWSNZ, the following sections are those of greatest importance:
•
Compliance with the DWSNZ: see section 9.4
•
Sampling: see section 9.5
•
Transgressions: see section 9.6
•
Risk management: see section 9.7
•
Refer also to the datasheets for cyanobacteria and for the cyanotoxins, in Volume 3.
Over recent years, water supplies in some parts of New Zealand have experienced an increase in
the number of cyanobacterial blooms affecting their water sources. These events have the
potential to introduce into the water toxins that can have acute and, if their concentrations are
high enough, fatal consequences for consumers. Experience of such events in New Zealand is
still relatively limited, and consequently this section provides substantial detail to assist water
suppliers in dealing with cyanobacteria. In preparing this section, extensive use has been made
of Toxic Cyanobacteria in Water: A guide to their public health consequences, monitoring and
management Chorus and Bartram (editors), published on behalf of the World Health
Organization 1999. Cyanobacteria may also be referred to as blue-green algae, or harmful algal
blooms (HAB) and a publication in 2008 provides holistic coverage of cyanobacteria (Hudnell
2008), with a chapter on cyanotoxin removal during drinking-water treatment (Westrick 2008).
316
Guidelines for Drinking-water Quality Management for New Zealand 2013
Cyanobacteria are primarily aquatic organisms with many characteristics of bacteria. As their
metabolism is based on photosynthesis, they have also been termed blue-green algae. They may
grow as filaments or colonies readily visible and identified (to the genus level) under a
microscope.
Cyanobacteria are not, of themselves, a health hazard, but the toxins they produce (called
cyanotoxins) are. For this reason Chorus and Bartram (1999) recommended that public health
management be focussed on the cyanotoxins, and that cyanobacteria in drinking-water be
managed as a chemical problem. The presence of cyanobacteria can be regarded as a trigger for
monitoring for cyanotoxins.
Cyanobacteria inhabit all natural waters and become a problem only when they increase to
excessive numbers (water blooms). Concern about the effects of cyanobacteria on human health
has grown in many countries in recent years for a variety of reasons. These include cases of
poisoning attributed to toxic cyanobacteria and awareness of contamination of water sources
(especially lakes) resulting in increased cyanobacterial growth. Cyanobacteria also continue to
attract attention in part because of well-publicised incidents of animal poisoning.
Outbreaks of human poisoning attributed to toxic cyanobacteria have been reported in several
countries including Australia, following exposure of individuals to contaminated drinking water,
and the UK, where army recruits were exposed while swimming and canoeing. However, the
only proven human fatalities associated with cyanobacteria and their toxins have occurred in
Brazil (see section 9.1.2).
A diagram to rapidly assess the level of risk to health presented by a cyanobacterial bloom, by
considering the treatment processes in place, is given in Figure 9.1, which assumes that
treatment processes are working properly, and that they are capable of treating the levels of
toxin or cell concentrations in the raw water. If either of these assumptions is invalid, the
absolute levels of risk may be markedly different.
The purpose of this chapter is to provide:
•
general information on cyanobacteria, the factors that control bloom formation, and their
toxins and health significance
•
advice on how the risk they present to consumers can be evaluated
•
discussion on meeting the cyanotoxin compliance requirements of the DWSNZ
•
guidance on how the pubic health risk associated with cyanotoxins can be managed.
9.1.1
Algal bloom development
Cyanobacteria are members of the community of phytoplankton (which means small free
floating plants; however cyanobacteria are actually bacteria, have no defined nucleus, rather
than plants, which do have a defined nucleus) and the bottom-dwelling organisms living on the
surface of the sediments and stones in most water-bodies. The right combination of
environmental conditions, particularly high nutrient levels, may cause their excessive growth
(bloom formation), leading to blue, brown or greenish discolouration of water through the high
population density of suspended cells, and to the formation of surface scums. Such
accumulations of cells may lead to high toxin concentrations.
Guidelines for Drinking-water Quality Management for New Zealand 2013
317
Some key factors affecting bloom development are:
a)
Eutrophication
High levels of nutrients, usually phosphorus and nitrogen, can cause increases in natural
biological production in rivers, lakes and reservoirs. These conditions can result in visible
cyanobacterial or algal blooms, surface scums, floating plant mats and aggregations of plants
attached to underwater surfaces. The levels of phosphorus in the water often limit the growth of
cyanobacteria, but in a substantial number of lakes in New Zealand, the dissolved nitrogen
concentrations are said to be the limiting factor despite cyanobacteria being able to fix nitrogen.
Some lakes are naturally eutrophic, but in most the excess nutrient input is of