IMPROVE Aerosol Quality Assurance Project Plan (QAPP)

IMPROVE Aerosol Quality Assurance Project Plan (QAPP)
Quality Assurance
Guidance Document
Revision 0.0
IMPROVE:
Interagency Monitoring of Protected
Visual Environments
Quality Assurance Project Plan
OAQPS Category 1 QAPP
March 2002
LIST OF ACRONYMS AND ABBREVIATIONS
ACE
BLM
C
CAA
cm2
CIRA
CNL
COTR
DQO
DRI
EC
EPA
EPA OAR
fid
FLM
FTP
HIPS
L
LIPM
IC
IMPROVE
L/min
m
m3
MDL
min
Mm-1
MQL
NAAQS
ng
NIST
NPS
OC
OMC
OMH
PESA
name of Crocker Nuclear Laboratory data acquisition program
Bureau of Land Management
Celsius
Clean Air Act
square centimeter
Cooperative Institute for Research in the Atmosphere, Colorado State
University
Crocker Nuclear Laboratory
Contracting Officer's Technical Representative
Data Quality Objective
Desert Research Institute, University of Nevada
elemental carbon
US Environmental Protection Agency
EPA Office of Air and Radiation
flame ionization detector
Federal Land Manager
file transfer protocol
Hybrid Integrating Plate/Sphere
liters
Laser Integrating Plate Method
Ion Chromatography
Interagency Monitoring of Protected Visual Environments
liters per minute
meter
cubic meter
minimum detectable limit
minute
inverse megameters (a unit of visibility)
minimum quantifiable limit
National Secondary Ambient Air Quality Standard
nanograms
National Institute of Standards and Technology
US National Parks Service
organic carbon
organic mass from carbon analysis
organic mass from hydrogen analysis
Proton Elastic Scattering Analysis
i
PIXE
PM2.5
PM10
QA
QAPP
QC
RACE
RENO
RTI
SOP
TC
TOR
µg
µm
USFS
XRF
Particle Induced X-Ray Emission
particulate matter (with aerodynamic diameter less than 2.5 µm)
particulate matter (with aerodynamic diameter less than 10 µm)
quality assurance
quality assurance project plan
quality control
name of CNL elemental analysis data reduction program
re-normalization parameter (elemental analysis)
Research Triangle Institute
standard operating procedure
total carbon
Thermal Optical Reflectance Carbon Combustion Analysis
micrograms
micrometers
US Forest Service
x-ray fluorescence
ii
LIST OF TABLES
Table 1. IMPROVE Project Organization: External Responsibilities......................4—1
Table 2. IMPROVE Project Organization..........................................................4—10
Table 3. IMPROVE Standard Operating Procedures and Technical
Instructions..................................................................................4—29
Table 4. IMPROVE Measurement Quality Objectives of Precision, Accuracy,
and Minimum Quantifiable Limit...................................................4—31
Table 5. Methods for determining Analytical Precision........................................4—34
Table 6. Flow Rate and Analytical Accuracy......................................................4—37
Table 7. IMPROVE Network Data Recovery and Completeness, 1988-1998.
...................................................................................................4—45
Table 8. IMPROVE Sampler Modules and Filter Media: Module (Filter),
Analysis Performed, and Parameters Measured..............................5—3
Table 9. IMPROVE Module Sample Media........................................................5—9
Table 10. Days the filter is in shipment, in storage at the site, or in the sampler.....5—17
Table 11. Cahn Microbalance Manufacturers Performance Specifications...........5—22
Table 12. Size and Load Correction for Fine Particles with PIXE or XRF..........5—29
Table 13. Preparation of Anion Calibration Standards........................................5—32
Table 14. Preparation of Cation Calibration Standards.......................................5—33
Table 15. Possible QC Failures and the Associated Corrective Actions..............5—46
Table 16. Fraction of cases with concentrations above the MDL or MQL. All
samples collected in 1999............................................................5—64
Table 17. IMPROVE Sample Codes for Invalid Samples. .................................5—86
Table 18. IMPROVE Validation Flags. .............................................................7—24
iii
LIST OF FIGURES
Figure 1. IMPROVE organizational chart.............................................................4—2
Figure 2. CNL Organizational Chart for IMPROVE. .........................................4—10
Figure 3. Monitoring Sites: Growth by Year for IMPROVE, Protocol, and
non-Protocol sites using IMPROVE samplers from 1988 to
2002. ..........................................................................................4—19
Figure 4. Monitoring Sites: Current and Projected Sites for 2002. ......................4—20
Figure 5. Recovery and completeness of IMPROVE data, 1988-1998. .............4—45
Figure 6. IMPROVE Sampler Modules: A (Teflon), B (Nylon), C (Quartz),
and D (Teflon)...............................................................................5—2
Figure 7. Sampler Location with Respect to Trees and Solid Barriers...................5—5
Figure 8. IMPROVE Sampler Control Module....................................................5—7
Figure 9. IMPROVE PM2.5 Sampler Module. .....................................................5—8
Figure 10. IMPROVE Network Field Sample Log Sheet...................................5—12
Figure 11. IMPROVE Sampler Handling of Filters.............................................5—14
Figure 12. CNL Cyclotron and North Cave.......................................................5—21
Figure 13. Setup of Hybrid Integrating Plate / Sphere Apparatus........................5—23
Figure 14. XRF Analysis System. ......................................................................5—25
Figure 15. Setup of PIXE/PESA Analysis System..............................................5—26
Figure 16. PESA Spectra Criteria for a Clean Blank and Network Sample.........5—27
Figure 17. Flowchart for Controls Filter Processing............................................5—40
Figure 18. IMPROVE Sampler Field Calibration Log Sheet...............................5—51
Figure 19. Setup of CNL Reference Integrating Sphere Apparatus.....................5—53
Figure 20. Median values of the 8 carbon fractions for laboratory blanks,
secondary filters, and field blanks for samples collected between
March 1995 and May 2001.........................................................5—60
Figure 21 Distributions of S, SO4, NO3, OC, and EC for low concentrations. ...5—62
Figure 22. IMPROVE Sample Handling Programs and Database.......................5—77
Figure 23. IMPROVE Database: Logs Entry Screen..........................................5—78
Figure 24. PRE Program Schematic...................................................................5—81
Figure 25. POST Program Schematic................................................................5—83
Figure 26. IMPROVE Database: Auxdata Screen. ............................................5—85
Figure 27. IMPROVE Sampler Consistency Plot of Flow History. .......................7—2
iv
Figure 28. IMPROVE Quality Assurance Plots: PIXE vs. XRF Comparison. .....7—12
Figure 29. Site Specific Time Plot of Trace Element Concentrations. ..................7—14
Figure 30. IMPROVE Quality Assurance Plots: ZN vs. MDL for Possible
Outliers. ......................................................................................7—15
Figure 31. IMPROVE Quality Assurance Plots: Flow Rate................................7—17
Figure 32. IMPROVE Quality Assurance Plots: Elemental Sulfur vs. Ionic
Sulfate.........................................................................................7—18
Figure 33. IMPROVE Quality Assurance Plots: Babs vs. LAC. ...........................7—19
Figure 34. IMPROVE Quality Assurance Plots: OMH vs. OMC. ......................7—20
Figure 35. IMPROVE Quality Assurance Plots: MF vs. MT. .............................7—21
Figure 36. IMPROVE Quality Assurance Plots: Select East and West Plots.......7—22
Figure 37. IMPROVE Quality Assurance Plots: MDL Plot. ...............................7—23
Figure 38. IMPROVE Sampler Cyclone Collection Efficiency: Relationship
between Diameter and Flow Rate. .................................................8—2
v
IMPROVE QAPP
Title and Approval Sheet
Section—Page: 1–1 of 2
Revision: 0.0
Date: March 2002
1. TITLE AND APPROVAL SHEET
The Interagency Monitoring of Protected Visual Environments (IMPROVE) project
began in 1987 and continues to protect visibility in Class I Wilderness Areas from
existing and future man-made air pollution. This quality assurance project plan (QAPP)
was developed by Crocker Nuclear Laboratory (CNL) to document the quality
assurance (QA) and quality control (QC) activities of the IMPROVE project. The
following signatures of key officials indicate agreement with the procedures specified
within this plan and a commitment to deliver the details of this plan to personnel on this
project.
Crocker Nuclear Laboratory
Robert Flocchini, Financial Manager and CNL Director
Date
Lowell Ashbaugh, Project Manager
Date
Robert Eldred, Quality Assurance Manager
Date
Patrick Feeney, Network Operations Manager
Date
Steven Ixquiac, Laboratory Manager
Date
Peter Beveridge, Field Manager
Date
Research Triangle Institute
Eva Hardison, Laboratory Manager
Date
IMPROVE QAPP
Title and Approval Sheet
Section—Page: 1–2 of 2
Revision: 0.0
Date: March 2002
Desert Research Institute
Judith Chow, Laboratory Manager
Date
Environmental Protection Agency
Marc Pitchford, EPA Project Officer
Date
Dennis Mikel, EPA Quality Assurance Manager
Date
National Park Service (Contract Agency)
Mark Scruggs, Branch Chief, Research and Monitoring,
NPS Air Resources Division
Date
IMPROVE QAPP
Table of Contents
Section—Page: 2—1 of 4
Revision: 0.0
Date: March 2002
2. TABLE OF CONTENTS
List of Acronyms and Abbreviations....................................................................... i
List of Tables..........................................................................................................iii
List of Figures ....................................................................................................... iv
1. Title and Approval Sheet................................................................................1–1
2. Table of Contents.........................................................................................2—1
3. Distribution List............................................................................................3—1
4. Project Management ....................................................................................4—1
4.1 Project/Task Organization....................................................................4—1
4.1.1 Roles and Responsibilities........................................................4—1
4.1.2 KEY QA PERSONNEL ......................................................4—14
4.2 Problem Definition / Background........................................................4—17
4.3 Project/Task Description ...................................................................4—25
4.4 The Scope of this Quality Assurance Plan...........................................4—29
4.5 Data Quality Objectives.....................................................................4—30
4.6 Measurement Quality Objectives........................................................4—31
4.6.1 Precision Objectives..............................................................4—31
4.6.2 Accuracy Objectives .............................................................4—36
4.6.3 Minimum Quantifiable Limit Objectives..................................4—37
4.6.4 Recovery Rate and Completeness..........................................4—43
4.6.5 Representativeness and Comparability...................................4—46
4.7 Special Training and Certification.......................................................4—46
4.7.1 Purpose / Background...........................................................4—46
4.7.2 Training.................................................................................4—46
4.7.3 Certification...........................................................................4—47
4.8 Documents and Records....................................................................4—47
4.8.1 Structure of the Quality Assurance Project Plan......................4—47
4.8.2 Contents of the Data Report Packages...................................4—47
4.8.3 Process of Producing Final Datasets ......................................4—48
5. Data Generation and Acquisition...................................................................5—1
5.1 Sampling Process Design (Experimental Design)...................................5—1
5.1.1 IMPROVE Sampler Configuration...........................................5—1
5.1.2 IMPROVE Sampler Site Selection..........................................5—3
5.2 Sampling Methods...............................................................................5—6
5.2.1 IMPROVE Sampler................................................................5—6
5.2.2 Sampling Media ......................................................................5—9
IMPROVE QAPP
Table of Contents
Section—Page: 2—2 of 4
Revision: 0.0
Date: March 2002
5.3
5.4
5.5
5.6
5.7
5.2.3 Sampling Operations .............................................................5—10
Sample Handling and Custody...........................................................5—13
5.3.1 Purchasing Filters and Acceptance Testing.............................5—15
5.3.2 Load Clean Filters.................................................................5—16
5.3.3 Transport and On-Site Storage..............................................5—17
5.3.4 Handling Onsite.....................................................................5—17
5.3.5 Receiving Boxes and Log Sheet Information...........................5—18
5.3.6 Downloading Nylon and Quartz Filters...................................5—18
5.3.7 Downloading Teflon filters .....................................................5—19
5.3.8 IC and TOR..........................................................................5—20
5.3.9 XRF, PIXE, PESA, and HIPS ..............................................5—20
Analytical Methods............................................................................5—21
5.4.1 Gravimetric Methodology......................................................5—21
5.4.2 HIPS Methodology ...............................................................5—23
5.4.3 XRF Methodology................................................................5—24
5.4.4 PIXE and PESA Methodology..............................................5—26
5.4.5 Ion Chromatography Methodology........................................5—29
5.4.6 Thermal Optical Reflection Methodology ...............................5—36
Quality Control..................................................................................5—37
5.5.1 IMPROVE Sampler Quality Control......................................5—37
5.5.2 Gravimetric Quality Control...................................................5—38
5.5.3 HIPS Quality Control............................................................5—41
5.5.4 XRF, PIXE, and PESA Quality Control.................................5—42
5.5.5 Ion Chromatography Quality Control.....................................5—43
5.5.6 Thermal Optical Reflection Quality Control............................5—43
5.5.7 Field Blanks and Secondary Filters........................................5—44
5.5.8 Site Collocation.....................................................................5—46
5.5.9 Possible QC Failures.............................................................5—46
Instrument/Equipment Testing, Inspection, and Maintenance ...............5—48
5.6.1 IMPROVE Sampler Testing and Maintenance .......................5—48
5.6.2 Gravimetric Maintenance.......................................................5—48
5.6.3 HIPS Maintenance................................................................5—48
5.6.4 XRF, PIXE, and PESA Maintenance.....................................5—49
5.6.5 Ion Chromatography Maintenance.........................................5—49
5.6.6 Thermal Optical Reflection Maintenance ................................5—49
Instrument/Equipment Calibration and Frequency...............................5—50
5.7.1 IMPROVE Sampler Calibration............................................5—50
5.7.2 Gravimetric Calibration..........................................................5—52
IMPROVE QAPP
Table of Contents
Section—Page: 2—3 of 4
Revision: 0.0
Date: March 2002
5.7.3 HIPS Calibration...................................................................5—52
5.7.4 XRF, PIXE, and PESA Calibration .......................................5—53
5.7.5 Ion Chromatography Calibration............................................5—54
5.7.6 Thermal Optical Reflection Calibration...................................5—55
5.8 Calculation of Concentration, Uncertainty, and MDL/MQL................5—58
5.8.1 Definitions .............................................................................5—58
5.8.2 Concentration and Artifact.....................................................5—58
5.8.3 Uncertainty in Concentration..................................................5—62
5.8.4 MDL and MQL....................................................................5—63
5.8.5 Volume Precision..................................................................5—65
5.8.6 Equations for Gravimetric Mass (Teflon Filter) .......................5—65
5.8.7 Equations for Elements (Teflon Filter).....................................5—66
5.8.8 Equations for Optical Absorption (Teflon Filter).....................5—69
5.8.9 Equations for Ions (Nylon Filter)............................................5—69
5.8.10 Equations for Carbon (Quartz Filter).....................................5—71
5.9 Inspection/Acceptance of Supplies and Consumables.........................5—73
5.10 Non-direct Measurements .................................................................5—76
5.11 Data Management .............................................................................5—76
5.11.1 Database..............................................................................5—77
5.11.2 Programs..............................................................................5—80
5.11.3 Field Data ............................................................................5—85
6. Assessment and Oversight..........................................................................6—88
6.1 Types of Assessments........................................................................6—88
6.2 Assessment Frequency ......................................................................6—88
6.3 Acceptance Criteria...........................................................................6—88
6.4 Assessment Schedules.......................................................................6—88
6.5 Assessment Personnel........................................................................6—88
6.6 Assessment Reports ..........................................................................6—90
6.7 Implementation of Response Actions..................................................6—90
6.8 References ........................................................................................6—92
6.9 Reports to Management.....................................................................6—93
7. Data Validation and Usability........................................................................7—1
7.1 Data Review, Verification, and Validation.............................................7—1
7.2 Verification and Validation Methods.....................................................7—1
7.2.1 XRF Validation.......................................................................7—3
7.2.2 PIXE/PESA Verification..........................................................7—4
7.2.3 Ion Chromatography Verification.............................................7—5
7.2.4 Thermal Optical Reflection Verification....................................7—5
IMPROVE QAPP
Table of Contents
Section—Page: 2—4 of 4
Revision: 0.0
Date: March 2002
7.2.5 Elemental Data Verification....................................................7—11
7.2.6 Flow Rate Data Validation.....................................................7—16
7.2.7 Species Comparisons Between Modules................................7—17
7.2.8 Regional Data Review ...........................................................7—21
7.2.9 Final Data Review and Validation..........................................7—23
7.2.10 Data Validation Flags............................................................7—23
7.3 Reconciliation with User Requirements...............................................7—24
8. Appendix.....................................................................................................8—1
8.1 IMPROVE Sampler Technical Properties ............................................8—1
IMPROVE QAPP
Distribution List
Section—Page: 3—1 of 1
Revision: 0.0
Date: March 2002
3. DISTRIBUTION LIST
Crocker Nuclear Laboratory (CNL)
Robert Flocchini, Financial Manager and CNL Director
Lowell Ashbaugh, Program Manager
Robert Eldred, Quality Assurance Manager
Patrick Feeney, Network Operations Manager
Steven Ixquiac, Laboratory Manager
Peter Beveridge, Field Manager
Research Triangle Institute (RTI)
Eva Hardison, Laboratory Manager
Desert Research Institute (DRI)
Judith Chow, Laboratory Manager
Environmental Protection Agency (EPA)
Marc Pitchford, EPA Project Officer
Dennis Mikel, EPA Quality Assurance Manager
National Park Service (Contract Agency)
Mark Scruggs, Branch Chief, Research and Monitoring, Air Resources Division
IMPROVE QAPP
Project Management
Section—Page: 4—1 of 52
Revision: 0.0
Date: March 2002
4. PROJECT MANAGEMENT
4.1
4.1.1
Project/Task Organization
Roles and Responsibilities
This section identifies individuals and organizations working on the IMPROVE
(Interagency Monitoring of Protected Visual Environments) project and discusses their
specific responsibilities. The organization for the IMPROVE Aerosol Monitoring
Network is shown in Figure 1. Three distinct groups exist within the Program. The
chart shows the quality control (QC) and management organization; the gray boxes
within the chart represent external Quality Assurance (QA) organizations.
The responsibilities for the participating organizations are summarized in Table 1.
Table 1. IMPROVE Project Organization: External Responsibilities.
Organization
Responsibilities
EPA
Provide external quality assurance, provide site operators.
FWS
Provide site operators.
NPS
Administrator contracts, provide site operators.
USFS
Provide site operators.
States, Tribes Provide site operators.
CNL
Provide field support, process filters, ship quartz filters to DRI and
nylon filters to RTI, provide analysis of Teflon filters for mass,
elemental concentrations, and optical absorption, process filter data
from all three contractors, perform data validation through Level 2,
provide final data to CIRA.
DRI
Provide analysis of quartz filters for carbon.
RTI
Provide analysis of nylon filters for ions.
CIRA
Supervise additional level 2 validation, maintain website, report data.
Agencies: Environmental Protection Agency (EPA), US Fish and Wildlife Service
(FWS), National Parks Service (NPS), US Forest Service (USFS), Crocker Nuclear
Laboratory, University of California, Davis (CNL), Desert Research Institute (DRI),
Research Triangle Institute (RTI).
IMPROVE QAPP
Project Management
Section—Page: 4—2 of 52
Revision: 0.0
Date: March 2002
IMPROVE Steering Committee
Monitoring and Quality
Assurance Group
(MSR or TSA, as required)
QAQPS
Quality Assurance Coordinator
D. Mikel
National Park Service
Contract Administration and Technical Oversight
Field QA Assessments
ORIA—R&IE
E. Braganza
FIELD ACTIVITIES
Sampler Installation,
Calibration,
and Maintenance
Crocker Nuclear Lab
P. Beveridge
Field Coordination,
Filter Compilation,
Shipping, & Receiving
Crocker Nuclear Lab
S. Ixquiac
National Park Service
M. Scruggs
US Forest Service
R. Fisher
site
operators
US Fish & Wildlife Service
S. Silva
LABORATORY ACTIVITIES
Mass & Elemental Analysis
Crocker Nuclear Lab
B. Perley
Field QA Assessments
ORIA—R&IE
E. Braganza
State Agencies and
Tribal Governments
LABORATORY QA
ORIA—NAREL
M. Clark
Ion Analysis
Research Triangle Institute
E. Hardison
Carbon Analysis
Desert Research Institute
J. Chow
DATA COMPILATION, ANALYSIS, & REPORTING
Filter Data Compilation and
Level II Data Validation
Crocker Nuclear Lab
P. Feeney
Level II Data Validation, Data
Reporting, and Website Hosting
Colorado State University, CIRA
D. Fox
Figure 1. IMPROVE organizational chart.
Data Analysis
National Park Service
W. Malm
IMPROVE QAPP
Project Management
Section—Page: 4—3 of 52
Revision: 0.0
Date: March 2002
Below are descriptions of each organization.
4.1.1.1 The IMPROVE Steering Committee
Determines the policy and supervises the network. The committee has four
voting members representing each FLM, four voting members representing the
states, and one voting member representing the EPA. These representatives
provide oversight to the entire program and meet at least annually to discuss any
and all issues that concern the program. In addition, they will interact with the
Quality Assurance Coordinator (QAC) and the organizations involved in field,
laboratory, analysis, and reporting activities. The voting members of the committees
are representative from:
•
U.S. EPA
• National Park Service
• Bureau of Land Management
• U.S. Forest Service
• U.S. Fish and Wildlife Service
• NESCAUM (Northeast States for Coordinated Air Use Management)
• STAPPA (State and Territorial Air Pollution Program Administrators)
• WESTAR (Western States Air Resources Council)
• MARAMA (Mid-Atlantic Regional Air Management Association)
There is currently one associate member, the State of Arizona.
The chairman of the IMPROVE Steering Committee is the EPA Project Officer. He
provides final approval of the quality assurance (QA) and quality control (QC)
measures outlined in this document; reviews and approves the Quality Assurance
Project Plan (QAPP) and subsequent revisions; and ensures that the QAPP is
implemented.
4.1.1.2 QAQPS QA Coordinator
At the top of the QA structure is the OAQPS-QAC. It is the QAC’s responsibility to
oversee that QA is implemented into the program and oversee the work performed by
the following QA agencies:
•
Office of Air Quality Planning and Standards (OAQPS), Raleigh, NC.
- Quality Assurance Coordinator (QAC)
- Monitoring and Quality Assurance Group (MQAG)
IMPROVE QAPP
Project Management
Section—Page: 4—4 of 52
Revision: 0.0
Date: March 2002
•
Office of Radiation and Indoor Air (ORIA).
- National Air and Radiation Laboratory (NAREL), Montgomery, AL.
- Radiation and Indoor Air Environments National Laboratory (R&IE), Las
Vegas, NV.
The QAC will meet periodically with NAREL and R&IE, the two supporting agencies
and IMPROVE Steering Committee to discuss QA issues as they arise throughout the
program. Assessment reports will be given annually to the QAC by the supporting
agencies and other program participants as noted in Table 2-2.In addition to NAREL
and R&IE, the Monitoring and Quality Assurance Group (MQAG) may also perform a
management system review (MSR) or technical system audits (TSAs) on any of the
agencies in the QA or monitoring system.
The QAC’s primary responsibilities are to:
•
Ensure that the methods and procedures used in making air pollution
measurements are adequate to meet the program’s objectives and that the
resulting data are of satisfactory quality.
•
Develop a Quality Management Plan.
•
Periodically review the field and laboratory Quality Assurance Project Plan
(QAPP).
•
Evaluate the performance of organizations making chemical speciation
measurements through mechanisms such as technical systems audits,
performance evaluations, and management systems reviews.
4.1.1.3 National Air and Radiation Environmental Laboratory
NAREL in Montgomery, Alabama, is one of two ORIA laboratories that will have a
major role in the quality assurance system for the speciation program. NAREL will fill a
specific and vital role in the laboratory quality assurance aspects of the program. These
duties are briefly listed below (details are presented in the NAREL QA Management
Plan (NAREL, date):
•
Review SOPs and QAPP.
•
Perform Performance Evaluation (PE) Round Robin.
•
Perform on-site laboratory Technical System Audits (TSAs).
•
Forward audit report to OAQPS.
IMPROVE QAPP
Project Management
Section—Page: 4—5 of 52
Revision: 0.0
Date: March 2002
•
Maintain a repository of PE samples that could be used as double-blind PEs by
states, diagnostic tools by laboratories, etc.
4.1.1.4 Radiation and Indoor Air Environments National Laboratory
R&IE is the other ORIA laboratory that will have a major role in the quality assurance
system for the speciation program. The Las Vegas, Nevada, office will fill specific and
vital roles in the field quality assurance aspects of the program. These are briefly listed
below:
•
Provide the major portion of the field QA for this program, consisting of
field TSAs; site visits; and flow, temperature, and pressure sensor checks in
the field.
•
Provide observations of ongoing work to document conformance with
specified field SOPs and QAPPs.
4.1.1.5 National Park Service
The National Park Service (NPS) is the key operational agency of the IMPROVE
Program. The agency is responsible for implementing the technical direction of the
Steering Committee; operating a majority of IMPROVE sites; issuing and administering
all IMPROVE contracts; performing final QA on all data; performing data analyses; and
distributing the data, analyses results, and project information through the IMPROVE
Web site.
The responsibilities of the NPS include:
•
Participate in the IMPROVE Steering Committee.
•
Issue and administer the following IMPROVE support contracts:
- Prime Contractor: Operational support, mass and elemental speciation
(current contract to Crocker Nuclear Laboratory - CNL).
- Ion filter analysis contract
(current contract to Research Triangle Institute - RTI).
- Carbon filter analysis
(current contract to Desert Research Institute - DRI).
- Data quality assurance, analysis, and reporting contract
(current contract to Colorado State University; Cooperative Institute for
Research in the Atmosphere (CIRA).
IMPROVE QAPP
Project Management
Section—Page: 4—6 of 52
Revision: 0.0
Date: March 2002
•
Provide technical oversight to all aspects of the program in response to the
IMPROVE Steering Committee.
•
Perform detailed data analyses including the preparation of scientific papers
and presentations.
•
Operate samplers at NPS sites.
- Receive and store sampler shipping box.
- Perform weekly sample change.
- Return sampler shipping box to CNL.
- Consult with CNL concerning problems.
- Perform calibration and maintenance as directed by CNL.
4.1.1.6 U.S. Forest Service
The U.S. Forest Service (USFS) is a member of the IMPROVE Steering Committee
and serves as one of the agencies that will perform the field work for the program. The
field operators at USFS sites will be USFS employees or contractors who will:
• Operate samplers at USFS sites.
- Receive and store sampler shipping box.
- Perform weekly sample change.
- Return sampler shipping box to CNL.
- Consult with CNL concerning problems.
- Perform calibration and maintenance as directed by CNL.
4.1.1.7 U.S. Fish and Wildlife Service
The U.S. Fish and Wildlife Service (USFWS) is a member of the IMPROVE Steering
Committee and serves as one of the agencies that will perform the field work for the
program. The field operators at USFWS sites will be USFWS employees or
contractors who will:
• Operate samplers at USFS sites.
- Receive and store sampler shipping box.
- Perform weekly sample change.
- Return sampler shipping box to CNL.
- Consult with CNL concerning problems.
- Perform calibration and maintenance as directed by CNL.
4.1.1.8 EPA-OAQPS
The U.S. Environmental Protection Agency (EPA) is a member of the IMPROVE
Steering Committee. The QA responsibilities are discussed in sections 4.1.1.2, 4.1.1.3,
IMPROVE QAPP
Project Management
Section—Page: 4—7 of 52
Revision: 0.0
Date: March 2002
and 4.1.1.4. The EPA is also responsible for providing field operators at eight former
CASTNet (speciation) sites. They will provide contractors who will:
• Operate samplers at former CASTNet sites.
- Receive and store sampler shipping box.
- Perform weekly sample change.
- Return sampler shipping box to CNL.
- Consult with CNL concerning problems.
- Perform calibration and maintenance as directed by CNL.
4.1.1.9 State Agencies and Tribal Governments
State agencies and tribal governments perform the field work for the state and tribal
sites. The field operators will be state or tribal government employees, or contractors,
who will:
• Operate samplers at state and tribal sites.
- Receive and store sampler shipping box.
- Perform weekly sample change.
- Return sampler shipping box to CNL.
- Consult with CNL concerning problems.
- Perform calibration and maintenance as directed by CNL.
4.1.1.10 Coordinating and Elemental Contractor
Crocker Nuclear Laboratory (CNL) is the coordinating laboratory for all field
operations and speciation laboratory work. CNL will coordinate filter pre-sampling,
shipping, sampling, and post-sampling activities. CNL will perform gravimetric and
elemental analysis on the Module A and D Teflon filters. Using the information from the
contract laboratories and from its own analytical laboratory, CNL will calculate
concentrations and uncertainties for all reported parameters. Some of the QA activities
of CNL are briefly listed below:
•
Track and recordkeep all samples as they move through the program.
•
Participate in all TSAs and MSRs.
•
Analyze the PEs when received from the NAREL laboratory.
•
Perform gravimetric, absorption, and elemental analysis on all sample filters
and blanks.
•
Coordinate with other contract laboratories (DRI and RTI) and assure that
the good laboratory practices and QA are performed.
IMPROVE QAPP
Project Management
Section—Page: 4—8 of 52
Revision: 0.0
Date: March 2002
•
Maintain adequate internal documentation and quality control.
•
Perform Level 0, Level 1 and some Level 2 validation of the data.
•
Perform precision and bias analyses on collected data.
•
Perform annual calibrations, adjustments, and major repairs of the field
samplers.
•
Install instrumentation at new monitoring sites.
•
Coordinate the manufacturing of IMPROVE samplers.
•
Perform scientific analyses of IMPROVE data as directed by the NPS, and
prepare scientific papers and presentations.
•
Perform research as directed by the NPS.
4.1.1.11 Research Triangle Institute (RTI)
Under contract to the NPS, RTI will perform ion chromatography on all IMPROVE
Module B nylon filters. Specifically, RTI will:
•
Receive all Module B nylon filters and associated files with sample
identification information.
•
Perform ion chromatography on all sample filters and blanks.
•
Report all results to CNL as micrograms per filter.
•
Maintain good laboratory practices, internal documentation, and quality
control.
4.1.1.12 Desert Research Institute (DRI)
Under contract to the NPS, DRI will analyze all IMPROVE Module C quartz filters for
carbon. Specifically, DRI will:
•
Prefire all Module C quartz filters and forward them to CNL.
•
Receive all Module C filters and associated files with sample identification
information.
•
Perform carbon fraction analyses on all sample filters and blanks.
•
Report all results to CNL as micrograms per filter.
•
Maintain good laboratory practices, internal documentation, and quality
control.
IMPROVE QAPP
Project Management
Section—Page: 4—9 of 52
Revision: 0.0
Date: March 2002
•
Perform scientific analyses of IMPROVE data as directed by the NPS, and
prepare scientific papers and presentations.
4.1.1.13 Cooperative Institute for Research in the Atmosphere (CIRA)
Under contract to the NPS, the Cooperative Institute for Research in the Atmosphere
at Colorado State University performs the following data management, analyses, and
reporting functions for IMPROVE:
•
Receive Level-2 validated data from CNL and performs additional and
independent Level-2 data validation.
•
Maintain all IMPROVE data, reports, and program documentation on the
IMPROVE Web site.
•
Develop, maintains, and hosts the IMPROVE Web site.
•
Perform scientific analyses of IMPROVE data as directed by the NPS, and
prepare scientific papers and presentations, including video and CD-ROM
presentations.
•
Perform visibility research as directed by the NPS.
4.1.1.14 Organizational Structure of Crocker Nuclear Laboratory
The organizational chart for CNL is provided in Figure 2. The responsibilities for the
various CNL positions are defined in Table 2.
IMPROVE QAPP
Project Management
Section—Page: 4—10 of 52
Revision: 0.0
Date: March 2002
Crocker Nuclear Laboratory Director
Financial Manager
Robert Flocchini
Computer Support Manager
Fiscal Support
Danny Shadoan
Deborah Cecil
Program Manager
Lowell Ashbaugh
Network Operations Manager
Quality Assurance Manager
Patrick Feeney
Sampler QA/QC:
Maintenance
and Tests
Peter Beveridge
HIPS, XRF,
PIXE/PESA
Analyses
Brian Perley
Robert Eldred
Data Processing,
Data Validation,
Database
Management
Sample Collection,
Sample Handling,
Gravimetric Mass,
Laboratory Tests
Paul Wakabayashi
Steven Ixquiac
Quality Assurance,
Quality Control,
and SOP
Documentation
Lee Portnoff
Figure 2. CNL Organizational Chart for IMPROVE.
Table 2. IMPROVE Project Organization.
Role
Financial Manager and
CNL Director
•
•
•
•
Responsibilities
Determining that the research program adheres to its
budget.
Determining that the program interacts with other CNL
programs properly.
Overseeing personnel performance reviews.
Representing CNL in any fiscal inquiries.
IMPROVE QAPP
Project Management
Section—Page: 4—11 of 52
Revision: 0.0
Date: March 2002
Role
Project Manager
•
•
•
•
•
•
•
•
Quality Assurance
Manager
•
•
•
•
•
•
•
•
•
•
•
•
Responsibilities
Determining that all program objectives and contractual
requirements are being met on schedule and within
budget.
Preparing cost and budget analysis of the program.
Preparing reviews, work plans, and revisions to work
plans in accordance with contract requirements.
Overseeing program reviews, approving program work
plans, and approving revisions to work plans in
accordance with contract and COTR requirements.
Representing CNL in any technical inquiries.
Serving as point of contact with the COTR and
IMPROVE committee chairman.
Reviewing all quality assurance procedures with the QA
Manager and ensuring overall compliance.
Acting as Principal Investigator for research aspects of
the program.
Preparing the annual report.
Preparing other reports as needed.
Reviewing the quality assurance procedures of all aspects
of the program, in cooperation with the Program
Manager.
Verifying that all quality assurance procedures are being
met.
Reviewing quality assurance documentation by staff
members.
Supervising all quality assurance/quality control studies,
including those conducted at the Davis field station.
Validating the analytical results of gravimetric, HIPS,
XRF, PIXE, and PESA.
Validating the final data set.
Conducting annual internal audits of CNL operations.
Preparing the quality assurance annual report.
Preparing other reports as needed.
Performing management tasks as assigned by the
Program Manager.
IMPROVE QAPP
Project Management
Section—Page: 4—12 of 52
Revision: 0.0
Date: March 2002
Role
Network Operations
Manager
•
•
•
•
•
Laboratory Manager
•
•
•
•
•
•
•
Computer Support
Manager
•
•
•
Responsibilities
Monitoring program activities and advising the Program
Manager on all progress, needs and problems, including
technical, budgetary and staffing considerations.
Evaluating staff performance, in cooperation with the
Program Manager.
Periodic reviewing of field operations, sample handling,
sample analysis, and sampler testing with the various
managers..
Coordinating with the QA Manager on standard
operating procedures and special tests.
Coordinating work with the external contractors for ions
and carbon.
Directing the activities of the workers in the sample
handling laboratory.
Coordinating with Site Operators and responding to
sampler problems.
Maintaining a smooth flow of filters through the
laboratory.
Purchasing filter supplies.
Overseeing gravimetric measurements, and maintaining
the quality assurance records.
Coordinating with external contractors on the transport of
filters.
Reviewing the quality assurance procedures with the
Operations and QA Managers.
Maintaining the CNL computer network and all computer
workstations.
Advising CNL personnel on matters concerning
computer usage.
Coordinating with the Network Operations Manager to
optimize the sample handling program.
IMPROVE QAPP
Project Management
Section—Page: 4—13 of 52
Revision: 0.0
Date: March 2002
Role
Analytical Manager
•
•
•
•
•
•
Data Processing
Manager
•
•
•
•
•
•
•
•
Field Manager
Documentation
Specialist
•
•
•
•
•
•
•
•
•
•
Responsibilities
Preparing the instruction files for the analyses.
Operating the HIPS system and providing the QA data to
the QA Manager.
Operating the XRF system and providing the QA data to
the QA Manager.
Scheduling the PIXE/PESA analytical sessions and
overseeing the operation.
Setting up the PIXE/PESA system and presenting the
QA data to the QA Manager.
Maintaining records of quality assurance procedures as
advised by the QA Manager.
Supervising the entry of the field data in the database.
Calculating artifact values and analytical precisions.
Calculating concentrations, uncertainties, and minimum
detectable limits.
Performing data validation procedures.
Maintaining the database.
Preparing seasonal summaries.
Communicating with the IMPROVE website webmaster.
Maintaining records of quality assurance procedures as
advised by the QA Manager.
Coordinating the annual site visits.
Maintaining records of all field equipment.
Maintaining records of site descriptions.
Maintaining all samplers in storage at Davis.
Overseeing all tests at the Davis field station.
Overseeing fabrication of new samplers.
Maintaining the Standard Operating Procedures (SOPs).
Preparing and maintaining the QAPP.
Coordinating printed and HTML documentation for the
particulate monitoring program.
Assist in preparing quarterly and annual data documents.
IMPROVE QAPP
Project Management
Section—Page: 4—14 of 52
Revision: 0.0
Date: March 2002
4.1.2
KEY QA PERSONNEL
4.1.2.1 Steering Committee Chairman - Dr. Marc Pitchford
Dr. Pitchford will serve as the chairman of the IMPROVE Steering Committee. He will
schedule Steering Committee meetings, call them to order, facilitate the meetings, and
distribute the minutes. He will coordinate with all project participants to ensure that the
program develops and maintains an adequate quality system.
4.1.2.2 EPA Quality Assurance Coordinator – Mr. Dennis Mikel
Mr. Mikel of EPA will oversee the quality assurance aspects of the IMPROVE
Program. He will be designated as the QAC and will:
•
Coordinate the input to the EPA QAAR.
•
Ensure that updated QAPPs are in place for all environmental data
operations associated with the program.
•
Ensure that technical systems audits (TSAs), management system reviews
(MSRs), audits of data quality, and data quality assessments occur within
the appropriate schedule and conducting or participating in these audits.
•
Coordinate the ORIA QA activities.
The QAC has the authority to carry out these responsibilities and to bring to the
attention of the Steering Committee any issues related to these responsibilities.
4.1.2.3 National Park Service – Dr. William Malm
As the IMPROVE administrative agency, the NPS will be responsible for
ensuring that the direction of the IMPROVE Steering Committee is carried out. The
NPS activities are directed by Dr. William Malm. NPS responsibilities include:
•
Prepare and maintain the QAPP through NPS IMPROVE contractors.
•
Coordinate and review all QA procedures and documentation for all
aspects of the program.
•
Verify all QA procedures are met by the NPS/IMPROVE contractors.
•
Plan and review QA/QC studies.
IMPROVE QAPP
Project Management
Section—Page: 4—15 of 52
Revision: 0.0
Date: March 2002
4.1.2.4 Crocker Nuclear Laboratory – Quality Assurance Manager – Dr. Robert
Eldred
As directed by the NPS and CNL program manager, Dr. Eldred of CNL is
responsible for the QA aspect of the field and laboratory programs and will:
•
Review and maintain the quality assurance procedures for the field and
laboratory aspects of the program, in cooperation with the CNL program
manager and NPS.
•
Verify that all field and laboratory QA procedures are met.
•
Review QA documentation by staff members.
•
Supervise field-related QA/QC studies, including those conducted at the
field station.
•
Validate the analytical results of gravimetric, HIPS, X-Ray Fluorescence
(XRF), Particle Induced X-ray Emission (PIXE), and Proton Elastic
Scattering Analysis (PESA).
•
Validate the aerosol data set to Level-1.
•
Conduct annual internal audits of CNL operations.
•
Prepare the CNL QA Annual Report (QAAR) and forwarding the
electronic formatted report to the OAQPS QAC.
•
Prepare other reports as needed.
4.1.2.5 NAREL – Montgomery, Alabama Team Leader – Mr. Michael Clark
Mr. Clark of NAREL is responsible for overseeing laboratory QA activities of the
IMPROVE Program and will:
•
Implement and oversee the laboratory IMPROVE QA policy within the
team.
•
Oversee analysis of the QA samples.
•
Report the QA data to the OAQPS QAC.
IMPROVE QAPP
Project Management
Section—Page: 4—16 of 52
Revision: 0.0
Date: March 2002
•
Oversee the preparing of QA samples for the contract laboratories (CNL,
DRI, and RTI).
•
Lead the MSRs and TSAs on the contract laboratories.
4.1.2.6 R&IE – Las Vegas, Nevada, Branch Manager – Mr. Emilio Braganza
Las Vegas personnel will oversee the field QA activities. Mr. Braganza, the R&IE
branch manager, will:
• Participate in the development of data quality requirements (field) with the
appropriate QA staff.
•
Perform field TSAs and performance audits.
•
Oversee the verifying of all required field activities are performed and insure
that measurement quality standards are met as required in the QAPP.
•
Send a QA report to the OAQPS QAC for inclusion into the annual EPA
Quality Assurance Annual Report (QAAR).
4.1.2.7 Research Triangle Institute – E. Hardison
As directed by the NPS, RTI is responsible for the laboratory QA aspects of Module B
filter analyses, including:
• Prepare and maintain all ion chromatography laboratory SOPs.
•
Verify that all laboratory QA procedures are met.
•
Review QA documentation by staff members.
•
Validate to Level-0 all Module B ion analyses data.
•
Prepare the ion component of the QA Annual Report (QAAR) and forward
the electronic formatted section to CNL.
4.1.2.8 Desert Research Institute – Dr. Judith Chow
As directed by the NPS, DRI is responsible for the laboratory QA aspects of Module
C filter analyses, including:
• Prepare and maintain all carbon laboratory SOPs.
•
Verify that all laboratory QA procedures are met.
•
Review QA documentation by staff members.
•
Validate to Level-0 all Module C carbon analyses data.
•
Prepare the carbon component of the QA Annual Report (QAAR) and
forward the electronic formatted section to CNL.
IMPROVE QAPP
Project Management
Section—Page: 4—17 of 52
Revision: 0.0
Date: March 2002
4.1.2.9 Cooperative Institute for Research in the Atmosphere – Dr. Douglas Fox
As directed by the NPS, CIRA is responsible for the QA aspects of the validation,
maintenance, and distribution of the final data set and overall program documentation,
including:
• Prepare and maintain SOPs for Level-2 validation.
4.2
•
Prepare and maintain SOPs for data, reporting, and documentation
distribution.
•
Verify that all data and documentation posted on the IMPROVE Web site
meets all QA standards.
•
Prepare QA documentation for the QA Annual Report (QAAR).
Problem Definition / Background
Section 169A of the 1977 amendments to the Clean Air Act (CAA) established as a
national goal the “prevention of any future, and the remedying of any existing,
impairment of visibility in mandatory Class I Federal Wilderness Areas, which
impairment results from manmade air pollution.” Mandatory Class I Federal Wilderness
Areas are national parks greater in size than 6000 acres, international parks, and
wilderness areas greater in size than 5000 acres, all of which were in existence on
August 7, 1977. There are 156 such areas. The responsibility of protecting the air
quality at these areas was given to the Federal Land Managers (FLMs): the National
Park Service (NPS), the US Forest Service (USFS), The Fish and Wildlife Service,
and the Bureau of Land Management (BLM).
The NPS Visibility Monitoring Program started in 1978 without particulate
measurements. A program administered by the Las Vegas office of the EPA began
monitoring particulate concentrations in 1979 at several Class I NPS areas in the Rocky
Mountain region. CNL was the contractor for this program. The protocol for remote
area sampling were developed at this time. Using the samplers and protocols from the
EPA network, the NPS Visibility Monitoring Program added a particulate component in
1981.
In 1985, the EPA established Federal Implementation Plans for states without approved
visibility provisions in the State Implementation Plan. To assist states in meeting CAA
objectives, in 1987, Federal Land Managers joined with the EPA in a collaborative
monitoring program called IMPROVE. The IMPROVE committee originally consisted
of representatives of the four federal land managers and the EPA. At a later time,
representatives for four regional-state agencies were added.
IMPROVE QAPP
Project Management
Section—Page: 4—18 of 52
Revision: 0.0
Date: March 2002
The 1990 amendments to the CAA reaffirm the importance of visibility protection.
Section 169B includes provisions for the EPA to conduct visibility research with the
National Park Service and other federal agencies, to develop an interim findings report
on the visibility research, to develop a Report to Congress on expected visibility
improvements due to implementation of other air pollution programs, and to provide
periodic reports to Congress on trends in visibility improvements.
In 1991, three organizations were formally added to IMPROVE: the State and
Territorial Air Pollution Program Administrators, the Western States Air Resources
Council, and the Northeast States for Coordinated Air Use Management. A fourth
state organization, the Mid-Atlantic Regional Air Management Association, was added
later. Ten sites in the Eastern US were added to the IMPROVE network in 1991.
In 1997, the EPA published proposed amendments to the 1990 regulations (62 FR
41138) to set forth a program to address regional haze visibility impairment. EPA also
established a National Secondary Ambient Air Quality Standards (NAAQS) for
particulates with an aerodynamic diameter less than or equal to a nominal 2.5
micrometers (PM2.5) as part of a final decision on revision of the existing NAAQS for
particulate matter under Section 109(d) of the CAA. IMPROVE sites began PM2.5
measurements a decade earlier.
In 1999 and 2000, the total number of sites in the IMPROVE network increased to
143. Of these, 110 are under the supervision of the IMPROVE committee. These
represent 155 of the 156 mandatory Class I Wilderness Areas. The remaining 33 sites
are under the supervision of FLMs, states, and tribes. Almost all are at remote sites.
These 33 follow the same protocol as the IMPROVE sites and are denoted as
“Protocol sites.” In addition, the sampling frequency changed from WednesdaySaturday to 1-day-in-3. These changes required the design and fabrication of the
Version II IMPROVE sampler. The growth of the IMPROVE network is shown in
Figure 3. All sites were operated with IMPROVE samplers. Those listed as
IMPROVE are under the direct supervision of the IMPROVE steering committee.
Protocol sites the same full sampler with all four modules and operate for the entire
year. Non-Protocol sites have either less than four modules or operated for less than
12 months each year. All non-Protocol sites were phased out in 2000. (An additional
non-Protocol site at Mauna Loa Observatory is maintained for the NPS on the same
contract, but is not considered part of the IMPROVE network.)
A map of the IMPROVE network in 2002 is shown in Figure 4. Sites 1-110 are the
IMPROVE sites representing the Class I Wilderness Areas. Sites 111-163 are
IMPROVE Protocol sites. Sites 145-163 are scheduled for installation in early 2002.
IMPROVE QAPP
Project Management
Section—Page: 4—19 of 52
Revision: 0.0
Date: March 2002
Six sites are involved in an intercomparison with Speciation Trends Network samplers.
In the East, these are Washington DC (#119) and Dolly Sods Wilderness (#8). In the
Northwest, these are Seattle (#202) and Mount Rainier National Park (#80). In the
Southwest, these are Phoenix (#201) and Superstition Wilderness (#44). The Seattle
and Phoenix sites are for this intercomparison only. The other four are part of the
network.
number of sites
180
160
Non-Protocol
140
Other Protocol
120
Class I Protocol
100
80
60
40
20
0
'88 '89 '90 '91 '92 '93 '94 '95 '96 '97 '98 '99 '00 '01 '02
year
Figure 3. Monitoring Sites: Growth by Year for IMPROVE, Protocol, and nonProtocol sites using IMPROVE samplers from 1988 to 2002.
IMPROVE QAPP
Project Management
Section—Page: 4—20 of 52
Revision: 0.0
Date: March 2002
81
202
83
82
75
80
78
144
134
79
85
112
84
89
162 163
65
68
96
114
116
97
98
92
51
52
93
99
101
124
123
100
127
54
44
122
120
45
121
40
39
151
33
42
157
30
38
36
119
6
7
28
158
.
8
155
156
35
133
5
138
140
34
130
131
132
137
27
9
26
13
11
10
141
14
12
29
15
21
37
16
32
142
17
102
111
103
105
31
20
108
107
IMPROVE Sites (1-110)
Protocol Sites (111-163)
QA Sites (201- 202)
Figure 4. Monitoring Sites: Current and Projected Sites for 2002.
18
19
2
1
128
129
136
152
139
153
53
41
47
46 43
201
4
143
150
154
55
48
135
146
147
149
57
50
49
117
145
59
56
118
109
126
22
148
58
110
25
3
60
95
23
67
69
104 9 0
125
24
62
161
66
70
87
94
160 6 3
64
61
77
88
91
72
74
71
76
86
159
73
106
IMPROVE QAPP
Project Management
Section—Page: 4—21 of 52
Revision: 0.0
Date: March 2002
OP indicates the agency providing the site operator.
n
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
site name
Acadia
Moosehorn
Lye Brook
Great Gulf
Brigantine
Shenandoah
James River Face
Dolly Sods
Mammoth Cave
Great Smoky Mtns
Shining Rock
Cohutta
Linville Gorge
Swanquarter
Cape Romain
Okefenokee
St. Marks
Chassahowitzka
Everglades
Breton
Sipsey
Seney
Boundary Waters
Voyageurs
Isle Royale
Mingo
Upper Buffalo
Hercules -Glades
Caney Creek
Wichita Mountains
Big Bend
Guadalupe Mountains
Bandelier
San Pedro Parks
Wheeler Peak
Salt Creek
White Mountain
Bosque del Apache
Chiricahua
Saguaro
Petrified Forest
IMPROVE: Represented Class I Areas
Acadia
Moosehorn, Roosevelt Campobello
Lye Brook
Great Gulf, Presidential Range-Dry River
Brigantine
Shenandoah
James River Face
Dolly Sods, Otter Creek
Mammoth Cave
Great Smoky Mtns, Joyce Kilmer-Slickrock
Shining Rock
Cohutta
Linville Gorge
Swanquarter
Cape Romain
Okefenokee, Wolf Island
St. Marks
Chassahowitzka
Everglades
Breton
Sipsey
Seney
Boundary Waters
Voyageurs
Isle Royale
Mingo
Upper Buffalo
Hercules -Glades
Caney Creek
Wichita Mountains
Big Bend
Guadalupe Mountains, Carlsbad Caverns
Bandelier
San Pedro Parks
Wheeler Peak, Pecos
Salt Creek
White Mountain
Bosque del Apache
Chiricahua NM, Chiricahua W, Galiuro
Saguaro
Petrified Forest
ST
ME
ME
VT
NH
NJ
VA
VA
WV
KY
TN
NC
GA
NC
NC
SC
GA
FL
FL
FL
LA
AL
MI
MN
MN
MI
MO
AR
MO
AR
OK
TX
TX
NM
NM
NM
NM
NM
NM
AZ
AZ
AZ
OP
NPS
FWS
FS
FS
FWS
NPS
FS
FS
NPS
NPS
FS
FS
FS
FWS
FWS
FWS
FWS
FWS
NPS
FWS
FS
FWS
FS
NPS
NPS
FWS
FS
FS
FS
FWS
NPS
NPS
NPS
FS
FS
FWS
FS
FWS
NPS
NPS
NPS
IMPROVE QAPP
Project Management
Section—Page: 4—22 of 52
Revision: 0.0
Date: March 2002
n
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
site name
Gila
Mount Baldy
Tonto
Sierra Ancha
Ike's Backbone
Sycamore Canyon
Grand Canyon, Hance
Bryce Canyon
Canyonlands
Zion
Capitol Reef
Great Sand Dunes
Mesa Verde
Weminuche
White River
Rocky Mountain
Mount Zirkel
Badlands
Wind Cave
Theodore Roosevelt
Lostwood
Medicine Lake
UL Bend
Bridger
Yellowstone
IMPROVE: Represented Class I Areas
Gila
Mount Baldy
Superstition
Sierra Ancha
Mazatzal, Pine Mountain
Sycamore Canyon
Grand Canyon
Bryce Canyon
Canyonlands, Arches
Zion
Capitol Reef
Great Sand Dunes
Mesa Verde
Weminuche, La Garita, Black Canyon of Gunnison
Maroon Bells, West Elk, Eagles Nest, Flat Tops
Rocky Mountain
Mount Zirkel, Rawah
Badlands
Wind Cave
Theodore Roosevelt
Lostwood
Medicine Lake
UL Bend
Bridger, Fitzpatrick
Yellowstone, Grand Teton, Teton, Red Rock
Lakes
North Absaroka
North Absaroka, Washakie
Jarbidge
Jarbidge
Craters of the Moon
Craters of the Moon
Sawtooth
Sawtooth
Sula
Anaconda-Pintler, Selway-Bitterroot
Glacier
Glacier
Monture
Bob Marshall, Mission Mountains, Scapegoat
Gates of the Mountains Gates of the Mountains
Cabinet Mountains
Cabinet Mountains
Starkey
Eagle Cap, Strawberry Mountain
Hells Canyon
Hells Canyon
Mount Rainier
Mount Rainier
White Pass
Goat Rock, Mt Adams
Snoqualmie Pass
Alpine Lakes
North Cascades
North Cascades, Glacier Peak
ST
NM
AZ
AZ
AZ
AZ
AZ
AZ
UT
UT
UT
UT
CO
CO
CO
CO
CO
CO
SD
SD
ND
ND
MT
MT
WY
WY
OP
FS
FS
FS
FS
FS
FS
NPS
NPS
NPS
NPS
NPS
NPS
NPS
FS
FS
NPS
FS
NPS
NPS
NPS
FWS
FWS
FWS
FS
NPS
WY
NV
ID
ID
MT
MT
MT
MT
MT
OR
OR
WA
WA
WA
WA
FS
FS
NPS
FS
FS
NPS
FS
FS
FS
FS
FS
NPS
FS
FS
NPS
IMPROVE QAPP
Project Management
Section—Page: 4—23 of 52
Revision: 0.0
Date: March 2002
82
Pasayten
Pasayten
WA
FS
n
83
84
site name
Olympic
Three Sisters
ST
WA
OR
OP
NPS
FS
85
86
Mount Hood
Crater Lake
OR
OR
FS
NPS
CA
CA
OR
CA
CA
CA
CA
CA
CA
CA
CA
CA
CA
CA
CA
AK
AK
CA
AK
VI
HI
HI
CA
CA
NPS
NPS
FS
NPS
NPS
NPS
FS
FS
FS
NPS
FS
NPS
FS
FS
NPS
NPS
FWS
FS
FWS
NPS
NPS
NPS
FS
FS
ST
AK
WA
WY
NV
AZ
CA
DC
AZ
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
Lava Beds
Redwood
Kalmiopsis
Lassen Volcanic
Point Reyes
Pinnacles
San Gabriel
San Rafael
Bliss
Yosemite
Hoover
Sequoia
San Gorgonio
Agua Tibia
Joshua Tree
Denali
Tuxedni
Trinity
Simeonof
Virgin Islands
Hawaii Volcanoes
Haleakala
Dome Land
Kaiser
IMPROVE: Represented Class I Areas
Olympic
Three Sisters, Mount Jefferson, Mount
Washington
Mount Hood
Crater Lake, Diamond Peak, Mountain Lakes,
Gearhart Mtn
Lava Beds, South Warner
Redwood
Kalmiopsis
Lassen Volcanic, Caribou, Thousand Lakes
Point Reyes
Pinnacles, Ventana
San Gabriel, Cucamonga
San Rafael
Desolation, Mokelumne
Yosemite, Emigrant
Hoover, John Muir
Sequoia, Kings Canyon
San Gorgonio, San Jacinto
Agua Tibia
Joshua Tree
Denali
Tuxedni
Marble Mountain, Yolla Bolly Middle Eel
Simeonof
Virgin Islands
Hawaii Volcanoes
Haleakala
Dome Land
Kaiser, Ansel Adams
n
111
112
114
116
117
118
119
120
site name
Trapper Creek-Denali
Columbia River Gorge
Brooklyn Lakes
Great Basin
Indian Gardens
Death Valley
Washington DC
Saguaro west
Agency for Protocol Site
National Park Service
US Forest Service
US Forest Service
National Park Service
National Park Service
National Park Service
National Park Service
State of Arizona
IMPROVE QAPP
Project Management
Section—Page: 4—24 of 52
Revision: 0.0
Date: March 2002
121
122
123
n
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
Queen Valley
Organ Pipe
Hillside
site name
Meadview
Presque Isle
Old Town
Proctor Research
Casco Bay
Bridgton
Cape Cod
Quabbin Reservoir
Martha’s Vineyard
Mohawk Mountain
Spokane Reservation
Connecticut Hill
MK Goddard
Arendtsville
Quaker City
Bondville
State of Arizona
State of Arizona
State of Arizona
Agency for Protocol Site
State of Arizona
Micmac tribe
Penobscot tribe
State of Vermont
State of Maine
State of Maine
State of Massachusetts
State of Massachusetts
Wampanoags tribe
State of Connecticut
Spokane tribe
EPA (former CASTNet)
EPA (former CASTNet)
EPA (former CASTNet)
EPA (former CASTNet)
EPA (former CASTNet)
AZ
AZ
AZ
ST
AZ
ME
ME
VT
ME
ME
MA
MA
MA
CT
WA
NY
PA
PA
OH
IL
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
Livonia
Cadiz
Sikes
Addison Pinnacle
Columbia Gorge
Blue Mounds
Great River Bluffs
Niobrara River
Omaha Reservation
North Platte
Nebraska NF
Viking Lake
Lake Sugema
Sac and Fox
Cedar Bluff
El Dorado Springs
Tallgrass
Cherokee
Ellis
Flathead
Fort Peck
Northern Cheyenne
Cloud Peak
EPA (former CASTNet)
EPA (former CASTNet)
EPA (former CASTNet)
State of New York
US Forest Service
State of Minnesota
State of Minnesota
State of Nebraska
Omaha Tribe
State of Nebraska
State of Nebraska
State of Iowa
State of Iowa
Sac and Fox Nation
State of Kansas
State of Missouri
State of Kansas
Cherokee Nation
State of Oklahoma
Confederated Salish and Kootenai Tribes
Fort Peck Tribes
Northern Cheyenne Tribe
State of Wyoming
IN
KY
LA
NY
WA
MN
MN
NE
NE
NE
NE
IA
IA
KS
KS
MO
KS
OK
OK
MT
MT
MT
WY
IMPROVE QAPP
Project Management
Section—Page: 4—25 of 52
Revision: 0.0
Date: March 2002
163 Thunder Basin
201 Phoenix
202 Seattle
4.3
State of Wyoming
EPA (Speciation Trends collocated)
EPA (Speciation Trends collocated)
WY
AZ
WA
Project/Task Description
The primary objective of the IMPROVE Visibility Monitoring Program is to monitor the
concentrations of particles affecting visibility in Class I Wilderness Areas. With the
Regional Haze Regulation, the specific species for this primary objective has been
defined as those used to calculate reconstructed extinction: sulfate (sulfate ion or
elemental sulfur), nitrate, soil, organic matter, light-absorbing (elemental) carbon, and
coarse mass. The last is the difference between PM10 mass and PM2.5 mass.
This primary objective affects only the IMPROVE sites at Class I Wilderness Areas
and only a portion of the species measured.
Secondary objectives include:
1. Monitoring the species used to calculate reconstruction extinction at the
Protocol sites not in Class I Areas, in order to understand the transport of
visibility-impairing species into Class I Areas. In many cases, this information
will be used by the states in developing SIPs.
2. Monitoring trace elements in order to understand possible sources of the
visibility-impairing species.
The original objectives of the particulate component of the Visibility Monitoring
Program as stated in 1987 are still appropriate. These were:1
1. To establish the background visibility levels necessary to assess impacts of
potential new sources through the development and operation of a long-term
background monitoring network,
2. To determine the sources and levels of reasonably attributable visibility
impairment through the design and implementation of special source attribution
studies, relationship between visibility impairment and various atmospheric
particulate constituents.
3. To collect data useful for assessing progress toward the national visibility goal,
and determine the existing sources of particles producing visibility impairment.
1
Joseph, D. B., J. Metsa, W.C. Malm, M. Pitchford, “Plans for IMPROVE: A federal
program to monitor visibility in Class I areas,” in Visibility Protection: Research and
Policy Aspects, P.S. Bhardwaja, ed, 1987.
IMPROVE QAPP
Project Management
Section—Page: 4—26 of 52
Revision: 0.0
Date: March 2002
4.
To promote the development of improved visibility monitoring technology and
the collection of comparable visibility data by the evaluation of candidate
monitoring methods and the development of quality assurance, data
processing and documentation procedures.
The operation of the IMPROVE network is performed by three contractors. The prime
contractor is CNL, and has been the prime contractor since the start of the program in
1987. There are two additional contractors for part of the sample analysis. Research
Triangle Institute (RTI) is responsible for the analysis of the nylon filters and Desert
Research Institute (DRI) is responsible for the analysis of the quartz filters. CNL is
responsible for all other project activities. The quality assurance objectives of RTI and
DRI are included in this plan. In order to meet these objectives, the particulate
contractors for IMPROVE must perform the following tasks.
Tasks that CNL is responsible for include:
1. Designing and fabricating the samplers used in the network.
2. Supervising the selection of sites and installing the samplers.
3. Overseeing the field operation at all ambient monitoring stations by:
• Providing training and support to IMPROVE site operators,
• Maintaining flow rates, calibrations, site records, and sampler equipment
on annual site visits.
4. Acquiring and acceptance testing the Teflon and nylon filters:
• Running acceptance tests on the collection properties of each new batch
of Teflon and nylon filters,
• Performing a collocated test for each new batch of Teflon filters to
determine if there is an organic artifact,
• Performing elemental analysis on a set of filters from each new batch,
• Shipping a set of nylon filters from each new batch to RTI and analyzing
the results to verify that the artifact values are acceptable.
5. Handling all samples by:
• Loading filters into cassettes and cartridges, and shipping to site operators
along with the field log sheet,
• Receiving samples from site operators,
• Logging the collection parameters from memory card into the computer
data base and checking the data for consistency,
• If the collection information is not available on memory cards, logging the
collection parameters from the field log sheets into the computer data
base twice, and checking the data for consistency,
• Logging all comment from the field operator in the data base,
IMPROVE QAPP
Project Management
Section—Page: 4—27 of 52
Revision: 0.0
Date: March 2002
•
Logging all sample-handling actions into the computer data base to track
actions at laboratory workstations, performing immediate quality
assurance checks, and verifying complete data entry,
• Distributing the filters to the appropriate analytical custodian: Module A
Teflon to the CNL Analytical Manager, Module B nylon to the Ions
Contractor, Module C quartz to the Carbon Contractor, and Module D
Teflon to the archives. All relevant information is sent electronically with
the samples to improve efficiency and maintain quality control.
6. Performing and/or supervising all filter analysis, by:
• Analyzing all Teflon filters for gravimetric mass,
• Analyzing a set of gravimetric controls at the beginning of every morning
and afternoon session,
• Analyzing the Module A Teflon filters by Hybrid Integrating Plate/Sphere
(HIPS), X-ray Fluorescence (XRF), Particle Induced X-ray Emission
(PIXE), and Proton Elastic Scattering (PESA).
• Reanalyzing a set of filters from the previous quarter at the beginning,
middle, and end of the analytical session,
• Reviewing all quality control and quality assurance data from the various
analytical methods: gravimetric analysis, HIPS, XRF, PIXE, and PESA
by CNL; Ion Chromatography (IC) by the Ions Contractor; and Thermal
Optical Reflection (TOR) by the Carbon Contractor.
7. Processing and validating data by:
• Relating samples to the flow calibration and field log databases
• Preparing a flow file database that includes sample id, start time, duration,
module flow rates, module volumes, temperature, calibration constants
dates, and status code
• Generating a mass database to contrast pre and post filter mass values, as
well as descriptive statistics for mass gains of control and field blank filters
• Compiling databases from external contractor analysis
• Monitoring analytical precision through calibration standards and replicate
analysis
• Performing Level I and Level II data validation
• Preparing quarter data summaries
Tasks that RTI is responsible for include:
1. Analyzing the laboratory blanks for new nylon batches when sent by CNL.
2. Analyzing the samples sent by CNL.
3. Calibrating the analytical system and running quality assurance standards.
4. Performing replicate analyses as specified by their SOP's.
IMPROVE QAPP
Project Management
Section—Page: 4—28 of 52
Revision: 0.0
Date: March 2002
5. Sending the analytical results for samples and field blank to CNL.
Tasks that DRI is responsible for include:
1. Prefiring the quartz filters and shipping to CNL.
2. Analyzing laboratory blanks as specified by their SOP's.
3. Analyzing the samples sent by CNL.
4. Calibrating the analytical system and running quality assurance standards.
5. Performing replicate analyses as specified by their SOP's.
6. Sending the analytical results for samples, field blank, and secondary filters to
CNL.
The IMPROVE Standard Operating Procedures and Technical Instructions for
operations by CNL are listed in Table 3. RTI and DRI have separate SOP's for the
Ion Chromatography and Thermal Optical Reflectance analyses. The SOP's are
available on the IMPROVE web site at the following URL's:
CNL SOP's:
DRI SOP's:
http://www2.nature.nps.gov/ard/vis/sop/index.html
http://www2.nature.nps.gov/ard/vis/casop/ and
http://www2.nature.nps.gov/ard/vis/qffsop/.
IMPROVE QAPP
Project Management
Section—Page: 4—29 of 52
Revision: 0.0
Date: March 2002
Table 3. IMPROVE Standard Operating Procedures and Technical Instructions
SOP 101 Procurement and Acceptance Testing
TI 101A Filter Procurement and Acceptance Testing
TI 101B Sampler Construction and Testing
TI 101C Sampler Wiring Diagrams
TI 101D Filter Cassette Construction
SOP 126 Site Selection
SOP 151 Installation of Samplers.
TI 151A Installation of Controller Module
SOP 176 Calibration, Programming, and Site Documentation
TI 176A Calibration of Audit Devices
TI 176B Final Flow Rate Audit Calculations
TI 176C Flow Rate Audits and Adjustment
SOP 201 Sampler Maintenance by Site Operators
TI 201A IMPROVE Aerosol Sampler Operations Manual
TI 201B Forms for Flow Audits by Site Operators
SOP 226 Annual Site Maintenance
SOP 251 Sample Handling
SOP 276 Optical Absorption Analysis
SOP 301 X-Ray Fluorescence Analysis
TI 301A XRF Calibration
SOP 326 PIXE and PESA Analysis
TI 301A PIXE Calibration
SOP 351 Data Processing and Validation
SOP 376 Data Archiving and Reporting
4.4
The Scope of this Quality Assurance Plan
This Quality Assurance Plan is intended to cover the aerosol collection and analysis
program of IMPROVE. It includes the operation of the network by CNL, the analyses
of the Teflon filters by CNL, the analyses of the nylon filters by the ions contractor
(RTI), and the analyses of the quartz filters by the carbon contractor (DRI). It does not
include further quality assurance and maintenance of the IMPROVE web site by
another contractor to the NPS, CIRA. It also does not include the system audits to be
performed by EPA OAR.
This QAPP is written to address the objectives of the 110 IMPROVE sites operated
for the IMPROVE steering committee to monitor regional haze at Class I Wilderness
Areas. The objectives of the IMPROVE Protocol sites operated for FLM, states,
tribes, and EPA may differ from those of the IMPROVE sites and from each other.
IMPROVE QAPP
Project Management
Section—Page: 4—30 of 52
Revision: 0.0
Date: March 2002
Operationally, the IMPROVE Protocol sites are treated identically to the IMPROVE
sites.
4.5
Data Quality Objectives
The Primary Objective of IMPROVE is to track reconstructed extinction. The
coefficient of extinction in Mm-1 is defined as
bext = 3 f ( RH )[ Sulfate] + 3 f ( RH )[ Nitrate] + ( 4)[OMC ] +
(10)[ LAC] + (1)[ Soil ] + ( 0.6)[CM ] + (10)
(Equation 1)
where [Sulfate] = (4.125)[S] or (1.375)[SO4=],
[Nitrate] = (1.29)[NO3-]
[OMC] = (1.4)[OC] {OC = sum of organic fractions}
[LAC] is elemental carbon = sum of elemental fractions
[Soil] = sum of Al, Si, K, Ca, Ti, Fe plus oxides
[CM] = [PM10] – [PM2.5] {coarse mass}
f(RH) = relative humidity factor.
The last term of 10 represents the contribution of Rayleigh scattering from air molecules.
The relative humidity factor f(RH) is specific for each site and month. It varies from 1 to
5, with an average value of 2.6. Generally the annual average is from 1.5 to 3 in most
of the west, 3.0 to 3.6 in the east, and 3.6 to 4.2 in most of Oregon and Washington
and a few other marine sites.
The extinction in Mm-1 is converted to deciviews using the formula
b 
dv = 10 ln  ext 
 10 
(Equation 2)
The deciview extinction is calculated for each sample and ranked for the calendar year.
Annual averages are calculated for the 20% best days (lowest dv) and 20% worst days
(highest dv). The values are combined for specific five-year periods (2000-2004,
2005-2009, etc), by averaging the five annual values. The trends in the five-year values
will be used to measure progress for SIPS. The requirement is that neither the best
days not the worst days may show increasing trends. The first SIP comparison will take
place in 2018.
A 10-percent change in bext corresponds to a 1 deciview change. The primary Data
Quality Objective (DQO) for IMPROVE is to be able to measure a 5-percent change
in bext in 5 years. The effect of individual components on bext depends on the site.
IMPROVE QAPP
Project Management
Section—Page: 4—31 of 52
Revision: 0.0
Date: March 2002
Generally, in the eastern U.S., sulfate dominates bext. In most of the western U.S.,
sulfate, and organic, and soil play major roles. In a few Class I Areas near west coast
urban areas, nitrate is important. Coarse mass can be important on a few samples per
year at some sites, but is generally nor important. LAC (elemental carbon) is rarely
important. The DQO for IMPROVE will therefore require that a 5-percent change in
five years in each of the major components of sulfate, organic carbon, and soil must also
be achieved.
4.6
Measurement Quality Objectives
The Measurement Quality Objectives (MQO's) should either reflect the DQO or be
sufficiently stringent so that the DQO will be easily met. The most important single
factor in calculating trends is consistency over several decades. It is essential that no
biases be introduced in either collection or analysis over this time period. One area that
will may be a problem is consistency in filter media. Protocol are in place to monitor
any changes.
The usual factors in determining MQO's are precision, accuracy, minimum quantifiable
limit (MQL), completeness, representativeness, and comparability. The objectives of
precision, accuracy, and minimum quantifiable limit are summarized in Table 4.
Table 4. IMPROVE Measurement Quality Objectives of Precision, Accuracy, and
Minimum Quantifiable Limit.
Method
Parameters
Precision*
Accuracy
MQL
Mass
± 5 µg
± 5 µg
300 ng/m3
Optical Absorption
± 5%
± 5%
0.2 Mm-1
Elements Fe to Pb
± 5%
± 5%
0.05 - 0.18 ng/m3
Elements S to Mn
± 5%
± 5%
1 - 4 ng/m3
Element Na
± 5%
± 10%
20 ng/m3
PESA
Elemental H
± 5%
± 5%
4 ng/m3
IC
NO3, SO4, NH4
± 5%
± 5%
10 - 30 ng/m3
NO2, Cl
± 5%
± 5%
60 - 100 ng/m3
TOR
Organic Carbon
± 5%
± 5%
250 ng/m3
Elemental Carbon
± 10%
± 5%
100 ng/m3
* The precision objectives are for large concentrations. It excludes the statistical
uncertainty for XRF, PIXE, and PESA analyses and the uncertainty in artifact
subtraction for IC and TOR analyses. These factors are characterized by the MQL.
Gravimetric
HIPS
XRF
PIXE
4.6.1
Precision Objectives
Precision is a measure of mutual agreement among individual measurements of the same
property. The overall precision will be determined by collocated sampler
IMPROVE QAPP
Project Management
Section—Page: 4—32 of 52
Revision: 0.0
Date: March 2002
intercomparisons. Collocated sampler intercomparisons involving pairs of
measurements can be used in three ways. The first way is to calculate the precision
directly for parameters where the concentrations are generally sufficiently large so that
counting statistics and uncertainties in blank subtraction are negligible. The second use
of collocated comparisons is verify that the uncertainties in the database adequately
characterize the measured differences. This will be used for trace elements and
variables with lower concentrations. Finally, the third use of collocated measurements is
to monitor the individual parameters used in the calculation of individual concentrations,
such as volume uncertainty, analytical calibration uncertainty.
The standard procedure is to calculate the precision for a given variable directly. The
uncertainty for a given concentration has components that are proportional to the
concentration (flow rate and analytical calibration uncertainties), and components that
are independent of concentration or depend on the square root of the concentration
(artifact and statistical uncertainties). The precision can be defined either as the absolute
precision or relative precision. The absolute precision equation requires that the
uncertainty be predominantly independent of concentration, which is never valid for
aerosol measurements. The relative precision equation requires that the uncertainty be
predominantly proportional to uncertainty. Fortunately, there are many cases where this
is valid. The equation for the paired relative precision is given by:
2
2 (xi − yi )
P = ∑
,
n
( xi + y i )2
2
rel
(Equation 3)
where x and y are the paired measurements.
The collocated measurements should be used to estimate relative precision only when
the concentrations are sufficiently large so that the uncertainty in artifact subtraction or
the statistical uncertainty are negligible. Examples of where relative precision is
appropriate are: sulfate, nitrate, and sulfur at all except a few very clean sites, the major
soil elements at many sites, carbon at many sites, zinc and lead at many sites. The range
of validity can be extended to other sites and variables by selecting only those cases
where the concentration is large. For example, sodium, chlorine, and chloride can be
characterized if only samples with major marine influence are considered. The precision
calculation is rarely appropriate are for most trace elements. For example, the selenium
precision would be different for most Eastern sites than for most Western sites simply
because the concentrations are much larger. A precision calculation that depends on
the ambient concentrations is of little value in characterizing the system.
IMPROVE QAPP
Project Management
Section—Page: 4—33 of 52
Revision: 0.0
Date: March 2002
The second way to use collocated measurements is to verify the uncertainties calculated
for each measurement based on both collection and analysis. With this method, the
results are independent of the ambient concentrations. This method compares the
differences in concentrations with the predicted uncertainties in the two concentrations.
The appropriate statistic is the goodness-of-fit (chi-square) parameter, which compares
the measured differences with the calculated uncertainties of the concentrations. For a
comparison using two collocated samplers, the pairwise goodness-of-fit parameter, χ2,
is given by:
(xi − yi )
1
∑
n
σ x2 + σ y2
2
χ2 =
(Equation 4)
where x and y are the paired concentrations and σx and σy are the corresponding
calculated uncertainties. If the value of χ2 is near unity, then the calculated uncertainties
correctly predict the measured differences. Values significantly larger than unity for
identical samplers indicate either that the calculated uncertainties are too small. Either
the precision estimates in collection and analytical calibration are too small or there are
other sources of uncertainty that should be included in the calculation. Values
significantly smaller than unity indicate that the uncertainty estimates are too small. If
value of χ2 differs from unity, the parameters used to calculate individual uncertainties
should be re-evaluated.
IMPROVE will have collocated samplers at approximately 4% of the sites. Rather than
having a complete collocated sampler at six sites, there will be a single collocated
module at twenty-four sites. These will be equally divided between the four module
types.
In the database, it is necessary to calculate the uncertainty of each concentration. This
will depend on the precision for various components of the concentration calculation,
such as volume and analytical calibration. The uncertainty in a given concentration is a
combination of several factors:
• The precision in the volume (primarily from the precision in the flow rate). This
enters as a relative uncertainty (e.g. 3%) for every parameter.
• The calibration and normalization precision of the analytical instrument. This
enters as a relative uncertainty (e.g. 3%) for all methods except gravimetric
analysis.
• The statistical counting uncertainty for analytical methods that count events in a
spectral peak. This enters as a constant uncertainty (e.g. 5 µg) divided by the
volume for all elemental parameters.
IMPROVE QAPP
Project Management
Section—Page: 4—34 of 52
Revision: 0.0
Date: March 2002
•
•
The standard deviation of the field blanks (ions) or secondary filters (carbon),
when an artifact is subtracted from the measurement. This includes any constant
uncertainty in the analysis plus the variability of the artifact. This enters as a
constant uncertainty (e.g. 5 µg) divided by the volume.
The standard deviation of the gravimetric control filters. This enters as a
constant uncertainty (e.g. 5 µg) divided by the volume.
The relative precision in the flow rate can be estimated from flow audits and from an
examination of possible sources of variability.
The relative precision in analytical calibration and normalization can be estimated by
replicate analysis of the same filter. For analytical methods that have inherent
uncertainty in counting statistics, such as elemental analysis, caution must be exercised to
avoid commingling calibration precision and statistical uncertainty. The calibration
precision can only be estimated from variables in which the statistical uncertainty can be
neglected. For example, with x-ray analyses, the calibration precision can only be
estimated from a few elements that are present in high areal concentrations.
The method for determining analytical precision for each method is listed in Table 5.
Table 5. Methods for determining Analytical Precision
Method
Precision Method
Gravimetric
HIPS
XRF
PIXE
PESA
IC
TOR
standard deviation of control filters
replicate analyses conducted in previous session
replicate analyses conducted in previous session
replicate analyses conducted in previous session
replicate analyses conducted in previous session
standard deviation of QA standards, replicate analysis of eluent
analysis of additional punch in filter
The collocated samplers can check the validity of the precisions various components.
For example, if the χ2 calculated in the above equation is consistently greater than one,
then the flow rate estimate of 4% is too low. The measurements can then be used to
adjust the parameters used in the uncertainty calculations.
In general, it is not possible to define the precision of the concentration of a given
variable as a single number. For most variables, the precision obtained by collocated
sampling at one site and season cannot be extended to a different site and season.
IMPROVE has tried to overcome this limitation by calculating the uncertainty for every
concentration and providing this uncertainty in the database. The parameters that are
IMPROVE QAPP
Project Management
Section—Page: 4—35 of 52
Revision: 0.0
Date: March 2002
involved are derived from collocated measurements, the actual possible mechanisms for
error propagation, and from other tests. It is recommended that the data users use the
calculated uncertainties rather than a fixed fractional or absolute uncertainty for the
parameter.
Internal Redundancy. It should be noted that speciated sampling with independent
modules and redundant measurements permits a modified form of precision comparison
through intercomparisons for each sample. Although this involves some assumptions not
required in direct collocated intercomparisons, the advantage is that can be performed
on every sample rather than on some much smaller number, such as 4%. The
comparison of equivalent parameters using independent modules is performed for every
sample during the data validation procedure. The sulfur-sulfate comparison is especially
helpful because the concentrations are generally large enough so that the uncertainties
from counting statistics and artifact subtraction are negligible. In addition to being used
to monitor the calibration/normalization precisions of both IC and PIXE, the comparison
performs several additional functions.
•
It provides a quality control check for Modules A and B for every sample.
•
It monitors the flow rate calibration for Modules A and B.
•
It provides a measure of accuracy for the measurement of sulfur and sulfate by
independent analytical methods.
Two checks between Modules A and C are possible. Organics mass can be estimated
from elemental hydrogen measured on the Teflon A filter, with the assumption that the
hydrogen is all from either organic matter or ammonium sulfate. The estimate is too
small if the sulfate is acidic or marine and too large if there is significant nitrate relative to
sulfate. The first comparison is between organics mass by hydrogen with organic mass
by carbon. The second is between the coefficient of absorption measured on the Teflon
A filter with light-absorbing carbon measured on the quartz C filter. While less precise
than the sulfur-sulfate comparison, these do provide a useful quality control check
between Modules A and C for every sample.
Thus the internal comparisons allow a check of all three PM2.5 modules for every
sample. The emphasis here is on quality control, while collocated precision tests are
strictly quality assurance. However, the inter-module comparisons do provide very
robust data sets for quality assurance.
Precision MQO's and the overall DQO. There are two possible ways to characterize
the uncertainty of the mean of a given set of measurements: (1) calculate the standard
error of the concentrations, and (2) calculate the uncertainty by propagating the
individual measurement uncertainties. For IMPROVE, the statistic of concern is the
IMPROVE QAPP
Project Management
Section—Page: 4—36 of 52
Revision: 0.0
Date: March 2002
average value of the reconstructed extinction parameter for the 20% worst days or 20%
cleanest days. For sampling every third day, each group corresponds to about 24
samples. In practice, the standard error of the values in the group is about an order or
magnitude larger than the uncertainty that would be obtained by propagation of the 24
individual uncertainties. Consider the specific case of sulfur concentrations at one site
for the calendar year 1998. The 24 lowest concentrations in ng/m3 were: 44, 44, 67,
67, 119, 122, 143, 149, 151, 163, 200, 205, 213, 217, 221, 221, 222, 222, 224,
232, 234, 234, 242, 242. The corresponding uncertainties (between 5 and 7%) were:
3, 3, 5, 5, 7, 8, 9, 8, 9, 10, 11, 11, 12, 13, 12, 12, 12, 12, 12, 15, 13, 13, 13, 14.
The average is 175 ng/m3. The standard error is 14 ng/m3 (8%). However, the
propagated uncertainty is only 19 ng/m3 (1%). For this same data set, the 24 highest
concentrations gave the same general result that the standard error (in this case, 14%)
was much larger than the propagated precision (in this case, also 1%). The appropriate
estimate of the uncertainty should be the standard error. Note that in both cases, the
standard error is associated with the variation in the ambient concentrations and is not
affected in any significant way by the reasonable measurement uncertainty of 5%. This
example shows is that the precision MQO's of about 5% sufficient to not adversely
affect the overall DQO of calculating trends based on groups of data.
4.6.2
Accuracy Objectives
Accuracy is the agreement between a sample reading and the true value of the sample.
Accuracy is determined by comparing instruments to reference standards traceable to
the National Institute of Standards and Technology (NIST). The reference standard for
flow rate and each analytical method is listed in Table 6.
IMPROVE QAPP
Project Management
Section—Page: 4—37 of 52
Revision: 0.0
Date: March 2002
Table 6. Flow Rate and Analytical Accuracy
Method
Accuracy Reference
Flow Rate
Gravimetric
HIPS
NIST-traceable Dry-Cal Nexus DC-2 Flow-Calibrator
NIST-traceable Class 1.1 (Class M) mass standards
Reference Integrating Sphere system, which is itself calibrated using
NIST-traceable absorption standards
17 commercial NIST traceable elemental standards
27 commercial NIST traceable elemental standards
mylar standards of known area weighed on a balance calibrated by
NIST-traceable standards
NIST-traceable solutions of each ion
CH4 gas, CO2 gas, samples spiked with KHP and sucrose, none
NIST-traceable.
XRF
PIXE
PESA
IC
TOR
4.6.3
Minimum Quantifiable Limit Objectives
The Minimum Quantifiable Limits (MQL's) are defined differently for the elemental
analysis than for the other analytical methods.
For elemental analysis, the important quantity in the absence of analytical interferences is
the analytical minimum detectable limit (MDL). For the x-ray and PESA analysis the
counting spectrum has inherent background or noise. The ability to identify an element
depends on the real counts being greater than the uncertainty in the background noise.
The standard 3s definition is that the number of real counts must be greater than 3.29
times the standard deviation of the background in the spectrum. If the concentration is
below the MDL, the element cannot be identified. For the elements, there is no external
background or artifact to subtract. For most elements the MQL is set equal to the
MDL. For a few elements, there is interference from other elements. (The most
important cases are Pb interfering with As and Ti interfering with V.) For these
elements, an MQL is calculated based on both the background and the concentration of
the interfering element. For most cases, the uncertainty in the concentrations is
approximately 50%.
For all non-elemental analyses, the concentrations involve some sort of subtraction. For
gravimetric analysis, this is the pre-weight from the post-weight. For the others, the
subtraction is the artifact from the sample measurement. The MQL is equal to a factor
times the standard deviation of controls, field blanks, or secondary filters. Because the
MQL for elements corresponds to an uncertainty of approximately 50%, the factor
chosen for non-elemental parameters is 2. Thus, al the MQL, all parameters have an
uncertainty of approximately 50%.
IMPROVE QAPP
Project Management
Section—Page: 4—38 of 52
Revision: 0.0
Date: March 2002
The IC and carbon analyses also have an analytical MDL below which the instrument
cannot report any value. In practice, this MDL is much smaller than the MQL
determined from blanks. In general, the parameters on all samples, including field
blanks, are above the analytical MDL. In all cases, any concentration above the MQL
has an analytical value above the analytical MDL. Therefore the analytical MDL's are
not relevant to the quality of the data and do not have any associated MQO's.
Note that the major components of reconstructed extinction are rarely below the MQL.
Thus MQL's play only a minor role in the primary objective. However, they are
important in the examination of trace elements.
4.6.3.1 IMPROVE Sampler
Sampler airflow precision is monitored in three ways.
(1) The history of flow rates for each transducer is examined at least once a month to
monitor for changes in performance.
(2) The comparison of equivalent measurements by different modules is performed
once a quarter to verify that the flow rates of various modules agree. The most
precise test is the comparison of sulfur from Module A with sulfate from Module
B. A less precise comparison is between organic matter by hydrogen from
Module A with organic by carbon by Module C. A rough comparison is between
PM2.5 mass from Module A and PM10 mass from Module D.
(3) Field flow calibrations are conducted at least every six months, and when there is
a potential problem with flow measurements. A field calibration is performed
annually by a CNL technician during annual maintenance. The Site Operator
performs an field calibration six months after the annual maintenance using a
standard calibration device mailed from UCD. The calibration device is an orifice
meter with a meter to measure the pressure drop across the orifice. The meter for
the standard computer-based device is a transducer similar to that in the Version
II sampler. For a system with manual readout, the meter is a magnehelic similar to
that used in the Version I sampler. All calibrations consist of comparison of the
system transducer readings with the flow rates of the calibration device at four
different flow rates covering the normal range encountered. If the correlation
between the calibration and system flow rates has an R2 value less than 0.99, then
the calibration is redone. If the nominal flow rate (flow rate with a normal clean
filter) are not within 5% of the desired nominal values, the site operator may be
asked to adjust the nominal flow rate and perform a new calibration.
IMPROVE QAPP
Project Management
Section—Page: 4—39 of 52
Revision: 0.0
Date: March 2002
If the calibration results are not within 5% of the previous calibration results, the past
data are reviewed to determine when the change occurred. This is determined by
examining the comparisons between similar concentrations measured by separate
modules (sulfur sulfate, organic mass by hydrogen and carbon). If no change date can
be identified, the samples collected between the two calibrations will be flagged as QD
(questionable data). The new calibration is applied to all samples collected after that
change.
Sampler airflow accuracy is maintained be referencing all field calibration devices to a
DryCal Nexus DC-2 Flow Calibrator that is certified NIST traceable. The results are
verified using a dry gas meter. The field manager maintains a set of calibration orifice
meters for field and mail calibrations. All calibration devices are calibrated at UCD
using a same reference flow calibrator. The calibration of each calibration device is
verified before and after each calibration.
4.6.3.2 Gravimetric
The CNL weight laboratory uses multiple balances. Using a system of cross checks,
the results of all balances are adjusted to be identical. At the beginning of every
morning and afternoon session, a Laboratory Technician calibrates all balances and
checks each balance using an NIST-traceable test weight. If the test weight differs from
the expected by more than 2 µg, the balance is recalibrated.
Gravimetric precision is monitored through a system of control filters. A blank filter is
weighted on each balance after the calibration. After the afternoon measurement, it is
loaded into a cassette and stored for 42 days, the typical turnaround time for ambient
filters. Another filter that was originally weighed 42 days earlier is reweighed on both
balances at the same time every morning and afternoon.
The control values are used in four ways. (1) They determine the relationship between
all balances. A regression equation is determined from the paired masses of the control
filters. The Laboratory Manager monitors the history of the regression equation. (2)
The mass artifact associated with being in the cassette for 42 days is monitored by the
mean of the mass gain after 42 days. (3) The precision of the balances is monitored
using the standard deviation of the difference between morning and afternoon. (4) The
precision of the mass differences for ambient samples is monitored using the standard
deviation of the difference after 42 days.
The typical monthly average and standard deviation of the morning vs. afternoon
comparison has been 0 ± 1 µg. The typical monthly average and standard deviation of
the 42-day controls was about 4 ± 4 µg with the Version I cassettes. With the Version
IMPROVE QAPP
Project Management
Section—Page: 4—40 of 52
Revision: 0.0
Date: March 2002
II cassettes, the 42-day comparison is the same as the morning-afternoon comparison:
0 ± 1 µg. Thus with the Version II cassettes, there is no artifact gain from contact with
the cassettes.
Balance accuracy is limited by digital voltmeter linearity, torque motor linearity,
precision, calibration weight accuracy, and range of agreement. Variations in the
electromagnetic field within the laboratory also affect balance accuracy and precision.
A Bell 620 Gaussmeter monitors magnetic flux in the weighing room. Accuracy is
evaluated twice daily as part of a standard calibration procedure. Calibration weights
are certified traceable to NIST standards.
The uncertainty of the measurements will be significantly increased if the mass is
recorded before the balance has reached stability. The standard methodology is to wait
for a preset period after the filter is placed on the pan before recording the mass. This
has two problems. First, for most samples stability is reached well before this time.
Efficiency with a large number of samples is important. Second, there are occasionally
samples, generally with elevated NaCl, that take much longer to reach stability. The
methodology for IMPROVE is to monitor the stability of the mass measurement by
computer, and record the mass only after stability is reached. In the CNL system, the
mass is measured once every 17 seconds by the computer. If the mass value differs by
more than 1 µg since the last check, the computer will wait another 17 seconds and
repeat the measurement. The computer will record the mass after the stability criterion
of less than 1 µg per 17 seconds is reached.
The goal of IMPROVE is to achieve a mass precision of 4 µg based on the standard
deviation of control filters after 42 days. Currently the mass precision is 1 µg. The
uncertainty calculations are based on a conservative precision of 5 µg.
4.6.3.3 Optical Absorption Analysis (HIPS)
The CNL HIPS system is calibrated using a set of 20 control filters. One control filter
is selected as the standard and the remaining 19 as QC checks. The 20 control filters
are loaded Teflon filters representative of the range of mass loadings and composition in
the IMPROVE network. The transmission and reflectance values of the control filters
are determined using the CNL Reference Integrating Sphere system. The accuracy of
the sphere system is maintained by using NIST traceable reflection standards.
Optical absorption accuracy is limited by detector system settings and laser and
detector stability. Settings are verified before each analysis and the system is allowed to
warm up for at least two hours prior to analysis. The HIPS is calibrated using the
designated standards filter and the accuracy is verified using the remaining 19 control
IMPROVE QAPP
Project Management
Section—Page: 4—41 of 52
Revision: 0.0
Date: March 2002
filters. If the results from the control filters are not within 5% of historical values, the
system is corrected and recalibrated.
Optical absorption precision is determined by reanalysis of samples. Immediately after
the calibration and the verification by the control filters, the Analytical Manager
performs replicate analysis of three trays (approximately seventy five samples) from the
previous session. Only after the reanalysis results are within 5% of the previous results,
is the system ready for normal analysis. The reanalysis trays are reanalyzed after every
1000 samples. The controls and reanalysis trays are reanalyzed at the end of the
session.
4.6.3.4 Elemental Analysis (XRF, PIXE, PESA)
The total uncertainty in the concentration of an element is the quadratic sum of the
analytical normalization precision (4%), the volume precision (3%) and the statistical
uncertainty. The normalization and volume terms are proportional to the concentration.
The statistical uncertainty is based primarily on the number of counts in the spectrum
and background and increases slightly with concentration. The equation is given in
Section 5.8.
The goal is to achieve a 4% relative analytical normalization precision for all three
analytical methods. This goal is currently being achieved.
Replicate analysis of twenty-five to fifty samples from the previous session is performed
immediately after the completion of calibration. Only after the reanalysis concentrations
of the major elements are within 5% of the previous results, is the system ready for
normal analysis.
XRF accuracy is traceable to a set of 17 Micromatter™ thin film standards (PbGaAs,
CuS, CaF2, Ti, V, Cr, Fe, Ni, Cu, Zn, GaP, GaAs, Se, CsBr, RbI, SrF2, Pb). The
standards are NIST-traceable. Cross sections for elements not included are derived by
interpolation. The accuracy for individual elements is maximized by deriving cross
sections from all standards.
PIXE/PESA accuracy is traceable to a set of 27 Micromatter™ thin film standards (Pb
on Kapton, CuS on Kapton, NaCl, Mg, two Al, SiO, GaP, CuS, KCl, KI, CaF2, Ti,
V, Cr, Mn, Fe, Ni, Cu, ZnTe, GaAs, two Se, CdSe. CsBr. Au, Pb). The standards
are NIST-traceable. Cross sections for elements not included are derived by
interpolation. The accuracy for individual elements is maximized by deriving cross
sections from all standards.
IMPROVE QAPP
Project Management
Section—Page: 4—42 of 52
Revision: 0.0
Date: March 2002
PESA accuracy is further verified against six 1/8 mil mylar squares mounted in analysis
slides for PESA calibration. The areal densities are determined by weighing on an
NIST-traceable balance.
4.6.3.5 Ion Chromatography
The concentration of all variables is determined from the measured mass, the median
artifact from designated field blanks, and the volume. The uncertainty includes volume
uncertainty, analytical normalization uncertainty, and the standard deviation of the field
blanks. The equations are given in Section 5.8.
The analytical normalization uncertainty of the IC measurement is a combination of the
precision of the calibration, the precision in desorption process, the precision in
collecting an aliquot, and the precision in the measurement of the eluent. The last
component, the precision of the measurement of the eluent is determined by reanalysis
of the eluent after every 20 analyses. The precision is much less than 1% and is
therefore negligible. The best estimate of the total analytical precision is the standard
deviation of the difference between the measured values of the QA standards and the
predicted values. The goal is to achieve a 4% relative analytical precision for all species
measured by ion chromatography. This goal is currently being achieved.
4.6.3.6 Thermal Optical Reflection
The concentration of all variables is determined from the measured mass, the median
artifact from secondary filters, and the volume. The uncertainty includes volume
uncertainty, analytical normalization uncertainty, and the standard deviation of the
secondary filters. The equations are given in Section 5.8.
The relative analytical precision of TOR is determined by the analysis of a second punch
in the sample. This precision depends on the normalization of the analysis, and on the
uncertainty in the temperature and in the reflectance. For total carbon, there is no
temperature/reflectance separation and the precision depends solely on the
normalization of the analysis. The separation into organic and elemental carbon also
involves the uncertainty in the reflectance measurement. Because the concentrations of
elemental carbon are much less than for organic carbon, the effect of the uncertainty in
the reflectance measurement on elemental carbon is larger than on organic carbon.
Most of the uncertainty for organic carbon is from the normalization. Most of the
uncertainty for elemental carbon is from the reflectance measurement. The goal is to
achieve a 5% relative analytical precision for total carbon and for total organic carbon,
and 10% for total elemental carbon. Relative analytical precisions for total and organic
carbon of 4% and for elemental carbon of 7-10% are currently being achieved. The
IMPROVE QAPP
Project Management
Section—Page: 4—43 of 52
Revision: 0.0
Date: March 2002
relative precisions of the individual temperature species range from about 13% for O2,
O3, and O4, to 40% for O1 and E3. No goal for the precision of the temperature
species has been established.
From Carbon Contractor: DRI SOP #2-204.6 Revised June, 2000
The precision of this analysis has been reported to range from 2 to 4%.2 For
analysis of actual ambient and source filters, homogeneity of the deposit is
most important for reproducible results. For homogeneous deposits
containing >10 µg/filter of total carbon, precision is generally 5% or less; for
inhomogeneous deposits replicates may deviate by as much as 30%. The
precision of carbonate analysis results is approximately 10%.
The precision of the laser-dependent split between organic and elemental
carbon fractions depends upon how rapidly the laser is increasing at the time
of the split and whether the split falls in the middle of a large carbon peak or
not. Typically, relative laser split times are reproducible within 10 seconds
and deviations in calculated splits are less than 5% of the total measured
carbon.
The accuracy of the thermal/optical reflectance method for total carbon
determined by analyzing a known amount of carbon is between 2 to 6%.3
Accuracy of the organic/elemental carbon split is between 5 and 10%.
… The calibration gases are traceable to NIST standards. The gases are
assayed for exact concentrations by the gas supplier to two decimal places.
The assay value is obtained from the tag on the cylinders.
4.6.4
Recovery Rate and Completeness
Recovery rate and completeness are both defined as the ratio of the number of valid
samples divided by the number of possible samples. Recovery rate considers only the
PM2.5 Teflon filter while completeness considers all four filters. Since the PM2.5 Teflon
filter contributes the fine mass and most of the fine species, the recovery rate is
2
Johnson, R.L., J.J. Shah, R.A. Cary, and J.J. Huntzicker (1981). "An Automated
Thermal-Optical Method for the Analysis of Carbonaceous Aerosol." In Atmospheric
Aerosol, Source/Air Quality Relationships, E.S. Macias and P.K., Hopke, eds. ACS
Symposium Series, No. 169. American Chemical Society, Plenum Press, New York, NY.
3
Rau, J.A. (1986). “Residential Wood Combustion Aerosol Characterization as a
Function of Size and Source Apportionment Using Chemical Mass Balance Modeling.”
Thesis, Oregon Graduate Center.
IMPROVE QAPP
Project Management
Section—Page: 4—44 of 52
Revision: 0.0
Date: March 2002
appropriate if these variables are being considered. However, for a full characterization
of reconstructed extinction, it is necessary to have valid measurements from all four
filters. This is represented by completeness.
Recovery Rate. Prior to implementation of regional haze regulations, a sample was
considered valid if the elemental concentrations from the PM2.5 Teflon filter (Module A)
were valid, independent of the validity of the remaining filters. Recovery rate is defined
as the ratio of samples with valid elemental concentrations divided by the number of
possible samples. A recovered sample has at least sulfur, soil elements, trace elements,
the coefficient of absorption, and a secondary estimate of organic matter. Because the
Module A Teflon filter has a central role in the data validation procedures, if the Module
A Teflon filter is invalid, then the entire sampling period is considered invalid, and data
from the other filters is not included in the database. Thus the recovery rate indicates
the upper limit for the number of samples in the database.
Completeness. For regional haze regulations, the parameter of interest is the
reconstructed extinction, which is calculated from the following PM2.5 concentrations:
sulfate and soil from Module A, organic and elemental carbon from Module C, and
nitrate from Module B. (Sulfate could also be calculated from Module B.)
Reconstructed extinction also has a component calculated from the coarse mass, which
is the difference between PM10 mass (from Module D) and PM2.5 mass (from Module
A). The most conservative definition for completeness is to require all that all
parameters above must have valid measurements for the sample to be valid.
Figure 5 and Table 7 indicates the historical annual recovery rates and completeness for
the entire network. The recovery rate and completeness for year 2001 were 91% and
89% respectively.
Objectives. The objective for recovery rate for the total IMPROVE network is 90% of
the all possible samples. The objective for completeness is 82%.
IMPROVE QAPP
Project Management
Section—Page: 4—45 of 52
Revision: 0.0
Date: March 2002
100%
Recovery Rate
90%
Completeness
80%
70%
88
90
92
94
96
98
seasonal year
Figure 5. Recovery and completeness of IMPROVE data, 1988-1998.
Table 7. IMPROVE Network Data Recovery and Completeness, 1988-1998.
The years start in March.
year
Recovery Rate
Completeness
Comments
1988
88%
74%
carbon missing in database
1989
90%
77%
carbon missing in database
1990
91%
72%
Teflon filter problems
1991
94%
80%
Teflon filter problems
1992
93%
79%
1993
95%
87%
1994
92%
85%
1995
91%
86%
Federal Furlough
1996
92%
85%
1997
94%
87%
1998
93%
78%
nylon filter problems
Comment 1. In 1993, a decision was made to report carbon as 8 fractions rather than
4. Approximately 5% of the samples were lost in the main database in 1988 and 1989.
The concentrations are present in a separate database.
Comment 2. There was an organic artifact on approximately 7% of the Teflon filters for
a batch used between October 1990 and November 1991. For samples where organic
mass estimated from elemental hydrogen significantly exceeded the mass from organic
carbon, the PM2.5 mass was invalidated. This reduced the completeness but not the
recovery rate.
IMPROVE QAPP
Project Management
Section—Page: 4—46 of 52
Revision: 0.0
Date: March 2002
Comment 3 The Federal Furlough in December 1995 and January 1996, accounted
for a 2% decrease in the recovery rate and completeness for 1995.
Comment 4. Approximately 10% of the completeness losses in 1998 were associated
with a manufacturing flaw in the nylon filters, which drastically increased the pressure
drop across the filters. This reduced the completeness but not the recovery rate.
4.6.5
Representativeness and Comparability
Representativeness is based on the relationship between monitoring objectives and the
geographical location of monitoring stations. The locations of the sites have been
determined according to IMPROVE site selection criteria. These criteria ensure that
sites avoid non-representative meteorology, avoid local sources of pollution, avoid
obstructions, and represent Class I Wilderness Areas within the same location and
elevation.
Comparability is defined as the measure of confidence with which one data set can be
compared to another. An important factor is that all operations are conducted by one
contractor (CNL), and all analyses for a given filter are conducted by a single
contractor.
4.7
4.7.1
Special Training and Certification
Purpose / Background
This section describes any specialized training requirements necessary to complete the
project and the procedures are summarized to ensure that specific training skills can be
verified, documented, and updated as necessary.
4.7.2
Training
Site Operators are trained in equipment operations, sample collection, and log
recording. A training session is conducted during new site installation and repeated
during the annual maintenance. This training session consist of all the steps needed to
change filters, plus how to change the date and time. The operators are given a general
summary on how to calibrate the sampler. (The operators are given a detailed step-bystep manual whenever they are asked to calibrate the sampler.) The Field Manager
keeps current records of trained Site Operators.
The Laboratory Manager trains Laboratory Technicians in sample handling and
gravimetric analysis at the time of employment. This is done in a series of sessions, each
IMPROVE QAPP
Project Management
Section—Page: 4—47 of 52
Revision: 0.0
Date: March 2002
one on a specific station in the sample handling procedure. Maintaining a record of
training is unnecessary as the laboratory manager closely oversees all operations.
University regulations require that CNL staff are certified in radiation safety by
Environmental Health and Safety. Records are maintained by Environmental Health and
Safety. This has no impact on the quality of the IMPROVE data.
4.7.3
Certification
There are no specific certification requirements for the IMPROVE project.
4.8
Documents and Records
The following sections describe the structure of the QAPP, the contents of the data
report packages, and the process of producing final datasets.
4.8.1
Structure of the Quality Assurance Project Plan
This document establishes a process to ensure early detection of problems and timely
feedback when corrective actions are necessary. It also provides a blueprint of all
operations and coordination of the entire project. Since it is a working document, it will
be revised frequently to incorporate changes to the program. The Documentation
Specialist will update this document to reflect current procedures. Document revisions
will be approved by the Quality Assurance Manager and distributed to members of the
QAPP distribution list.
4.8.2
Contents of the Data Report Packages
All original data is archived at CNL. Monthly, quarterly, and annual reports are
compiled for internal quality assurance and EPA review. Quarterly data reports are
available for public distribution.
Monthly Progress and Financial Report
The Monthly Progress and Financial Report contains the following items:
• Actual and estimated cumulative expenditures
• List of sites in operation
• Monthly gravimetric precision
• HIPS/PESA/PIXE/XRF precision
IMPROVE QAPP
Project Management
Section—Page: 4—48 of 52
Revision: 0.0
Date: March 2002
Quarterly Data Report
CNL prepares quarterly data summaries after the data for the season has passed Level
I and Level II validation. This report will not be updated even if the data are changed
after the submission of the dataset. The report is made available on the CNL
IMPROVE web site. The Quarterly Data Report contains the following items:
• Major element and trace element 24-hour concentrations in ng/m3
• Fine mass and its major components 24-hour concentrations in µg/m3
• Fine mass and its major components 24-hour concentrations in µg /m3 and
percent of fine mass
• Cursory statistical distribution of concentrations in ng/m3
• Concentrations maps for the total network
Annual Report
The Annual Report contains the following items:
• Yearly summaries of data collected at each site
• Seasonal maps of selected variables
• Data interpretation
• A discussion of progress toward defined goals
• Suggestions for modification of goals and objectives
• Recommendations for new goals and objectives
The Annual Quality Assurance Report to the COTR contains the following items:
• Sample recovery rates
• Flow calibrations (mail and during site visits)
• Third-party audits
• Analytical precisions for each of the analytical methods
• Level II validation summaries
• Artifact (blanks) measured and used
Additionally, an Annual Site Configuration Document details operational sites during the
year.
4.8.3
Process of Producing Final Datasets
IMPROVE data pass two levels of validation before becoming a part of the final
dataset.
IMPROVE QAPP
Project Management
Section—Page: 4—49 of 52
Revision: 0.0
Date: March 2002
Level I Validation
Level I validation involves a review of field operations, instrumentation performance,
and questionable data coded for evaluation.
•
The quality assurance manager reviews the field operations database for
problems that occurred during the season.
•
The data processing programmer presents the quality assurance manager a
report on flow rates for the season. This includes a comparison of flow rates
measured by the two system methods. The report discusses any cases with
questionable validity.
•
The data processing manager presents the quality assurance manager a report
on cases with questionable validity based on collection parameters.
•
The quality assurance manager reviews the QA documentation accompanying
the external contractor data.
•
The data processing manager presents the quality assurance manager a report
on field blanks, secondary filters, chosen artifact values and analytical precision.
•
The network operations manager presents a report on the calibration and
controls for gravimetric and HIPS analyses for the season.
•
The elemental analysis manager presents a report on the calibration and
reanalysis samples for the season and on the comparison of PIXE and XRF.
Level II Validation
Level II validation verifies sample concentrations by contrasting with expected values.
The data processing manager or programmer reports the individual concentrations and
means for a standard set of parameters for each site. Any cases in which fine mass
exceeds the PM10 are identified. This information is used for decisions based on intermodule comparisons.
The data processing manager or programmer creates approximately 350 correlation
plots for each season. A comparison is made for two groups of sites: all sites and select
western sites. These are used to determine the overall analyses of the compared
variables. A correlation plot is also made for each site, in order to identify individual
outliers. Outliers are examined for flow rate problems, transcribing errors, and
analytical errors. The compared variables are:
• Sulfur by PIXE on A (Teflon) (times 3) vs. sulfate by IC on B (Nylon)
IMPROVE QAPP
Project Management
Section—Page: 4—50 of 52
Revision: 0.0
Date: March 2002
•
Organic mass by hydrogen on A (Teflon) vs. organic mass by carbon on C
(Quartz)
• Optical absorption on A (Teflon) vs. light-absorbing carbon on C (Quartz)
• Fine mass vs. reconstructed mass on A (Teflon)
• Fine mass on A (Teflon) vs. PM10 mass on D (Teflon)
• Fine mass vs. hydrogen on A (Teflon)
The data processing manager investigates any discrepancies between Modules A and B
using the sulfur-sulfate plots; verifies that the two analytical methods agree; and identifies
any outlying pairs to either correct the data or invalidate the sample.
The data processing manager investigates any discrepancies between Module A and C
using the organic and light-absorbing carbon plots; verifies that the two analytical
methods agree; and identifies any outlying pairs to either correct the data or invalidate
the sample.
The data processing manager investigates any discrepancies for the fine mass using the
reconstructed mass plots. The quality assurance manager identifies any outlying pairs to
either correct the data or invalidate the sample.
The data processing manager investigates any discrepancies for the PM10 mass using the
fine PM10 mass plots; identifies any samples in which the fine mass exceeds the PM10;
and either corrects the data or invalidates the sample. The quality assurance manager
identifies any sample with unusually large PM10 values and flags these samples for later
analysis by PIXE.
When an anomalous point is encountered above, time plots of both variables on the
same plot are prepared to determine which point disagrees with nearby values.
Whenever a value is changed in the database, a note is attached to that location
indicating the change and the reason. When a sample is invalidated, the concentrations
are flagged as invalid, but the original parameters are retained in the database.
After correcting individual concentrations, the data processing manager prepares maps
of the mean concentrations for ten key variables: PM2.5 mass, PM10 mass, sulfur,
nitrate, organic carbon, light-absorbing carbon, soil, the coefficient of absorption, zinc,
and lead. These are examined for anomalous means for a single site in comparison to
neighboring sites. These maps are compared to those for prior seasons and investigate
any discrepancies.
Any changes to the data after the dataset is released are documented in a file on the
FTP site maintained at Fort Collins.
IMPROVE QAPP
Project Management
Section—Page: 4—51 of 52
Revision: 0.0
Date: March 2002
The data processing manager prepares a summary of the Level I and Level II validation
results and presents it to the quality assurance manager. If acceptable the data will be
considered validated. The data are sent to the contract representative, who is
responsible to updating the database on the IMPROVFE web site. The data
summaries are posted on the CNL IMPROVE web site.
The following information is permanently retained:
Level I validation of sampler flow rate calibration
• Calibration log sheets for calibration devices
• Calibration log sheets for samplers
• Mail calibration log sheets
• Flow calibrations database (with historic values)
• Annual maintenance log books
Level I validation of collection
• Field log sheets
• Database information from field log sheets
• Field operations database
• Flow rate time plots for each module
• Tray instruction files
Level I validation of sample handling
• All chain of custody log sheets
• Twice-a-day relative humidity measurements
Level I validation of gravimetric analysis
• Log book of twice-daily calibrations and quarterly master calibrations
• Computer database of gravimetric control blanks
• Hard copy and computer copy of monthly average control blanks
Level I validation of absorption analysis
• Log book of maintenance
• Log book of HIPS control filters
Level I validation of elemental analysis
• Elemental analysis manager's journal
• Calibration log sheets (XRF and PIXE/PESA)
• Reanalysis plots (XRF and PIXE/PESA)
• Computer logs of analysis (XRF and PIXE/PESA)
• Handwritten operator logs of analysis (XRF and PIXE/PESA)
• Cyclotron operator logs (PIXE/PESA)
IMPROVE QAPP
Project Management
Section—Page: 4—52 of 52
Revision: 0.0
Date: March 2002
• XRF vs. PIXE comparison plots
• Computer journal file with analytical parameters (XRF and PIXE/PESA)
Level I validation of external contractor analysis
• External contractors' quality assurance documents
• Hard copy of data report with QA results
• Original electronic file
• Computer files for dynamic field blanks and secondary filters
• Computer and hard copy of seasonal artifacts and precision and analytical
precision
• Time plots of seasonal artifacts and precision
Level II validation of collection
• Hard copy of selected data (one page per site) with revisions in ink
• Comparison plots (by site and overall) with revisions in ink
• Time plots for each site
• Maps for season
• Overall comparison plots with final data (after revisions)
• Computer command file with changes
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—1 of 93
Revision: 0.0
Date: March 2002
5. DATA GENERATION AND ACQUISITION
5.1
Sampling Process Design (Experimental Design)
The IMPROVE network experimental design is devised to provide data for Regional
Haze Regulations and to show baseline, temporal, and spatial trends in ambient air
quality in Class I Wilderness Areas throughout the United States. To accomplish this
we focus on obtaining complete chemical signature of the composition of the airborne
particles.
Twenty-four hour samples are collected every third day from midnight to midnight.
Sampling is continuous throughout the year.
5.1.1
IMPROVE Sampler Configuration
The standard IMPROVE sampler has four sampling modules, as shown in Figure 6. It
is designed to obtain a complete signature of the composition of the airborne particles
that affect visibility. Module A (Teflon), B (Nylon), and C (Quartz) collect fine particles
(0-2.5 µm). Module D (Teflon) collects PM10 particles (0-10 µm). Module A (Teflon)
providing most of the fine particle data, with PM2.5 mass, elements H and Na-Pb, and
the coefficient of optical absorption. Module B (Nylon), with a denuder before the
nylon filter to remove acidic gases, is used primarily for nitrate. Module C (Quartz),
with tandem quartz filters, measures carbon in eight temperature fractions. The
properties of the four modules are summarized in Table 8. The inlets are normally 0.6m
apart. Selected sites have an additional PM2.5 module for quality assurance.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—2 of 93
Revision: 0.0
Date: March 2002
carbonate
denuder
Module A
Module B
Module C
Module D
PM2.5
PM2.5
PM2.5
PM10
(Teflon)
(Nylon)
(Quartz)
(Teflon)
mass,
elements
absorption
sulfate,
nitrate
ions
organic,
elemental
carbon
mass
electrical to
controller
electrical to
controller
electrical to
controller
electrical to
controller
Figure 6. IMPROVE Sampler Modules: A (Teflon), B (Nylon), C (Quartz), and D
(Teflon).
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—3 of 93
Revision: 0.0
Date: March 2002
Table 8. IMPROVE Sampler Modules and Filter Media: Module (Filter), Analysis
Performed, and Parameters Measured.
Module
(Filter)
A (Teflon)
Analysis Performed
Parameters Measured
Gravimetric, HIPS,
PM2.5 mass, coefficient of absorption (babs)
PESA,
H
PIXE,
Na, Mg, Al, Si, S, Cl, K, Ca, Ti, V, Cr, Mn
XRF
Fe, Ni, Cu, Zn, Ga, As, Se, Br, Rb, Sr, Zr, Pb
B (Nylon)
Ion Chromatography Nitrate, Nitrite, Sulfate, Chloride
C (Quartz) TOR combustion
Carbon in 8 temp. fractions
D (Teflon) Gravimetric
PM10 mass
Hybrid Integrating Plate/Sphere (HIPS); Particle Induced X-ray Emission (PIXE);
Proton Elastic Scattering Analysis (PESA); X-ray Florescence (XRF); Ion
Chromatography (IC), Thermal Optical Reflectance (TOR)
5.1.2
IMPROVE Sampler Site Selection
The siting criteria is given in SOP 126. A document was prepared for the 1999-2000
expansion and provided to persons responsible for site selection. Site selection was
done by the local FLM, the state and/or local air quality agency, and the national or
regional FLM. This document is summarized below.
5.1.2.1 Site Selection Criteria
Site selection is based on the following criteria. Significant variances from any criterion
are documented and approved by the IMPROVE steering committee before the site is
installed.
1. The site represents all the Class I Wilderness Areas in the cluster. This requirement
applies only to 110 sites used to monitor regional haze at the Class I Wilderness
Areas.
a. The distance between the site and the closest portion of all Class I Areas was
not be greater than 100 km.
b. The elevation of the site lies between the highest and lowest elevations of all
Class I Areas in the cluster. Exceedances of 100 feet or 10% are considered
meeting the criterion. Exceptions were made for exceedances if agreed to by
the states and FLMs and accepted by the IMPROVE steering committee.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—4 of 93
Revision: 0.0
Date: March 2002
2. The site must avoid small valleys with non-representative meteorology. Valleys with
towns or other emission sources are definitely to be avoided. Valleys without
emission sources, but with significant inversions, should also be avoided.
3. The site must avoid all local sources of pollution.
a. Automotive Sources: vehicle usage, distance between road and sampler.
<10,000 vehicles per day
>25m between road and sampler.
10,000-20,000 vehicles per day >50m between road and sampler.
20,000-40,000 vehicles per day >75m between road and sampler.
>40,000 vehicles per day
>100m between road and sampler.
b. Combustion Sources
Avoid any areas influenced by diesel generator emissions, wood smoke, or
incinerators.
c. Dust Sources
At least 400m from a large potential source of dust, such as a landfill,
agricultural operations, or an unpaved road with more than 400 cars per day.
4. The site should avoid large obstructions, such as trees or buildings. In the standard
setup, the inlet will be approximately 3.5m (11 feet) above the bottom of the shelter.
The sampler could be placed on a platform to clear obstructions, as well as to be
above any snow pack. Raising the height of the inlet by increasing the length of the
stack beyond the standard 2m is not recommended, although theoretical
calculations and tests show no significant loss of particles on the wall of a stack up
to 5m. (For a 1% loss of particles larger than 0.3 µm, the stack length would have
to be over 250m.)
a. Restrictions in airflow are defined in items b and c below. If possible, there
should be unrestricted airflow in all directions. If all directions are not possible,
the minimum arc of unrestricted airflow is 270°, with the predominant wind
direction in the unrestricted 270°.
b. Within 10m of the sampler, any solid barriers or trees should be at least 1m
below the inlet, as shown on the left side of Figure 7. In general, a pole or
meteorological tower will not be a solid barrier. We will set as a guideline that a
solid barrier is any object that subtends more than 10°.
c. Beyond 10m of the sampler, the solid barriers or trees should not be higher than
30° above the horizontal with respect to the inlet, as shown on the right side of
Figure 7.
5. The site must have electrical power (120 Volt, 60 Hertz, 20 Amperes).
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—5 of 93
Revision: 0.0
Date: March 2002
6. The site must be accessible for a weekly sample change in all but the most severe
weather conditions.
within 10 m
< 30°
> 1m
beyond 10m
Figure 7. Sampler Location with Respect to Trees and Solid Barriers.
5.1.2.2 Site Selection Process
Once potential sites have been found to meet the above siting criteria, the local FLM
manager, or other persons leading the initial search, send photos, sketches, and siting
information for each potential site to CNL. CNL distributes summaries to all parties
involved in the selection.
FLMs, Air quality agencies, and CNL make a joint decision on where to locate the
sampling site. If significant disagreements exist between the concerned parties, CNL
prepares a summary for the IMPROVE steering committee that discusses each siting
alternative and the tradeoff between them. The IMPROVE steering committee will
work with the parties to reach a decision.
5.1.2.3 Site Authorization
The local FLM manager complete the necessary paperwork required to use the site,
install power, and build structures. This includes obtaining any needed permission to
use the property, preparing and submitting any Environmental Impact Reports, and
obtaining any needed authorization to install and use electrical power.
5.1.2.4 Site Preparation
Once the specific location of an individual site has been agreed upon, the site must be
prepared for installation of the sampling equipment. This primarily involves providing a
structure and adequate electrical power. Local FLM managers prepare the site by:
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—6 of 93
Revision: 0.0
Date: March 2002
•
•
•
•
•
•
5.2
5.2.1
Supervising the installation of the shelter, or another agreed upon alternative.
Supervising the installation of the required electrical power (120 Volt, 60 Hertz,
20 amps) at the site.
Notifying CNL field manager of approximate date when the site will be ready
for sampler installation.
Filling out and returning the site information summary sheet.
Receiving and recording shipment of samplers.
Arranging for transportation of equipment to the site before CNL personnel
arrive.
Sampling Methods
IMPROVE Sampler
Particulate samples are collected with an IMPROVE particulate sampler operating with
an airflow rate of 22.8 L/min at standard temperature and local pressure for module
types A, B, and C. Type D module airflow rates are 16.9 or 19.1 L/min (depending
upon inlet stack diameter).
The principle of operation is based on the physical behavior of particles in a vortex
acted upon by centrifugal force. (See Section 8.1: IMPROVE Sampler Technical
Properties.)
In September of 1999, IMPROVE sites began conversion to the version II sampler.
The new version of the IMPROVE sampler measures the flow rate and volume more
accurately, is more flexible, and is easier to operate than the original sampler, while
retaining the same collection characteristics.
The IMPROVE controller is shown in Figure 8. It includes a microprocessor and a
keypad with display. The flow rate readings and temperature reading are read every
minute and a 15-minute average is recorded on a removable memory card. The
memory card is shipped between the site and the central laboratory with each shipping
container.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—7 of 93
Revision: 0.0
Date: March 2002
display
screen
keypad
cord
storage
pocket
keypad
cord
memory
card
Figure 8. IMPROVE Sampler Control Module.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—8 of 93
Revision: 0.0
Date: March 2002
PM2.5 Sampler (Module A, B, and C)
The PM2.5 module is shown in Figure 9. Each PM2.5 module contains a cyclone (to
separate out particles larger than 2.5 µm), 4 solenoids, a critical orifice flow controller, 2
flow gauges, an inlet stack, and associated electronics. The Nylon module (B) contains
a denuder to remove nitric acid vapor. Each module measures 16” x 12” x 7” and
weighs 40 pounds. The air stream at the filters goes vertically up. Four vacuum pumps
provide airflow through the filters. All the filters are pre-loaded into cassettes and the
cassettes into cartridges at the CNL sample laboratory. Each module has a separate
color-coded cartridge.
stack compression sleeve
timing pulleys
for motor
inlet stack
annular denuder
(Module B only,
to remove
nitric acid)
hose from
solenoid manifold
to critical orifice
and pump
hand wheel to raise
solenoid manifold
motor drive to raise
solenoid manifold
solenoid valve (4)
solenoid manifold
cartridge with 4 filter
cassettes
cassette manifold
inlet tee
cyclone
critical orifice
electronics
enclosure
connector for
hose to pump
connector for line to
controller
Figure 9. IMPROVE PM2.5 Sampler Module.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—9 of 93
Revision: 0.0
Date: March 2002
PM10 Sampler (Module D)
The PM10 module is the same as a PM2.5 module, except that the inlet and cyclone are
replaced by a commercial PM10 inlet and the air stream at the PM10 filters goes
vertically down.
5.2.2
Sampling Media
The filter media are summarized in Table 9.
Table 9. IMPROVE Module Sample Media.
Module (Filter)
A or D (Teflon)
B (Nylon)
C (Quartz)
Media
Teflo
MAGNA Nylon
QAT-UP quartz-fiber
Diameter
25 mm
25 mm or 37 mm
25 mm
Supplier
Pall Gelman
Osmonics Inc.
Pall Gelman
Teflon filters
Teflo Filters (for Gravimetric, HIPS, PIXE, PESA and XRF analysis) are supplied in
packs of fifty. The filters are 25mm stretched Teflon mesh filters with pore size 3.0µm,
and a rigid olefin ring support. Filters are purchased in a single lot for an entire year.
Nylon filters
MAGNA Nylon filters (for Ion Chromatography analysis) are supplied in sheets of 33
cm by 18.2 ft. The filters are 25mm or 37mm with a 1.0µm pore size. Nylon filter
sheets are purchased in a single lot for an entire year. Filter media is contained in plastic
bags and stored under refrigeration.
Quartz filters
QAT-UP quartz-fiber filters (for TOR combustion analysis) are supplied in boxes of
approximately one hundred. The quantity in each box varies because filters with holes
or manufacturing defects are discarded. Filter media is supplied by Pall Gelman and
pre-fired at Desert Research Institute (DRI). DRI purchase 25 mm Pallflex 2500
QAT-UP quartz filters. DRI pre-fires the quartz filters at 900°C for at least four hours
in accordance to in-house procedures. The contractor procedures include testing a
portion of the pre-fired filters for contamination. DRI ships the carbon filters to CNL in
a cooled container. Filter media is contained in plastic and stored in a freezer.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—10 of 93
Revision: 0.0
Date: March 2002
Collection Mask
The collection mask fits underneath the fine Teflon filter in the cassette and reduces the
area of collection. The purpose is to improve the sensitivity of the XRF, PIXE and
PESA analyses by concentrating the particles. The decrease in the area of collection is
limited by filter clogging: in regions of high concentrations the flow rate can be reduced
below the acceptable range. Therefore the mask is used only at western sites outside of
Southern California.
The collection masks are prepared from the inert paper spacers that Nuclepore
Corporation provides between Nuclepore 47 mm 8 µm polycarbonate filters. The
spacers have a very light coating of Apiezon-L grease. Many years of experience have
shown that this paper will not transfer mass to the filters. The spacers are retained in
their original factory containers and labeled and sealed until required as stock for
preparing the collection masks.
The masks are prepared using specially machined double-action cutter punches. The
cutter punches simultaneously cut the 25 mm outer diameter and the desired inner
collection diameter. Both punch and cutter are concentric, centering the collection area
in the 25 mm mask.
5.2.3
Sampling Operations
The instructions for changing samples are detailed in the Version II IMPROVE Sampler
Operating Procedures Manual. The instructions include a troubleshooting guide to
diagnose and fix common sampler problems.
CNL is responsible for supervising the sample changes by the Site Operator. The
controller reads the sensors and displays the values for the operator to record on the log
sheet, shown in Figure 10. The operator removes the cartridges with exposed filters
and installs the new cartridges. Every third week the operator removes one cassette
from the old cartridge and places it in the new cartridge. The controller then displays
the new flow values for recording.
The sample changing procedure is outlined in the sampler manual titled: Version II
IMPROVE Sampler Operating Procedures Manual for use in the IMPROVE
Monitoring Network. The general duties for the Site Operator are:
•
To receive a shipping box containing resealable, labeled bags, each containing
filters for one week and the corresponding log sheet. Each bag contains a
color-coded cartridge for each module; each box contains a data storage
memory card.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—11 of 93
Revision: 0.0
Date: March 2002
•
To verify the correct dates on the boxes and bags; and initiate the sample
changing sequence via keypad of the control unit.
•
To record Final Readings for each of the filters; and to log additional
information including date, time, temperature reading, operator's initials, and
comments on any anomalous events (i.e. pump noises, extreme sampler
pressure values, equipment malfunction, missed sample changes, or sample
interference such as forest fires).
•
To remove the old cartridges and replace with the current cartridge.
•
To record Initial Readings of the newly installed filters.
If there are any questions or difficulties with regular sample maintenance, Site Operators
call the CNL Laboratory Manager. Operators also call if there are any equipment
malfunctions, improper labeling of cassettes, or if readings are outside of site-specific,
acceptable ranges.
The telephone number for the CNL sample-handling laboratory is provided both on the
sampler manual and on the Field Sample Log sheet. We provide a person for
telephone response from 8 a.m. to 5 p.m. every workday. A voice-mail system is
available at other times. The Site Operators receive prompt and courteous responses
to any questions or problems, from procedures to sampler malfunctions.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—12 of 93
Revision: 0.0
Date: March 2002
IMPROVE Network Field Log
Preweighed by:ABC
12/25/1999
INSTALL ON
-->
ACAD1
01/18/2000
CurTemp_____C
FINAL READINGS
INITIAL READINGS
Operator’s Initials______
Date:____/____/1999
Time:________
Init____
Date:____/____/1999
Time:________
SamDate
StrTim
MxVac
Cass
Vac
Mag
Vac
Mag
ET
********** ********** ********** ********** ********** ********** ********** ********** ********** **********
01/19/2000
0000
1
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
Mod A __ __ __
01/22/2000
0000
2
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
01/25/2000
0000
3
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
********** ********** ********** ********** ********** ********** ********** ********** ********** **********
01/19/2000
0000
1
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
Mod B __ __ __
01/22/2000
0000
2
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
01/25/2000
0000
3
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
********** ********** ********** ********** ********** ********** ********** ********** ********** **********
01/19/2000
0000
1
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
Mod C __ __ __
01/22/2000
0000
2
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
01/25/2000
0000
3
__ __ __
__ __ __
__ __ __
__ __ __ __ __ __ __
********** ********** ********** ********** ********** ********** ********** ********** ********** **********
01/19/2000
0000
1
__ __ __
__ __ __
__ __ __ __
Mod D __ __ __
01/22/2000
0000
2
__ __ __
__ __ __
__ __ __ __
01/25/2000
Lab Use Only
0000
3
comments:
__ __ __
__ __ __
For Help Call (530) 752-1123
TFF M11681
10.911
10.822
10.302
Figure 10. IMPROVE Network Field Sample Log Sheet.
__ __ __ __
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—13 of 93
Revision: 0.0
Date: March 2002
5.3 Sample Handling and Custody
The filter preparation and sample handling is done in a building constructed in 1992 by
the University to support the aerosol research of Crocker Nuclear Laboratory. It
consists of an administrative room, a research and testing room, and a sample handling
and gravimetric laboratory. The CNL Annex houses the IMPROVE network staff and
researchers, in addition to sample handling and gravimetric analysis equipment.
A flow diagram of the sample-handling process is shown in Figure 11.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—14 of 93
Revision: 0.0
Date: March 2002
A
B
C
D
PM2.5 Teflon
PM2.5 nylon
PM2.5 quartz
PM10 Teflon
purchase
filters
purchase
filters
purchase
filters*
purchase
filters
acceptance
test
acceptance
test
prefire
filters*
acceptance
test
send to
Davis*
premeasure
mass
store in
freezer
Davis
premeasure
mass
load clean filters into cassettes and attach ID labels
leak-test all cassettes
U.S. Mail
store box in clean area at site
load cassettes into sampler and measure flow rate
collect sample
measure flow rate and remove cassettes
site
U.S. Mail
receive cassettes with exposed filters from site
review field log sheet, enter readings into database
transfer fillters and identification tags to Petri dishes
measure
mass
mount in
slides
store in
freezer
measure
mass
mount in
slides
measure
absorption
XRF
PIXE/PESA
send to
ions
contractor
send to
carbon
contractor
archive in
slide
trays
A
B
C
D
PM2.5 Teflon
PM2.5 nylon
PM2.5 quartz
PM10 Teflon
Figure 11. IMPROVE Sampler Handling of Filters.
Davis
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—15 of 93
Revision: 0.0
Date: March 2002
5.3.1
Purchasing Filters and Acceptance Testing
See SOP TI 101A for further detail.
The project manager estimates network filter requirements annually. Estimates include
filters for network site sampling, quality control and quality assurance, and planned
special studies. The project manager sends purchase orders for A (Teflon) filters to the
supply vendors. Filter testing is done upon receipt of the order, prior to accepting the
new filter lots for network use. Subcontractors are responsible for quality assurance
testing of the B (Nylon) and C (Quartz) filters.
Teflon filters
A single major order is initiated within existing procurement guidelines approximately six
months prior to the requirement to use the lot. The purchase request specifies, "All
Teflon filters must be from the same lot." Each pack contains lot ID number and initials
of the lab inspector. Filter batches are checked for contamination. Filter lots are
logged into the computer. Filter media is contained in sealed packs and stored at room
temperature.
We perform an acceptance test for each batch of Teflon filters approximately once per
year. One percent of the filters are examined for organic artifact and elemental artifacts.
Most of the test filters are used in a large collocated array. All test filters are examined
by XRF and PIXE/PESA.
Nylon filters
A single major order is initiated within existing procurement guidelines approximately six
months prior to the requirement to use the lot. The purchase request specifies, "All
Nylon filters must be from the same lot." Each batch contains a batch number. Filter
media is contained in plastic and stored in a refrigerator. Filters for use in the
IMPROVE network are prepared in batches of ninety to one hundred from the nylon
substrate sheets using a sterilized 25mm or 37mm punch. Ten samples from each lot
are sent to the analysis contractor for testing.
Quartz filters
The carbon contractor, DRI, is responsible for pre-firing the Quartz filters and shipping
them to CNL. Each box contains a lot number and documentation showing date of
pre-firing, date of acceptance test, lot IDs, and number used from box for controls.
Filter batches are checked for contamination. Filter lots are logged into the computer.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—16 of 93
Revision: 0.0
Date: March 2002
The filters are stored under refrigeration. The contractor performs an acceptance test
for each batch of filters prior to shipment.
5.3.2
Load Clean Filters
The sample handling computer code maintains a record of what should be loaded into
each cassette, including field blanks and secondary quartz filters. The computer also
maintains a record of every action, the time, and the name of the technician. Following
the computer screen, the clean filters are loaded into the appropriate cassette within the
sampling cartridge. Each cassette is identified with a unique tag: five-digit alphanumeric
site code, sample date, and module type. The labels are prepared by the computer and
placed on the cartridge next to the appropriate cassette. Because the filter is weighed
just before being loaded in the cassette, there is no need for any additional identification
of the clean filter.
Each cartridge is identified with a color-coded label to indicate module type (A B C or
D). The cassette tags on the cartridge also indicate the correct module and date.
The loading procedure is initiated by running the PRE program. When PRE is started, it
checks the LOGS database and creates a list, LOGS QUEUE, of sites requiring boxes
of filters. It downloads the site configuration information from the SITE database, then
checks the WEIGHTS database to determine the sample date of the last filter weighed
for each site. PRE next determines the next four sample dates required at each site. The
site name, configuration, and sample date information is placed in a queue and is used to
drive the pre-weights program for gravimetric analysis.
The four cartridges for a given samples are placed in a resealable bag that is labeled by
the site code and date of installation, which is generally a Tuesday. This is generally
different than the sample dates on the labels attached to the cartridges. The field
logsheet is included in the bag. The three bags for a given cycle are placed inside
shipping boxes. Prior to shipment, the laboratory manager checks the contents and
dates of each box against the correct information listed by the computer.
A cassette shipping container was specifically designed to transport a three week supply
of filter cassettes between UC Davis and the sampling sites. On the front there is a
mailer with a reversible prepaid mailing label. The UC Davis address is on one side of
the mailer, while the site address is on the other. Each shipping box is assigned to a
given site. For each site there are two shipping boxes in the system. A few sites with
shipment problems have three boxes.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—17 of 93
Revision: 0.0
Date: March 2002
5.3.3
Transport and On-Site Storage
Each shipping box is identified with a site ID and with the dates the cartridges are to be
installed. The container is received two to three weeks prior to the first installation date.
Each site normally has two shipping boxes.
All shipments are by First Class US Mail. The boxes are not shipped cool.
The site operators and store the shipping boxes in a clean location, either near the office
or close to the field site. Acceptable storage locations are either indoor office space or
onsite sampler housing units.
The number of days that the clean and exposed filters are either in shipment, stored at
the site, or are in the sampler are listed in Table 10.
Table 10. Days the filter is in shipment, in storage at the site, or in the sampler.
filter in cycle
filter 1
filter 2
filter 3
filter 4
filter 5
filter 6
filter 7
5.3.4
clean filter
15 days
18 days
21 days
24 days
27 days
30 days
33 days
exposed filter
22 days
19 days
16 days
13 days
10 days
7 days
4 days
Handling Onsite
Each shipping box contains the filters for three weekly change. All three installation
dates are labeled on the box. Inside the box are three labeled sealed bags, one for each
change day. In each bag are four cartridges, one for each module, and one field log
sheet. The operator stores the shipping container in a clean location out of direct
sunlight until needed. The operator always brings the box with the clean filters. On two
out every three weeks, this box also contains the empty bag with the date of last week's
installation. Every third week, the operator must bring a second box, one for the
exposed filters and one with the clean filters.
At the site, the operator first records the pertinent collection information from the
controller screen, removes the current cartridges from the sampler units, seals the
cartridge inside the empty plastic bag labeled with the previous week's change date.
After all four cartridges are present, the operator inserts the completed field log sheet,
reseals the bag, and places the bag back inside the shipping box. The box is retained in
the same clean location described in the previous paragraph.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—18 of 93
Revision: 0.0
Date: March 2002
When all three weeks of cassettes have been used, the box will be closed and mailed
back to CNL via First Class US Mail, using the reversible mail label on the front of the
box. On this change day, the operator is reminded by the controller screen and by a
label on the box to remove the memory card from the controller and place it in the box.
The operator then inserts a new memory card from the new box into the controller.
On one-third of the change days, the sampler is operating on the change day. This adds
a slight complication. The sampling is interrupted during the few minutes of the change.
The operator must shift one cassette from the current cartridge to the new cartridge.
This cassette is clearly identified and is the only cassette that can be removed from the
cartridge without a special tool. The new cartridge has an empty spot for the cassette
to be inserted. When the change is completed, the sampler automatically resumes
sampling. This change day never contains the first or last bag in a shipping box.
The log sheet has the initials of the site operator. The time of sample change is written
on the log sheet and recorded on the memory card.
5.3.5
Receiving Boxes and Log Sheet Information
When the US Postal Service delivers exposed cassettes from the site to CNL, a lab
technician arranges the boxes in alphabetical order on the countertop, opens each box,
removes the log sheets, checks that the dates on the log sheets match the dates on the
filter cassettes. The lab manager downloads the collection data from the memory card.
The computer checks the memory card data for incorrect data and allows the lab
manager to delete or correct specific values. The lab manager then compares the field
log sheet with the data from the memory card, displayed to look like the log sheet. If
the memory card is missing or damaged, the data from the log sheet is entered in the
computer. If there are any error or problems, the problem is described on a designated
note card and the box is placed on the problem samples shelf. If necessary, the site
operator is called.
Lab technicians clean the outside of the shipping boxes and verify that a 1-1/2" X 3"
white tag is mounted on the right side of the box to allow postal authorities to affix
required postage. Prepaid mail labels are reversed, exposing the 'send to’ address.
Shipping boxes are placed in the gravimetric laboratory on the shelf reserved for nonTeflon filter downloading.
5.3.6
Downloading Nylon and Quartz Filters
Boxes are processed in the order that the memory cards were entered. Lab technicians
initiate the CONTRACTOR program from the downloading station in the gravimetric
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—19 of 93
Revision: 0.0
Date: March 2002
laboratory. This program displays the order of each sample for placement in Petri
dishes in preparation for shipment to the analysis contractors. Sample order is
displayed in sets of fifty and users may view tray order for B (Nylon) or C (Quartz)
media. The computer maintains a record of every action, the time, and the name of the
technician.
Lab technicians check the numbered Petri dish trays for each contractor against the
display on the downloading terminal. The filter identification label on the outside of each
Petri dish, and position in the trays should match those in the CONTRACTOR display.
If not, they contact the lab manager to locate any misplaced or disordered Petri dishes
within the laboratory.
Lab technicians open the box containing the site and sample dates indicated on the
downloading terminal as being next in the queue. They transfer the nylon filter to a Petri
dish and transfer the accompanying Sample Identification label from the cassette to the
Petri dish. They sort the Petri dish into the numbered Petri tray in the order required by
the CONTRACTOR program queue.
When the numbered tray is full, they carefully transfer the Petri dishes, in order, into a
white Petri shipping tray. They label the tray with the site and sample date of the first
and last filters in the box and the filter type: B (Nylon) or C (Quartz). Lab technicians
again verify that the filters in the tray are in the order dictated by the CONTRACTOR
queue and place the Petri tray in the refrigerator for storage.
If the filter is missing or the deposit area is visibly damaged, they annotate the field log
sheet, and change the status code to XX (invalid). Add the comment on the log sheet
to the LOGS database, and change the status of the filter to XX by editing the log sheet
through the LOGSIN program. (The codes are explained in Table 17.)
5.3.7
Downloading Teflon filters
The A (Teflon) filters are weighed and placed in slide mounts for non-destructive
analysis. After all analyses of the A (Teflon) filters are completed, the filters are
archived in a clean environment. No minimum period of archiving has been established.
The Teflon filters from Modules A and D are generally unloaded and post weighed at
the Cahn 31 balance. The post weighing order is driven by a queue reflecting the order
in which the logs were entered. Post weighing shall not occur until the balance is
calibrated and controls have been run.
The loading procedure is initiated by running the POST program. When POST is
started, it check the LOGS database and creates a list, POSTQUEUE, of boxes to be
down loaded. It acquires the configuration, and the site, sample date, and pre weight
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—20 of 93
Revision: 0.0
Date: March 2002
data and places this information in the POSTQUEUE to drive the POST program for
gravimetric analysis.
If the deposit appears non-uniform, contact the lab manager, and make a comment on
the log sheet. The lab manager shall decide whether to declare the Teflon filter invalid
or questionable. Add the comment on the log sheet to the LOGS database, and change
the status of the filter by editing the log sheet through the LOGSIN program.
Processing instructions are given by the POST program in the message box in the upper
right corner of the screen. A highlighted box, the active box, will indicate the location in
which data is currently being collected. Filters are sequentially weighed and then set
into slide mounts and placed into slide trays. Slide mounts are labeled with site code,
date of sample, and tray position. Trays remain in the weighing room until collection for
XRF and PIXE/PESA analysis.
5.3.8
IC and TOR
The quartz and nylon filters are shipped to the respective contractors approximately
every 10 days, in batches of approximately 400 filters. All filters in a given batch were
collected in the same quarter. Lab Technicians generate a text and database file of tray
position with the CONTRACTOR program. The file is named with the site name and
sample date of the first filter in the tray. The text files are printed and included with the
filters as a shipment inventory. Copies are filed at CNL. Database files are emailed to
the contractor representative and samples are shipped via UPS or Federal Express.
5.3.9
XRF, PIXE, PESA, and HIPS
At the end of each quarter, filters mounted in slides and positioned in slide trays are
transported from the sample gravimetric laboratory to the cyclotron building. The
cyclotron is shown in Figure 12. The PIXE and PESA analysis are done on beam line
1A. Tray positions are checked and samples are visually inspected for filter placement,
sample uniformity, and mask pattern. Samples are first analyzed for HIPS and XRF
and then moved to the North Cave for PIXE/PESA analysis.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—21 of 93
Revision: 0.0
Date: March 2002
76" Cycltron
Main Vault
40" SWITCHING
MAGNET
North Cave
Beam Line 1A
Beam Line 1B
Pumping Station
scale
10 feet
Figure 12. CNL Cyclotron and North Cave.
5.4
5.4.1
Analytical Methods
Gravimetric Methodology
Gravimetric mass is measured with Cahn 25, Cahn 30, and Cahn 31 microbalances
modified with a zero area bail and vertical counterweight. These microbalances are
very sensitive weight and force measurement instruments. The balance is a force to
current converter, in which weight is directly proportional to current flowing in a torque
motor. The specifications for the balances are listed in Table 11.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—22 of 93
Revision: 0.0
Date: March 2002
Table 11. Cahn Microbalance Manufacturers Performance Specifications
Instrument
Range
Capacity
Weight range/sensitivity
Tare (Mechanical)
Tare (Electrical)
Accuracy
Precision (Ultimate)
Precision (% of load)
Cahn 30
A200
1.5 g
200 mg / 1 µg
1.5 g
250 mg
.0012%
.1 µg
.0001%
Cahn 31
A250
1.5 g
250 mg / 1 µg
1.5 g
250 mg
.0012%
.1 µg
.0001%
Cahn 25
A20
1.5 g
20 mg / 1 µg
1.5 g
20 mg
.005%
.1 µg
.00015%
A computer-controlled algorithm determines a stable reading by weighing the filter every
eight seconds until consecutive readings have a mass difference of 1 µg or less. The
Cahn 30 and 31 algorithms compare two consecutive stable readings; the Cahn 25
compares three.
A segregated laboratory area is used to control human traffic and to stabilize the
temperature and relative humidity of the weighing environment. Polonium strip ionization
units are used to reduce electrostatic effects in the weighing cavity and on individual
filters. These protect against noisy readout, drift and sudden readout shifts caused by
electrostatic charge. The microbalances are continuously powered and kept at a
consistent temperature to avoid calibration drift.
Gravimetric analysis of IMPROVE filters applies the difference method to determine the
mass of the collected aerosol. The pre-weight of each filter is measured prior to being
loaded into a cassette and sent into the field for sampling. Once exposed and returned
to the sample handling room, the filter is removed from the cassette, and the post-weight
of the filter is measured. The mass of the aerosol is determined by calculating the
difference between the pre-weight and the post-weights.
The weighing procedures are directed by sample handling computer codes, the PRE
and POST programs described in Section 5.11.2. Because of the arrangement of the
sample handling in separate stations, the pre-weighing and post-weighing are conducted
on separate balances. The Cahn 31 and one of the Cahn 30 balances are used for
post-weights and the Cahn 25 and the other Cahn 30 for pre-weights. A protocol of
systematic cross-checks ensures that all balances give identical masses and the mass
measurements at the time of pre-weight are identical to those of post-weight. (Even
with a single balance, a protocol to ensure there is no shift between pre-weight and
post-weight would be required.) The output of each balance is entered directly into the
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—23 of 93
Revision: 0.0
Date: March 2002
computer database, with the technician directing, via keyboard, the data collection and
storage.
5.4.2
HIPS Methodology
The coefficient of optical absorption babs is measured on all PM2.5 Teflon filters using the
Hybrid Integrating Plate/Sphere Analysis method (HIPS). A schematic of the HIPS
system is shown in Figure 13.
Reflectance
Radiometer
Detector
hole
reflection standard
filter
Collimator
Detector
HeNe Laser
INTEGRATING SPHERE
Transmittance
Radiometer
OPAL GLASS
INTEGRATING PLATE
Slide Changer
Figure 13. Setup of Hybrid Integrating Plate / Sphere Apparatus.
Prior to spring of 1994, all analyses were done with an integrating plate system, which
was called Laser Integrating Plate Method (LIPM). This required analysis before and
after collection. The HIPS system combines an integrating sphere with the same
integrating plate. This reference system uses NIST-traceable reflectance standards.
The reflectance measured by the integrating sphere replaces the initial integrating plate
measurement. Thus the coefficient of absorption can be determined from the exposed
filter with the simultaneous measurement by two detectors. The HIPS system is
calibrated to match the UCD reference integrating sphere system. There are three
advantages of the HIPS system
•
Better quality control is possible with the ability to do true replicate analyses.
•
The integrating sphere method accounts for all light in the system. Since the
HIPS system is calibrated to match the integrating sphere, the measurements are
more accurate.
•
The HIPS system requires less time, since only one analysis is needed.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—24 of 93
Revision: 0.0
Date: March 2002
The coefficient of optical absorption (Mm-1) for either the HIPS or the integrating plate
methods is determined from the intensity of transmitted light, T, the intensity of reflected
light, R, the area of deposit (cm2) and the volume (m3) sampled by:
I
 area 
babs = 100 * 
log e  1
 volume 
 I2
5.4.3



(Equation 5)
XRF Methodology
Module A (Teflon) filters are analyzed using the CNL XRF system, shown in Figure 14.
The XRF system uses a General Electric grounded anode diffraction type x-ray tube,
with a molybdenum anode. The x-rays produced by the tube are collimated and
directed onto an aerosol sample. The sample deposit absorbs the Mo x-ray energy and
re-emits the energy as x-rays characteristic to the elements present on the sample. The
x-rays are detected by high-resolution SiLi detectors with pulsed optical feedback to
provide high count rate capabilities. The data from the detector are processed by
standard pulse processing electronics and accumulated in a dual ported analog to digital
converter connected to an Ethernet local area network. The data are recorded by a
data acquisition program, (ACE), and analyzed by a data reduction program, (RACE).
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—25 of 93
Revision: 0.0
Date: March 2002
vvv
Teflon
Filter
v
Deposit
X-Ray Tube
e-
<
vvv
Mo Anode
<
X-Ray
Collimators
Si(Li)
Detector
Figure 14. XRF Analysis System.
IMPROVE network sample analyses by XRF are performed four times a year in
analysis run sessions lasting approximately three weeks, with a single break to refill the
detectors with liquid nitrogen. Following the break, the system is recalibrated, and tray
previously analyzed is reanalyzed to verify no system parameters changed significantly.
Each analysis session begins roughly one month after the end of the quarter, pending the
return of the filters and completion of gravimetric analysis and level I collection
validation.
The Acquisition of Composition by Elements program, ACE, controls the automatic
analysis system and collects the data generated during each analysis, storing it as a data
file. ACE is run on a MicroVAX terminal and, during analysis, continually displays the
status of the XRF system and of the current analysis, including the real time XRF
spectrum.
The Rapid Analysis of Composition by Elements program, RACE, analyzes the data.
RACE opens the files created by ACE and analyzes the x-ray spectrum. Analysis
involves subtracting the background radiation and the ‘blank’ spectrum (the x-rays due
to the filter substrate), leaving the spectrum due to the deposits on the filter. RACE also
locates the peaks in the spectrum, fits each peak with Gaussian curves to calculate its
integrated area, identifies the element responsible for each peak, deconvolutes any
interfering peaks, and uses a calibration table to determine the concentration in ng/cm2
of each element. RACE saves this data is saved in a second data file for later reference.
The collection and processing of data are controlled by a single terminal running ACE,
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—26 of 93
Revision: 0.0
Date: March 2002
displaying the real time acquisition on x-rays, and displaying the data reduction
produced by RACE. X-ray corrections are made to the concentration to compensate
for the loss of x-rays as they leave the particles. Table 12 shows that the correction for
the elements heavier than iron are negligible.
The XRF system was involved as a NIST (National Institute of Standards and
Technology) collaborative laboratory for the development of standard reference
material SRM 1833 multi-element x-ray standard.
5.4.4
PIXE and PESA Methodology
Module A (Teflon) filters are analyzed by Proton Induced X-ray Emissions (PIXE) and
Proton Elastic Scattering Analysis (PESA) techniques using the 4.5 MeV Protons
produced by the 76" Cyclotron. In the north cave of CNL, the cyclotron beam
terminates at several target chambers. Beam line 1A is used for analysis of all normal
network filters. Electronics to run the detectors, both particle for PESA and x-ray for
PIXE, are located in the north cave. The analysis station is operated remotely from the
Computer Room using automated filter changers.
A schematic of the PIXE/PESA system is shown in Figure 15. The x-rays are detected
by a high-resolution SiLi detector with pulsed optical feedback to provide high count
rate capabilities. The PESA system uses a surface barrier detector (Ultra™ Ion
Implanted Silicone Charged Particle Detector, made by EG&G Ortec™), to detect
protons forward scattered 30° by their passage through the sample deposit.
PESA
detector
Teflon Filter
Deposit
Proton
beam
Beam
Collimator
beam current monitor
PIXE
X-ray
detector
Figure 15. Setup of PIXE/PESA Analysis System.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—27 of 93
Revision: 0.0
Date: March 2002
The PESA detector measures the protons from elastic collisions between incident
protons and particles in the deposit. By positioning the detector location at 30° off the
beam line in the forward direction, maximum definition of the peak produced by
scattering from hydrogen atoms is obtained. Since Teflon filters contain very low
concentrations of hydrogen, the measured value is solely due to the hydrogen in the
deposit. Spectra of a blank filter and of a network sample are shown in Figure 16. The
large peak labeled “C, F, O” is the sum of all protons scattered from elements heavier
than H, primarily the C and F from the Teflon and the C and O from the sample. The
background under the H peak is determined from the sample spectrum without
reference to the spectrum of the clean blank. The spectrum of the clean blank is used to
monitor any H contamination in the filter substrate. The minimum detectable limit for
hydrogen of around 5 ng/m3 is similar to those for PIXE.
square root of counts
160
120
Blank Teflon Filter
140
Network Sample
100
120
C, F, O
80
C, F, O
100
80
60
60
40
H
40
20
20
0
0
0
50
100
channel number
150
0
50
100
150
channel number
Figure 16. PESA Spectra Criteria for a Clean Blank and Network Sample.
The system is calibrated using Mylar films of known density. Rectangles of known area
are cut from a large roll of uniform mylar and weighed on an NIST-traceable balance.
The data from the detector are processed by standard electronics and accumulated in a
dual ported analog to digital converter connected to an Ethernet local area network.
The number of x-ray and protons depends on the number of protons in the cyclotron
beam. For PIXE-PESA this is determined by integrating the charge deposited by the
beam in a Faraday cup behind the sample. This data is used for quality control of the
PIXE and PESA systems.
The electronics to support the PIXE/PESA system are located in the North Cave of
Crocker Nuclear Lab, and in the Computer Room of Crocker Nuclear Laboratory.
The data from the detectors are processed by standard pulse processing electronics and
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—28 of 93
Revision: 0.0
Date: March 2002
accumulated in a dual ported analog to digital converter connected to an Ethernet local
area network.
ACE (Acquisition of Composition by Element) runs the automated acquisition system
for PIXE and PESA analysis. ACE controls the sample changer, using instruction files
called ‘tray files’ that indicate which positions in each analysis tray contain valid
samples. ACE collects the data generated during each analysis, including the x-ray
spectra, the integrated charge, and the detector live time. This information, along with
the sample identification data from the tray file, saved as an output binary file. The real
time x-ray and particle spectrum, detector live time, and integrated charge are displayed
on the MicroVAX console for quality control.
The program RACE (Rapid Analysis for Composition by Element) identifies the
elemental peaks and calculates the elemental concentrations from the spectral files
produced by ACE. RACE opens the file for the analysis and a file with the spectrum of
a system blank. It first subtracts the background x-rays due to scattering of x-rays or
protons using the system blank and a smoothing subroutine. The remaining spectrum,
due to the deposits on the filter, is processed to locate elemental peaks. The peaks are
identified and matched to a series of Gaussian peaks. Using the energy calibration, each
peak is labeled by element and x-ray line. The concentration (areal density in ng/cm2)
for each peak is calculated from the analysis parameters and the cross section for the
specific line as determined previously by the standards. The program calculates the
uncertainty in the concentration from the number of counts in the peak and background.
It also calculates the minimum detectable limits for all possible elements in the table from
the counts in the background. A subroutine resolves any possible interferences. The
concentrations and errors of all identified species, and the minimum detectable limits of
all elemental species are stored in a separate binary files for later reference.
The major steps performed by RACE for spectral analysis are:
•
Remove most of the background by subtracting a system blank.
•
Remove the remaining background by subtracting a smooth function calculated
as the lesser of the counts in the channel and an average counts per channel
over a specified width.
•
Use a peak search routine to identify the peaks and valleys by comparing the
shape of the spectrum near a given channel to what would be expected if there
were a peak at that channel.
•
Fit each segment of the spectrum to one or more Gaussian peaks. Using this
information, the centroid of the peak, the energy calibration, the integrated
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—29 of 93
Revision: 0.0
Date: March 2002
charge and a "cross section table", calculate the elemental concentration
represented by each peak.
•
Calculate the uncertainty of the concentration from the uncertainty in the
Gaussian area.
•
Calculate the minimum detectable limits for all elements in the cross section table
as 3.29 times the square root of the background counts under the region that
would have been occupied by a peak of normal width.
•
Resolve interferences between the various lines, using preset criteria.
IMPROVE sample analyses by PIXE are performed four times a year in analysis run
sessions lasting approximately three days. Each analysis session begins roughly two
months after the end of the quarter, pending the return of the filters and completion of
gravimetric, HIPS, and XRF analysis.
X-ray matrix corrections for particle size and filter loading are made to compensate for
the absorption of the x-rays by the particles. The same corrections are used for PIXE
and XRF. The size correction compensates for absorption by the particle containing the
emitting atom. For a given element, it depends on the size distribution and composition
of the particle containing the element. A fixed correction for each element was
calculated using reasonable assumptions. The loading correction compensates for
absorption by another particle that the x-ray must pass through to reach the detector.
The value for a given sample depends on the loading; this is estimated from the sum of
elements. The net correction is the product of the two terms. The size correction for
fine particles and an average loading correction are given in Table 12. The total
correction exceeds 3% only for elements lighter than sulfur.
Table 12. Size and Load Correction for Fine Particles with PIXE or XRF.
element
size
load:
5.4.5
Na
1.07
1.12
Al
1.17
1.05
Si
1.09
1.03
S
1.01
1.02
K
1.02
1.01
Ca
1.02
1.01
Ion Chromatography Methodology
From Ions Contractor: RTI SOP Revised July 31, 2000
Fe
1.01
1.00
Ni to Pb
1.00
1.00
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—30 of 93
Revision: 0.0
Date: March 2002
SUMMARY OF METHOD
Nylon filters for collection of anions do not require pretreatment. Exposed
filter samples are extracted by a method appropriate for the analyte(s) of
interest. Nylon filters to be analyzed for anions only will be extracted with the
eluent used for IC analysis, a dilute sodium carbonate/sodium bicarbonate
buffer. Nylon filters to be analyzed for anions and cations will be extracted
with deionized water.
Sample extracts are passed through a column of ion chromatographic resin
consisting of polymer beads coated with quaternary ammonium active sites.
During passage through the column, anion separation occurs due to the
different affinities of the anions for the active resin sites. Following
separation, the anions pass through a suppressor column which exchanges all
cations for H+ ions. An eluent which yields a low conducting acid is used.
Species are detected and quantified as their acids (e.g., HCl) by a conductivity
meter.
Cations in solution are separated when passed through an ion
chromatographic column containing surface-sulfonated ion exchange resin;
separation is caused by differing affinities of the cations for the active sites on
the resin. After separation, the cations pass through a suppressor column
which exchanges all anions for OH- ions. Species are detected and quantified
as their hydroxides by a conductivity meter. The eluent is hydrochloric acid
which yields deionized water when passed through the suppressor column.
INTERFERENCES
Large amounts of anions eluting close to the ions of interest will result in an
interference. No interferences have been observed in nylon filters samples
analyzed to date. If interferences are observed, several steps to increase
separation can be taken, such as reducing eluent strength and/or flow rate or
replacing the columns.
APPARATUS
Anions: Ion chromatograph (Dionex Model DX-500 with LC20
chromatography module, one IP25 isocratic pump, one GP50 gradient pump,
two CD20 conductivity detectors, a Dionex AS40 automated sampler and
PeakNet Control Windows 95 Workstation) with Dionex AG12A anion guard
column, Dionex AS12A anion separator column, and Dionex ASRS-ULTRA
anion self-regenerating suppressor column.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—31 of 93
Revision: 0.0
Date: March 2002
Cations: Ion chromatograph (Dionex Model DX-500 with LC20
chromatography module, two IP20 isocratic pumps, two CD20 conductivity
detectors, a Dionex AS40 automated sampler and PeakNet software) with
Dionex CG12A cation guard column, Dionex CS12A cation separator column,
and cation self-regenerating suppressor
REAGENTS
ACS reagent grade chemicals are used for the preparation of all solutions.
Chemicals used for the preparation of calibration standards are dried at
105°C for 2 hours and cooled in a desiccator immediately before use.
Anion Analysis
•
Eluent, 0.0017M NaHCO3/0.0018M Na2CO3: Dissolve 2.8562 g
NaHCO3 and 3.8156 g Na2CO3 in 20 liters deionized water.
•
Regenerant, 0.025 N H2SO4: Add 500 mL 1 N H2SO4 to a Nalgene
Carboy and dilute to 20 L with deionized water.
•
Mixed Stock Solution, 1000 mg/L NO2-, NO3-, and SO42-, and 200 mg/L
C1-: Dissolve 1.4998 g NaNO2, 1.6305 g KNO3, 1.8142 g K2SO4, and
0.3297 g NaCl in 1 liter deionized water.
•
Standard Solution A: Dilute 10 mL mixed stock solution to 100 mL
with eluent (100 mg/L NO2-, NO3-, SO42-, and 20 mg/L C1-).
•
Standard Solution B: Dilute 10 mL Standard Solution A to 100 mL
with eluent (10 mg/L NO2-, NO3-, and SO42-, and 2 mg/L C1-).
•
Calibration Standards: Using Standard Solutions A and B, prepare
calibration standards with eluent in 100 mL volumetric flasks as
shown in Table 13. Preparation of standards in eluent eliminates the
water dip which interferes with chloride quantitation. Prepare fresh
calibration standards weekly.
•
Quality Assurance Stock Solutions: Purchase from CPI International,
GFS (Columbus, OH), and/or EM Science (Gibbstown, NJ).
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—32 of 93
Revision: 0.0
Date: March 2002
Table 13. Preparation of Anion Calibration Standards
NO2-, NO3-, SO42mg/L
STANDARD SOLUTION A
1
10.0
2
5.0
3
3.0
4
2.0
STANDARD SOLUTION B
5
1.0
6
0.5
7
0.2
8
0.1
9
0.05
Standard
NOTE:
Cl –
mg/L
mL of Standard
Solution/100 mL
2.0
1.0
0.6
0.4
10.0
5.0
3.0
2.0
0.2
0.1
0.04
0.02
0.01
10.0
5.0
2.0
1.0
0.5
Higher concentration standards can be prepared from Standard A or the mixed
stock solution if needed.
Cation Analysis
• Concentrated Eluent Stock Solution, 1 N Methanesulfonic Acid (MSA):
Dilute 96.1 g (~65 mL) of 99% MSA to 1 liter with deionized water.
•
•
Eluent, 20mN Methanesulfonic Acid: Dilute 20 mL 1N MSA to 1 liter
using deionized water.
Calibration Standard Stock Solutions, 1000 mg/L each NH4+, Na+, K+,
Ca2+, and Mg2+,
•
Standard Solution A: Add 10 mL of each stock solution (NH4+ , Na+,
K+, Ca2+, and Mg2+) to a 100 mL volumetric flask and dilute to the
mark with deionized water and mix thoroughly (100 mg/L NH4+ , Na+,
K+, Ca2+, and Mg2+).
•
Standard Solution B: Dilute 10 mL Standard Solution A to 100 mL
with deionized water (10 mg/L NH4+ , Na+, K+, Ca2+, and Mg2+).
•
Calibration Standards: Using Standard Solutions A and B, prepare
calibration standards with deionized water in 100 mL volumetric
flasks as shown in Table 14. Prepare fresh calibration standards
weekly.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—33 of 93
Revision: 0.0
Date: March 2002
•
Quality Assurance Stock Solutions: 1000 mg/L each NH4+, Na+, K+,
Ca2+, and Mg2+, NIST Traceable : Purchase from CPI International
and/or EM Science, Gibbstown, NJ.
Table 14. Preparation of Cation Calibration Standards
Standard
NH4+, Na+, K+, Ca2+, Mg2+
-
mg/L
STANDARD SOLUTION A
1
25.0
2
10.0
3
3.0
STANDARD SOLUTION B
4
1.0
5
0.3
1 ppm SOLUTION
6
0.1
NOTE:
mL of Standard
Solution/100 mL
25.0
10.0
3.0
10.0
3.0
10.0
Higher concentration standards can be prepared from Standard A or the mixed
stock solution if needed.
SAMPLE HANDLING
RTI will provide chain-of-custody documentation with all sample shipments to
track and ensure that filter samples are collected, transferred, stored, and
analyzed by authorized personnel; sample integrity is maintained during all
phases of sample handling and analysis; and an accurate written record is
maintained of sample handling and treatment from the time of its collection,
through the laboratory analytical process, to the eventual relinquishing of all
data to the NPS.
FILTER EXTRACTION PROCEDURE
NOTE: Nylon filters to be analyzed for anions only will be extracted with the
eluent used for IC analysis, a dilute sodium carbonate/sodium bicarbonate
buffer. Filters to be analyzed for anions and cations will be extracted with
water. The anion eluent produces a large sodium peak in the cation
chromatogram that precludes quantitation of the sodium ion in the filter
extract and interferes with the quantitation of ammonium ion.
To extract the filters, the analyst will do the following:
(1) Remove filters to be extracted from the freezer and allow them to
equilibrate to room temperature.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—34 of 93
Revision: 0.0
Date: March 2002
(2) Using gloved hands and tweezers, place each filter in a Nalgene
Monovette that has been labeled with the sample I.D. (The label is
carefully taped near the top of the Monovette to prevent loss during
sonication.)
(3) Add exactly 10 mL of extraction solution (0.3 mM NaHCO3/2.7 mM
Na2CO3 or deionized water) using a repipet. (The repipet will be
calibrated against a 10 mL Class A volumetric pipet.)
(4) Push up the Monovette plunger so that no air space remains above
the filter.
(5) Place the batch of Monovettes in an epoxy-coated wire test tube rack,
expose them to ultrasonic energy in a bath for 30 minutes, and then
allow them to sit at room temperature overnight.
(6) Record the date of extraction on the RTI Sample Filter Processing
Form.
Refrigerate samples that will not be analyzed the following day. Just prior
to analysis, re-sonicate for thirty minutes the filters and extracts that have
been refrigerated. This will also serve to bring the samples to room
temperature.
IC PROCEDURE
1. Fill the eluent reservoirs with the eluent.
2. Start the eluent flow, activate the self-regenerating suppressor, and
allow the baseline to stabilize.
3. Inject two eluent blanks to flush the system and to ensure that the
system is operating properly.
4. Anions: Using the calibration schedule, perform the daily multipoint
calibration over the range 0.05 to 25.0 ppm NO2-, NO3-, and SO42followed by control/quality assurance (QC/QA) samples listed below:
•
QC sample containing concentrations of Cl-, NO2-, NO3- and SO42typical of those found in the mid-range of actual filter extract
concentrations.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—35 of 93
Revision: 0.0
Date: March 2002
•
QC sample containing concentrations of Cl-, NO2-, NO3- and SO42typical of those found at the lower end of actual filter extract
concentrations.
•
Commercially prepared, NIST-traceable QA sample containing
known concentrations of Cl-, NO2-, NO3- and SO42-.
Cations:Using the calibration schedule, perform the daily multipoint
calibration over the appropriate range followed by quality
assurance/quality control (QC/QA) samples listed below:
•
QC sample with NH4+, Na+, K+, Ca2+, and Mg2+ concentrations
typical of the higher concentrations found in actual filter extracts.
• QC sample with NH4+, Na+, K+, Ca2+, and Mg2+ concentrations
typical of the lower concentrations found in actual filter extracts.
• Commercially prepared, NIST-traceable QA sample containing
known concentrations of NH4+, Na+, K+, Ca2+, and Mg2+.
If the observed values for sulfate differ by more than 10 percent from the
known values, identify and correct the problem before analyzing samples.
5. Load the sample extracts into the autosampler vials according to the
schedule prepared for that day. Typically, fifty field samples are
analyzed per day. The daily schedule includes, at a minimum, 3
duplicate samples, 2 spiked samples and 5 QA/QC samples.
6. Begin the analysis run, occasionally checking to ensure that the system is
operating properly.
7. Examine the data at the end of the run. If the concentration of any ion
exceeds the upper end of its calibration curve, dilute the sample
appropriately and include with the samples to be analyzed the following
day.
CALCULATIONS AND DATA REDUCTION
Peak areas are entered into the computer automatically by the PeakNet
software where calculations are performed using a quadratic fit to the
calibration data. The quadratic fit yields the following:
yi = ax i2 + bx i + c
where:
y = the calculated sulfate concentration, mg/L
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—36 of 93
Revision: 0.0
Date: March 2002
x = the instrument response
Anions: Initially, the calibration curves from 0.05 to 10.0 ppm NO2-, NO3- and
SO42- (0.01 to 2.0 ppm Cl-) are used for the calculation of the extract anion
concentrations. All nitrite, nitrate, and sulfate concentrations that exceed 10
ppm and all chloride concentrations that exceed 2 ppm are recalculated with
the 25.0 ppm NO2-, NO3- and SO42- (5 ppm Cl-) standard added to the
calibration curve. If a recalculated nitrite, nitrate or sulfate concentration
exceeds 25 ppm or a recalculated chloride concentration exceeds 5 ppm, the
extract is diluted appropriately (usually 5-fold) to bring the ion concentration
into the calibration range and reanalyzed.
Cations: Initially, the calibration curve from 0.05 to 10.0 ppm is used for the
calculation of the extract NH4+, Na+, K+, Ca2+, and Mg2+ concentrations. All
cation concentrations that exceed 10 ppm are recalculated with the 25.0 ppm
standard added to the calibration curve. If a recalculated cation
concentration exceeds 25 ppm, the extract is diluted appropriately (usually 5fold) to bring the cation concentration into the calibration range and
reanalyzed.
5.4.6
Thermal Optical Reflection Methodology
From Carbon Contractor: DRI SOP #2-204.6 Revised June, 2000
The DRI thermal/optical carbon analyzer is based on the preferential
oxidation of organic carbon (OC) and elemental carbon (EC) compounds at
different temperatures. It relies on the fact that organic compounds can be
volatilized from the sample deposit in a helium (He) atmosphere at low
temperatures, while elemental carbon is not oxidized and removed. The
analyzer operates by: 1) liberating carbon compounds under different
temperature and oxidation environments from a small sample punch
(normally 0.536 cm 2) taken from a quartz fiber filter; 2) converting these
compounds to carbon dioxide (CO2) by passing the volatilized compounds
through an oxidizer (heated manganese dioxide, MnO2); 3) reduction of CO2
to methane (CH4) by passing the flow through a methanator (hydrogenenriched nickel catalyst); and 4) quantification of CH4 equivalents by a flame
ionization detector (FID).
The principal function of the optical (laser reflectance) component of the
analyzer is correction for pyrolysis of organic carbon compounds to elemental
carbon. Without this correction, the organic carbon fraction of the sample
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—37 of 93
Revision: 0.0
Date: March 2002
would be underestimated and the elemental carbon fraction would be
overestimated by including some pyrolyzed organic carbon. The correction
for pyrolysis is made by continuously monitoring the filter reflectance (via a
helium-neon laser and a photodetector) throughout an analysis cycle. This
reflectance, largely dominated by the presence of light absorbing elemental
carbon, decreases as pyrolysis takes place and increases as light absorbing
carbon is liberated during the latter part of the analysis. By monitoring the
reflectance, the portion of the elemental carbon peak corresponding to
pyrolyzed organic carbon can be accurately assigned to the organic fraction.
The correction for pyrolytic conversion of organic to elemental carbon is
essential for an unbiased measurement of both carbon fractions, as discussed
in Johnson et al. (1981).4
Carbonate carbon can be determined by measuring the CO2 evolved upon
acidification of the sample punch before the normal carbon analysis
procedure.
Seven temperature fractions as well as the pyrolysis correction are
individually determined and can be reported. Values routinely reported
include total organic carbon, organic carbon evolved at temperatures greater
than 120ºC (high-temperature organic carbon), total elemental carbon,
elemental carbon evolved at temperatures greater than 550ºC (hightemperature elemental carbon), total carbon, and carbonate.
5.5
5.5.1
Quality Control
IMPROVE Sampler Quality Control
New Samplers
New samplers are tested at CNL for leaks, wiring problems, and faulty parts. The
sampler is completely assembled in the lab, and then shipped to the site. Before the first
samples are installed, a CNL technician verifies the correct installation, photographs and
records details of the site, and leak checks and calibrates the sampler.
4
Johnson, R.L., J.J. Shah, R.A. Cary, and J.J. Huntzicker (1981). "An Automated
Thermal-Optical Method for the Analysis of Carbonaceous Aerosol." In Atmospheric
Aerosol, Source/Air Quality Relationships, E.S. Macias and P.K., Hopke, eds. ACS
Symposium Series, No. 169. American Chemical Society, Plenum Press, New York, NY.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—38 of 93
Revision: 0.0
Date: March 2002
Biannual Flow Rate Calibration
Every six months the Site Operators conduct a flow calibration. The Field Manager
sends a calibrated calibration device, a set of calibration cassettes in cartridges, and a
memory card. The operator installs the cartridge, installs calibration device, and initiates
the calibration sequence. A microprocessor calculates the calibration parameters for
the module and compares them with the previous parameters. The operator records
calibration results on a Sampler Field Calibration Form. If values have changed more
than 5%, the Site Operator performs a full calibration.
Verification of Contents of Shipping Box
The last step prior to sending the cassettes into the field for sampling is to check that the
right number and type of filters are being sent. The Lab Technician initiates the
LEAKCHECK program. This program displays the type and configuration of filters
and cassettes expected in each shipping box. It also displays the site and sample date
for each filter. Technicians verify filter type, site and sample date.
5.5.2
Gravimetric Quality Control
The sample handling and gravimetric laboratory is climate controlled and entry is
situated such that there is no pass through foot traffic. The laboratory area floor and
work surfaces are vacuumed weekly with a high efficiency HEPA cleaner. All counter
surfaces are cleaned once per week with reagent grade alcohol, though the areas in
front of the electrobalances are cleaned daily. The entire CNL Annex building, is on the
same central air supply system. However, the air is separately filtered prior to entering
the sample handling room, and the room is maintained at positive pressure to reduce
fugitive dust levels.
Balance accuracy and precision is checked and maintained twice daily during regular
calibrations. Additionally, microbalances maintain a zero balance ±1 µg. If zero drifts
beyond this range, Laboratory Technicians recalibrate the instruments.
Lab technicians record Gaussmeter values as part of a standard calibration procedure.
Balances are recalibrated, if the magnetic field changes more than 5 gauss.
The sample handling system program also records ambient room temperature and
relative humidity with each sample.
Temperature and relative humidity control is through a central heating/air conditioning
unit used for the entire CNL Annex building. To reduce dust levels in the sample
handling room, the room is over-pressured, with the inlet air passing through a high
efficiency filter to reduce daily dust build up. The set temperature is 22.2º C with an
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—39 of 93
Revision: 0.0
Date: March 2002
allowed twenty-four hour variation of ± 3º C. Relative humidity is set at 35% with an
allowed twenty-four hour variation of ±5%. The temperature and relative humidity are
measured and recorded in the data base with every post-weight measurement. When
the temperature or relative humidity range is exceeded, analysis is discontinued until the
proper range is achieved.
Controls Procedure
The balances are cleaned and calibrated at the beginning of the morning and afternoon
work sessions. For the pre-collection and post-collection balance, a new clean filter is
weighed in the morning and re-weighed in the afternoon. After the afternoon analysis,
the control filter is loaded into a cassette and stored in a clean location for 42 days, the
length of time a filter sent to the field remains in a cassette. The 42-day old archived
filter is also weighed in the morning and re-weighed in the afternoon. A flow diagram
for control filters is shown in Figure 17.
The difference of afternoon minus morning, named “reweight,” is recorded in the
computer database and used to monitor each balance. The difference between the precollection mass and the post-collection mass, named “control,” is recorded in the
computer database and used to monitor the dual-balance system.
Data Quality objectives for laboratory “reweight” is a mean difference of zero, with a
standard deviation of 2 µg. The historic mean and standard deviation for the network
(1988-2000) is 0 ± 1 µg. Data Quality objectives for laboratory controls is 3 µg, with a
standard deviation of 5 µg. This is the historic value for the IMPROVE network using
masked Version I cassettes. Periodic tests with blank filters stored in Petri dishes show
for 42 days give a mean and standard deviation of 0 ± 2 µg, indicating a small and
variable increase from contact with the cassettes. The mean and standard deviation for
Version II cassettes is 0 ± 2 µg, showing no effect from being in the cassettes. The
mean is monitored to assure that there is no drift in the system over time. (The standard
deviation of 5 µg corresponds to 150 ng/m3, which is small compared to the network
average of 6000 ng/m3.) The primary advantage of the control protocol over field blank
protocol is that the laboratory has a complete knowledge of the history of every sample.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—40 of 93
Revision: 0.0
Date: March 2002
8:00 a.m Daily Controls Procedures
Get new filter from stock
Get oldest archived filter cassette
Label filter
Remove filter and label from cassette.
Weigh filter = Pre weight
Weigh filter = Post weight
Place filter in labeled Petri dish
Place filter in labeled Petri dish
1:00 p.m Daily Controls Procedures
Re-weigh Pre filter = RePre weight
Re-weigh Post filter = RePost weight
Mount in cassette
Replace in Petri dish
Label cassette
Label Petri dish
Place cassette in order on archive shelf
Archive Petri dish for four weeks
Archive cassette for six weeks
Replace filter in Teflon filter stock
Figure 17. Flowchart for Controls Filter Processing.
The control filters facilitate determination of the following:
•
Any change in the equivalency of the two balances. The Cahn 31 reconstructed
weights (from applying the calibration equation to the Cahn 25 weights) should
match the Cahn 31 measured weights within the quoted precision of the balance
(± 2.5 micrograms). If the weights do not correlate well, the problem is noted
and rectified before the measurements of real filters is performed.
•
Any change in either electrobalance between morning and afternoon. Controls
are run in the afternoon instead of evening since the pre-weighed filters are
mailed out at 2:00p.m. daily. If controls indicate a potential problem with the
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—41 of 93
Revision: 0.0
Date: March 2002
electrobalance, the filters can be removed from the shipping queue and
reweighed once the problem is resolved.
•
Any shift in readings between the pre-weights and post-weights for an ambient
sample. As pre and post weights are performed six weeks apart, a drift or shift
in either balance could lead to erroneous gravimetric measurements. Control
filters provide a daily record of balance consistency.
•
The uncertainty of the analysis. The difference between the morning weights
and the afternoon re-weights provides an estimate of the precision of each
electrobalance.
•
The value and uncertainty of any artifact associated with storage in a cassette
for 42 days. The artifact was small with the Version I cassettes (3 µg or 100
ng/m3) and is zero with the Version II cassettes. The precision for ambient
mass concentrations is based on the standard deviation of the mass differences
for Version I cassettes (5 µg). (The corresponds to 160 ng/m3, which is less
than 3% of the average network sample of 6000 ng/m3.)
5.5.3
HIPS Quality Control
The HIPS (Hybrid Integrating Plate and Sphere System) is calibrated using a set of 20
standard filters that were previously calibrated by the CNL reference integrating sphere
system, as discussed in Section 5.7.3. These filters span the range of typical loadings of
the IMPROVE network. One filter is used as the primary standard, while the other 19
are used as controls.
Prior to analysis, the HIPS system is warmed up for at least two hours. This allows the
laser and detectors to stabilize. Detector systems are set to levels documented in the
calibration logbook.
At the beginning of the analysis session, the system is calibrated using the one standard
filter and remaining 19 standard filters are run as a control. If the measured control
values differ by more than 5% from the expected values, the entire system is examined
to determine the cause of the discrepancy. The transmission and reflection values are
determined for an internal control filter that is kept in the arm that moves the slides into
position. The values for this internal control filter can be determined at any time by the
spectroscopist. A tray of samples from a previous session are reanalyzed. If the
reanalysis values differ by more than 5% from the expected, entire system is examined
to determine the cause of the discrepancy. This value of the internal control filter is
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—42 of 93
Revision: 0.0
Date: March 2002
recorded at the beginning and end of every tray of samples, and generally once in the
middle. The reanalysis tray is analyzed after every 1000 samples, or sooner if the value
of the internal control filter changes significantly. If the reanalysis values differ by more
than 5%, the system is recalibrated and previous 1000 samples are rerun. At the
conclusion of the session the standards and reanalysis trays are run again.
5.5.4
XRF, PIXE, and PESA Quality Control
The XRF or PIXE systems are calibrated whenever there is a change in the physical
configuration or in the detectors or when a problem is observed in the routine calibration
checks. XRF uses set of 17 Micromatter™ thin film standards consisting of PbGaAs,
CuS, CaF2, Ti, V, Cr, Fe, Ni, Cu, Zn, GaP, GaAs, Se, CsBr, RbI, SrF2, Pb) and 2
clean filters (blanks) representative of the filter lot used for IMPROVE aerosol
sampling. PIXE uses a set of 27 Micromatter™ thin film standards consisting of Pb on
Kapton, CuS on Kapton, NaCl, Mg, two Al, SiO, GaP, CuS, KCl, KI, CaF2, Ti, V,
Cr, Mn, Fe, Ni, Cu, ZnTe, GaAs, two Se, CdSe, CsBr. Au, Pb and 2 clean filters
(blanks) representative of the filter lot used for IMPROVE aerosol sampling. All thin
film standards are NIST-traceable. A cross section table is created from this
information. A smooth curve is determined in order to (1) minimize any error in the
standard, and (2) to determine the cross section of elements not represented. By
combining multiple standards, the accuracy of the cross section table is better than that
using a single standard for each element.
Prior to analysis, the detector is filled with liquid nitrogen at least 4 hours before starting
an analysis. Major systems such as the beam collimation system, cooling system, slide
advancement mechanism, data interface devices, wiring and power systems are
checked for proper function in accordance with SOPs.
The same set of 17 standards is run at the beginning of the session to verify the
calibration. If any measured values differ by more than 5% from the expected values,
the entire system is examined to determine the cause of the discrepancy. A
renormalization constant for all elements is determined to make the measured values
agree with the expected. This must also be less than 5%. This methodology simulates
recalculating the cross section table each session, with the added control that the cross
section cannot change from the historical value by more than 5%.
At the beginning of each XRF or PIXE session , samples from a previous session are
reanalyzed. For XRF this is done twice, once with the low current used with the
standards and once with the high current used for normal filters. If the reanalysis values
differ by more than 5%, the system is recalibrated and samples rerun. The reanalysis
tray is rerun after every 1000 samples. Again, if the reanalysis values differ by more
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—43 of 93
Revision: 0.0
Date: March 2002
than 5%, the system is recalibrated and samples rerun. If necessary all 1000 samples
will be reanalyzed. At the end of the session, the standards and reanalysis tray are rerun
a final time.
5.5.5
Ion Chromatography Quality Control
See Section 0 for methodology.
From Ions Contractor: RTI SOP Revised July 31, 2000
If correlation coefficient does not exceed 0.998, stop the analysis and identify
the problem.
Analyze QC samples (see Section 0) at the beginning of every analytical run.
Compare the results with those obtained during previous QC tests. If the
observed concentration of any ion differs from the known value by greater
than 10%, stop the analysis until the problem is identified and corrected.
Analyze a duplicate sample, a QA/QC sample, and a spiked sample after at
least every 20 field samples.
5.5.6
Thermal Optical Reflection Quality Control
From Carbon Contractor: DRI SOP #2-204.6 Revised June, 2000
System blanks are performed at the beginning of each week to insure the
system is not introducing bias in the carbon results and to insure that the laser
signal is not temperature dependent.
… the calibration peak at the end of each analysis run serves as a regular
standard; the integrated area under the calibration peak serves as a measure
of analyzer performance. In addition, the daily injections of two calibration
gases further serve as standards. Primary standards in the form of NISTtraceable spiked filter punches do not exist.
… Replicates of analyzed samples are performed at the rate of one per group
of ten samples. The replicate is selected randomly and run immediately after
a group of ten is completed. The µg/cm 2 values for organic carbon, elemental
carbon, and total carbon are compared with the original run. The values
should fall into the following criteria:
Range
Criteria
< 10 µg/cm 2
< ±1.0 µg/cm 2
> 10 µg/cm 2
< 10 % of average of the 2 values
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—44 of 93
Revision: 0.0
Date: March 2002
Notice that the criteria converge at 10 µg/cm 2. Replicates which do not fall
within the above criteria must be investigated for analyzer or sample
anomalies. Analyzer anomalies include poor response (as reflected in the
calibration peak areas) or poor laser signals affecting the splits between
organic carbon and elemental carbon. Typical sample anomalies include
inhomogeneous deposits or contamination during analysis. Inconsistent
replicates for which a reason cannot be found must be rerun again.
5.5.7
Field Blanks and Secondary Filters
Field blanks and secondary filters are used to estimate the artifact on ambient filters.
The Teflon field blanks have never shown sufficient concentrations of elements, the
coefficient of absorption, and gravimetric mass to justify their subtraction as artifacts.
However, they are monitored to verify that the concentrations remain at negligible levels.
The nylon field blanks are used to estimate the artifacts of the ionic species. The blank
values are generally small compared to those on ambient filters, but are not negligible.
The carbon artifact is estimated from secondary filters. Field quartz blanks are used as
a check on the secondary filters. In the past, the field blanks and secondary filters have
been very similar. This relationship is monitored for changes.
The field blank cassette is identical to the normal cassette and placed in the same
cartridge. The controller does not allow any air to pass through the field blank. No
special treatment for field blanks is needed at the site. The field blanks go through
normal sample handling and analysis.
Secondary filters are collected only for the quartz module. As discussed in section
5.8.2, both secondary filters and field blanks reach a saturation artifact by contact with
the atmosphere without having ambient organic gases drawn though. Because all quartz
filters, ambient and secondary, are already saturated, they do not acquire any additional
artifact from organic gases in the air stream. The secondary filters do not acquire any
gases that may have volatilized from the primary filter. Thus it is not necessary to
subtract or add the secondary filter from the primary filter. In fact, a statistically more
robust artifact is obtained by combining all secondary filter. For this same reason, both
secondary filters and field blanks have similar values. With the Version I sampler, there
was better control of the secondary filters in the field than was possible with field
blanks, making the secondary filter was a more reliable estimator. While this is not true
for the Version 2 sampler, the artifact is calculated from the secondary filters. The field
blanks are used as a quality assurance check.
If all that is needed is a small number of secondary filters, compared to ambient filters,
then there are two possible methods to collect secondary filters. One method is to
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—45 of 93
Revision: 0.0
Date: March 2002
collect secondary filters at all sites and analyze only a small fraction. The second
method is to collect secondary quartz filters at a few selected sites and analyze all of
them. The second method is used for two reasons. The primary reason is better quality
control. If all secondary filters at these few sites are analyzed, it is possible to identify
and correct any swapping of filters during sample handling. For all other sites, there is
only a primary filter and thus no possibility of swapping. A second reason is the cost of
purchasing, prefiring, handling, and temporarily archiving 20,000 unused secondary
quartz filters a year.
Field blanks for all four modules are prepared at a rate of at least 2% of the normal field
samples. This provides about 100 field blanks per quarter for 160 sites. Past data has
shown that the field blank concentrations are independent of site. This is reasonable
since the site operator has little influence on the blanks. The influence is even less with
the Version 2 protocols than with the Version 1 protocols. The blanks are loaded into
the cassettes and cartridges in the central laboratory. The protocol specifies that when
not in the sampler, the cassettes are to be capped and kept in the sealed bag and
shipping box. The operator loads the cartridge as a single unit whether of not there is a
field blank.
If there is no difference between sites, then the number of field blanks should be based
on the total field blanks needed to calculate the artifact, not on the fraction of ambient
samples. The important statistic is the uncertainty in the artifact, which is the standard
error in the mean field blank. Since the artifact is calculated every quarter, the standard
error is the standard deviation divided by the square root of the number of field blanks
in a quarter. The number of field blanks is most important for nitrate. (The ambient
concentrations of sulfate are so much higher than those for nitrate, that the sulfate artifact
is much less significant. No artifact is subtracted from Teflon filters.) Consider the case
of summer 2001, when there were 82 nylon field blanks, corresponding to 107 full sites.
For nitrate, the mean, standard deviation, and standard error of the field blanks were
0.94 µg, 0.42 µg, and 0.05 µg, respectively. The mean mass on ambient filters after
subtracting the artifact was 8.62 µg. Ninety-eight percent of the ambient samples had a
mass greater than 0.79 µg, after subtracting the artifact. The standard error of 0.05 µg
is 0.5% of the average sample and less than 6% for 98% of all samples. The conclusion
is that with the usual standard deviation in field blanks, 100 samples is sufficient.
Increasing this number would have a negligible effect on the data.
The quartz cassettes have a secondary quartz filter at six sites in the network, which
represents 4% of all sites. This provides about 183 secondary filters per quarter.
Again past data indicates that there are no significant differences in the quartz artifact
between sites.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—46 of 93
Revision: 0.0
Date: March 2002
5.5.8
Site Collocation
To verify network precision approximately 10 sites will have a collocated PM2.5
module. This will be serviced by the same personnel as the other modules. This
module can be operated with any of the three filter media. The flow rate will be
recalibrated whenever the filter types are changed. Filters from the collocated modules
will be analyzed in the same manner as the normal filters.
5.5.9
Possible QC Failures
If procedures or equipment show anomalous results of any kind, immediate corrective
action is taken. These events are detailed in Table 15.
Table 15. Possible QC Failures and the Associated Corrective Actions.
Instrument
QC Failure
Corrective Actions
Field Instrument
IMPROVE
Sampler
Site not serviced
If sample days remain then change
cassettes regularly and record time
missed log sheet; If no sample days
remain then change to current cassette
and record days missed on log sheet.
Equipment malfunctions
Use troubleshooting guide in manual to
determine corrective action. Call CNL
Laboratory Manager if further
assistance is needed. (the Laboratory
Manager will fix the problem or send
replacement units with repair
instructions.)
Display values outside
operating range
Record values on log sheet. Call
Laboratory Manager to determine
cause.
New cartridge missing or
mislabeled
Call Laboratory Manager to receive a
new box of cartridges or to determine
correct labeling.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—47 of 93
Revision: 0.0
Date: March 2002
Instrument
QC Failure
Corrective Actions
Field Documentation
IMPROVE
Sampler
Log sheet missing
Call Laboratory Manager to receive a
new log sheet.
Log sheet improperly
completed or incorrect
Laboratory staff searches out the
correct values and flags the data as
questionable. If values are unknown,
the sample is discarded.
Instrument
QC Failure
Corrective Actions
Laboratory Instrumentation
Gravimetric
Calibration test weight
changes more than 1 µg
since last time weighed
Lab Manager is informed and recalibrates
balance.
Control weight more
than 1 µg difference
between balances
Lab Manager is informed and recalibrates
balance and reweighs controls.
Sample weight exceeds
1 mg
Computer gives warning, Lab Manager
determines if reading is valid, based on
sample density and site history.
Sample weight is
negative
Computer gives warning, if sample is a field
blank the Lab Manager accepts the
sample, otherwise the Lab Manager verifies
that a sample was successfully taken.
HIPS
System precision is not
within 5%.
Operator performs a system recalibration.
XRF/PIXE/PESA
System precision is not
within 5%.
Operator performs a system recalibration.
Ion Chromatography
Thermal Optical
Reflection
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—48 of 93
Revision: 0.0
Date: March 2002
5.6
5.6.1
Instrument/Equipment Testing, Inspection, and Maintenance
IMPROVE Sampler Testing and Maintenance
When a malfunction occurs, the Laboratory Manager will first try to isolate the problem
over the telephone. When possible, he will suggest methods for remedying the problem.
The components in the version II sampler are generally easily removable. If the
problem is a malfunction of any of these removable components, the Laboratory
Manager will ship a new component to the site via express mail and have the defective
item returned to Davis. The next step is to ship a complete new module. As a final
resort, a staff member will travel to the site to make appropriate repairs.
On annual site visits, a Field Technician checks all sampler functions, including solenoid
action, vacuum pressure, keypad function, and electronic control. The Technician also
cleans the inlet head, inlet stack, internal cyclone and resets system clocks. Any worn
or damaged parts are replaced. The flow rates are calibrationed, and if necessary
adjusted.
Equipment testing is performed at the CNL Field Test Station on the UCD campus.
Any modification of the sampler is tested before being used in the network.
5.6.2
Gravimetric Maintenance
Every Friday, the sample handling room is thoroughly cleaned, after insuring that all
filters have been protected against contamination. To reduce fugitive dust levels, all
surfaces are cleaned with a high efficiency HEPA vacuum. The floors are cleaned with
a mild cleaning solution, if necessary. Finally, all work surfaces are cleaned with reagent
grade alcohol (or another reagent grade solvent, if necessary) and Kimwipes™. This
procedure reduces the possibility of contamination should a filter fall to the work
surface.
To check the stability of the balance, reweigh the last 20 archived control filters, and
develop a regression line comparing the re-weight values to the original values. If the
standard deviation is greater than ±3 micrograms, and the r2 is not better than .995, the
balance should be carefully inspected and submitted for maintenance.
5.6.3
HIPS Maintenance
No special maintenance is required. Instrumentation is cleaned and repaired if precision
check is not within data quality objectives.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—49 of 93
Revision: 0.0
Date: March 2002
5.6.4
XRF, PIXE, and PESA Maintenance
No special maintenance is required. Instrumentation is cleaned and repaired if precision
check is not within data quality objectives.
5.6.5
Ion Chromatography Maintenance
From Ions Contractor: RTI SOP 1989
Analytical instrumentation used in this project will be carried through
preventative maintenance procedures and schedules as recommended by the
manufacturer.
Dionex Ion Chromatographs require a minimum of maintenance if operated
at pressures less than 800 psi. During the following periods, perform the
maintenance listed:
Daily (all operating days): Check for leaks at all valves and column fittings
(at normal operating pressure). Cycle the injection valve by repeated
switching between INJECT and LOAD and rinse with DI H2O. Check the
meter Zero and Cal adjustments. Wipe-up liquid spills and salted-out
chemicals. Check the drip trays.
Weekly: Compare standard chromatographs to check that no significant
changes in column efficiency have occurred. Check all air and liquid lines for
crimping or discolor.
Monthly: Check column resolution by measuring percent resolution of the
NO3 and SO4 peaks. Oil each pump.
5.6.6
Thermal Optical Reflection Maintenance
From Carbon Contractor: DRI SOP #2-204.6 Revised June, 2000
Regular maintenance for the analyzer involves daily checking of compressed
gas supplies, cleaning the punching tool and tweezers between each sample
with dry KimWipes (Kimberly-Clark Corporation), and backing up data files
on a daily basis. Checks of laser adjustments (physical and electrical) are
made at least monthly; analyzer calibrations are performed every six months.
All calibrations and repairs must be recorded in the log book. Additionally, all
repairs must be recorded in the maintenance log book.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—50 of 93
Revision: 0.0
Date: March 2002
5.7
Instrument/Equipment Calibration and Frequency
5.7.1
IMPROVE Sampler Calibration
The samplers flow rates are adjusted and calibrated at the time of installation and during
annual maintenance. A six-month calibration is conducted by the site operator. A
calibration logsheet is shown in Figure 18.
The calibration device is an orifice meter, which consists of an orifice and meter to
measure the pressure drop across the orifice. The orifice is contained in a probe that is
inserted at the base of the inlet stack. The calibration system is calibrated at Davis using
a DryCal Nexus DC-2 Flow Calibrator that is certified NIST traceable. The log of the
meter reading, Mo, is regressed against the log of the flow rate for a set of four flow
rates covering the normal range of the device.
T + 273
+ +bo * log( M o )
293
log( Q) = a o + log
(Equation 6)
At the time of installation, the nominal flow rates are adjusted to provide a flow rate of
23 L/min at 20°C with a typical filter in the cassette. Before any later re-adjustment, a
4-point calibration is performed. The equation for adjustment is:
 23
1
M o = 
a
 F ( elev ) 10 o
1 / bo



(Equation 7)
Where: F(elev) corrects for atmospheric pressure at a given elevation; ao and bo are
constants for the calibration meter. The technician adjusts the orifice diameter until the
calibration meter has the desired reading.
The flow rate calibration compares the calibration device pressure drop and the
pressure drop of each system transducer for four airflow settings covering the expected
range. If the regression of the logs of these four points yields a correlation coefficient
(R2) less than 0.99, the system is recalibrated. The equation is:
log( Q ) = a o + log
P (sea level ) T + 273
+ +bo * log( M o )
P( site)
293
(Equation 8)
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—51 of 93
Revision: 0.0
Date: March 2002
IMPROVE Sampler Calibration Form
Site Name:______________
elevation________
Date of Calibration:____/___/__
F(elev )_______ (from Table)
Sampler Serial #__________
Field Technician: _____________________
Audit Device # _________
Audit Constants: a1/ob = _________ bo = _________
T_____°C
0
Qo
1
M
=
audit mag reading for nom flow: o F ( elev ) 10a o
Mo (A, B, C) = ______ Mo(D) = ______
A Module : Qo = 23.0 lpm
Flow Rate at Calibration
sea level, 20° Magnehelic
System
Magnehelic
B Module: Qo = 23.0 lpm
System
Vac Gauge
nominal
r2 = ___________
log(flow) =
Vacuum Gauge:
r 2 = ___________
+
+
* log(M)
* (G)
max vac:
System
Magnehelic
System
Vac Gauge
nominal
Magnehelic:
log(flow) =
Vacuum Gauge:
flow =
max vac:
System
Vac Gauge
r 2 = ___________
Magnehelic:
log(flow) =
+
* log(M)
r2 = ___________
Vacuum Gauge:
flow =
+
* (G)
max vac:
C Module: Qo = 23.0 lpm
Flow Rate at Calibration
sea level, 20 ° Magnehelic
System
Magnehelic
nominal
Magnehelic:
flow =
Flow Rate at Calibration
sea level, 20° Magnehelic
D Module: Qo = 16.9 lpm
Flow Rate at Calibration
sea level, 20° Magnehelic
System
Magnehelic
System
Vac Gauge
nominal
r2 = ___________
+
* log(M)
r2 = ___________
+
* (G)
r2 = ___________
Magnehelic:
log(flow) =
+
flow =
max vac:
Figure 18. IMPROVE Sampler Field Calibration Log Sheet.
* log(M)
r 2 = ___________
Vacuum Gauge:
+
* (G)
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—52 of 93
Revision: 0.0
Date: March 2002
5.7.2
Gravimetric Calibration
Laboratory Technicians perform calibration and control measurements twice daily.
Additional calibrations are preformed if zero or test weight exceed 1µg of actual values.
Calibrations weights are traceable to NIST standards and standard test weights are
used to verify the accuracy of calibrations. Control measurements assure agreement
between the Cahn 31 and Cahn 25 balances and demonstrate the effect that cassettes
have upon filter mass.
Calibration weights:
• 200.000 mg class 1.1 (Class M) mass to calibrate the Cahn 31
• 20.000 mg class 1.1 (Class M) mass to calibrate the Cahn 25
• Tare weight: 50.000 mg class 1.1 (Class M) mass to test the Cahn 31
Calibration weights are stored in plastic containers, cleaned weekly and handled with
nylon forceps.
A calibration log database is maintained for each balance; C25CALIB for the Cahn 25,
and C31CALIB for the Cahn 31. All calibrations are recorded in these databases.
Significant events concerning the balance and any balance maintenance other than
routine procedures are recorded in the log of the lab manager.
On a random basis, but at least semiannually, the laboratory supervisor shall request a
comparison of the normal calibration standards with a master set of reference standard
masses maintained by the laboratory supervisor. After calibration, measure these
200.000, 50.000, and 20.000 mg standards and report their masses to the supervisor.
The results are used to verify the integrity of the electrobalance and the standard masses
used in daily calibrations.
5.7.3
HIPS Calibration
A set of 20 network filters representing a typical range of samples is used as standards,
with one selected as the primary standard and 19 as controls. These filters span the
range of typical loadings of the IMPROVE network. The values of R and T for each
standard were previously measured using the CNL reference integrating sphere system
shown in
Figure 19. The standards are periodically remeasured and any damaged standards
replaced. These standards are run at the beginning of each analytical session. The gains
of the detector are adjusted to give the same values of R and T for the primary
reference. The remaining 19 controls are used to verify that the calibration is valid over
a typical range of filters. If the measured control values differ by more than 5% from the
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—53 of 93
Revision: 0.0
Date: March 2002
expected values, the entire system is examined to determine the cause of the
discrepancy.
The transmission and reflection values are determined for an internal control filter that is
kept in the arm that moves the slides into position. The values for this internal control
filter can be determined at any time by the spectroscopist. They are recorded at the
beginning and end of every tray of samples.
detector
filter
He-Ne Laser
Transmittance Mode
Beam expander
light baffles
particles
He-Ne Laser
Reflectance Mode
filter
Figure 19. Setup of CNL Reference Integrating Sphere Apparatus.
5.7.4
XRF, PIXE, and PESA Calibration
The XRF system is calibrated whenever there is a change in the physical configuration
or in the detectors or when a problem is observed in the routine calibration checks.
XRF uses set of 17 Micromatter™ thin film standards consisting of PbGaAs, CuS,
CaF2, Ti, V, Cr, Fe, Ni, Cu, Zn, GaP, GaAs, Se, CsBr, RbI, SrF2, Pb) and 2 clean
filters (blanks) representative of the filter lot used for IMPROVE aerosol sampling. All
thin film standards are NIST-traceable. A cross section table is created from this
information. A smooth curve is determined in order to (1) minimize any error in the
standard, and (2) to determine the cross section of elements not represented. By
combining multiple standards, the accuracy of the cross section table is better than that
using a single standard for each element.
The PIXE system is calibrated whenever there is a change in the physical configuration
or in the detectors or when a problem is observed in the routine calibration checks.
PIXE uses a set of 27 Micromatter™ thin film standards consisting of Pb on Kapton,
CuS on Kapton, NaCl, Mg, two Al, SiO, GaP, CuS, KCl, KI, CaF2, Ti, V, Cr, Mn,
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—54 of 93
Revision: 0.0
Date: March 2002
Fe, Ni, Cu, ZnTe, GaAs, two Se, CdSe, CsBr. Au, Pb and 2 clean filters (blanks)
representative of the filter lot used for IMPROVE aerosol sampling. All thin film
standards are NIST-traceable. A cross section table is created from this information.
A smooth curve is determined in order to (1) minimize any error in the standard, and (2)
to determine the cross section of elements not represented. By combining multiple
standards, the accuracy of the cross section table is better than that using a single
standard for each element.
The PESA system is calibrated using a series of mylar standards of known hydrogen
content.
Both systems are calibrated using a set of 30-40 commercial and NIST elemental
standards whenever the configuration is changed. The system calibration for each
system is verified using 15-20 elemental standards at the beginning of the session; a tray
of filters from a previous session is also reanalyzed. The analysis is performed only after
the precision requirements are met. Scatter plots of the major elements are prepared
and checked for consistency. If the calibration and reanalysis are within the accepted
4%, the regular analysis is allowed to proceed.
5.7.5
Ion Chromatography Calibration
See Section 0 for methodology. After flushing the system with eluent:
From Ions Contractor: RTI SOP Revised July 31, 2000
Anions: Using the calibration schedule, perform the daily multipoint
calibration over the range 0.05 to 25.0 ppm NO2-, NO3-, and SO42- followed by
control/quality assurance (QC/QA) samples listed below:
•
QC sample containing concentrations of Cl-, NO2-, NO3- and SO42typical of those found in the mid-range of actual filter extract
concentrations.
•
QC sample containing concentrations of Cl-, NO2-, NO3- and SO42typical of those found at the lower end of actual filter extract
concentrations.
•
Commercially prepared, NIST-traceable QA sample containing known
concentrations of Cl-, NO2-, NO3- and SO42-.
Cations:Using the calibration schedule, perform the daily multipoint
calibration over the appropriate range followed by quality assurance/quality
control (QC/QA) samples listed below:
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—55 of 93
Revision: 0.0
Date: March 2002
•
a QC sample with NH4+, Na+, K+, Ca2+, and Mg2+ concentrations
typical of the higher concentrations found in actual filter extracts.
• a QC sample with NH4+, Na+, K+, Ca2+, and Mg2+ concentrations
typical of the lower concentrations found in actual filter extracts.
• a commercially prepared, NIST-traceable QA sample containing
known concentrations of NH4+, Na+, K+, Ca2+, and Mg2+.
If the observed values for sulfate differ by more than 10 percent from the
known values, identify and correct the problem before analyzing samples.
5.7.6
Thermal Optical Reflection Calibration
From Carbon Contractor: DRI SOP #2-204.6 Revised June, 2000
Four standards are used in calibrating the carbon analyzers: 5% nominal CH4
in He, 5% nominal CO2 in He, KHP, and sucrose. Only the calibration gases
are used on a daily basis as analyzer performance monitors. KHP and sucrose
are used in conjunction with the two gases semiannually to establish the
calibration curve of each analyzer.
… The calibration procedures for the carbon analyzers are of three types: the
end-of-run calibration peak, the manual calibration injections of CH4/He and
CO2/He, and instrument calibration using KHP, sucrose, and the two
calibration gases.
The end-of-run calibration consists of a set quantity of CH4/He calibration
gas, which is automatically injected by the Carbon program. All FID readings
during the analysis run are normalized to this peak to minimize the effects of
FID performance and electronic drift over time. The end-of-run calibration
occurs automatically at the end of each analysis run and requires no operator
intervention. The integrated calibration peak counts should be checked by the
operator immediately after each run to insure that the analyzer is operating
satisfactorily.
The manual calibration injections are performed at the beginning and ending
of each analysis day, and serve to verify proper analyzer performance.
… Instrument calibration, performed twice a year or when a new calibration
gas cylinder is started, establishes the calibration slope used in converting
counts to µg of carbon, as explained in the next section. Instrument
calibration involves spiking prefired quartz punches with various amounts of
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—56 of 93
Revision: 0.0
Date: March 2002
the 1800 ppm KHP and sucrose solutions and injecting various volumes of the
CO2 and CH4 gases.
A clean blank quartz punch is baked in the analyzer oven at 800°C for 10
minutes using option 4 from the main menu of the carbon program. After the
punch has cooled to less than 50°C, the KHP or sucrose solution is injected
onto the punch using a 20 ml syringe. The following volumes are used:
• 5 ml KHP or sucrose solution
• 10 ml KHP or sucrose solution
• 15 ml KHP or sucrose solution (do twice)
• 20 ml KHP or sucrose solution
• no injection (as a system blank)
• 20 ml acidified DDW only (check of background level of DDW)
The sample port is sealed and the spiked filter punch is pushed to 1 cm from
the sample oven. In this position the punch will experience a temperature of
35 to 40°C due to the heat from the oxidation oven. Allow the punch to dry
thoroughly; the punch will turn from translucent to opaque as it dries. The
punch must be dry to avoid water vapor effects on the FID. The
(OC/elemental carbon) analysis option from the main menu is selected and
started. The integrated peak counts for all seven temperature fractions for
the sample peak and the calibration peak are recorded.
The CO2 and CH4 calibrations are run using the calibration options from the
main menu. The following volumes are injected:
• 100 ml CO2 or CH4 gas (use 1000 ml syringe)
• 250 ml CO2 or CH4 gas (use 1000 ml syringe)
• 500 ml CO2 or CH4 gas (use 1000 ml syringe)
• 1000 ml CO2 or CH4 gas (do once with 1000 ml syringe and once with
2500 ml syringe)
• 2000 ml CO2 or CH4 gas (do with 2500 ml syringe)
Again, the integrated peak counts are extracted manually from the tabular
printouts.
Calibration values are plotted as actual µg carbon vs. the ratio of the
integrated sample peak counts to the calibration peak counts. Obvious
outliers are identified and rerun. Linear regression is performed on each set
of calibration data (separate calculations for KHP, sucrose, CH4/He, and
CO2/He). The calibration slope derived from the CO2 injections typically has
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—57 of 93
Revision: 0.0
Date: March 2002
a slightly different slope and does not fit as well. The slope (m) is calculated
from:
m=
∑ (y x )
∑ (x )
i
i
(Equation 9)
i
and the standard deviation (s) is calculated by:
σ =
1
n −1
where:
xi =
∑ ( y − mx )
∑x
i
2
i
2
(Equation 10)
i
(injected carbon peak area )
(calibratio n peak area )
and:
y i = calculated carbon in spiked filter or manual injection (µ g )
Note that this is a special form of the regression formula, which insures that
the curve passes through the origin.
The resulting slope is compared to previous calibration results. New values
should be no more than ±10% different than previous calibrations if no major
analyzer changes have been made.
The new slope for each analyzer (derived from combined CH4, KHP, and
sucrose data) is placed into the CARBON.DAT file for each analyzer; this file
contains analyzer parameters, which are read into the Carbon program when
it is first started. The date and version number in the CARBON.DAT file is
also updated.
Calibration data and plots are retained in file folders in the file cabinet with
raw analysis data.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—58 of 93
Revision: 0.0
Date: March 2002
5.8
Calculation of Concentration, Uncertainty, and MDL/MQL
The concentration, uncertainty, and MDL/MQL are calculated for every parameter on
every sample and included in the database.
5.8.1 Definitions
Variables calculated externally to the ambient sample:
B = artifact mass (ng) from controls, field blanks, or secondary filters
σfb = standard deviation of the controls, field blanks, or secondary filters
σa = absolute component of the analytical precision
fa = fractional component of the analytical precision
fv = fractional volume precision = fractional flow rate precision
Variables measured or calculated for each ambient sample:
A = mass measured on ambient sample (ng)
V = volume (m3)
σstat = analytical uncertainty associated with counting statistics
c = concentration (ng/m3)
σ(c) = uncertainty of c (ng/m3)
5.8.2
Concentration and Artifact
Define A as the mass of a given parameter measured on the ambient sample, and B as
the artifact mass determined from measurements of controls, field blanks, or secondary
filters. For elemental concentrations, the measured quantity is the areal density
(ng/cm2), so A and B are determined by multiplying the areal density by the deposit
area. For carbon, the measured value on a punch is multiplied by the ratio of deposit
area divided by punch area. For some parameters, the artifact B is zero. The
concentration is given by:
c=
A− B
V
(Equation 11)
Artifact is defined as any increase or decrease of material on the filter that positively or
negatively biases the measurement of ambient concentration. The five major types of
artifact are
• contamination of the filter medium (lab blanks);
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—59 of 93
Revision: 0.0
Date: March 2002
•
contamination acquired by contact with the cassettes, in transportation, or in
handling (field blanks);
• adsorption of gases before and during collection that increase the mass
measured on the filter (quartz secondary filters);
• volatilization during collection and in handling;
• fall-off during handling after collection.
The first three are positive artifacts and the last two negative. The contamination of the
filter medium is determined by analysis of laboratory blanks. The sum of the two
contamination artifacts is determined by analysis of field blanks and secondary filters.
Quartz artifact: The general consensus is that an organic carbon artifact is produced by
the adsorption of organic gases in the air onto certain points of the quartz matrix. The
following information concerning the organic carbon artifact on quartz filters with the
IMPROVE protocol is based on comparison of laboratory blanks, field blanks, and
secondary filters, plus laboratory tests. In addition, two types of filters cassettes were
used, those for the Version 1 samplers and those for Version 2.
•
Field blanks and secondary filters are consistently higher than laboratory blanks
for all organic fractions. This is shown for each of the carbon temperature
species in Figure 20. The figure is based on 1068 field blanks and 1882
secondary filters for the Version 1 cassettes, and 273 field blanks and 560
secondary filters for the Version 2 cassettes. There is no increase as long as the
blank filters are not exposed to the atmosphere. Thus, the major part of the
artifact is acquired on the quartz substrate after prefiring.
•
Temporarily placing the filters in the cassettes does not increase the artifact.
Comparison between the two cassettes in Figure 20 shows a significant
difference for elemental carbon, but not for organic carbon. Thus, the organic
artifact is not associated with contact with the cassette. However, there is
significant elemental artifact associated with long-term contact with the Version
1 cassette. There is only a very small elemental artifact with the Version 2
cassette.
•
Filters left open to the atmosphere for one week have an organic artifact value
that is similar to that of both field blanks and secondary filters. Thus exposure
to the atmosphere is the key factor for organic carbon.
•
Leaving the filters exposed for longer than one week does not increase the
organic artifact value. This indicates that there is a saturation limit independent
of the amount of available organic gas.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—60 of 93
Revision: 0.0
Date: March 2002
•
Keeping the cassettes in a sealed bag increases the time required to reach
saturation. After one week in a bag, the artifact is one-half that of saturation.
•
Field blanks and secondary filters for a given cassette type have approximately
the same measured carbon values . This is shown for each of the carbon
temperature species in Figure 20. In the IMPROVE protocol, the filters are in
sealed bags for six weeks and in the samplers for one week. This is sufficiently
long for field blanks to reach saturation in the organic artifact. Because the
quartz filter artifact saturates, the organic gases that pass through the secondary
filter during collection do not add to the artifact.
•
The low value of the elemental artifact for the Version 2 cassettes indicates that
organic gases in the atmosphere have no significant effect on the elemental
artifact.
•
For the Version 2 sampler, the filed blanks and secondary filters are almost
equal, while for the Version 1 samplers, the field blanks are slightly higher. The
probable reason is that with the protocols for the Version 2 sampler we have
much better control of the field blanks than we did with the Version 1 sampler.
Some field blanks could have been exposed by the field operator. Thus, the
secondary filter should be a slightly better estimate of the artifact on an actual
filter than the field blank.
Version 1 Samplers 3/95 to 8/00
Version 2 Samplers 1/00 to 5/01
160
140
lab blank
all field blanks
secondary filters
140
100
3
100
ng/m
ng/m
3
120
80
60
80
60
40
40
20
20
0
lab blank
all field blanks
secondary filters
120
0
O1
O2
O3
O4
OP
E1
carbon temperature fraction
E2
E3
O1
O2
O3
O4
OP
E1
E2
E3
carbon temperature fraction
Figure 20. Median values of the 8 carbon fractions for laboratory blanks, secondary
filters, and field blanks for samples collected between March 1995 and May 2001.
Negative Artifacts. No correction is made for the volatilization and fall-off of particles
collected on any filter. Tests comparing collocated filters with and without transport
indicated no loss on Teflon filters. Similar tests have not been conducted for the nylon
and quartz filters; however the close agreement between sulfur on Teflon and sulfate on
nylon suggests that there is no loss of sulfate particles on nylon. Volatilization could
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—61 of 93
Revision: 0.0
Date: March 2002
occur for nitrate and low temperature organics. The nitrate particles are tightly bound in
the nylon filter and should not volatilize. Some of the nitrate will volatilize from the
Teflon. The only effect is on the mass measurements at a few sites where nitrate
contributes more than 10% of the mass. Consider summer 1999 as an example. At
San Gorgonio in Southern California the average ammonium nitrate on the nylon filter
was 30% of the average gravimetric mass on the Teflon filter. If 50% of the nitrate had
volatilized from the Teflon filer, then the actual mass concentration would have been
15% higher than measured. For all sites outside California, ammonium nitrate
accounted for less than 3% of the mass; with a 50% loss of nitrate, the change in mass
would be 1%.
Verification of artifact subtraction by distribution of concentrations: The distribution of
concentration is routinely used to verify that a chosen artifact is reasonable. The
concentrations for the variable are ordered and the behavior for the most lightly loaded
samples are examined. With a large number of samples, it can be assumed that the
minimum concentrations will be approximately zero. An artifact is considered
acceptable if less than 2% of the concentrations are negative and if the minimum
concentration is reasonable. If more than 1% to 2% of concentrations are negative, the
artifact used is probably an overestimate of the true artifact. Similarly, if the minimum
concentration is a large positive number, the artifact used is probably an underestimate
of the true artifact. Examples for sulfur, sulfate, nitrate, organic carbon, and elemental
carbon are shown in Figure 21, based on all samples collected in 1999. The average of
the four quarterly artifacts for this period were 24 ng/m3 for NO3, 53 for SO4, 310 for
OC, and 31 for EC. Sulfur has no artifact and shows that the ambient concentrations
do approach zero. Sulfate and nitrate had zero concentrations, but none negative. Both
methods had less than 0.4% equal to zero. Approximately 0.3% of the OC and 1% of
the EC have zero or negative concentrations. These are all in the range of acceptable
artifact. If a variable were to fail the test, the artifacts would be re-examined.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—62 of 93
Revision: 0.0
Date: March 2002
40
concentration (ng/m3)
concentration (ng/m3)
100
S
SO4
OC
0
EC
NO3
-100
0
0.0%
0.5%
1.0%
1.5%
fraction of cases
0.0%
0.5%
1.0%
1.5%
fraction of cases
Figure 21 Distributions of S, SO4, NO3, OC, and EC for low concentrations.
5.8.3
Uncertainty in Concentration
The uncertainty in the concentration is a quadratic sum of collection uncertainty the
analytical uncertainty, and the uncertainty in the artifact. The collection uncertainty is
characterized solely by the flow rate precision. The analytical uncertainty equation for a
given parameter follows the constraints of the analytical method. The elemental analyses
include a statistical component. All methods except gravimetric analysis include a
fractional calibration/normalization precision. Gravimetric analysis explicitly includes an
absolute precision (independent of the magnitude of the mass difference). Ions and
carbon are the only variables for which an artifact is subtracted and the only ones that
include an uncertainty in the artifact. The artifact uncertainty is determined from the
standard deviation in the field blank. The standard deviation includes both the variance
in the artifact plus any absolute analytical uncertainty.
The statistical analytical uncertainty is appropriate whenever the measurement is based
on the number of counts from a detector. This excludes gravimetric and HIPS analyses.
Counting statistics are not provided for IC and TOR. The statistical uncertainty is
included explicitly for PIXE, XRF, and PESA.
It is assumed that the nonstatistical component of the analytical uncertainty may be
separated into a constant mass per filter (σa) and a fractional precision (fa). Gravimetric
analysis has only a constant component, which is determined from control filters. The
elemental analysis methods have only fractional precision components, which are
determined from replicate measurements. IC and TOR may have both components.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—63 of 93
Revision: 0.0
Date: March 2002
The fractional precision component is determined from replicate analysis. The fractional
component includes the uncertainty in the system calibration and in the normalization of
the individual sample. At present, the fractional precision does not include the
uncertainty in preparing an aliquot. Any constant uncertainty is included in the standard
deviation in the field blank or secondary filter.
5.8.4
MDL and MQL
The MDL indicates the analytical minimum detection limit for a given parameter. Below
this limit the analytical system cannot detect the parameter and thus cannot report a
concentration. The MDL is the important quantity for the elemental methods (XRF,
PIXE, and PESA). This is based on the number of counts in the background under the
peaks, and is the concentration corresponding to 3.29 times the uncertainty in the
background. The elemental MDL approximately corresponds to a relative uncertainty
of 50% in the measured areal density. Since there is no artifact subtracted, and the
collection uncertainty is negligible compared to 50%, the uncertainty for the
concentration is also 50% at the MDL.
For non-elemental concentrations, the appropriate quantity is the MQL rather than the
analytical MDL. If s is the uncertainty at low concentrations (i.e., where the fractional
uncertainties in collection and analysis are negligible), the MQL can be defined as either
2s or 3s. Although elemental MDLs are defined as 3.29 times the uncertainty in the
background, the uncertainty at the MDL is actually 50%. Therefore for consistency, the
MQL is defined as 2s. Because the MQL is provided with each concentration, the
user has the option of discarding a concentration is it is less than 3s by multiplying the
provided MQL by 1.5. For gravimetric mass, the coefficient of absorption, and the ion
parameters, the concentration is set equal to zero if it is less than the MQL. (Retaining
the negative concentrations would not provide additional information and could lead to
additional confusion.) For the eight carbon parameters, the concentration is left as
calculated, even if below the MQL. The reason is that the information is needed to
calculate total OC and total light-absorbing carbon.
The analytical MDL is of no importance for a single gravimetric measurement, since a
pre-weight and a post-weight is always measured. However, there are a few cases
where the mass difference is less than twice the uncertainty in the mass difference. This
important statistic is the MQL, defined as 2s. The same is true for the coefficient of
absorption.
An analytical MDL can be determined for IC and TOR, but plays no role in the
statistical significance in the concentrations. Because IC and TOR require an artifact to
be subtracted, the important statistic is the MQL, based on the standard deviation of the
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—64 of 93
Revision: 0.0
Date: March 2002
field blanks, not the smaller analytical MDL. Therefore, the MQL is always reported
rather than the analytical MDL. For these methods, a statistically significant
measurement (ie, above the MDL) may result in a statistically insignificant and even
negative concentration. An example is the IMPROVE parameter chloride ion. Every
measurement on an ambient sample or a field blank collected in 1999 was above the
analytical MDL. However, after artifact subtraction and defining s as the standard
deviation of the field blanks, less than 2% of the samples at non-marine sites had a
concentration above the 2s level. There are a few samples with other variables when
the measured value is less than the analytical MDL and a zero is reported by the
laboratory. A more sensitive analytical system would not have helped, since the
concentration (after artifact subtraction) is well below the 2s level. Even in this case,
the MQL is reported.
Summary: For most elemental concentrations from XRF, PIXE, and PESA, the MDL
is reported. This is defined as 3.29 times the uncertainty of the background in the
spectrum, and corresponds to an uncertainty of 50%. For non-elemental parameters,
the MQL is included in the database instead of the analytical MDL. The MQL is
defined as twice the standard deviation of the control filters, field blanks, or secondary
filters. Again, this corresponds to an uncertainty of 50%.
A statistic that is monitored is the fraction of cases with concentrations above the
MDL/MQL. This depends on the relationship between the MDL/MQL and the
ambient concentrations. If the fraction changes significantly for a season, then the entire
analysis process is re-examined. The fractions are listed in Table 16.
Table 16. Fraction of cases with concentrations above the MDL or MQL. All samples
collected in 1999.
range
99% to 100%
90% to 99%
80% to 90%
60% to 80%
40% to 60%
10% to 40%
less than 10%
parameter
PM2.5, PM10, H, Si, S, K, Ca, Fe, Zn, Br, SO4=
Pb, NO3-, OC, O4, E1
Ti, EC, O3, OP, E2
Al, Se, Sr, O2
Na, V, Cr, Mn, Cu, As
Cl, Cl-, Ni, Rb, O1, E3
Mg, P, Ga, Zr, Hg, Y, NO2-
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—65 of 93
Revision: 0.0
Date: March 2002
5.8.5
Volume Precision
The volume is the product of the average flow rate and the sample duration. The
fractional precision of the volume is the quadratic sum of the fractional precisions of flow
rate and duration. Since the fractional precision of the duration is always much smaller
than that of the flow rate, it can be safely neglected.
The precision in the measured flow rates is less than 3%, as estimated from internal and
third-party audits. A volume precision of 3% is used for all IMPROVE data.
5.8.6
Equations for Gravimetric Mass (Teflon Filter)
The artifact is determined by the average of the mass gain by the control filter and
verified by the field blanks. The average mass gain of the control filters in Version I
cassettes for the history of the network is 3 µg. This corresponds to 100 ng/m3, which
is small compared to the average gravimetric mass concentration of 6000 ng/m3.
Therefore no artifact has been subtracted. With the Version II cassettes, the average
mass gain of the controls is 0 µg.
The analytical precision for gravimetric analysis is independent of the magnitude of the
measured mass difference, so that there is only a constant term. This is determined from
control filters weighed daily in the laboratory and verified by field blanks. The standard
deviation of the control filters in Version I cassettes for the history of the network is 5
µg. This value has been used for the constant analytical precision for all IMPROVE
data. With the Version II cassettes, the standard deviation of the controls is 1.5 µg.
However, the 5 µg precision value will be retained. The uncertainty in ng/m3 is given by:
2
 5000 
2
σ ( c) = 
 + (c * f v )
 V 
(Equation 12)
The analytical MDL is the uncertainty of the mass difference with two mass
measurements of clean filters performed on the same day. In the IMPROVE system,
this is monitored by weighing the same blank filter immediately after the system
calibration in the morning and in the afternoon. The standard deviation of all reweights
over the history of the network is ± 1 µg. Using an estimate of 2s, the MDL is 2 µg.
This corresponds to a concentration of 60 ng/m3. This MDL is not used because it
does not reflect the variation of the mass difference for filters in a cassette for 42 days.
In the IMPROVE system, all filters are in a cassette for approximately 42 days. Any
artifact collected from the cassette is monitored by laboratory controls. A clean control
filter is weighed, stored in a cassette in the laboratory for 42 days, and then reweighed.
This process is repeated daily. With the Version I cassettes, there was a small and
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—66 of 93
Revision: 0.0
Date: March 2002
variable gain in mass: the standard deviation of control filters was ± 5 µg. Using an
estimate of 2s, the MQL is 10 µg. With the Version II cassettes, there appears to be
no gain from contact with the cassette: the standard deviation of control filters is 0 ± 1
µg, which is identical to that of comparison between morning and afternoon. However,
the MQL of 10 µg will be retained for consistency.
The MQL concentration in ng/m3 is given by:
MQL =
10000
V
(Equation 13)
Since the typical volume is 33 m3, the typical MQL is 300 ng/m3. If the mass difference
is less than 10 µg, the concentration and uncertainty are both set equal to zero in the
database. The MQL is retained in the database. In 1999, 0.36% of the mass
concentrations were in this category.
5.8.7
Equations for Elements (Teflon Filter)
XRF, PIXE, and PESA measure the areal density (mass per unit area) for the elements.
The x-ray or proton energy spectrum has a series of Gaussian peaks superimposed on a
background. Each element has a unique set of peaks at fixed energies. The
background in the spectrum is calculated using the spectrum from a blank filter. (The
blank is run for a longer time than the sample to minimize the uncertainty in the
background. The same blank spectrum is used for the entire analytical session, unless
there is a change in configuration.) The areal density is calculated from the number of
counts in the peak after background subtraction, the cross section for the element, the
quantity of incident beam, and the live time of the detector. A correction for x-ray
attenuation in the sample matrix is applied; this is significant only for the lighter elements
(Na through Si). No significant artifact for any element is observed on field blanks.
Any minor contamination in the filter substrate would be removed by the spectral
background methodology. The concentration is equal to the areal density multiplied by
the area of the deposit on the filter and divided by the volume:
c = areal density *
deposit area
deposit area
= k* N*
,
V
V
(Equation 14)
where N is the number of counts in the spectrum after background subtraction and k is
a constant calculated from sample normalization factors (the quantity of incident beam
and the live time of the detector) and from the cross section for the element (calculated
from the standards).
The sources of uncertainty for an elemental concentration are
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—67 of 93
Revision: 0.0
Date: March 2002
• the collection uncertainty (flow rate precision), f v
• the fraction precision in normalization, f a
• the statistical uncertainty from the number of counts in the spectrum, s stat
The uncertainty for a given concentration is given by:
σ (c ) =
(
f v2
+
f a2
)
deposit area 

* c +  σ stat *

V


2
2
(Equation 15)
The fractional analytical precision (f a) is the same for all elements and is determined by
replicate analyses conducted in sequential analytical sessions approximately 3 months
apart. Because the statistical term is always present, only elements where the statistical
term is negligible can be used. The chosen elements are S for PIXE, Fe for XRF, and
H for PESA. Historically, the replicate precision for these elements has been 4% for
the duration of the IMPROVE program. Rather than allow this value to vary each
season based on the individual analysis run, a precision of 4% has been used
throughout. When combined with a volume uncertainty of 3%, the total fractional
uncertainty for elemental concentrations is 5%.
The statistical uncertainty is calculated separately for each peak from the number of
counts in the peak and in the background under the peak. Assuming a Poisson
distribution, N counts in the peak, and Nb background counts under the peak, the
statistical precision is given by
σ stat = k N + 2 N b ,
(Equation 16)
where k is the same proportional constant used to calculate concentration.
The minimum detectable limit (MDL) is based on the background under the peaks in the
spectrum and is calculated separately for each analysis. The MDL for each element is
calculated from the uncertainty (standard deviation) of the background at the location of
the primary peak. The MDL is defined as the concentration at which the number of
valid counts equals 3.29 times the uncertainty in the background.5 The uncertainty is
equal to the square root of the background counts under the peak. The equation for the
number of counts is:
N mdl = 3.29 N b
5
(Equation 17)
Currie, L.A., “Detection and Quantification in X-ray Fluorescence,” in X-Ray
Fluorescence Analysis of Environmental Samples, T.G. Dzubay, ed., Ann Arbor, Ann
Arbor Science, 1977.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—68 of 93
Revision: 0.0
Date: March 2002
The number of background counts is calculated using the normal full-width-halfmaximum of the peak, which is standard for x-ray methods.6
The equation for the MDL in ng/m3 is:
(
)
MDL = k * 3.29 N b *
deposit area
,
V
(Equation 18)
where k is the same constant used in the concentration equation.
Occasionally, the element is observed with a concentration slightly below the MDL; in
this case, the reported concentration is retained. By combining the MDL and
uncertainty equations, the relative analytical uncertainty for concentrations equal to the
MDL is approximately 50%.
To keep non-elemental MQLs equivalent to elemental MDLs, the non-elemental
MQL's are defined as 2s, which is the concentration with an uncertainty of 50%.
The MDL is incorrect if there is a significant interference from another element. The
most important interference is between As and Pb, whose primary peaks cannot be
resolved by XRF. In this case, it is necessary to examine the secondary peaks, which
can be resolved. The cross section of the Pb secondary line is 80% or that of the
primary line, while that of the As secondary line is 20% of that the primary As line.
Thus the interference is decided primarily by the Pb secondary line. Using the Pb
concentration from the secondary peak, the expected number of Pb x-rays in the
primary peak can be estimated. If the number of x-rays in the primary peak is larger
than this, then the primary peak is assumed to have x-rays from both Pb and As. The
uncertainty for As therefore depends on the concentration of Pb as well as the counts in
the spectrum. The MDL for As is not appropriate. The MDL for As is replaced by the
MQL corresponding to a relative uncertainty of 50% in the recalculated As
concentration.
Two elements that suffer from interference are aluminum and chlorine. The Al peak is a
shoulder on the larger Si peak and Cl is a shoulder on the S peak. For most samples in
the network, the concentration of Al is 45% of that of Si. In many cases, Al is not
observed although the predicted value of 0.45* Si is somewhat larger than the
interference-free MDL. Unfortunately, the correct MQL cannot be calculated for either
Al or Cl. The MDL is retained, but is not always correct.
6
Campbell, J.A., “Instrumentation, Fundamentals, and Quantification,” in Particle Induced
X-Ray Emission Spectrometry (PIXE), S.A.E. Johannson, ed., New York, Wiley, 1995.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—69 of 93
Revision: 0.0
Date: March 2002
5.8.8
Equations for Optical Absorption (Teflon Filter)
The parameter determined by HIPS is the coefficient of optical absorption in (Mm)-1,
rather than concentration. The variables determined by the HIPS system is the intensity
of reflected light (R) and transmitted light (T). If the volume is in m3 and the deposit
area is in cm2, and babs is in Mm-1, the equation is given by:
 area 
 1− R 
babs = 100 * 
log e 

 V 
 T 
(Equation 19)
No significant artifact for the coefficient is observed on field blanks. Therefore, no
artifact is subtracted.
The uncertainty in the coefficient of absorption is determined by replicate measurements
of control filters, which span the range from blank to heavily loaded. Based on the
standard deviations of these controls, there is no relative precision term. The equation
in Mm-1 is given by:
2
area 

2
σ ( babs ) =  2.25 *
 + ( f v * babs ) ,
V


(Equation 20)
where the factor of 2.25 is based on the standard deviation of the controls. The
minimum detectable limit is defined as twice the precision in the measurement for a
sample with low absorption. The approximate expression for the MDL in Mm-1 is given
by:
MDL = 4.5 *
5.8.9
area
V
(Equation 21)
Equations for Ions (Nylon Filter)
The concentration is given by:
c=
A− B
,
V
(Equation 22)
where A is the measured parameter mass, B is the median of the field blanks, and V is
the volume for the sample. The artifact is determined for each ion species by the
median of the mass of the field blanks. This value is applied to all sites for each quarter.
The analytical uncertainty in the measured value A can be expressed as the quadratic
sum of a constant component, s a, and a fractional component, f a:
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—70 of 93
Revision: 0.0
Date: March 2002
σ ( A) =
(σ a )2 + ( A * f a )2
(Equation 23)
This equation is valid for both field blanks and ambient samples. The standard deviation
of field blanks will include this uncertainty plus the variation in the artifact. Thus, the
standard deviation for a set of field blanks will be given by:
σ
fb
=
(σ a ) 2 + (B * f a ) + [σ (B)]2
2
(Equation 24)
The uncertainty in the concentration is the quadratic sum of the collection, analytic, and
artifact uncertainties. The uncertainty is given by:
2
2
 σ ( A) 
σ ( B) 
2
σ ( c) = 
+
+ (c * f v ) ,


 V 
 V 
(Equation 25)
where the first term is the uncertainty in the measurement of the ambient sample, the
second is the uncertainty in the artifact, and the third term is the uncertainty in the
volume. Combining the three equations to eliminate A, the uncertainty in concentration
is given by:
B
σ ( c) = σ 2fb + ( f v2 + f a2 ) c 2 + 2 f a2  c ,
V 
(Equation 26)
The first term, s fb, is independent of concentration and dominates for low
concentrations. The second term, (f a2+fv2)c2, dominates for high concentrations. In this
case, the uncertainty is proportional to the concentration. If f a is 4% and f v is 3%, then
the uncertainty for large concentrations approaches 5%. The third term is never
dominant. The constant analytical uncertainty s a drops out of the final equation.
The fractional term is f a determined from replicate analyses and from the comparison
with QA/QC standards. The precision of the IC measurement is a combination of the
precision the calibration, the precision in desorption process, the precision in collecting
an aliquot, and the precision in the measurement of the eluent. The calibration precision
is monitored by producing a solution from Quality Assurance standards and comparing
the measured and predicted concentrations. The precision in desorbing the sample and
collecting an aliquot are not monitored. The precision of the measurement of the eluent
is determined by reanalysis of the sample eluent after every 20 analyses. This precision
is much less than 1% and is therefore negligible. The best estimate of the total analytical
precision is the standard deviation of the difference between the measured values of the
QA standards and the predicted values.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—71 of 93
Revision: 0.0
Date: March 2002
The MQL is defined as twice the standard deviation of the field blanks. The equation is:
MQL = 2
σ fb
V
(Equation 27)
At this concentration, the uncertainty in the concentration is 50%.
5.8.10 Equations for Carbon (Quartz Filter)
The artifact is determined for each temperature species by the median of the mass of the
secondary field blanks. This value is applied to all sites for each quarter.
The analytical uncertainty in the measured value A can be expressed as the quadratic
sum of a constant component, s a, and a fractional component, f a:
σ ( A) =
(σ a )2 + ( A * f a )2
(Equation 28)
This equation is valid for field blanks, secondary filters, and primary filters. The
standard deviation of field blanks and secondary filters will include the constant term
plus the variation in the artifact. Thus, the standard deviation for a set of field blanks will
be given by:
σ
fb
=
(σ a ) 2 + [σ ( B) ]2 + ( A * f a )2
(Equation 29)
The uncertainty in the concentration is the quadratic sum of the collection, analytic, and
artifact uncertainties. The uncertainty is given by:
2
2
 σ ( A) 
σ ( B) 
2
σ ( c) = 
+
+ (c * f v ) ,


 V 
 V 
(Equation 30)
where the first term is the uncertainty in the measurement of the ambient sample, the
second is the uncertainty in the artifact, and the third term is the uncertainty in the
volume. Combining the three equations to eliminate A, the uncertainty in concentration
is given by:
B
σ ( c) = σ 2fb + ( f v2 + f a2 ) c 2 + 2 f a2  c ,
V 
(Equation 31)
The first term, s fb, is independent of concentration and dominates for low
concentrations. The second term, (f a2+fv2)c2, dominates for high concentrations. In this
case, the uncertainty is proportional to the concentration. If f a is 4% and f v is 3%, then
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—72 of 93
Revision: 0.0
Date: March 2002
the uncertainty for large concentrations approaches 5%. The third term is never
dominant. The constant analytical uncertainty s a drops out of the final equation.
The fractional term is f a determined from replicate analyses. This is determined for all
eight temperature fractions, plus for the total organic and elemental fractions. For a
single pair of measurements, if the mean is assumed to be the true value, the square of
the precision for a given pair is given by:
1  differencei
Pi = 
2  meani
2



2
(Equation 32)
Because cases with low mean concentrations will dominate the RMS average of the
precision, a more realistic result is obtained by using the slope of the regression line
comparing the difference and the mean. Because of the factor of 2 above, the precision
is 2 times the slope.
The median and standard deviation of the secondary filters for each of the eight species
is determined each quarter. The relative precision of the eight fractions has been held
fixed throughout the duration of the IMPROVE network. It is re-evaluated annually.
The MQL is defined as twice the standard deviation of the secondary filters. The
equation is:
MQL = 2
σ fb
V
(Equation 33)
At this concentration, the uncertainty in the concentration is 50%.
Historically, the total organic carbon and the total light-absorbing carbon have not been
included explicitly in the IMPROVE database. The user has thus had to calculate the
concentrations from the eight fractions. Unfortunately, the uncertainties cannot be
calculated from the uncertainties in the eight fractions. The reason is that the eight
fractions contain temperature uncertainties that are not present in the totals. For this
reason, it is necessary to estimate the uncertainty in the total organic carbon, total lightabsorbing carbon, and total carbon separately from the eight fractions. Based on the
replicate analyses, the following uncertainties were chosen for the total organic carbon
and total light-absorbing carbon:
σ (OC ) =
σ ( LAC ) =
(120) 2 + (0.05 * OC ) 2
(34) 2 + (0.07 * LAC )2
(Equation 34)
(Equation 35)
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—73 of 93
Revision: 0.0
Date: March 2002
Beginning in May 2001, total organic carbon and total light-absorbing carbon will be
explicitly included in the database. These equations will be used for all past data. For
future data, the replicate analyses will be evaluated annually to determine if different
parameters should be used in the equation.
5.9
Inspection/Acceptance of Supplies and Consumables
The IMPROVE aerosol samplers rely on consistent, high quality filter media.
Consistency throughout the lots of filters is important both to ensure optimal flow rate,
and provide stable artifact values. The quality of the filters refers to their artifact levels
and lack of manufacturing flaws. High artifact levels, especially if not consistent, may
lead to erroneous recorded concentrations. Manufacturing flaws may lead to leakage,
or may simply slow the filter handling processes. To prevent surprises, and to ensure
the filters are appropriate for use, extensive testing of each lot of filters is undertaken
prior to use of that lot in the network. The following four sections describe acceptance
testing procedures for the four filter media used in the IMPROVE aerosol sampling
network.
Teflon Filters
Consistency in mass, areal density, and composition are the primary requirements for
Teflon filters being used in the IMPROVE network. Roughly one percent of the new
filters are selected throughout the lot for acceptance testing. A maximum of one filter is
taken from each box. When certified, the entire lot is accepted for use in network
operations and payment to the vendor is authorized.
Randomly selected filters are visually inspected for unevenly stretched Teflon, holes, and
poorly attached or damaged olefin rings. Unevenly stretched Teflon manifests itself as
color variation on the filter. The less stretched portions are thicker, and appear brighter
white than the thinner, more tightly stretched sections. Unevenly stretched surfaces are
inevitable during production, but the number of visually uneven filters should be kept to
a minimum of fewer than 5%. Uneven stretching leads to local variations in pressure
drop across the filter, and therefore local variations in areal loading. Fortunately, the
analysis procedures compensates for these small differences in areal loading.
Holes in the stretched Teflon mesh are unacceptable. No visual holes should appear on
the filters. If more than 1% of the filters have manufacturing defects, the lot is
considered unacceptable. No filters with holes are used for network operations.
Generally, damaged filters are removed from the stock, as they are encountered, and
stored in a bin for return to the manufacturer.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—74 of 93
Revision: 0.0
Date: March 2002
The semi-rigid olefin ring constitutes the support structure for the Teflon membrane.
The ring is heated and melted onto stretched Teflon mesh to form a filter. Poorly
attached olefin rings occur when the heating is inadequate to melt the entire surface of
the ring so that the Teflon is attached loosely, if at all. This detachment can allow
airflow around instead of through the filter resulting in invalid data. Similarly, for weakly
bonded Teflon and olefin rings, the sample may be collected, but become nonanalyzable due to detachment and subsequent deformation of the stretched Teflon.
Detachment and weak bonding can be visually discerned and are another manufacturer
defect. If more than 1% of the filters in the lot show these defects, the lot should be
returned.
Teflon filters are screened for two other problems, namely warped or damaged olefin
rings. If these occur in more than 1% of the filters, acceptance of the filter lot is
refused. Warped olefin rings are created by excess heating during manufacture. These
filters do not lie flat on a surface and are difficult to mount into cassettes. Damaged
olefin rings often have uneven strips of olefin hanging off them or large dents that allow
air to leak into the filter cassettes. Either defect, in quantities over 1% of the filters in the
lot is sufficient to invalidate the entire lot.
Ten filters are weighed to determine if the weights are within usual Teflon filter ranges
(25 to 40 mg). If the weights fall outside the range, the filters are still acceptable, but
the tare weights on the balances and the programs for data processing must be slightly
changed to process the lot.
The ten weighed filters are used to test the consistency of the areal density of the
stretched Teflon substrate. A 2.2 cm2 area is cut from each filter, then weighed. The
average areal-density (area of the filter punch = 2.2 cm2 divided by the mass of the
Teflon punch) and standard deviation are calculated and compared with current
standards.
The average areal density of the stretched Teflon should be within 20% of 0.39mg/cm2.
The standard deviation between the tested filters should be no greater than 30% of the
average areal density, though larger deviations are acceptable.
Half of the filters selected for quality assurance procedures are analyzed by the HIPS,
XRF, PESA and PIXE systems for elemental artifacts. The spectrum obtained during
each analysis should be a Bremsstrahlung spectrum showing no elemental peaks.
If there are elemental peaks in the spectra, greater than twice the minimum detectable
limit, the filters are considered contaminated. In this case, the project manager reviews
the data, contacts the supplier, forwards the data to the technical division, and requests
a new lot of filters.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—75 of 93
Revision: 0.0
Date: March 2002
The remaining half of the filters are tested for organic artifact by collecting four sets of
side-by-side samples in an array of four to six collocated samplers. The filters are
analyzed by XRF, PESA and PIXE following standard analysis procedures.
The elemental data are plotted for each set of side-by-side samples. The mean
concentrations and standard deviations of the elemental data are calculated. The
precision of the side by side samples, 100*(mean standard deviation / mean
concentration), is calculated.
Outliers in OMH, points more than three standard deviations from the mean, are the
primary concern. OMH is a derived measurement that assumes all sulfur is ammonium
sulfate, and that none of the hydrogen measured is from nitrate (OMH = 13.75*(H 0.25*S) where H is hydrogen concentration and S is the sulfur concentration). If
outliers are observed, the samples should be examined with a microscope and scanning
electron microscope to determine whether surface contamination is visible. Filters
having organic artifact generally contain irregular patches of transparent material. This
material, unidentified but associated with the manufacturing process, is difficult to see on
unused filters. However, once aerosol sampling occurs, the transparent patches are
easily spotted as no airflow, and thus little aerosol deposition, occurs in these regions.
The Quality Assurance Manager determines whether the new lot is acceptable as free of
artifact. If there are no outlier points (points more than three standard deviations from
the mean), and the elemental concentrations obtained compare well with previous
measurements at the testing site, the filters are considered acceptable.
Nylon Filters
Consistency in pressure drop and composition are the primary requirements for nylon
filters being used in the IMPROVE network. CNL purchases Gelman MAGNA nylon
filter material in 33 cm by 18.2 ft. rolls layered in thin sheets of inert spacer paper.
When certified, the entire lot is authorized for use in network operations.
For the certification tests, 20 filters are punched from different sheets randomly chosen
from the lot. To verify the filters in the lot will have uniform thickness and thus similar
flow characteristics, the pressure drop across these filters is measured. If the readings
for any filter are more than three standard deviations from the mean, the lot is rejected.
A change in the average pressure drop will result in a change in the nominal flow rate of
Module B, unless the critical orifices of each sampler is modified. If the average flow
rate for the lot being tested varies by more than 2% from the average flow rate of the
current lot, then all the IMPROVE samplers must be recalibrated at the time the new lot
filters are installed.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—76 of 93
Revision: 0.0
Date: March 2002
Additionally, ten filters are sent for analysis by the ion contractor to verify that there are
no abnormal artifacts. Results are compared to the acceptance results for the lot
currently in use. If the measured concentrations for the two lots are within 5%, and the
standard deviation is low, the filters are accepted for use.
Quartz Filters
The carbon contractor is responsible for acceptance testing. Batches are much smaller
than for the Teflon and nylon filters. Two filters are analyzed using the standard TOR
analysis. To pass the acceptance test, there must be less than 1.5 mg of organic
carbon, 0.5 mg of elemental carbon, and 2.0 mg of total carbon. If the filters do not
pass the acceptance test, the entire lot is discarded.
5.10 Non-direct Measurements
Data required from non-direct measurements include site operator observations made
each week. These observations are recorded on log sheets and entered into the
database comments field.
5.11 Data Management
To handle the large number of samples we have developed a computerized sample
handling system. The system is based on having several work stations, each with a
specific responsibility, and includes the gravimetric and absorption analyses.
Approximately 150 steps are involved in sample processing. The system components
are shown below and in Figure 22.
• The computer keeps a log of every action at each workstation including the
identification of the technician, date, and other appropriate information.
• The computer keeps an audit trail on every filter. At any time we will be able to
determine the status of any filter.
• The computer performs immediate quality assurance checks of all data entered
and notes any problems.
• The computer verifies that all steps for a given filter are properly made.
• The computer system expedites and improves the data reduction and data
validation processes of the final data set.
• The detailed standard operating procedures incorporated in the computer
system are included in the IMPROVE procedures manual.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—77 of 93
Revision: 0.0
Date: March 2002
OUTPUT DATA FILES
DATA OUTPUT
PROGRAM
AUXDATA
WEIGHTS
DATABASE
PRE
IONS
DATABASE
POST
CARBON
DATABASE
SO2
DATABASE
CHANBCS
LASER
DATABASE
LASER
ELEMENTAL
DATABASE
DATABASES
TRAY
PROGRAMS
LOGS DATABASE
LOGSIN
PRIMARY
DATABASE
DATA INPUT
PROGRAM
FIELD LOG
SHEETS
FIELD DATA
Figure 22. IMPROVE Sample Handling Programs and Database.
5.11.1 Database
LOGS
The controlling database is the LOGS database. This database is organized to include
all data for a single site/sample date in one record. Each record includes all the
information from each module at a site for one sample date. Site, sample date, pump
vacuum for each module, initial and final second transducer readings for modules A, B,
and C, initial and final first transducer readings for all four modules, start time, current
temperature, elapsed time for each module, status of each sample, person entering the
data, date the samples returned from the field, quarter, comments on status, and area of
the sample are fields included in the LOGS database. Since the LOGS database is the
primary database, it is protected and input, other than through the programs listed in
Figure 22, is restricted.
The logs entry screen is shown in Figure 23.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—78 of 93
Revision: 0.0
Date: March 2002
Figure 23. IMPROVE Database: Logs Entry Screen.
WEIGHTS
The WEIGHTS database, tied to the LOGS database through the PRE and POST
programs, is organized to include mass and identification data for each filter in a single
record. Each record contains a filter identification number, site, sample date, channel,
status, pre-weight, post-weight, sample weight, pre-weight time and date, post-weight
time and date, electrobalance used for pre-weight, electrobalance used for post-weight,
person entering the data, expected start time, expected sample duration, quarter, and
analysis tray position.
PRETARE
The PRETARE database, tied to the WEIGHTS database through the PRE program, is
organized to include the tare weight for conversion of Cahn 25 weights to Cahn 31
equivalent weights, the correlation between the Cahn 31 weights and the Cahn 31
equivalent weights for the 20 controls filters weighed to derive the equation, and the
starting date for the new equation in a single record. New equations are derived and
entered by the lab manager.
C25CALIB, C31CALIB
The C25CALIB and C31CALIB databases are accessed through the PRE or POST
programs, and are organized to include the name of the balance operator, the date and
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—79 of 93
Revision: 0.0
Date: March 2002
time of calibration of the balance, the zero weight value, the calibration weight value, the
test weight value (tare weight for the Cahn 25), and the magnetic field reading. These
are recorded whenever the balance is calibrated. C25CALIB contains Cahn 25
calibration data. C31CALIB contains Cahn 31 calibration data.
CONTROLS
The CONTROLS database is accessed through the PRE or POST programs, and is
organized to include the following information for each balance (Cahn 25 and Cahn 31):
filter identification name; Pre-weight date; Post weight date; # of days between Pre and
Post weighing of the filter; Pre, Post, re-Pre and re-Post weights; Post minus Preweight; re-Post minus re-Pre weight; Post minus re-Post weight; Pre minus re-Pre
weight; differences between the Pre, Post, re-Pre and re-Post weights measured on the
Cahn 31 and Cahn 25; name of the balance operator responsible for each
measurement; relative humidity during each measurement; and temperature of the room
during each measurement. These are calculated and recorded when the control filters
are weighed.
IONS
The IONS database, tied to the LOGS database through the CHANBCS program, is
organized to include ionic species and filter identification for each B (Nylon) filter
sample. All data for each filter is recorded in a single record. Each record contains the
site, sample date, start time, status, Cl concentration, NO2 concentration, NO3
concentration, and SO4 concentration. All concentrations are in ng/m3.
CARBON
The CARBON database, tied to the LOGS database through the CHANBCS
program, is organized to include the eight measured carbon channels, five composite
carbon variables, and filter identification for each quartz filter sample. All data for each
filter is recorded in a single record. Each record contains the site, sample date, start
time, status, replicate markers, five composite carbon variables, four fields of organic
carbon concentrations, one field for carbon concentrations derived during pyrolysis of
the filter, and three fields of elemental carbon concentrations. All concentrations are in
ng/m3.
LASER
The LASER database, tied to the LOGS database through the LASER program, is
organized to include transmission and reflectance measurements, calculated uncorrected
absorption values, and filter identifications for each A (Teflon) filter. This data, for each
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—80 of 93
Revision: 0.0
Date: March 2002
filter, is recorded in a single record. Each record contains status, site, sample date, start
time, channel, elapsed time, flow rate, transmission, reflectance, absorption values
uncorrected for sample density, measurement errors, measurement minimum detectable
limit, initials of person collecting data, and date and time of measurement.
PIXE
The PIXE database, tied to the LOGS database through the TRAY program, contains
the elemental concentrations derived from XRF and PIXE, and filter identification for
each A (Teflon) filter. For each filter, this data is arrayed in a single record. XRF
elements recorded include species from Sulfur through Zirconium. PIXE elements
recorded include species from Sodium through Lead. For each species, the following
are recorded in ng/m3: elemental concentration, error in concentration, and minimum
detectable limits.
OUTPUT DATA FILES
The OUTPUT DATA FILES are site and quarter specific. The files are stored on the
FTP site maintained by Fort Collins. Not all collected species are reported in the
OUTPUT DATA FILES. Only forty-two elemental, ionic, and carbon channel species
are reported for each sample.
5.11.2 Programs
PRE
The PRE program accesses the LOGS database to develop a list of filters to be
uploaded into cassettes. Next, it queues the filters for weighing, and records pre-weight
values from the electrobalance to the WEIGHTS database. Finally, it records the site
code, sample date, and start time to the LOGS database, thereby completing the
initialization of the records for each filter being sent to the field. If the pre-weight/postweight method is replaced by beta-gauge measurements of mass, PRE will be replaced
by a program called UPLOAD which will be identical to PRE except it will not access
of the WEIGHTS database. A simplified schematic of the functioning of PRE follow in
Figure 24.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—81 of 93
Revision: 0.0
Date: March 2002
PRE
RUN
CALIBRATE
BALANCE
OPEN
RUN BALANCE CONTROLS?
YES
C25CALIB.DBF OR
C31CALIB.DBF
CLOSE
OPEN CONTROLS.DBF
C25CALIB.DBF OR
C31CALIB.DBF
CLOSE
NO
CONTINUE PRE.
COPY LOGS QUEUE TO PRE WEIGHT
QUEUE
NO
OPEN
COPY PROTOCOLS FOR SITES IN
QUEUE TO PRE WEIGHT QUEUE
OPEN LOGS
QUEUE. EMPTY?
SITE.DBF
CREATE
YES
CLOSE
NEW BOX?
CONTROLS.DBF
NO
END
YES
SITE.DBF
OPEN
WEIGHTS.DBF
REPLACE DATES IN QUEUE WITH
CLOSE
WEIGHTS.DBF
LAST FILTER DATE FOR EACH SITE
INCREMENT DATES IN QUEUE. FIND
THE NEXT FILTER SAMPLE DATE
INCREMENT THE DATES FOR EACH
SITE IN THE QUEUE THREE TIMES TO
FIND THE NEXT 3 SAMPLE DATES.
RECORD THE FOUR DATES FOR
EACH SITE IN THE QUEUE.
REMOVE SITE
VIEW FIRST SITE AND DATE IN
INCORRECT SITE
FROM QUEUE
INCORRECT DATE
ENTER FIRST
QUEUE. CORRECT?
SAMPLE DATE
CORRECT
END
NO
ENTER SITE INTO
PRE WEIGHT
CREATE
PRE WEIGHT QUEUE EMPTY?
YES
NO
FOR THE FIRST SITE IN THE QUEUE,
WEIGH THE FILTERS FOR THE FIRST
TWO FILTER DATES.
(AT LEAST TWO "A" FILTERS, AND
AT MANY SITES, 2 "D" FILTERS
AND, RARELY, A FIELD BLANK)
NEW BOX?
YES
QUEUE
YES
WEIGHTS.DBF
OPEN
OPEN
RECORD: SITE, DATE, AREA, PRE WEIGHT,
BALANCE USED, DATE AND TIME, STATUS
LOGS.DBF
RECORD SITE, DATE, PROTOCOL, STATUS
AREA, FIELD BLANKS, AND PRE DONE.
REMOVE THE SITE AND FILTER
DATES CORRESPONDING TO THE
WEIGHED FILTERS FROM THE
QUEUE
Figure 24. PRE Program Schematic.
CLOSE
CLOSE
WEIGHTS.DBF
LOGS.DBF
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—82 of 93
Revision: 0.0
Date: March 2002
LEAK CHECK
The LEAK CHECK program is a quality assurance and data accounting program. It
accesses the LOGS database to check for agreement between the sampling protocol
displayed in the box, and the protocol described in the LOGS database. Once the
contents of each shipping box have been verified, the LEAK CHECK program adds
the current date and the box identification to a temporary file of boxes not yet returned
from the field. This provides a quick reference of which boxes were shipped and when
they were shipped. (Note: The name of the program was defined when the Version I
cassettes were used and were checked for leaks. The Version II cassettes do not leak,
so there is no actual leak check.)
LOGSIN
The LOGSIN program records data from the field log sheets for exposed filters in the
LOGS database. Through this program, temperature gauge voltages, pump vacuum
voltages, two transducer voltages, elapsed sampling time, and filter status are recorded
in the LOGS database for each filter.
POST
The POST program accesses the LOGS database to develop a list of exposed filters to
be downloaded from cassettes. Next, it queues the exposed filters for weighing, and
records post-weight values from the electrobalance into the WEIGHTS database.
Finally, it displays the tray and position in which the filter should be mounted, and notes
the record is complete in the LOGS database. If the pre-weight/post-weight method is
replaced by beta-gauge measurements of mass, POST will be replaced by a program
called DOWNLOAD which will be identical to POST except it will not access of the
WEIGHTS database. A simplified schematic of the functioning of POST follows in
Figure 25. A new program, BETAWTS, will be used to collect beta-gauge
measurements of mass for transfer to the WEIGHTS database, should a Beta-gauge
system be installed.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—83 of 93
Revision: 0.0
Date: March 2002
RUN
POST
YES
CALIBRATE BALANCE
OPEN C31CALIB.DBF OR C25CALIB.DBF
CLOSE
C31CALIB.DBF
C25CALIB.DBF
NO
YES
RUN CONTROLS?
NO
COPY LOGS QUEUE TO POST
WEIGHTS QUEUE.
OPEN CONTROLS.DBF
CLOSE
CONTROLS.DBF
CONTINUE POST.
LOGS QUEUE
NO
EMPTY?
YES
END
OPEN
LOGS.DBF
COPY PROTOCOLS FOR SITES IN
LOGS TO POST WEIGHTS QUEUE
LOGS.DBF
OPEN
WEIGHTS.DBF
VIEW FIRST SITE AND DATE IN
QUEUE. CORRECT?
CLOSE
NO
CLOSE
WEIGHTS.DBF
CONTACT LAB MANAGER FOR ASSISTANCE
YES
YES
POST WEIGHTS QUEUE EMPTY?
NO
FOR THE FIRST SITE IN THE QUEUE,
WEIGH THE FILTERS DISPLAYED ON
THE SCREEN IN THE ORDER SHOWN.
(AT LEAST TWO "A" FILTERS, AND
AT MANY SITES, 2 "D" FILTERS
AND, RARELY, A FIELD BLANK)
END
WEIGHTS.DBF
OPEN
OPEN
RECORD: POST WEIGHT, BALANCE USED, DATE
AND TIME OF MEASUREMENT, AND INITIALS.
CALCULATE: MASS, FILTER POSITION
LOGS.DBF
RECORD: POST WEIGHTS DONE FOR FILTER.
WEIGHTS.DBF
CLOSE
CLOSE
LOGS.DBF
MOUNT THE WEIGHED FILTERS IN
THE PREPARED SITE SLIDE FRAMES
IN THE POSITION CALCULATED BY
WEIGHTS.
Figure 25. POST Program Schematic.
CHANBCS
The CHANBCS program accesses the LOGS database to organize the B (Nylon) and
C (Quartz) filters for analysis. Organization is by date and time the log sheet was
entered into the LOGS database. The display shows the site name, sample date, and
status of each filter. Filters are organized for shipment to the analysis contractor
according to the CHANBCS queue. When four complete boxes of Nylon or Carbon
filters are available, CHANBCS creates a file containing the organization information to
send to the contractor with the filters. This file is used for quality assurance to compare
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—84 of 93
Revision: 0.0
Date: March 2002
the LOGS database with the actual filters, and is used by the contractors as a template
for reporting their data to CNL.
LASER
The LASER program accesses the LOGS database to organize a quarterly analysis
queue. Organization is by site, sample date within the requested quarter, and start time.
The LASER program collects data from the hybrid integrating plate system (HIPS),
associates it with the filter identification information from the LOGS database, and
stores the resulting record in the LASER database. The database, when complete,
contains status, site name, sample date, start time, channel, elapsed time, flow rate,
transmission, reflectance, name of operator collecting data, and date and time collected.
The LASER database also includes the calculated values of filter absorption
uncorrected for loading effects, with error and minimum detectable limits for all values.
TRAY
The TRAY program accesses the LOGS database to organize an analysis queue by
quarter for XRF and PIXE. Organization is by site, by sample date in quarter
requested, and by start time. The files created by the TRAY program are used to
queue the XRF and PIXE/PESA analyses.
AUXDATA
The AUXDATA program accesses the secondary databases for two purposes: to
allow quality assurance of the data, and to create final ASCII data files. AUXDATA
organizes by site, sample date, and start time. Using AUXDATA, the quality assurance
specialist can edit the status, site name, and sample date in the secondary data files, as
necessary, to correct inaccurate data entries. AUXDATA also calculates error values
for the ions and carbon data, and selects data from the other databases to include in the
final data files. The final ASCII data files contain the following: site, sample date, start
time, sample duration, flow rate, Babs, and concentrations, errors and minimum
detectable limits of PM2.5 mass, PM10 mass, H, Na, Mg, Al, Si, P, S, Cl, K, Ca, Ti, V,
Cr, Mn, Fe, Ni, Cu, Zn, As, Pb, Se, Br, Rb, Sr, Zr, Mo, SO4, Cl, NO2, NO3, and
organic and elemental carbon channels.
The Auxdata screen is shown in Figure 26.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—85 of 93
Revision: 0.0
Date: March 2002
Figure 26. IMPROVE Database: Auxdata Screen.
5.11.3 Field Data
The initial step after the box is received is to read the field data on the memory cards
using the program READFLSH. The computer will process the information and look
for possible errors in data transmission, such as wrong date or abnormal values. The
lab manager has the option of deleting or correcting the data. When acceptable, the
data are transferred to the LOGS database. The computer then displays what the field
log sheet should contain using the program LOGSIN. The lab manager compares the
field log sheets with the information from the memory card to check for further errors.
If the memory card is missing or damaged, the data from the field log sheets is entered
with the LOGSIN program. This program provides a screen identical to the log sheet,
with blank spaces to enter the data. Once the data are saved, the values are transferred
to the LOGS database.
The lab manager verifies the date of installation, sample dates on the filter identification
tags, any omission in the log sheet entry, anomalous vacuum readings, sample runtime
duration, and reviews any comments on the log sheet. Corrections can be made at this
time, and if necessary the Site Operator is contacted. Corrections are made by drawing
a single line through the existing information using a fine red pen, and entering the correct
information. Comments are also made using a fine red pen.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—86 of 93
Revision: 0.0
Date: March 2002
Invalid samples are noted on the log sheets and a problem code sticker is placed on the
bag containing the invalid sample. Samples that are not valid are removed from the
system, placed in Petri dishes, individually labeled and placed in the problem filter
archive. A code is entered in the database for this sample. The codes for flagging
invalid samples are given in Table 17.
Table 17. IMPROVE Sample Codes for Invalid Samples.
Code Definition
NS
Site Not Serviced or filter not changed at proper time (Technician error
resulting in a sample period of less than 18 or greater than 24 hours).
OL
Off Line The sampler was taken off line.
EP
Equipment Problem (Mechanical problem with sampler or associated
equipment resulting in an invalid sample due to problems with flow rate,
elapsed time, pump function or vacuum pressure).
PO
Power Outage (Lack of power resulting in a sample time less than 18
hours).
BI
Incorrect Cartridge Installation (Inserted cartridge with less than four
cassettes, installed cartridge upside down, installed cartridge in wrong
module).
XX
Other (Torn filter, broken cassettes, contaminated filters, other).
A additional preliminary code of “QD” (Questionable Data) can be assigned during
the sample handling to alert the data processing manager to a give special consideration
to this sample during the data validation process. (See Section 7.2.) The filter is
processed normally.
A second check of the collection data is made at this time. If the data was entered from
the Log Sheets rather than the memory card, the log sheets are re-entered by the lab
manager at this time to ensure no entry errors have been made, and that no conditions
that could result in invalid samples were missed by the first entry. To perform the
second entry of the log sheets, the lab manager initiates the DOUBLELOGS program
to access the LOGS database. The previously entered log sheets appear on the screen,
in the order entered. The lab manager reenters each log sheet. The computer then
compares the newly entered sheet with the original and will highlight any differences.
The lab manager then checks the data in the LOGS database using the program
AUXDATA. Any questionable entries are highlighted on the screen. The lab manager
then verifies whether the memory card was read correctly or the log sheet information
was entered correctly.
IMPROVE QAPP
Data Generation and Acquisition
Section—Page: 5—87 of 93
Revision: 0.0
Date: March 2002
The shipping box is then moved to the downloading station. Using the information from
the memory card or field log sheet, all information for the box is displayed for the
technician. The filters for ions and carbon analysis are placed in Petri dishes for
shipment to the analysis contractors. The computer maintains a record of the location of
every filter in the contractors’ boxes. At the same time, the lab technician checks the A
and D (Teflon) filters, while remaining in their cassettes, for visible damage or unusual
deposits. Any unusual condition is recorded on the log sheets and, if necessary, a
problem code tag is filled out and attached to the bag containing the suspect filter.
IMPROVE QAPP
Assessment and Oversight
Section—Page: 6—88 of 93
Revision: 0.0
Date: March 2002
6. ASSESSMENT AND OVERSIGHT
6.1 Types of Assessments
The assessment of the IMPROVE Network will be performed by the EPA Office of
Air and Radiation (EPA OAR), in a manner consistent with the assessment of the
Speciation Trends Network (STN). The following types of assessments are expected
to be performed:
•
•
•
•
•
•
Management systems reviews
Network reviews
Technical systems audits
Performance evaluations
Audits of data quality
Data quality assessments
6.2 Assessment Frequency
The assessment frequency will be determined by EPA OAR.
6.3 Acceptance Criteria
Assessments will be based on the acceptance criteria to be developed by EPA OAR.
6.4 Assessment Schedules
The initial assessments for the network will be conducted in calendar year 2001.
6.5 Assessment Personnel
The assessment personnel will be provided by EPA OAR. The field audits will be
conduced by the EPA Office of Radiation and Indoor Air in Las Vegas.
Lead assessors should have assessment, technical, and quality system experience.
Other assessment team members also may have such experience, or they may have only
technical experience and currently be receiving assessment and QA training. Lead
assessors should have knowledge and understanding of the applicable environmental
statutes and regulations. They should be familiar with EPA management systems and
with the organizational and operating procedures for environmental data collection.
Lead assessors should have a working knowledge of the technical assessment
techniques for examining, questioning, evaluating, and reporting environmental data
operations and for following up on response actions. They need to understand the
assessment planning process. They also need technical understanding of the
IMPROVE QAPP
Assessment and Oversight
Section—Page: 6—89 of 93
Revision: 0.0
Date: March 2002
IMPROVE Network. In general, they need to be able to evaluate the IMPROVE
Network scope of work, its management system structure, and its operating procedures
and to judge the IMPROVE Network adequacy compared to this QAPP.
Assessment team members should be familiar with technical assessment concepts and
techniques and with the structure and operating procedures for environmental data
collection. They should have technical knowledge of the IMPROVE Network.
Depending on the scope of the technical assessment, assessors may need to meet
additional qualifications, including health and safety requirements.
Technical specialists, who have specialized knowledge of the IMPROVE Network and
basic knowledge of assessment techniques and procedures, may participate in
assessments. They may need basic training in assessment techniques and procedures.
Under the direct supervision of the lead assessor, they may help prepare the technical
portions of assessment checklists and may conduct the technical portions of an
assessment. They can verify findings and observations that are made by other
assessment team members concerning any specialized technical aspects of the
IMPROVE Network.
Three general standards for assessors are as follows:
• The assessors assigned to conduct a specific assessment should collectively
possess adequate professional proficiency for the tasks required. This standard
places responsibility on the assessors' organization to ensure that the assessment
is conducted by assessors who collectively have the technical knowledge and
assessment skills necessary for the assessment. This standard applies to the
assessors as a group, not necessarily to every individual assessor.
• The assessors should be free from personal and external barriers to
independence, organizationally independent, and able to maintain an
independent attitude and appearance. This standard places responsibility on the
assessors' organization and on individual assessors to maintain independence so
that assessment findings will be both objective and viewed as objective by
knowledgeable third parties.
• The assessors should use due professional care in conducting the assessment
and in preparing related reports. This standard places responsibility on the
assessors' organization and on individual assessors to follow all applicable
standards in conducting assessments. Assessors should use sound professional
judgment in determining the standards that are to be applied to the assessment.
IMPROVE QAPP
Assessment and Oversight
Section—Page: 6—90 of 93
Revision: 0.0
Date: March 2002
The authority and independence of assessors, and the limits on their authority, must be
clearly defined in the organization's quality documents. Assessment personnel should
have sufficient authority, access to programs and managers, and organizational freedom
to
• Identify and document problems that affect quality;
• Identify and cite noteworthy practices that may be shared with others to
improve the quality of their operations and products;
• Propose recommendations (if requested) for resolving problems that affect
quality;
• Independently confirm implementation and effectiveness of solutions; and
• When problems are identified, provide documented assurance (if requested) to
line management that further work performed will be monitored carefully until
the deficiencies are suitably resolved.
Prior to an assessment, it is important to establish whether the assessors have the
authority to stop or suspend work if they observe conditions that present a clear danger
to personnel health or safety or that adversely affect data quality. If not, assessors need
to know what communication they may be required to have with the authorized official
who can stop work. Safety is paramount; no assessments will be made in any unsafe
conditions.
6.6 Assessment Reports
The product of an assessment is a written report. The objective of the report is to
communicate assessment findings to the proper levels of management and the assessed
organization. The report must include
• Assessment/ review title and number and any other identifying information;
• The lead assessor, assessment team members, and the management and key
personnel of the assessed organization;
• Background information about the IMPROVE Network activity being
assessed, the purpose and date(s) of the assessment, the particular parameter
evaluated, and a brief description of the assessment process;
• Summary and conclusions of the assessment and proposed response actions;
and
• Attachments and appendices that include all evaluation and finding information.
6.7
Implementation of Response Actions
IMPROVE QAPP
Assessment and Oversight
Section—Page: 6—91 of 93
Revision: 0.0
Date: March 2002
After an assessment, any necessary response actions should be timely and effective. In
certain cases, it may be necessary to perform response actions as quickly as possible.
Such cases may include adverse impacts on data quality and threats to personnel health
and safety. Verbal approval from responsible parties suffices under these conditions.
The EPA OAR will provide a response action form to document any nonconformances
that require actions and the resolution of them. This form includes the signatures of the
individual identifying the need for response action, and the individual responsible for
implementing the response action. The problem requiring response action, the
proposed response action, and the approach for evaluating the response action should
be described.
Response actions encompass immediate actions to eliminate problems such as errors in
calibrations, weighing, and other internal procedural problems and long-range response
actions instituted to improve overall data quality. Management of the assessed
organization responsible for the assessed activities is responsible for ensuring that
effective and timely response actions occur. The response actions should address the
following:
•
•
•
•
•
Measures to correct each nonconformance,
Identification of all root causes for significant deficiencies,
Determination of the existence of similar deficiencies,
Response actions to preclude recurrence of like or similar deficiencies,
Assignment of response action responsibility, and
Completion dates for each response action.
Management of the assessed organization should implement the response actions and
provide objective evidence to EPA of the effectiveness of the correction. Once such
objective evidence is received, the assessment will be closed unless a reassessment is
planned. In some cases, the assessment team may be needed to confirm the successful
implementation of response actions.
IMPROVE QAPP
Assessment and Oversight
Section—Page: 6—92 of 93
Revision: 0.0
Date: March 2002
6.8
References
American Society for Quality. 1994. Specifications and Guidelines for Quality
Systems for Environmental Data Collection and Environmental Technology
Programs. ANSI/ ASQC E41994. Milwaukee, WI.
U. S. Environmental Protection Agency. 1994. Guidance for Preparing,
Conducting, and Reporting the Results of Management Systems Reviews. Draft
EPA Publication No. EPA QA/ G-3. Washington, DC.
U. S. Environmental Protection Agency. 1998a. Quality Assurance Handbook for
Air Pollution Measurement Systems, Volume II: Part 1 Ambient Air Quality
Monitoring Program Quality System Development. EPA Publication No. EPA454/ R-98-004. Washington, DC.
U. S. Environmental Protection Agency. 1998b. SLAMS/ NAMS/ PAMS Network
Review Guidance. EPA Publication No. EPA-454/ R-98-003. Washington, DC.
U. S. Environmental Protection Agency. 1998c. Guidance for Data Quality
Assessment: Practical Methods for Data Analysis. EPA Publication No. EPA QA/
G-9. Washington, DC.
U. S. Environmental Protection Agency. 1999. Guidance on Technical Audits and
Related Assessments for Environmental Data Operations. EPA Publication No.
EPA QA/ G-7. Washington, DC.
U. S. Government Accounting Office. 1994. Government Auditing Standards.
Washington, DC.
IMPROVE QAPP
Assessment and Oversight
Section—Page: 6—93 of 93
Revision: 0.0
Date: March 2002
6.9
Reports to Management
This section describes the type, content, distribution, and frequency of submission of
quality assurance (QA) reports for the IMPROVE Network. Regular QA reporting to
management serves the following needs:
•
•
•
Documentation of adherence to schedules for delivery of equipment, data, and
reports
Documentation of deviations from approved QA and standard operation
procedures (SOPs), and the impact of these deviations on data quality
Analysis of the potential uncertainties in decisions based on the data.
Self-Assessment Results—IMPROVE samplers will be subject to internal technical
system audits, audits of data quality, and performance audits by CNL. If problems are
found, corrective action will be initiated immediately, and documented. Results of selfassessments will be summarized in the next annual QA report to management.
Independent Assessment Results—External technical system audits, audits of data
quality, and performance audits will be conducted by the EPA Office of Air and
Radiation. If problems are found, corrective action will be initiated immediately, and
documented. Results of self-assessments will be summarized in the next annual QA
report to management.
Data Set Review—Following the Level 2 data validation by CNL, the concentration
data will be sent to the NPS and the state agencies for review.
Quality Assurance Report to Management—The annual QA report to management
will include an executive summary of all activities, a detailed summary of accuracy and
precision achieved, and discussion of outstanding problems and audit results.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—1 of 24
Revision: 0.0
Date: March 2002
7. DATA VALIDATION AND USABILITY
7.1
Data Review, Verification, and Validation
Data are accepted if they meet the data quality objectives listed in Table 4. The
validation process in Section 7.2 attempts to recover as much data as possible and
includes potential adjustments based on calibration results, audits, and supporting field
information. The adjustments are rejected if they do not meet the data quality
objectives. The data include status flags if necessary to qualify or describe the data.
7.2
Verification and Validation Methods
Samplers are installed, tested and maintained in accordance with QA and QC guidelines
specified in Section 5.6.1. Flow rate audits remain within 5% of target values. If flow
rates slightly exceed these values, audit values may be used to back-calculate an
accurate flow rate. If data still does not meet the data quality objectives, it is rejected.
Prior to acceptance for network use, all filter lots are tested for contamination, as
specified in Section 5.3.1. Filters that do not pass acceptance are rejected.
After collection log sheets are reviewed and each sample is coded with a status, as
specified in Section 5.11.3. Data with codes that indicate questionable data are flagged
and comparisons are made to determine data validity.
All equipment maintains precision and accuracy within data quality objectives, as
specified in Sections 5.5 and 5.7. IF data does not meet the data quality objectives, it
is rejected.
Flow rate consistency checks are done upon entry of the field log sheet data, and at
weekly, monthly and quarterly intervals for each site. If the initial flow rate is not within
5% of the nominal value, there may be a problem with the sample, or the sampler.
These anomalous data are carefully checked to determine the cause and resolution of
the problem, and corrective action is taken if necessary.
Consistency plots of the flow history at a site provide a visual reference for sampler
behavior, useful in resolving problems by displaying the chronic or sporadic behavior of
the sampler flow. A typical consistency history is shown in Figure 27.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—2 of 24
Revision: 0.0
Date: March 2002
A channel flows
PINN1
30
25
20
15
Mag Flow Init
Vac Flow Init
10 % Above
Mag Flow Ave
Vac Flow Ave
Nominal
Mag Flow Fin
Vac Flow Fin
10% Below
10
5
12
/4/
9
12 6
/18
/96
1/1
/97
1/1
5/9
7
1/2
9/9
7
2/1
2/9
7
2/2
6/9
7
3/1
2/9
7
3/2
6/9
7
4/9
/97
4/2
3/9
7
5/7
/97
5/2
1/9
7
6/4
/97
6/1
8/9
7
7/2
/97
7/1
6/9
7
7/3
0/9
7
8/1
3/9
8/2 7
7/9
7
9/1
0/9
7
0
Figure 27. IMPROVE Sampler Consistency Plot of Flow History.
Unusual differential masses are generally resolved before entering the database. If no
resolution is found, the A (Teflon) mass data is flagged, but is entered into the database.
Later quality assurance procedures will deal with the problem. For D (Teflon) filters,
unresolved large negative masses are changed in status to XX and removed to the
problem file, while unresolved large positive masses are flagged, but kept and entered
into the database for further analysis.
The secondary quality assurance on gravimetric data occurs quarterly during
construction of the PIXE instruction files. The differential masses are sorted by
magnitude, and extremely large or negative masses are re-weighed for verification.
Other possibilities such as sample misidentification are considered, and the data are
corrected if necessary. Also, the A (Teflon) gravimetric data, 2.5µm cut point, are
compared to the D (Teflon) gravimetric data, 10µm cut point. If the A (Teflon)
gravimetric mass is the same size or larger than the corresponding D (Teflon) mass, both
are re-weighed and flagged if no resolution occurs. No data are removed from the
database at this time. Filters with unresolved mass problems are analyzed normally.
Verification of mass data occurs after acquiring and processing the elemental data.
Reconstructed mass values are generated from the elemental output from the elemental
analysis, and these values are compared to the measured mass values. The measured
and reconstructed mass should correlate well, with the reconstructed mass being
between 85% and 100% of the measured mass. The percentage is site dependent and
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—3 of 24
Revision: 0.0
Date: March 2002
is generally reflected in historical data. If the percentage is substantially different from
past values, or is out of range, there may be a problem with the weight measurement or
the elemental analysis. The sampler flow, the sample duration, and the deposit area are
carefully verified, as well as the calibration values and re-analysis data from the PIXE
run. The data manager and quality assurance manager investigate and the data is either
accepted or rejected.
If all but one or two of the measured mass values at a site correlate well with the
reconstructed mass values, the measured mass for these one or two points is considered
suspect. If the A (Teflon) gravimetric mass is larger than the D (Teflon) gravimetric
mass for the site and date in question, the A (Teflon) gravimetric mass is assumed
invalid and is flagged for deletion. If no clear decision can be made, the data manager is
consulted and a resolution is reached.
Three analytical systems are used for elemental analysis of the Teflon filters. Module A
(Teflon) filters are analyzed in quarterly batches. All elemental analyses occur during
two quarterly analysis runs. Upon completion of these procedures, but before deleting
invalid data, the data manager is consulted for approval. If approved, the data are
deleted, and the deletions recorded in a log file for later reference. Only one deletion
file for PIXE, PESA and XRF data is created for each quarter, and this is also saved
for later reference. All of the quality assurance plots are reviewed by the data manager
and archived for later reference.
7.2.1
XRF Validation
At the start of the analysis run, a tray of elemental standards is analyzed to provide data
on the functioning of the XRF system, and to ensure the detector is working properly.
The seventeen elemental standards purchased from Micro Matter Co. are the same
type used for PIXE standards, and are analyzed in the PIXE system every twelve
analysis runs to confirm the correlation between the XRF and PIXE systems shown by
the re-analysis data. The elemental standards include single, double, and multiple
element thin film standards and range in concentration from 20 to 60 micrograms per
square centimeter. These standards are analyzed to ensure the detector is working
properly, and to calibrate the acquisition system. The same standards are used for each
quarterly analysis to provide continuity. Included in the standards tray are several
unexposed filters, blanks, for determining the background spectra. The spectra of each
blank are scrutinized for contaminants, and the cleanest blank is selected for use as the
background subtract value. As a second check, this background subtract value is used
on the re-analysis samples. If the spectra are not under or over subtracted, the blank is
saved.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—4 of 24
Revision: 0.0
Date: March 2002
Once the standards have been deemed acceptable, two trays of samples analyzed
during the previous run are re-analyzed. The data from this re-analysis is plotted against
the data from the previous run. From the comparison of the standards, a renormalization parameter (RENO) is determined. With the RENO applied to the reanalysis, the re-analyzed filters are compared to the data set taken during the previous
run. This is a calibration value that removes the effect of variations in the x-ray beam
between analysis runs. The re-analysis tray is run at the start and end of each ten-day
analysis period, roughly every thousand samples, or whenever the XRF system is shut
off and turned back on. The analysis is done in ten-day segments since the XRF system
must be shut down and the detector refilled with liquid nitrogen on that time scale.
Once the XRF system has been turned off, it requires a four-hour warm up period to
allow the detector and the x-ray source to stabilize before it can be calibrated. Before
continuing the analyses, the re-analysis tray must be analyzed and a new renormalization parameter calculated. Generally, just before the XRF system is shut
down, the re-analysis tray is analyzed to verify the current re-normalization factor.
Two systems are used to collect and analyze the spectra from the detector. ACE
collects the data from the detector and associates it with an analysis number. RACE
processes the data from the detector into a file containing the sample identification, the
quantity of each species found, the minimum detectable limits for each species, and the
location and probable identification of peaks in the x-ray spectrum. It also does
subtraction of the background value for the substrate. Users enter the run calibration
values, and determine the optimal blank subtract. The output data is saved in files
according to site and quarter or month, data for each sample given by the operator and
the analysis instruction file. Once the data have been collected, they are stored pending
PIXE analysis of the samples.
7.2.2
PIXE/PESA Verification
At the start and end of each analysis run, a tray of standards is analyzed. Thirty are
elemental standards purchased from Micro Matter Co., six are Mylar blanks for PESA
calibration, and two are unused (blank) Teflon filters. The elemental standards include
single, double, and multiple element thin film standards and range in concentration from
20 to 60 micrograms per square centimeter. These standards are analyzed to ensure
the detectors are working properly, to normalize the two PIXE detectors, and to
calibrate the acquisition system. The same standards are used for each quarterly
analysis to provide continuity. Every twelve analysis runs, the PIXE standards are
analyzed in the XRF system and the PIXE system, and the results are compared to
confirm the correlation between the two systems shown by the re-analysis data. The
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—5 of 24
Revision: 0.0
Date: March 2002
areal density the elements on the standards are measured and compared to the quoted
values. The average of the ratio of the measured and quoted values is the renormalization factor, RENO, for the analysis run, and negates differences between runs
due to slight changes in the proton beam. If the RENO is not between 0.9 and 1.1,
with a standard deviation of less than 5%, the system is not considered adequate for
sample analysis and the run is stopped until the problem can be rectified.
Once the standards have been deemed acceptable, a tray of samples analyzed during
the previous run is re-analyzed. The data from the re-analysis is plotted against the data
from the original analysis. From this comparison, an independent re-normalization
factor is determined. The RENO generated from the standards tray analysis and that
from the re-analysis tray should agree. If the values are not within 5%, the proton beam
is checked for stability, and both trays are re-analyzed.
The re-analysis tray is analyzed every thousand samples, roughly three times during a
standard analysis run, and whenever the proton beam is re-tuned. Generally, the same
proton beam is used for an entire analysis run, so the re-analysis tray is run only three
times. Re-normalization factors for the entire run are determined after the run is
completed by doing best-fit analysis after averaging all the collected RENO values.
The unexposed Teflon filters, blanks, in the standards tray are used to determine the
PIXE background spectra. The spectra of each blank are scrutinized for contaminants,
and the cleanest blank is selected for use as the background subtract value. As a
second check, this background subtract value is tested on the re-analysis samples. The
blank subtract should eliminate the background spectra that exist between elemental
peaks, but should not eliminate any peaks.
Two systems are used to collect and analyze the spectra from the detectors. ACE
collects the data from each detector and associates it with an analysis number. RACE
processes the data from each detector into separate files containing the sample
identification, the quantity of each species found, the minimum detectable limits for each
species, and the location and probable identification of peaks in the x-ray spectrum. It
also does subtraction of the background value for the substrate. Users enter the run
calibration values, and determine the optimal blank subtract. The output data is saved in
files according to site and quarter or month, data given by the trayfile for each sample.
7.2.3
Ion Chromatography Verification
7.2.4
Thermal Optical Reflection Verification
From Carbon Contractor: DRI SOP #2-204.6 Revised June, 2000
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—6 of 24
Revision: 0.0
Date: March 2002
Level I validation is performed by manually checking the tabular and
thermogram printouts the day after the analysis is performed. The laboratory
supervisor or a designated technician is responsible for checking the data.
The following items are checked on the tabular data:
• The filter ID is correct.
• For calibration injection runs, the carrier gas type is He/O2 in the
morning and He only in the afternoon, and the injection gas type is
reversed between morning and afternoon injections.
• The analysis date is correct.
• The punch area is correct; errors in entry require that the calculated
carbon concentrations be recalculated by hand.
• The deposit area is correct; errors in entry require that the calculated
carbon concentrations be recalculated by hand.
• The calibration peak area is in the correct range.
• The initial and final FID baseline are within 3 counts of each other;
excessive FID baseline drift is cause for reanalyses. NOTE: Some very
heavily loaded filters will have an FID baseline drift greater than 3
counts no matter which carbon analyzer the sample is run on, but
typically a FID baseline drift greater than 3 counts signals either a
problem with the run or with the carbon analyzer.
• The lower laser split time and the upper laser split time are within 10
seconds of each other. If the times differ by more than 10 seconds,
check that the lower split organic carbon (OC) and upper split OC
differ by no more than 5%. OC values which differ by more than 5%,
unless due to a small change in laser signal resulting from an
extremely clean or very dark sample, requires reanalysis.
• Calculated carbon values for calibration injection runs are within 10%
of the current mean value for the injected gas type on that analyzer.
• Acceptance runs for prefired quartz filters result in < 1.5 µg/cm 2 OC,
< 0.5 µg/cm 2 elemental carbon, and < 2.0 µg/cm 2 total carbon. Filters
which exceed these levels must be refired.
Items which are found to be okay are underlined in red. Items which have
problems are circled in red.
The thermograms are checked for the following:
• The initial FID baseline is flat, indicating that the analyzer has been
thoroughly purged before analysis began.
• The final FID baseline prior to the calibration peak is within 3 counts
of the calculated FID baseline; excessive drift is cause for reanalysis.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—7 of 24
Revision: 0.0
Date: March 2002
•
The laser signal during the first 90 seconds appears near the bottom of
the graph (no reflectance); an excessively high initial laser is an
indication that the internal reflectance of the quartz rod is too high,
either due to too many internal cracks or a complete fracture of the
rod. High initial laser signals should result in a physical inspection of
the analyzer.
• The initial laser line drawn on the thermogram matches the laser
signal immediately after the rod is pushed in. A laser line which is too
low is an indication that the sample was not pushed into the oven in
time; a laser signal which exceeds the calculated initial laser signal is a
symptom of physical coupling between the sample boat and the push
rod, although some automobile emission samples also show this
characteristic; a spike or a number of jumps in the laser signal
indicates that the operator had difficulty in decoupling the boat from
the push rod. All of these problems are grounds for reanalysis if
severe.
• The laser signal should dip below the initial laser line until oxygen is
introduced at 550°C, at which the signal should rise steeply.
• The laser signal at the end of the analysis is flat; if the laser signal dips
as the oven begins to cool, the boat is physically coupled to the push
rod and the laser signal during the rest of the analysis is suspect.
• The temperature readings reflect stable and smooth temperatures at
each level and quick transitions between levels.
Problems or deviations from normal should be circled in red. If the sample
punch taped to the thermogram is not white, it is also circled.
If examination of the tabular and thermogram printouts result in a decision
that a sample should be reanalyzed, write "Rerun" in red on the printouts and
prepare a reanalysis list. This list should be posted immediately after the
validation is complete, and those samples should be rerun as soon as they can
be conveniently fit into the current day's analyses.
Evidence of persistent analyzer problems must be resolved, either by
physically examining the analyzer or reviewing the problems with the analyzer
operator.
The following steps are followed to create a dBase file containing carbon data
and to perform Level "I" validation on it:
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—8 of 24
Revision: 0.0
Date: March 2002
•
Obtain copies of the latest version of the summary file from the
directory corresponding to the desired project. These files are called
CPEAKS.n, where n is the carbon analyzer number. These files may
be either copied from the backup files or copied directly from the
carbon analyzer computers. The latter method is recommended, as it
guarantees that the summary files retrieved are the latest versions.
These files are updated at the end of each analytical run, so the latest
version is necessary to insure that all of the analyses are included.
• Copy the files to the working directory.
… The final dBase file name is specified in the analysis list posted in the
carbon room.
After the INPCARBN program produces the dBase output file, the program
will alert the operator that it is ready to print the contents of that file.
Although the INPCARBN program is capable of printing the carbon data file,
it is recommended that Microsoft Excel be used to print the file. If the file is
opened in Excel, the uncertainty columns deleted, and the page setup set to
landscape mode and fit to one page wide by many pages long, a good printout
will be produced on a laserjet printer. If you must use the INPCARBN
program to print the carbon data file, a wide carriage dot matrix printer is
recommended.
After the printout is produced, immediately label the top of the printout with
the file name and printout date. This can also be accomplished by using a
labeling header in Excel.
Begin validation by matching the filters listed on the analysis list with the
filters listed on the dBase printout. There must be at least one entry on the
printout for every filter listed on the analysis list.
Flag field and lab blanks as the list is reviewed by placing "b1", "b2", or
"b6" in the second column of the printout. Because the dBase printout is
sorted by ID number, replicates and reruns will be grouped together.
Indicate missing data by writing the missing filter ID in the margin with an
arrow drawn to the appropriate place of insertion. Scan the printout for
unusual IDs which may have been mistyped during analysis. Generally these
will appear at the beginning or end of the printout due to the sorting process.
Make sure that all samples listed on a rerun list appear on the printout.
Resolve all missing data. If a large amount of data is missing because of
analysis in the incorrect subdirectory, it is generally easier to retrieve the
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—9 of 24
Revision: 0.0
Date: March 2002
summary file from that incorrect subdirectory, trim the unnecessary data from
that file using a word processor, combine the remaining data with the
CPEAKS.n file, and rerun the INPCARBN program. If only a few data points
are missing, it is generally not too much trouble to simply write the correct
values from the daily folders on the printout and add those values manually to
the dBase files at the same time the flags and other corrections are made.
Scan the deposit area column for incorrect entries. Circle the incorrect
entries to insure that corrected values replace those currently in the database.
Scan the filter IDs for multiple entries of ID numbers. Under normal
conditions, the only times multiple entries should occur are reruns and
replicates. All multiple entries must be flagged to indicate the reason for their
existence.
Scan for missing runs. The most common example is the first run being
aborted or lost for some reason, and the only entry in the dBase file is the
second run. An entry for the first run must be inserted, flagged as invalid, and
labeled as to the reason it was invalid. All punches taken from the filters
MUST be accounted for.
Scan the OC and elemental carbon columns looking for unusually high or low
values. At this time make sure that the field blanks and/or lab blanks are all
close to one another. Circle any possible outliers for further investigation.
Finally, pull the analysis folders and go through the analysis summaries and
thermograms one by one. At this time, resolve all circled items and all missing
flags. Determine if analyses flagged by the operator are legitimate. These
flags are determined by the operator at the end of the analysis run… If the
temporary flag is not warranted, draw a line through the flag to indicate it
should be removed. If the sample should be rerun, add it to a rerun list. If the
analysis has some anomaly but still appears to be legitimate, either flag or add
notes to the comments field as appropriate... All samples flagged as invalid
must have an entry in the comments field to describe the reason that the
sample is invalid.
… All operator-generated flags must be either converted to standard analysis
flags or removed. The (current flags) are temporary flags only and are not
recognized as legitimate analysis flags at DRI.
After all thermograms have been reviewed and all possible reruns have been
identified, post the rerun list in the carbon room and have the reruns done as
soon as possible.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—10 of 24
Revision: 0.0
Date: March 2002
Review the data from the reruns, looking for inconsistencies. Insure that the
reasons for the rerun have been addressed. Mark the printout with the new
values for manual insertion into the dBase file. Previous runs must be flagged
as invalid or the reruns flagged as replicates.
Finally, all comments, flags, insertions, and other changes made to the
printout are entered into the dBase file. After all changes are made, generate
a new printout. Label the new printout with the file name and printout date.
Forward a copy of the printout and the dBase file on disk to the person
putting the final report together.
A Microsoft FoxPro program named CARBVAL exists which will greatly
assist in the validation process. When you use the CARBVAL program, you
must specify if you are validating a data file which contains IMPROVE data
or not. There are several validation steps that the CARBVAL program
performs which are specific to IMPROVE.
The CARBVAL program will perform the following validations:
• Identify records where the deposit area (deparea) is not equal to the
deposit area entered by the user.
• Identify records with duplicate runids.
• Identify records before/after which contain gaps in runid sequence
• Identify filters with more than one original run.
• Identify records where elemental carbon concentration is greater than
organic carbon concentration.
• Identify blank filters where organic carbon concentration is greater
than 3.95*deparea
• Identify blank filters where elemental carbon concentration is greater
than deparea
• Identify records where organic carbon concentration divided by the
total carbon concentration is less than specified value (default is 0.75).
• Identify voided samples that do not have a value in the comments
column.
• Identify replicate records incorrectly classified as ‘r1’ or ‘r5’.
• Identify samples where a ‘m2’ flag was not present in both the original
and replicate runs.
• Identify replicate runs where OC, EC, or TC’ values are not within
10% or the deparea (in µg/filter) of original run.
• Identify records where filter channel value is not equal to ‘C’.
• Identify records where filter type is not primary or secondary.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—11 of 24
Revision: 0.0
Date: March 2002
•
•
•
•
7.2.5
Identify secondary filters where EC > 3.8 or OC > 18.
Identify secondary filters where EC or OC values are > corresponding
primary filter values.
Identify records where 'samdate’ or ‘week’ values are empty.
Identify records with incorrect values in the ‘day’ field.
Elemental Data Verification
Once the PIXE data have been collected, quality assurance procedures for the
complete elemental analysis of the samples begins. This is usually done during the PIXE
analysis run to reduce the chances of having a run that go beyond standard
specifications. First, the XRF, then the PIXE and PESA data are re-examined to verify
the calibration and blank subtract values used during the analysis runs. Parallel data sets
are created, the XRF database having only XRF data, the PIXE database containing
PIXE data, and PESA database containing the PESA data. A plotting program is used
to compare the following species from each database for each sample: S, Ca, K, V, Ti,
Mn, Fe, Ni, Cu, Zn, As, Pb, Se, Br, Sr, Rb, and Zr. Iron is the most important
comparison because both PIXE and XRF have good sensitivity. A comparison for Fe
is shown in Figure 28. If the species do not correlate well, the data manager and PIXE
manager are consulted. The calibrations and re-analysis data are checked, and a
resolution is reached.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—12 of 24
Revision: 0.0
Date: March 2002
B96ELE, N = 1758, R = 0.998, R^2 = 0.996
Y = 0.9916*X + (-2.7926), Slope error = 0.002, Intcpt. error = 2.764
Mean X = 715.08, Error = 25.24, Std Dev. = 1058.10
Mean Y = 706.27, Error = 25.02, Std Dev. = 1049.21
Mean Y/Mean X = 0.988
14000
12000
IMPROVE* FE
10000
8000
6000
4000
2000
0
0
2000
4000
6000
8000
10000
12000
IMPROVE* PFE
Figure 28. IMPROVE Quality Assurance Plots: PIXE vs. XRF Comparison.
14000
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—13 of 24
Revision: 0.0
Date: March 2002
If the species all show good correlation, the following correlation plots are created:
• gravimetric mass versus reconstructed mass,(MF vs. RCMA)
• gravimetric mass versus hydrogen, (MF vs. H)
• silicon versus iron, (Si vs. Fe)
• copper versus minimum detectable limit for copper (Cu vs. MDL)
• zinc versus minimum detectable limit for zinc (Zn vs. MDL)
• trace element concentration versus time
As these species should correlate well, poorly correlated data points are identified and
scrutinized. If the stray points appear due to regional effects (i.e. seen at most of the
sites in a region on or near that date), the data is considered valid. If the stray points do
not appear to correlate to anything, it is possible that the data are incorrect.
A series of time plots of trace element concentration at each site are generated for the
following species: Se, Br, V, Pb, Ni, and As. This is shown in Figure 29.
The data manager generates site specific plots show concentration versus time, and
involve data from the current quarter as well as the previous four quarters. If a step in
the data is seen between quarters, the elemental analysis system is suspect, and the
PIXE and XRF calibrations and re-analyses are carefully reviewed by the data manager
and quality assurance group until a resolution is reached. Slow trends in the data are
due to regional effects.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—14 of 24
Revision: 0.0
Date: March 2002
ACAD1
1.2
Selenium in nanograms/cubic meter
1
0.8
SE
0.6
0.4
0.2
6/1
/96
6/5
/96
6/8
/96
6/1
2/9
6
6/1
5/9
6
6/1
9/9
6
6/2
2/9
6
6/2
6/9
6
6/2
9/9
6
7/3
/96
7/6
/96
7/1
0/9
6
7/1
3/9
6
7/1
7/9
6
7/2
0/9
6
7/2
4/9
6
7/2
7/9
6
7/3
1/9
6
8/3
/96
8/7
/96
8/1
0/9
6
8/1
4/9
6
8/1
7/9
6
8/2
1/9
6
8/2
4/9
6
8/2
8/9
6
8/3
1/9
6
0
Figure 29. Site Specific Time Plot of Trace Element Concentrations.
Zinc versus the minimum detectable limit for zinc is plotted to verify that no zinc
contamination was present on the filters. An example is shown in Figure 30. Zinc
contamination, from the anodized sampler parts, would be in the form of large particles,
if it existed. The percentage of zinc in a regional aerosol is generally constant, so outlier
points having large zinc concentrations on the plot of Zn vs. MDL would tend to
indicate contamination. An outlier is defined as being greater than three standard
deviations from the correlation line. The data for each outlier point is flagged for review
by the Data Manager.
SE_MDL
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—15 of 24
Revision: 0.0
Date: March 2002
0.35
Zn MDL (nanograms per cubic meter)
0.3
0.25
Possible
Outlier
0.2
0.15
0.1
0.05
0
0
5
10
15
20
25
30
35
40
45
50
Zn (nanograms per cubic meter)
Figure 30. IMPROVE Quality Assurance Plots: ZN vs. MDL for Possible Outliers.
Copper versus the minimum detectable limit for copper is plotted, like zinc, to verify that
no copper contamination from the brass fittings on the sampler is present on the filter.
As in the Zn vs. MDL plot, if outliers are found, they are flagged for review by the data
manager.
The Si vs. Fe plots show site specific correlation, as both elements are primarily due to
soils, and the ratios of these elements in the soils near a site are constant from year to
year. Thus, this plot may be used as a secondary check of the XRF and PIXE
calibrations, since Si is a PIXE element, while Fe is an XRF element. Poor correlation
in the Si vs. Fe plot, unless historically common, would indicate a problem with the XRF
or PIXE system calibrations and would result in review of these calibrations by the data
manager.
The gravimetric mass versus H plot is meant to prove that PESA data was collected for
all the samples. The plot also verifies the functioning of the PESA system since the ratio
of organic matter to fine mass is roughly constant at four to five percent. Higher or
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—16 of 24
Revision: 0.0
Date: March 2002
lower ratios may indicate improper calibration of the PESA system, or invalid
gravimetric mass data.
The gravimetric mass versus reconstructed mass plot is done as further quality
assurance of the gravimetric data. Reconstructed mass, RCMA, is generated from data
collected during analysis and does not include organic or nitrate contributions to the
measured mass. RCMA generally correlates well with the fine gravimetric mass, MF,
and stray data points are typically due to invalid gravimetric mass data. The measured
and reconstructed mass should correlate well, with the reconstructed mass being
between 85% and 100% of the measured mass. The percentage is site dependent and
is generally reflected in historical data. If the percentage is substantially different from
past values, or is out of range, there may be a problem with the sampler or the elemental
analysis. The sampler flow, the sample duration, and the deposit area would be
carefully verified, as well as the calibration values and re-analysis data from the PIXE
run until the data were resolved. Invalid gravimetric data would be flagged for deletion.
7.2.6
Flow Rate Data Validation
Quality assurance of the flow rate data for each sampler module are done upon receipt
of the elemental, ion and carbon data. One species is selected from each analysis
procedure, and the concentration of the species is plotted against the uncertainty in the
concentration. An example is shown in Figure 31. The uncertainty is a function of the
volume of air samples, so if the flow rate is incorrect, the data will not fall on the
correlation line. Outlier points are data points that are located more than three standard
deviations from the correlation line. For each outlier point, the flow rate and elapsed
time are reviewed. If necessary, the data manager is consulted to reach a resolution.
The species concentrations plotted against their uncertainties are as listed for each
sampler module:
A (Teflon) PM2.5 - Sulfur versus the uncertainty
B (Nylon) PM2.5 - Sulfate versus the uncertainty
C (Quartz) PM2.5 - Light absorbing carbon (LAC) versus the uncertainty
D (Teflon) PM10 - Total mass versus the uncertainty
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—17 of 24
Revision: 0.0
Date: March 2002
Flow correlation plot
A channel
5000
4500
4000
IMPROVE* S error
3500
3000
2500
2000
1500
1000
500
0
0
10000
20000
30000
40000
50000
60000
70000
80000
90000
100000
IMPROVE* S
Figure 31. IMPROVE Quality Assurance Plots: Flow Rate.
7.2.7
Species Comparisons Between Modules
Further data is verified through elemental or species comparisons between modules.
The B (Nylon), C (Carbon), and D (Teflon) modules measure one or more species that
are also measured by module A (Teflon). This overlap allows verification of data
through inter-comparison of samplers and analysis procedures. The following sampler
module comparisons provide valuable information on the quality of the reported data:
A (Teflon) versus B (Nylon)
Quality assurance for the A and B modules consists of comparison of the measured
concentration of sulfur and sulfate. Sulfur concentrations are reported through elemental
analysis, while sulfate concentrations are derived through ion chromatography analysis.
Since both modules sample simultaneously and have the same flow and aerosol size cut
point, the collected data should correlate. An example is shown in Figure 32.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—18 of 24
Revision: 0.0
Date: March 2002
B96TOT, N = 1340, R = 0.996, R^2 = 0.991
Y = 1.0009*X + (-47.7366), Slope error = 0.004, Intcpt. error = 15.459
Mean X = 2624.55, Error = 90.68, Std Dev. = 3319.47
Mean Y = 2579.09, Error = 90.76, Std Dev. = 3322.33
Mean Y/Mean X = 0.983
25000
IMPROVE* SO4
20000
15000
10000
5000
0
0
5000
10000
15000
20000
25000
IMPROVE* S3
Figure 32. IMPROVE Quality Assurance Plots: Elemental Sulfur vs. Ionic Sulfate.
Any data more than three standard deviations from the correlation line are considered to
be outlier points. All outlier points are carefully reviewed for flow rate entry errors, or
analytical errors. Corrections are made and unresolved outlier data are flagged for
review by the data manager and quality assurance group.
A (Teflon) versus C (Quartz)
Quality assurance for the A and C modules involves correlation plots of four species,
two from each analytical technique.
The first correlation plot is of Babs and the measured concentration of light absorbing
carbon (LAC). Babs values are determined through hybrid integrating plate system
(HIPS) analysis, while LAC concentrations are derived through thermal optical
reflectance (TOR) analysis. Since both Babs and LAC are measurements of light
absorbing carbon, and both modules sample simultaneously and have the same flow and
aerosol size cut point, the two measurements should within reason. An example is
shown in Figure 33.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—19 of 24
Revision: 0.0
Date: March 2002
B96TOT, N = 1415, R = 0.894, R² = 0.800
Y = 1.1625*X + (-90.6985), Slope error = 0.023, Intcpt. error = 10.443
Mean X = 382.08Mean Y = 353.46, Mean Y/Mean X = 0.925
3500
3000
IMPROVE* LACN (ng/m³)
2500
2000
1500
1000
500
0
0
500
1000
1500
2000
2500
-500
IMPROVE* LRNC (ng/m³)
Figure 33. IMPROVE Quality Assurance Plots: Babs vs. LAC.
The second correlation plot is of the concentration of organic mass from hydrogen
analysis (OMH) and the concentration of organic mass from carbon analysis (OMC).
OMH concentrations are determined by assuming that all sulfur is in the form or
ammonium sulfate, no hydrogen is associated with nitrates, and the remaining hydrogen
measured by PESA is from organic compounds. OMC concentrations are derived
through thermal optical reflectance (TOR) analysis. Although OMH is merely an
approximation of organic carbon, since both modules sample simultaneously and have
the same flow and aerosol size cut point, the two measurements correlate well. An
example is shown in Figure 34.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—20 of 24
Revision: 0.0
Date: March 2002
B96TOT, N = 1382, R = 0.915, R^2 = 0.838
Y = 0.9693*X + (-307.4456), Slope error = 0.017, Intcpt. error = 60.524
Mean X = 2868.28, Error = 66.36, Std Dev. = 2466.79
Mean Y = 2472.89, Error = 64.49, Std Dev. = 2397.46
Mean Y/Mean X = 0.862
40000
35000
30000
IMPROVE* OMCN
25000
20000
15000
10000
5000
0
-5000
0
5000
10000
15000
20000
25000
30000
35000
-5000
IMPROVE* OMH
Figure 34. IMPROVE Quality Assurance Plots: OMH vs. OMC.
For quality assurance, any data more than three standard deviations from the correlation
line are considered to be outlier points. All outlier points are carefully reviewed for flow
rate entry errors, or analytical errors. Corrections are made and unresolved outlier data
are flagged for review by the data manager and quality assurance group.
A (Teflon) versus D (Teflon)
Quality assurance for the A and D modules consists of comparison of the PM2.5 mass
concentration and the PM10 mass concentration. This procedure is done to verify that
no PM2.5 mass values are larger by two standard deviation than the corresponding
PM10 mass values, and as another check of the sampler flow rates. Although the ratio
of PM2.5 mass to PM10 mass is fairly consistent at most sites, the correlation plot is
meant only to verify that no PM2.5 mass values are larger than the corresponding PM10
mass values. An example is shown in Figure 35.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—21 of 24
Revision: 0.0
Date: March 2002
B96TOT, N = 1365, R = 0.860, R^2 = 0.740
Y = 1.3973*X + (2864.5045), Slope error = 0.033, Intcpt. error = 361.599
Mean X = 9034.26, Error = 198.33, Std Dev. = 7327.46
Mean Y = 15488.18, Error = 264.80, Std Dev. = 9783.34
Mean Y/Mean X = 1.714
70000
60000
IMPROVE* MT
50000
40000
30000
20000
10000
0
0
5000
10000
15000
20000
25000
30000
35000
40000
45000
50000
IMPROVE* MF
Figure 35. IMPROVE Quality Assurance Plots: MF vs. MT.
7.2.8
Regional Data Review
Most sites in the IMPROVE network fall into one of two groups, according the
sampling conditions and the historical data. The sites East of the Mississippi River
historically have high humidity in the summer, have relatively larger mass loadings, and
proportionally higher sulfur concentrations. The Western sites historically have low
humidity, are west of the Mississippi River, have relatively lower mass loadings, and
proportionally higher soil concentrations. Sites not included in this group are included in
the All Sites group, though this group is less effective for quality assurance than the
Eastern or Western groups.
For each group, Eastern, Western, and All Sites, the following correlation plots are
created: gravimetric mass versus reconstructed mass (MF vs. RCMA), gravimetric
mass vs. hydrogen (MF vs. H), sulfate versus sulfur (BSO4 vs. S), and organic mass
from carbon versus organic mass from hydrogen (OMC vs. OMH). Examples are
shown in Figure 36.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—22 of 24
Revision: 0.0
Date: March 2002
B96TOT, N = 694, R = 0.974, R^2 = 0.948
Y = 0.9014*X + (391.8582), Slope error = 0.011, Intcpt. error = 75.633
Mean X = 5786.12, Error = 126.67, Std Dev. = 3337.06
Mean Y = 5607.22, Error = 114.49, Std Dev. = 3016.06
Mean Y/Mean X = 0.969
B96TOT, N = 366, R = 0.967, R^2 = 0.935
Y = 0.9110*X + (521.8891), Slope error = 0.018, Intcpt. error = 324.292
Mean X = 15992.17, Error = 461.85, Std Dev. = 8835.63
Mean Y = 15091.05, Error = 422.03, Std Dev. = 8074.00
Mean Y/Mean X = 0.944
45000
50000
40000
45000
40000
35000
35000
SELECT EASTERN* RCMA
SELECT WESTERN* RCMA
30000
25000
20000
15000
30000
25000
20000
15000
10000
10000
5000
5000
0
0
0
5000
10000
15000
20000
25000
SELECT WESTERN* MF
30000
35000
40000
45000
0
5000
10000
15000
20000
25000
30000
35000
40000
45000
50000
SELECT EASTERN* MF
Figure 36. IMPROVE Quality Assurance Plots: Select East and West Plots.
Reviewing these data as part of a group having similar characteristics enhances
recognition of differences. Since these sites are historically similar, differences noted in
the current data may be due to sampler calibration problems, or to changes in the
aerosol sources or removal mechanisms near the site. These possibilities are
investigated, and the problem resolved and recorded. Since the IMPROVE network is
concerned with regional aerosols, the addition of local sources must be noted and
reported with the final data set.
Site summary review
The data for each site are recorded on quarterly summary output sheets. Recorded
species concentrations are compared to their associated minimum detectable limits
(MDLs) to verify the validity of the data. The MDLs are compared with historical site
MDLs to insure they are reasonable. Samples having unusually large MDLs are
reviewed for sampler calibration or elemental analysis problems. An example is shown
in Figure 37.
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—23 of 24
Revision: 0.0
Date: March 2002
Se MDL in nanograms per cubic meter
0.09
0.08
Se MDL in nanograms per cubic meter
0.07
0.06
0.05
c
0.04
0.03
0.02
0.01
3/1
/9
3/1 5
1/9
5
3/2
2/9
5
4/1
/95
4/1
2/9
4/2 5
2/9
5
5/3
/9
5/1 5
3/9
5
5/2
4/9
5
6/3
/9
6/1 5
4/9
5
6/2
4/9
5
7/5
/95
7/1
5/9
5
7/2
6/9
5
8/5
/9
8/1 5
6/9
5
8/2
6/9
5
9/9
/9
9/2 5
0/9
9/3 5
0/
10 95
/11
/9
10 5
/21
/95
11
/1/9
11 5
/11
/9
11 5
/22
/95
12
/2/
12 95
/13
/95
1/1
7/9
1/2 6
7/9
6
2/7
/96
2/1
7/9
6
2/2
8/9
6
0
Figure 37. IMPROVE Quality Assurance Plots: MDL Plot.
7.2.9
Final Data Review and Validation
All missing data is noted and flagged for verification of invalid status. Once all data
corrections have been entered, and the data have been processed to their final form, the
archived information for the quarter is submitted to the quality assurance group.
Any remaining problems are resolved, and the final data set is agreed upon.
7.2.10 Data Validation Flags
All valid data is flagged as "NM" or normal. All missing data is entered as -99 and
given one of the flags listed in Table 18. This may affect all data for the period, or may
affect only selected variables.
Se_MDL
IMPROVE QAPP
Data Validation and Usability
Section—Page: 7—24 of 24
Revision: 0.0
Date: March 2002
Table 18. IMPROVE Validation Flags.
Flag
NS
OL
EP
PO
BI
XX
UN
NA
Definition
Site Not Serviced or filter not changed at proper time. (The sample
operated less than 18 hours or operated during multiple periods.)
Off Line The sampler was taken off line.
Equipment Problem Mechanical problem with sampler or associated
equipment resulting in an invalid sample due to problems with flow rate,
elapsed time, pump function or vacuum pressure.
Power Outage Lack of power resulting in a sample time less than 18
hours.
Incorrect Cartridge Installation The cartridge was inserted with less
than four cassettes, installed upside down, installed in the wrong module.
Other Torn or contaminated filter, broken cassette, other.
Unresolved Concentration failed the Level II data validation parameters
for unresolved reason.
Not Available The specific analysis was not specified.
If the flow rate is 10% or 20% from the nominal flow rate for the module, the
concentrations are retained and a flag is assigned. A flag of “LF” is assigned for a flow
rate that differs from the nominal by over 10% and under 20%. A flag of “RF” is
assigned for a flow rate that differs from the nominal by over 20%. A change in flow
rate affects the cut point but not the accuracy of the concentration. (See Figure 38.)
Note that these flags are redundant, in that the user could make the same determination
using the provided flow rates.
7.3
Reconciliation with User Requirements
The IMPROVE network is designed to show baseline, temporal, and spatial trends in
reconstructed extinction Federally mandated Class I Wilderness Areas for the
requirements of the Regional Haze Regulation. In addition, the network provides the
chemical composition in order to examine the sources of visibility degradation. There
are numerous users of the data. Among the agencies are the EPA, the various FLMs,
the various states, and the various tribes. The user's requirements may differ according
to the agency.
The reconciliation of IMPROVE results will be performed in the various assessment
reports listed in Section 6.6 and based on the MQO's that are listed in Table 4 (page
4—31).
IMPROVE QAPP
Appendix
Section—Page: 8—1 of 5
Revision: 0.0
Date: March 2002
8. APPENDIX
8.1
IMPROVE Sampler Technical Properties
Effect of Flow Rate on Cyclone Cut Point
The particle capture efficiency of the IMPROVE cyclone of the fine modules was
characterized by the CNL staff at the Laboratory of Energy-Related Health Research
aerosol facility on the Davis campus by two independent methods, both using
polystyrene latex spheres. One method used fluorescent particles and a fluorometer,
while the other used a Single Particle Aerosol Relaxation Time analyzer.
The collection efficiency of the IMPROVE cyclone was characterized at the Health
Sciences Instrumentation Facility at the University of California at Davis. The efficiency
was measured as a function of particle size and flow rate using two separate methods:
PSL and SPART. The PSL method uses microspheres of fluorescent polystyrene latex
particles (PSL) produced by a Lovelace nebulizer and a vibrating stream generator and
analyzed by electron micrographs. The SPART method uses a mixture of PSL particles
produced by a Lovelace nebulizer and analyzed by a Single Particle Aerodynamic
Relaxation Time (SPART) analyzer. The aerodynamic diameter for 50% collection,
d50, was determined for each flow rate.
The results are shown in Figure 38. The left plot shows the aerodynamic diameter for
50% collection efficiency as a function of flow rate. The solid symbols are from PSL
and the open symbols from SPART. The right plot shows the efficiency as a function of
particle diameter at a flow rate of 22.8 L/min.
The best-fitting straight line in the left plot is based on measurements for both methods
for flow rates between 18 and 24 L/min. The equation is:
d 50 = 2.5 − 0.334 * (Q − 22.75)
(Equation 36)
To maintain a constant cut point of 2.5 µm, it is necessary to maintain a constant volume
flow rate of 22.8 L/min.
IMPROVE QAPP
Appendix
Section—Page: 8—2 of 5
Revision: 0.0
Date: March 2002
100%
4.5
4.0
efficiency at 22.8 L/min
d(ae) for 50% efficiency
PSL
3.5
PSL
3.0
SPART
2.5
SPART
80%
60%
40%
20%
2.0
0%
18
20
22
24
flow rate (L/min)
0
2
4
6
aerodynamic diameter (um)
Figure 38. IMPROVE Sampler Cyclone Collection Efficiency: Relationship between
Diameter and Flow Rate.
Flow Control by a Critical Orifice.
The flow rate through the IMPROVE sampler is maintained by a critical orifice. The
diameter is adjustable. As long as the pressure after the orifice is less than 52% of the
pressure in front of the orifice, the airflow will be critical – limited to the speed of sound
and will not be affected by small changes in pump performance.
The volume flow rate is used in IMPROVE, as opposed to mass flow rate, because the
ambient concentrations and particle size cut vary with volume, not mass. The volume
flow rate increases as the air pressure decreases. Since there is negligible pressure drop
across the inlet, the volume flow rate does not change between the inlet and the cyclone.
The pressure, however, will decrease across the cyclone and filter. If this pressure drop
is ∆P, then the inlet/cyclone flow rate is (1-∆P) times the flow rate at the front of the
critical orifice.
The flow rate through a critical orifice depends on the diameter of the orifice and the
absolute temperature of the air entering the orifice. The equation for the inlet flow rate
is:
IMPROVE QAPP
Appendix
Section—Page: 8—3 of 5
Revision: 0.0
Date: March 2002
T + 273
 ∆P 
Q = Qo *  1 −
,
*
P 
293

(Equation 37)
where Q = inlet flow rate, Qo = constant, ∆P/P = relative decrease in pressure across
the filter, and T = temperature in ºC.
The pressure drop ∆P is produced either by the pressure drop of a clean filter or by the
filter load. To remove the effect of the clean filter, each critical orifice is adjusted during
calibration to give the desired flow rate with a typical clean filter appropriate for the
module. The important pressure quantity is then the variation, δP, about the nominal
pressure drop of the clean filter used in calibration, ∆Pnom.
δP = ∆P − ∆Pnom
(Equation 38)
The δP associated with variations of clean filters can be negative or positive, and will
affect all measurements for a sampler period equally. The δP associated with filter
loading will be positive and will increase over the sampling period.
The mean annual temperature for the network is about 15ºC. In order to have the mean
annual flow rate at 22.8 L/min, the critical orifices are adjusted to provide a flow rate of
23 L/min at 20ºC with a typical filter in the cassette. The constant Qo is then given by:
∆P 

Q o = 23 * 1 −

P 

−1
(Equation 39)
where: ?P/P is the relative decrease in pressure across the filter.
Substituting, assuming there is no variation in atmospheric pressure at the site, the flow
rate, relative to 20°C, is:
δP  T + 273

Q = 23 * 1 −
*
P − ∆P 
293

(Equation 40)
Variations in temperature with site, month, and time of day affect the collection cut point
but not the volume calculation.
IMPROVE QAPP
Appendix
Section—Page: 8—4 of 5
Revision: 0.0
Date: March 2002
Flow Rate Through an Orifice Meter
An orifice meter consists of a restriction in the air path and a device to measure the
pressure drop across the restriction. The calibration devices and the cyclone are both
used as orifice meters.
The flow rate through an orifice meter depends on the pressure drop across the
restriction and the square root of the density of the air. Because the density is
proportional to the pressure and absolute temperature, the flow rate can be written:
Q = Q1 (δP )
β
Po
T + 273
P
293
(Equation 41)
Where: Q1, β, and Po are constants. For laminar flow, β= 0.5. We express this
equation in parameterized form using the first transducer reading (across the cyclone),
M, for the pressure drop:
P(sea level )
Q = 10 a M b
P( site)
T + 273
293
(Equation 42)
We arbitrarily define all pressures relative to the standard pressure at sea level and all
temperatures relative to 20°C. Thus, the parameters, a and b, are always calculated
relative to 20°C and Davis. The value of b should be similar to that of β, around 0.5.
The advantage in expressing the parameters relative to sea level is that all modules
should have parameters with similar values independent of the site elevation.
Because of the difficulties in measuring the ambient pressure, we have chosen to use an
average pressure based on the elevation of the site. We will write the pressure and
temperature functions as F(elev) and f(T):
F(elev) =
f(T) =
P(sea level )
P( site)
T + 273
293
(Equation 43)
(Equation 44)
Thus, the flow rate Q, in terms of the first transducer voltages:
Q = 10 a M b F ( elev ) f (T )
Pressure - Elevation Relationship.
(Equation 45)
IMPROVE QAPP
Appendix
Section—Page: 8—5 of 5
Revision: 0.0
Date: March 2002
The ambient pressure enters into the equations for the calibration device and the
transducer across the cyclone as the square root of the pressure. Because of the
difficulties of measuring the ambient pressure at each sample change, we have chosen to
use an average pressure based on the elevation of the site. The actual pressure is used
only in calibrating the calibration devices at Davis.

P = P(sea level) * exp −

2
 Z
 Z  
+
 

8437  26621  
(Equation 46)
It is convenient to define an elevation factor that is the square root of the pressure at sea
level divided by the pressure at the site. This factor is expressed as:
F ( elev ) =
P(sea level)
P( site)
2
 Z
 Z  
= exp 
+

16874  37648  


(Equation 47)
Was this manual useful for you? yes no
Thank you for your participation!

* Your assessment is very important for improving the work of artificial intelligence, which forms the content of this project

Download PDF

advertising