The Evolution of Part 75 Performance Test Procedures and Specifications

 

Russell S. Berry and Stephen K. Norfleet

RMB Consulting & Research, Inc., 5104 Bur Oak Circle, Raleigh, NC 27612

 

Charles E. Dene

Electric Power Research Institute, 3412 Hillview Ave. Palo Alto, CA 94304

 

Introduction

Over the past five years, the utility industry has been continuously monitoring emissions of sulfur dioxide (SO2), nitrogen oxides (NOx) and carbon dioxide (CO2) from fossil fuel-fired boilers. Continuous monitoring of emissions from these sources has been performed in accordance with regulations promulgated by the Environmental Protection Agency’s (EPA’s) Acid Rain Division (ARD) in Title 40 of the Code of Federal Regulations Part 75 (40 CFR 75).

Since the Environmental Protection Agency’s (EPA’s) promulgation of 40 CFR Part 75 on January 11, 1993, utilities have installed and continue to operate approximately 1,500 continuous emission monitoring systems (CEMS) on fossil fuel-fired boilers and stationary gas turbines. Approximately 260 of these CEMS are installed on "Phase I" sources; the remaining CEMS are installed on "Phase II" sources. For most Phase I sources, CEMS were required to be installed by November 15,1993, and since January 1, 1995, Phase I sources have been required to use the CEMS data to account for sulfur dioxide (SO2) emissions in accordance with the Acid Rain Division’s (ARD’s) allowance trading program. For Phase II sources, CEMS had to be installed by January 1, 1995 (in most cases), and to date, the Phase II CEMS have only been used to measure and report SO2, nitrogen oxides (NOX) and carbon dioxide (CO2) emissions to EPA. Phase II sources will not be required to account for SO2 allowances until January 1, 2000.

As part of ARD’s continuous emission monitoring system (CEMS) requirements, ongoing quality assurance and quality control (QA/QC) activities were specified to ensure the quality of data submitted to EPA. As a result of the utility industries’ implementation of Part 75 CEMS, extensive experience has been gained and an enormous amount of CEMS performance data has been gathered. In addition, as experience levels have increased, more utilities have begun reviewing their CEMS programs to assess the cost and benefit of each activity.

Supporting member efforts to reduce CEMS costs, the Electric Power Research Institute (EPRI) has initiated a project to enhance CEMS equipment performance and to develop more cost-effective QA/QC procedures to be used by Part 75 sources (while ensuring or improving the accuracy and reliability of the data). EPRI is evaluating changes that will reduce excessive aspects of current QA/QC requirements, and changes to CEMS designs/equipment that will afford the use of alternative QA/QC procedures. EPRI is also considering procedural changes that may result in more cost-effective operation, maintenance, recordkeeping and reporting activities not specifically dictated by EPA. As alternative procedures are developed, consideration is being given, first, to changes that can be implemented with existing CEMS equipment and second, to changes that may be possible through the enhancement of existing equipment.

The History of Part 75 QA/QC

Understanding the history behind Part 75 QA/QC requirements is an essential task in determining which aspects of the existing requirements may be improved and what types of alternatives may be acceptable. Throughout the last 20+ years, CEMS performance specifications, standard operating procedures and quality assurance requirements have continually evolved, culminating in the current Part 75 CEMS requirements for utilities. As with many EPA regulations, the current Part 75 requirements evolved from similar regulations previously written by others at EPA.

Subparts A and D to 40 CFR Part 60 were proposed on August 17, 1971 and promulgated on December 23, 1971. These regulations contained EPA’s first CEMS requirements. CEMS requirements in these Subparts included initial certification tests, daily drift checks, 15-minute cycle time limits, brief operation and maintenance expectations, 1-hour averaging, common and multiple stack requirements, deadlines, span value requirements, reporting procedures, calculation procedures and allowances for waivers and alternatives. However, at the time these rules were promulgated there were not any CEMS performance specifications, and consequently, the requirement to monitor emissions was deferred until specifications could be developed.

Performance Specification Tests (PSTs) 1, 2 and 3 for opacity, SO2/NOX, and diluent (CO2/O2) monitors, respectively, were proposed on September 11, 1974 and promulgated as Appendix B to Part 60 on October 6, 1975. Following the promulgation of these PSTs, monitoring was required at Subpart D facilities, but even then, the monitoring data were not used to determine source compliance, and only limited ongoing QA/QC requirements existed. PST requirements added opacity monitor design and performance specifications, monitor location requirements, additional analyzer range and span requirements, relative accuracy test audits (RATAs), 7-day drift tests, and PST reporting requirements.

A few years later, EPA Subpart Da regulations were proposed on September 18, 1978 and promulgated on June 11, 1979. Subpart Da contained more stringent CEMS requirements including data availability requirements and missing data supplementation procedures. Furthermore, this was the first EPA rule to require the use of CEMS data to demonstrate compliance. Following the promulgation of Subpart Da, revisions to PSTs 1, 2 and 3 were proposed on October 10, 1979, re-proposed on January 26, 1981 and promulgated on March 30, 1983 for PST 1 and May 25, 1983 for PSTs 2 and 3.

On March 14, 1984, ongoing QA/QC procedures for CEMS were proposed and finally promulgated as Appendix F on June 3, 1987. The Appendix F procedures were EPA’s initial QA/QC program requirements and were applicable to all Part 60 monitoring systems required for determining compliance with an emission standard. Appendix F included QA Plan requirements, out-of-control limits, quarterly accuracy assessments and data assessment reporting requirements.

Note that during the development of Appendix F and all prior CEMS requirements, EPA did not have any instrumental Reference Methods, and CEMS stability, reliability and accuracy were always issues of concern. All of the monitoring specifications and test procedures were developed based on information obtained using EPA’s "wet chemistry" methods and older versions of the CEMS being implemented for Part 75. Part 75 requirements (proposed on December 3, 1991 and promulgated on January 11, 1993) were the first such requirements to begin incorporating instrumental Reference Method test procedures and the performance capabilities of current CEMS technologies. The use of these procedures in conjunction with improvements in CEMS technology were reflected in significantly more stringent CEMS performance limits. In general, drift and relative accuracy requirements were reduced by 50 percent from the Subpart Da to the Part 75 requirements.

While the performance requirements have changed significantly, many of the Part 75 QA/QC procedures still closely emulate the procedures developed for Part 60 sources. As an example, linearity test procedures evolved from Part 60 Appendix F Cylinder Gas Audit (CGA) and "optional" Relative Accuracy Audit (RAA) requirements. The 3-run requirement was initially required in the CGA/RAA procedures to allow for the averaging of three Reference Method 6 (the original wet chemistry SO2 method) runs when calculating RAA results. Furthermore, at the time these procedures were developed, many CEMS incorporated the use of strip charts. Three test runs were necessary in order to average out errors associated with the test methods and reading measurements from a strip chart. As with this example, other current QA/QC requirements also contain procedures that reflect past concerns and that are no longer appropriate or applicable.

Suggested Part 75 QA/QC modifications

CEMS operating experience suggests that there are a number of ways that the existing Part 75 QA/QC requirements could be modified to reduce costs without impacting system reliability or data accuracy. To investigate possible modifications, RMB recently conducted a study for EPRI evaluating Part 75 QA/QC performance specifications (EPRI GC-111911). The data clearly showed some QA/QC elements to be either unnecessary or outmoded. While the specific results of the EPRI study are not discussed, the following paragraphs outline some changes that could be made to reduce the CEMS QA/QC burden with no significant impact on data quality.

Reduce Number of RATA Runs. Utilities should not be required to complete nine runs when performing a RATA but should be able to demonstrate compliance with the relative accuracy standard using fewer runs. As experience source testers know, the results of a RATA can frequently be determined after three runs. This practical knowledge is born out by an evaluation of RATA data; almost without exception, if the RATA criteria is met using the results from the first three, four, five or six test runs, performing additional test runs has no impact on the RATA results and consequently no benefit. Thus, if at any point, beginning with the relative accuracy results calculated after the third run, the relative accuracy requirements are satisfied, testing should be allowed to stop and the RATA should be considered complete. Of course, a minimum of nine runs would still be required in order to discard any test runs.

While clearly shown by a review of RATA data, the possibility of reducing the number of RATA runs can also be appreciated intuitively. The RATA test is intended to serve as a combination check of system bias and variability and is based on assumption that each set of paired reference method and CEMS represents statistically independent values. RATA data, however, actually represents little more than a snapshot of CEMS performance; thus, because the values are related, the perceived bias should be relatively consistent and easily recognizable from an analysis of the first few paired data sets. As far as CEMS variability is concerned, because of the nature of the standard deviation calculation and the change in the t-value, passing the relative accuracy test is much more difficult with fewer runs.

The sliding scale of difficulty associated with passing the variability component is illustrated by the simulated RATA data in the following table. The a standard deviation of only about 3.5% can be absorbed by the relative accuracy standard for three runs whereas over 11.5% can be accommodated using nine runs.

Simulated RATA data illustrating increased difficulty of satisfying the variability component of the relative accuracy with fewer runs

Run Number CEMS Value Reference Method Value Standard Deviation t-value Confidence Coefficient Relative Accuracy
1 103 100        
2 97 100 4.24 12.71 38.12 38.12
3 103 100 3.46 4.303 8.61 9.61
4 92 100 5.32 3.182 8.46 9.71
5 110 100 6.82 2.776 8.47 9.47
6 88 100 8.08 2.571 8.49 9.65
7 114 100 9.35 2.447 8.64 9.64
8 84 100 10.53 2.365 8.81 9.93
9 117 100 11.56 2.306 8.89 9.77

Simplify Linearity Check. An analysis of Part 75 QA/QC data indicated that it is not necessary to perform three injections at each level in order to establish analyzer linearity. Triplicates are wholly unnecessary; particularly since experience shows that the use of the average of the replicate injections at each level almost invariably makes the test easier to pass. If linearity is demonstrated during the first round of injections at each level, little benefit is achieved by repeating injections.

It is also unnecessary to include a high level calibration gas injections in the linearity check since the same cylinder gas used for the regular daily calibrations is typically used as the high level gas. Only the low and mid level gas injections are meaningful since a high-level gas injection would only serve to duplicate the daily calibration results. Not surprisingly, given that the performance of the analyzers at the high level is well established by the daily calibrations, the data demonstrate that linearity failures are considerably more often associated with the low and mid level gas injections--further illustrating the unnecessary nature of the high level gas injections.

EPA should consider simplifying the existing linearity procedures. As it now stands, the linearity test can result unnecessary missing data during the time required to perform the triplicate injection series for each level or may consume half a technicians workday trying to avoid the missing data. The recommended procedure would be to conduct a calibration error test and then inject a low and mid level linearity gas only once each. If the calibration error test is passed and the low and mid level linearity results satisfy the Part 75 criteria, then the CEMS passes the linearity test.

Eliminate 7-Day Drift Test. EPA should consider eliminating the 7-day calibration drift test since the crux of the test is essentially performed on an ongoing basis through regular daily calibrations. When the 7-day drift test was original conceived years prior to Part 75, ensuring the general stability of the analyzers was much more meaningful, particularly when the daily drift specifications were two to four times higher than the current calibration error specifications. But, given the current performance of Part 75 CEMS equipment, the 7-day drift test adds little to the monitoring program and often becomes little more than an exercise for CEMS technicians in predicting changes in the weather and other factors that might influence CEMS drift.

The 7-day drift test was carried over into the Acid Rain Program based on the assumption that "comprehensive and thorough calibration testing at certification is particularly important" and "meticulous testing at certification can pay large rewards over the life of the monitoring program." (Federal Register, January 11, 1993. Volume 58, No.6. p. 3641). Analyzers that might exhibit poor general stability that would be exposed by a 7-day drift test, however, would likely have difficulty passing the calibration drift test on a regular, daily basis. The wealth of CEMS operating experience gained by utilities has also served to resolve many emissions monitoring difficulties and analyzer problems that the 7-day drift test was initially designed to catch. Thus, in light of a combination of daily calibrations and growing utility CEMS experience, the 7-day drift test can be completely eliminated without impacting CEMS data quality

Eliminate Flow Interference Check. The Agency should also investigate eliminating daily flow interference checks required under Appendix B of Part 75 for stack volumetric flowmeters. The flow interference check is intended to supplement the two-point calibration test required of all CEMS analyzers. In general, while the details are not specified in Appendix B, a differential pressure flow monitor must determine daily if the probe and system lines are sufficiently free of obstructions. Thermal, ultrasonic or acoustic type flow monitors, must perform a daily check to determine if the sensors are sufficiently clean to prevent velocity sensing interference. The mechanics of the flow interference check vary. For ultrasonic or acoustic flowmeters, it typically involves checking for deterioration of signal strength and sometimes checking transducer blower operation. For differential pressure flowmeters, it usually involves a purge cycle. For thermal flowmeters, it conventionally involves a comparative check of single sensor output to average output or a check of the temperature responses of co-located sensors.

While the intent of the flow interference check is to supplement the daily calibration, common CEMS operating experience suggests, and the data show, that the problems resulting in flow interference check failures also trigger calibration error test failures. Whenever a flow interference check is failed a calibration failure is almost invariably reported as well. This suggests flow interference checks are superfluous. Furthermore, while the merit of the proposed flow-to-load test may be questionable, the application of the flow-to-load test would tend to further highlight any malfunction such as plugging or fouling, making flow interference checks even more unnecessary.