![]() |
ETV
Quality & Management Plan |
![]() |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
EPA Report No: EPA/600/R-98/064May 1998ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM APPROVED BY
ACKNOWLEDGMENTSThe first draft of this document was developed by a team of writers consisting of the following quality assurance staff members from the U.S. Environmental Protection Agency Office of Research and Development National Risk Management Research Laboratory and National Exposure Research Laboratory: Sam Hayes, Lora Johnson, Ann Kern, and Jeff Worthington. Subsequent revisions included significant input in the form of comments from members of the Environmental Technology Verification Team. Verification partners also provided comments. The team of writers were joined by the following EPA staff in development of this final document: Nancy Adams, Penelope Hansen, Linda Porter, and Shirley Wasson. The HTML version of this document was prepared in July 1998 by Jeremiah McBurrows, an apprentice in the Minority Research Apprenticeship Program of the U. S. Environmental Protection Agency and the University of Cincinnati. TABLE OF CONTENTSDOCUMENTS AND GENERAL TERMS 1.0 MANAGEMENT AND ORGANIZATION 1.3 ETV customer identification and ETV customer needs/expectations/work objectives 1.4 Potential verification partners 1.5 Management resolution for verification partner quality constraints 1.7 Authority to stop work for safety and quality considerations 2.0 QUALITY SYSTEM AND DESCRIPTION 2.1 Authorities and conformance to E4 quality standard 2.4 Quality expectation for products and services 2.5 Quality procedures documentation 2.7 Management system reviews (MSRs) 3.0 PERSONNEL QUALIFICATION AND TRAINING 3.1 Personnel training and qualification procedures 3.2 Formal qualifications and certifications 3.3 Technical management and training 4.0 ETV VERIFICATION PARTNER SELECTION 4.1 Planning and control of selection process 4.2 Technical and quality requirements 4.3 Quality specification/conformance 4.4 Peer review of assistance agreements 4.5 Conformance of verification testing efforts 5.2 Preparation, review, approval, and distribution 5.3 Records storage and obsolete records 6.0 COMPUTER HARDWARE AND SOFTWARE 6.2 Scope of ETV computer hardware/software procedures 6.4 Measurement and testing equipment configurations 6.5 Change assessments - configurations, components, and requirements 6.6 ETV website roles and responsibilities 7.1 Systematic planning process 8.0 IMPLEMENTATION OF WORK PROCESSES 9.1 Numbers and types of assessments 9.3 Personnel qualifications, responsibility, and authority 10.1 Annual review for quality improvement 10.2 Detecting and correcting quality system problems 10.3 Cause and effect relationship 10.5 Quality improvement action COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA 1.1 Systematic planning of the verification test 1.2 Systematic planning for verification testing 2.0 DESIGN OF TECHNOLOGY VERIFICATION TESTS 3.0 IMPLEMENTATION OF PLANNED OPERATIONS 3.1 Implementation of planning 3.3 Field and laboratory samples 3.4 Data and information management 5.0 ASSESSMENT AND VERIFICATION OF DATA USABILITY Annual progress report Directors of Quality Assurance E4 ETV assistance agreement ETV coordinator ETV verification statement ETV verification report EPA line management EPA pilot manager EPA pilot quality managers EPA review/audit reports ETV team ETV test objective ETV webmaster Evaluation contractor Generic verification protocol Laboratory director Management system review Office of Research and Development Assistant Administrator Quality and Management Plan for the Pilot Period Raw data Records Stakeholder groups Standard operating procedures Test/QA plan Test measurement Verification Verification partners Verification partner manager Verification partner quality manager Verification partner quality management plan VP review/audit reports Verification Strategy ABBREVIATIONS AND ACRONYMS
Background The Environmental Technology Verification Program (ETV) has been established by the
Environmental Protection Agency (EPA) to evaluate the performance characteristics of
innovative environmental technologies across all media and to report this objective
information to the permitters, buyers, and users of environmental technology. ETV has
evolved in response to the following mandates:
To comply with these directives, EPA's Office of Research and Development (ORD) has established a five year pilot program to evaluate alternative operating parameters and determine the overall feasibility of a technology verification program. ETV began in October 1995 and will be evaluated through October 2000, at which time the Agency will prepare a report to Congress containing the results of the pilot program and recommendations for its future operation. Program Description The thesis that independent performance verification more rapidly moves new technology into use will be tested by EPA’s five year pilot program. ETV funds and operates twelve pilot projects, each operated by third party organizations under the auspices of EPA. These "partner organizations" include private sector testing, evaluation and research companies, state technology evaluation programs, federal laboratories, and industry associations. For the most part, each pilot is focused on a different environmental, industry, or technology sector (e.g., air pollution control technology, drinking water systems, field monitoring devices, industrial coatings products). By design, all pilots are operated in a somewhat different manner in order to test various methods for both technical and operational efficiency and effectiveness in verifying performance. Management techniques are in place to assure that constant evaluation of alternative methods occurs and results in continuous improvement of processes throughout the pilot period. Because credible information is the ultimate product of ETV, the highest appropriate quality assurance procedures will be used throughout the program. The EPA's Office of Research and Development implements an Agency-wide quality system to assure that activities conducted in EPA research laboratories, other EPA research facilities or locations, or at facilities being operated on behalf of or in cooperation with the EPA are supported by data of known and acceptable quality for their intended use. Individual research laboratories develop laboratory-specific quality management plans. The National Risk Management Laboratory (NRMRL) and the National Exposure Research Laboratory (NERL) are implementing ETV in conformance with such plans. Program and Quality Management Documents The second major program management document being used by ETV to guide its operation is the ETV Quality and Management Plan (QMP) which follows this introduction. Under development for over a year, the ETV QMP uses the structure, policies and standards of the American National Standard ANSI/ASQC E4-1994, "Specification and Guidelines for Quality Systems for Environmental Data Collection and Environmental Programs." This document, ...describes a basic set of mandatory specifications and non-mandatory guidelines by which a quality system for programs involving environmental data collection and environmental technology can be planned, implemented and assessed*. As of February 1996, all cooperative agreements entered into by EPA concerning environmental technology must be in conformance with the provisions of E4. This requirement is expected to be extended to EPA contracts in 1998#. The ETV Quality and Management Plan EPA's verification program is organizationally complex, involving numerous outside organizations through its extensive stakeholder process, partner organizations who bear most of the quality assurance responsibilities, and testing and consulting companies hired by partner organizations to conduct field and laboratory work. Within EPA, the program is coordinated through ORD's ETV Team consisting of staff from ten Branches located in six Divisions of two Laboratories, NRMRL and NERL, along with quality assurance staff in each of the laboratories' physical locations. Finally, EPA program offices and regions are increasingly involved in outreach activities, as are states and other Federal agencies through the White House Environment and Technology Working Group. The ETV QMP is designed to play a major role in clearly delineating the roles and responsibilities of all of these diverse and important players. All partner organizations will use this document and its parent, the E4 standard, to create quality management plans that assure appropriate levels of data collection, quality outputs, and customer responsiveness. These plans will be submitted to EPA for review and approval by pilot managers and quality assurance staff. It is not the purpose of this document to require that partner organizations create wholly new operating procedures solely for use under ETV. Most organizations selected by EPA as cooperators already have many of the procedural and process elements required by E4 incorporated into their existing management systems. Other requirements found in this document will be new or different. Cooperators should address all appropriate elements of the ETV QMP either specifically in their ETV plan or include appropriate and adequately detailed references to existing documents. The ETV QMP will be reviewed on an annual basis throughout the pilot period (and beyond if the program is extended) to incorporate lessons learned from the experiences of the pilots and feedback from customer groups. The addition of new policies and elimination or modification of ineffective procedures will be discussed with all participants and modifications to partner QMP's may be required. The ETV QMP follows the general outline of the ANSI/ASQC E4-1994 document. ____________________ *American Society for Quality; American National Standard and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs; Milwaukee, Wisconsin; 1994. #All ETV pilots are executed under cooperative agreements with the exception of the Site Characterization and Monitoring and the Industrial Coatings and Coatings Equipment Pilot which utilize Interagency Agreements with the Departments of Energy and Defense. Part A of the ETV Quality and Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program. Part B of the ETV Quality and Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data. 1.0 MANAGEMENT AND ORGANIZATION 1.1 ETV quality policy The Office of Research and Development shall establish and implement a quality policy to ensure that the Environmental Technology Verification (ETV ) program produces the type and quality of program outputs needed and expected by ETV clients. The EPA Office of Research and Development's (ORD) quality policy for the Environmental Technology Verification (ETV) program is established as follows: The quality system for the overall ETV program seeks to be consistent with industry consensus standards. Each verification partner shall implement a valid and approved quality system. As of February 1996, the Agency’s required quality system for cooperative agreements is ANSI/ASQC E4. Each verification test will be performed according to planned and documented, pre-approved test/QA plans. All technical statements in ETV verification reports shall be supported by the appropriate data.
1.2 Organization structure The relevant organizations, functional responsibilities, levels of accountability and authority, and lines of communication shall be formally defined in the quality system and approved by the EPA laboratory directors responsible for the quality of work performed by or on behalf of each EPA laboratory. The overall organizational structure of the ETV program graphically presents lines of accountability, authority, and communication. The general functional responsibilities for the major organizational units are specified in the structure.
To view a larger and more legible organization chart, click here.1.2.1 Assistant Administrator for ORD and the USEPA Administrator responsibilities:
1.2.2 ORD laboratory directors responsibilities:
1.2.3 Division directors and branch chiefs responsibilities:
1.2.4 ETV coordinator responsibilities:
1.2.5 ETV team responsibilities:
1.2.6 EPA pilot managers responsibilities:
1.2.7 Verification partners responsibilities:
1.2.8 Stakeholders group responsibilities may include the following:
1.2.9 ETV directors of quality assurance responsibilities:
1.2.10 EPA pilot quality managers responsibilities:
Tables available on the ETV website present a current listing of the pilots that are either underway or soon to be awarded. Included in the table are the EPA pilot managers, EPA pilot quality managers, verification partner managers, and verification partner quality managers. The tables contain their names, geographic locations, ORD laboratory or company affiliations, and phone numbers. 1.3 ETV customer identification and ETV customer needs/expectations/work objectives The ETV coordinator, pilot managers, and partners are responsible for coordinating the identification of customers and communicating the needs of the internal and external customers to ensure that ETV work products satisfy their needs. 1.3.1 As identified in the ETV Verification Strategy, external customers (i.e., outside EPA) include, but are not limited to:
In a general sense, needs and expectations of external customers include:
For each pilot, needs and expectations of external customers are defined and documented in the minutes of stakeholders meetings. The process to define these pilot-specific needs and expectations includes:
NOTE: Not all pilots are structured the same. For example, the independent pilot uses an advisory committee, an expert review, and an alliance group in determining pilot-specific needs and expectations. 1.3.2 Internal customers of the ETV program are those EPA staff responsible for execution of the ETV program in accordance with the expectations of Congress and the Administration. These customers include EPA and ORD senior managers who expect conformance with management and quality policies of the Agency. Other EPA staff, such as EPA technical experts in the regions and headquarters, will benefit incidentally from the program in the following areas:
1.4 Potential verification partners EPA line management shall oversee the selection of ETV verification partners. Each ETV pilot seeks to evaluate a wide variety of verification partnerships, using both interagency and cooperative agreements. When cooperative agreement holders conduct the verification testing, they are competitively solicited using the Request for Application (RFA) process, whereby notice of EPA's intent to issue an RFA is published, typically in the Commerce Business Daily (CBD). 1.5 Management resolution for verification partner quality constraints When necessary, appropriate EPA management shall negotiate acceptable measures of quality and success when constraints of time, costs, or other problems affect the verification partner's capability to fully satisfy customer's needs and expectations. When constraints of time, costs, or other problems significantly affect the verification partner's capability to fully satisfy the ETV's quality system needs and expectations, EPA pilot managers negotiate with the verification partner by the following procedure to establish acceptable measures of quality and success:
1.6 Resources The laboratory directors shall provide adequate resources to the ETV directors of quality assurance, EPA pilot managers and EPA pilot quality managers to enable them to plan, implement, assess, and improve the overall ETV program and quality system effectively. Laboratory directors take the following actions to achieve the above policy:
1.7 Authority to stop work for safety and quality consideration The verification partner shall stop unsafe work and work of inadequate quality, or shall delegate the authority to do so to others. The following procedures are necessary to stop unsafe work and work of inadequate quality:
2.0 QUALITY SYSTEM AND DESCRIPTION A quality system shall be planned, established, documented, implemented, and assessed as an integral part of an ETV management system for environmental technology verification programs defined by ETV quality policy. Development and subsequent endorsement of this plan by the ETV coordinator and EPA line management are evidence that the ETV quality system is planned, established, documented, implemented, and assessed as an integral part of an EPA ETV management system.
2.1 Authorities and conformance to E4 quality standard The ETV quality system shall address applicable parts of E4 and shall include the organizational structure, policies and procedures, responsibilities, authorities, resources, and guidance documents. The authority for developing appropriate quality systems for ETV is USEPA Order 5360.1. The requirement for assistance agreement holders is found in Federal Register, CFR Parts 30 & 33, February 15, 1996. This plan complies with ANSI/ASQC E4-1994, Specifications and Guidance for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Agency standard applicable to assistance agreements. E4 is comparable to the International Standards Organization (ISO) 9000 standards series, as shown in the comparison table provided in Annex B-5 to E4. The ETV quality system addresses each applicable individual “specification” provided in the published quality standard, ANSI/ASQC E4-1994, using the policies and procedures in this plan, as appropriate. Verification partners develop quality system descriptions to be consistent with both ANSI/ASQC E4-1994 (and/or ISO 9001) and/or this document.
2.2 Quality system documents The ETV quality system shall be described in a QMP that is reviewed and approved by the ETV coordinator and EPA line management. The ETV quality system is described in this quality and management plan.
2.3 Quality system scope The ETV quality system description shall identify in general terms those items, programs, or activities to which it applies. This quality system description applies to the following:
2.4 Quality expectation for products and services The ETV quality system shall include provisions to ensure that products or results of the environmental programs defined by the ETV program are of the type and quality needed and expected by ETV clients. The preeminent products of the ETV program are the environmental technology verification reports and statements issued by EPA and the verification partner. Provisions to ensure that these products and other results of the ETV program are of the quality expected include:
2.5 Quality procedures documentation Following approval of the ETV QMP, management elements of the quality system shall be implemented as described. Verification partners must operate the ETV pilots under a written and EPA-approved quality management plan that is based on E4 and/or the provisions of this plan.
The EPA pilot manager is responsible for obtaining a copy of the verification partner's quality management plan, as specified in the RFA, for their own review and forwarding the document to the EPA pilot quality manager for review and approval prior to planning technology tests. The ETV quality system description shall define when and how controls are to be applied to specific technical or technology testing efforts and shall outline how these efforts are planned, implemented, and assessed. 2.6.1 ETV program controls include:
2.6.2 Pilot-specific controls include:
Pilot-specific procedures for planning, implementation, and assessment are described in the verification partner's quality system. Procedures for planning, implementing, and assessing the overall ETV quality system are detailed in part A sections 7.0, 8.0, and 9.0 and in part B. 2.7 Management system reviews (MSRs) At regular intervals (at least annually) the ETV quality system shall be reviewed and its description updated, if appropriate, to reflect changes in the organization as well as changes in ETV quality policy. The ETV directors of quality assurance perform an internal MSR of the program in accordance with the process as outlined in part A section 9.0. 3.0 PERSONNEL QUALIFICATION AND TRAINING 3.1 Personnel training and qualification procedures Personnel performing work shall be trained and qualified based on appropriate requirements prior to the start of the work or activity. 3.1.1 EPA pilot managers are selected based on:
3.1.2 EPA pilot quality managers are selected based on:
3.1.3 Key participants working directly for or on behalf of the verification partner in support of the pilot and/or individual test operations are selected by the verification partner and evaluated by the EPA during the RFA process. RFA evaluation criteria for key personnel will vary, but typically include a consideration of the following:
The verification partner's documented quality management plan will address training and qualification procedures for pilot personnel. 3.2 Formal qualifications and certifications The need to require formal qualification or certification of personnel performing certain specialized activities shall be evaluated and implemented where necessary. ETV program management, quality management, and pilot management requires no formal qualification or certification other than where applicable:
Formal qualification or certification of personnel performing specialized activities for each pilot or for specific test/QA plans is addressed on a pilot-specific or test/QA plan-specific basis. Verification partners maintain records of the qualification or certification of such personnel. NOTE: Requirements for formal qualifications or certification may be based on applicable federal, state, or local requirements associated with a particular test. Examples of possible certifications include but are not limited to drinking water plant operators certification, professional engineering registration, and certification of industrial hygienists. 3.3 Technical management and training Appropriate technical and management training, which may include classroom and on-the-job, shall be performed and documented. EPA line management is responsible for appropriate technical and management training for staff working on the ETV program. Such training will be documented in each individual's training file. Verification partners are responsible for personnel training and qualification procedures for each pilot or for specific test/QA plans. Verification partners maintain the training records (available for review by EPA). The ETV team will be trained at meetings occurring at least twice a year to develop a policy, share information and lessons learned. The directors of quality assurance provide training on the requirements of the ETV QMP during the periodic workshops organized by the ETV coordinator. 3.4 Retraining When job requirements change, the need for retraining to ensure continued satisfactory job proficiency shall be evaluated. The need for retraining EPA staff is evaluated on an annual basis by the appropriate line management. Verification partners are responsible for retraining for each pilot or for specific test/QA plans. 3.5 Personnel job proficiency Evidence of personnel job proficiency shall be documented and maintained for the duration of the technology test or activity affected, or longer if required. 3.5.1 EPA pilot managers - The existing performance standards of the EPA pilot managers may already include tasks consistent with the following items. These items should be considered for specific identification in the performance standards:
NOTE: Evaluations are the responsibility of the appropriate supervisor and are not a record of the ETV program. 3.5.2 EPA pilot quality managers -The existing performance standards of the EPA pilot quality managers may already include tasks consistent with the following items. These items should be considered for specific identification in the performance standards:
NOTE: Evaluations are the responsibility of the appropriate supervisor and are not a record of the ETV program. 3.5.3 Verification partner staff - Verification partners document and maintain records (such as annual performance reviews) of personnel job proficiency for work performed directly in support of the verification partner’s ETV activities. NOTE: Evaluations are the responsibility of the verification partner and are not a record of the ETV program. 4.0 ETV VERIFICATION PARTNER SELECTION 4.1 Planning and control of selection process Funding of assistance associated with the ETV program shall be planned and controlled to ensure that the quality of verification tests is known, documented, and meets technical requirements and acceptance criteria of the clients. The ETV program is designed to investigate ways to facilitate the verification and use of environmental technology and exists solely for the benefit of industry and the user community. Two pilots operate under interagency agreements; however, the ETV program primarily operates by securing the cooperation of verification partners through open competition utilizing the agency’s assistance agreement program. Agency procedures for advertising, reviewing, and awarding assistance agreements are followed during the selection process. The procedures governing this process are available from GAD and are not discussed in this section. The procedures used to plan and control the selection of verification partners for ETV are listed below. Planning to select verification partners requires:
4.2 Technical and quality requirements Assistance solicitation documents shall contain information clearly describing the technical and quality requirements associated with the verification testing. Technical and quality requirements expressed in the RFA include technical evaluation criteria for technical skills and experience of staff members, and demonstrated experience in the development of quality systems relevant to ETV. Cooperative agreements require that a verification partner, once selected, develops and submits for approval to EPA, a pilot quality management, plan consistent with E4, prior to conducting technical activities. 4.3 Quality specification/conformance Assistance solicitation documents shall specify the ETV quality requirements for which the verification partner is responsible and how the verification partner’s conformance to client's requirements shall be verified. ETV quality requirements for which the verification partner is responsible are specified in the RFA and in this ETV QMP. During verification partner selection, the applicants’ proposals and written responses to the requirements are reviewed for conformance to the RFA specifications. After a verification partner is selected, the EPA pilot quality manager reviews and the EPA pilot manager approves written quality system documents (e.g., pilot QMP) for conformance to the EPA and ETV quality policies and procedures. 4.4 Peer review of assistance agreements Assistance award documents shall be reviewed for accuracy and completeness by qualified personnel prior to award. Peer review is an integral part of EPA’s project planning, implementation, and assessment process. RFA packages are internally peer reviewed prior to their issuance. Responses to the RFA undergo a peer review process which supports the award of the assistance agreement. 4.5 Conformance of Verification Testing Efforts Appropriate measures shall be established to ensure that the verification testing efforts satisfy all terms and conditions of the assistance agreement. Verification partners shall have a demonstrated capability to meet all terms and conditions. Once a verification partner has been selected, measures to ensure continued conformance to terms and conditions in the assistance agreement are implemented as described in part A, sections 8, 9, and 10. 5.0 RECORDS Procedures shall be established, controlled, and maintained for identifying, preparing, reviewing, approving, revising, collecting, indexing, filing, storing, maintaining, retrieving, distributing, and disposing of pertinent quality documents and records. Such procedures shall be applicable to all forms of documents and records, including printed and electronic media. Measures shall be taken to ensure that users understand the documents to be used. Records requiring control shall be identified. Records to which this policy applies include: - ETV Verification Strategy - ETV QMP (this document) - cooperative agreement and interagency agreement records - verification partners’ quality management plans - minutes of stakeholder meetings - generic verification of protocols (how a given type of technology is verified) - Test/QA plans (procedures for an individual test, including SOPs) - raw data (all written and electronic data generated when tests are conducted) - ETV verification reports (comprehensive reports on a technology verification project) - ETV verification statements (summary statement for an individual technology test) - annual pilot progress reports - EPA reviews and audit of reports - verification of partner reviews and audit reports __________________________ Information in this section applies to both electronic and printed records, as well as original records developed on behalf of the ETV program that are required to demonstrate the quality of information and data provided in ETV verification reports. TABLE 5.1 Records Management Scheme
5.2 Preparation, review, approval, and distribution Sufficient records shall be specified, prepared, reviewed, authenticated, and maintained to reflect the achievement of the required quality for completed work and/or to fulfill any statutory requirements. Documents used to perform work shall be identified and kept current for use by personnel performing the work. Documents, including revisions, shall be reviewed by qualified personnel for conformance with technical requirements and quality system requirements and approved for release by authorized personnel. Table 5.1 lists the pertinent quality records for ETV, the person(s) responsible for preparing and updating these records, the reviewers, those given approval authority for each record type, and the distribution plan. Where a procedure is not applicable (e.g., a document is not subject to approval), N/A is entered in Table 5.1. All reviewers and approving officials receive copies of the records they review/approve; the Distribution column in Table 5.1 lists only those individuals who receive final copies, in addition to the reviewers and approving official. For revised documents, these same review, approval, and distribution pathways are followed. Unless otherwise noted, material placed on the ETV website is available for public inspection, comment, and use. 5.3 Records Storage and Obsolete records Obsolete or superseded documents shall be identified and measures shall be taken to prevent their use, including removal from the work place and from the possession of users when practical. Maintenance of records shall include provisions for retention, protection, preservation, traceability, and retrievableness. While in storage, records shall be protected from damage, loss, and deterioration. Retention times for records shall be determined based on contractual and statutory requirements, or, if none stated, as specified by the EPA coordinator and EPA line management. Obsolete records should be clearly marked as such. These records may be retained in the workplace for historical reference, or they may be removed to archival storage. ETV will follow ORD'S Records Management Policy, Part 003 (see Appendix A), which addresses requirements for indexing, filing, maintaining, retrieving, and disposing of documents and records from all extramural financial agreements. The current minimum requirement is that all records be kept for seven years after the final payment on a cooperative agreement or interagency agreement. 6.0 COMPUTER HARDWARE AND SOFTWARE Computer software and computer hardware configurations used in the ETV program shall be installed/tested/used/maintained/controlled/documented to meet users' requirements and shall conform to this quality policy and applicable consensus standards and/or data management criteria. At the program level, ETV does not expect to develop software. At the pilot level, if verification partners intend to develop software to support their ETV process (or an individual test/QA plan), the partner should have procedures in place as specified here. If the verification partner uses only commercial software for office operations (e.g., word processing software, spreadsheet software), it is unlikely that the partner would need specific procedures for assessing software quality. Part A, sections 6.2 through 6.6, apply only to software and software/hardware configurations developed specifically for the ETV program. The following are the ETV program procedures which ensure that each pilot controls the quality of all computer hardware/software configurations for the program.
6.2 Scope of ETV computer hardware/software procedures Computer software and computer hardware/software configurations covered by ETV’s quality policy includes, but is not limited to:
Computer software and computer hardware/software configurations covered by this quality and management plan include all agreed upon, pilot-specific applications or configurations. These include, but are not limited to:
6.3 Configuration testing Computer hardware/software configurations shall be tested prior to actual use and the results shall be documented and maintained. On a pilot level, the verification partner conducts tests of the computer hardware/software configuration using a standard set of testing conditions. NOTE: Verification partner is required to have a system to document all testing of computer hardware/software configurations, as required by part A section 6.1. A test data set or a standard set of testing conditions should be developed on a pilot- or test/QA plan-specific basis. Maintenance testing should be easily tracked and retrievable. 6.4 Measurement and testing equipment configurations Computer hardware/software configurations integral to measurement and testing equipment that are calibrated for a specific purpose do not require further testing unless:
On a pilot level, verification partners perform the following procedures (as provided in the verification partner's quality system). Whenever computer hardware/software configurations integral to measurement and testing equipment are calibrated for a specific purpose, further testing is not normally performed unless the scope of the software usage changes or modifications are made to the hardware/software configuration. In the event either of the above mentioned changes occurs, the verification partner retest the changes as described in part A sections 6.1 and 6.3. Retesting is documented to the same extent as the original application/configuration. 6.5 Change assessments - configurations, components, and requirements Changes to hardware/software configurations, components, or program requirements shall be assessed to determine the impact of the change on the technical and quality objectives of the ETV program supported. The verification partner is responsible for assessing the changes, determining the need for testing, and reporting the assessments to the pilot manager. 6.6 ETV website roles and responsibilities The ETV website shall be operated in such a way that it serves all ETV participants and customers through prompt and accurate posting of ETV information and documents. The pilot managers, or alternate(s) designated in writing by the pilot manager, are responsible for sending the following information to the ETV webmaster:
7.0 PLANNING 7.1 Systematic planning process A systematic planning process shall be established, implemented, controlled, and documented to:
7.1.1 Systematic planning process established for ETV is conducted as follows:
- identify, revise, and/or clarify the technical and quality goals of the work to be accomplished - translate the technical and quality goals into written specifications that will be used to produce the desired result - consider any cost and schedule constraints within which test activities are required to be performed - develop qualitative measures of performance by which the results will be accepted - determine testing priorities and evaluate customer satisfaction
7.1.2 Implementation of the systematic planning process Planning is accomplished through frequent meetings among participants and through posting initial planning documents and stakeholders meeting minutes on the ETV website. Procedures for planning at the pilot and test level are addressed in Part B. Procedures for implementing the planning process are detailed below:
7.1.3 Systematic planning process controls include:
7.1.4 Systematic planning process documentation includes the ETV Verification Strategy, the ETV QMP, the verification partners QMPs, and test/QA plans. 7.2 Planning document review All planning documentation shall be reviewed and approved for implementation by authorized personnel before the specific work commences. Such documentation includes but is not limited to test/QA plans and generic verification test protocols. Planning document review is discussed in part A section 5.2. 8.0 IMPLEMENTATION OF WORK PROCESSES 8.1 Implementation Work shall be performed according to approved planning and technical documents. The planning for the implementation of the EPA management and quality work processes is contained in part A section 7.0. The individual ETV pilot work is performed according to planning documents written by the pilot. All technology verification work shall occur according to protocols and test/QA plans developed and agreed upon by EPA, the verification partner, and the vendor. The authors, reviewers, and approvers of these documents are specified in part A section 5.0, Table 5.1. The approved protocols and test/QA plans shall be present on the site of testing, and the work shall be implemented in accordance with them. During the work phase, modifications to plans and procedures shall be documented, and the modifications shall be incorporated into the final protocols and test/QA plans. The authors, reviewers, and approvers of changes to these documents are the same as for the original documents and are specified in part A section 5.0, Table 5.1. Verification partners are responsible for implementing their work processes in accordance with their quality systems. 8.2 Procedures Procedures shall be developed, documented, and implemented for appropriate routine, standardized, special, or critical operations. Operations needing procedures shall be identified. The form, content, and applicability shall be addressed, and the reviewers and approvers shall be specified. Procedures for the overall operation of the ETV program are contained in the ETV Verification Strategy, the ETV QMP and in other appropriate EPA policies (e.g., contractual, records management). The individual ETV pilots shall identify and document those operations in their pilots requiring procedures as discussed in Part B. Procedures shall be written in a format that can be readily comprehended by the user and shall contain sufficient detail and clarity to ensure that results are achieved effectively. Appropriate operations documents, authors, reviewers, and approvers are specified in part A section 5.0, Table 5.1. Implementation of work shall be accomplished with a level of management oversight and inspection commensurate with the importance of the program and the intended use of the results, and shall include the routine measurement of performance against established technical and quality specifications. EPA line management has responsibility for oversight of verification work processes as discussed in part A section 1.0. Verification partner oversight and responsibilities for the verification work processes are given in the individual pilot QMPs. 9.0 ASSESSMENT AND RESPONSE 9.1 Numbers and types of assessments Assessments shall be planned, scheduled, and conducted to measure the effectiveness of the implemented management and quality systems. Several types of assessments are available for this purpose. Management shall determine during the planning stage the appropriate types of assessment activities. Assessments shall include an evaluation to determine and verify whether technical requirements, not just procedural compliance, are being implemented effectively. The assessments shown in Table 9.1 and the minimum frequency are commensurate with the importance of the ETV program and the intended use of the verification results. Management assessments shall be used to measure the effectiveness of the implemented management systems and technical systems. Performance assessments shall be used to evaluate performance of the pilot technical operations. Data assessments shall assess reported data quality. Verification partners perform self-assessments in accordance with the individual pilot management plans, and EPA performs independent assessments of verification partners.
(Also, see Part B, 4.2 for information re: assessment frequency.) 9.2 Procedures Assessments shall be performed according to written and approved procedures, based on careful planning of the scope of the assessment and the information needed. Assessment results shall be documented and reported to management. Management shall review the assessments. Assessments shall be planned according to the scope of the assessment and the
information needed. Suitable written procedures for planning and conducting audits shall
be contained in the operating manuals of EPA quality teams, the operating and quality
manuals of the verification partners, and EPA guidance documents. Assessments are based on
interviews, on the physical examination of objective evidence, and on the examination of
the documentation of past performance. Results are documented in audit reports, and
reviewed by appropriate management. 9.3 Personnel qualifications, responsibility, and authority Personnel conducting assessments shall have the appropriate technical or management skills to perform the assigned assessment. Management shall determine and document the level of competence, experience, and training necessary to ensure the capability of personnel conducting assessments. The responsibilities and authorities of personnel conducting assessments shall be clearly defined and documented, particularly in regard to authority to suspend or stop work in progress upon detection and identification of an immediate adverse condition affecting the quality of results or the health and safety of personnel. EPA and verification partner management determines and documents the level of competence, experience, and training of their respective audit personnel during hiring and periodic performance reviews. Qualified audit personnel, as listed in Table 9.l, have access to the appropriate management personnel and documents required to perform their audit duties. They are organizationally independent of the program or pilot they are auditing. They have the responsibility and authority to:
If auditors identify a severe problem affecting verification quality, EPA pilot managers have the authority to request of the verification partner manager that work be stopped until the problem is addressed. If auditors identify a problem where the health and safety of personnel are in danger, they have the responsibility to bring it to the immediate attention of appropriate EPA management, verification partner management, and onsite testing personnel. Responses to adverse conclusions from the findings and recommendations of assessments shall be made in a timely manner. Conditions needing corrective action shall be identified and the appropriate response made promptly. Follow-up action shall be taken and documented to confirm the implementation and effectiveness of the response action. When the recommendations and conclusions from the findings of assessments are adverse, response from the auditee detailing the corrective action shall be expected within 10 working days of receiving the audit report. Auditors shall follow up with appropriate documentation to confirm the implementation and effectiveness of the response. 10.0 QUALITY IMPROVEMENT 10.1 Annual review for quality improvement A quality improvement process shall be established and implemented to continuously develop and improve the ETV Quality System. The ETV coordinator and EPA directors of quality assurance review the quality and management plan annually and recommend improvements to the plan. The EPA directors of quality assurance recommend improvements and negotiate improvements with the ETV team during the annual meeting and through the ETV website. 10.2 Detecting and correcting quality system problems Procedures shall be established and implemented to prevent as well as detect and correct problems that adversely affect quality during all phases of technical and management activities. EPA pilot managers and EPA pilot quality managers report problems in any of the areas to EPA line management and the ETV directors of quality assurance:
EPA line managers respond promptly to address correction of the quality problem. 10.3 Cause and effect relationship When problems are found to be significant, the relationship between cause and effect and the root cause shall be determined. The following are general procedures. Specific procedures are found in the individual verification partners written quality systems. When problems are significant, the quality manager determines and documents the relationship between cause and effect, and when possible, determines and documents the root cause of the problem. The quality manager provides this information to the appropriate project managers so corrective action can be authorized and implemented. A significant problem is any problem requiring:
NOTE: The verification partner quality managers in accordance with their quality systems are continually reviewing and assessing their projects for conformance with their quality documents. At the program level, assessment reports from the individual projects are monitored and evaluated by the ETV directors of quality assurance for trends or recurring problems that are indicative of significant problems affecting the ETV program as a whole. Any such situation is immediately communicated to the ETV coordinator. The ETV coordinator shares the information and any corrective actions with the EPA pilot managers. The root cause should be determined before permanent preventative measures are planned and implemented. To guard against implementing ineffective changes, EPA personnel ensure when possible that root causes are determined before preventative measures are planned and implemented. 10.5 Quality improvement action Appropriate actions shall be planned, documented, and implemented in response to findings in a timely manner. In the event that a significant problem is identified that requires a structural change to the ETV program, the ETV Coordinator will initiate discussions with EPA line management appropriate to correct the deficiency. COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA Part A of the ETV Quality and Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program. Part B of the ETV Quality and Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data. The work of the ETV program at the pilot level is to verify the performance of commercial-ready technologies. As discussed in part A section 4.0, the planning process begins with the Statement of Work (SOW) contained in the Request for Applications (RFA). The successful applicant becomes the verification partner for the pilot. 1.1 Systematic planning of the verification test All work involving the generation, acquisition, and use of environmental data shall be planned and documented. The type and quality of environmental data needed for their intended use shall be identified and documented using a systematic planning process. The test-specific planning must involve the key users and customers of the data. EPA pilot managers should guide planning activities and ensure that participants are informed of and understand completely the requirements of each test. The programmatic planning for verification of commercial-ready technologies is discussed in part A section 7.1.1. This section continues the discussion of systematic planning at the pilot level. Verification partners, working with the EPA pilot managers, begin a systematic process
to plan the individual pilot tests. Systematic planning may be accomplished through any
demonstrated technique including the data quality objectives process (EPA QA/G-4) and the observational
method. The planners perform the following actions:
The protocols and test/QA plans describe the experimental approach, with clearly stated test objectives and associated quality objectives for the related measurements. 1.2 Systematic planning for verification testing
1.2.1 Planning personnel The verification partner shall coordinate test planning among the participating organizations including EPA, the stakeholders, the vendors, and any testing organizations and laboratories participating in the test. The verification partner, with the concurrence and oversight of the EPA pilot manager, shall identify the planning roles of the various players, and shall conduct planning activities by shared communication via teleconference, video conference, and in-person meetings, as appropriate, and within the constraints of the budget. 1.2.2 Purpose, scope and objectives The purpose of this testing is to verify the performance of commercial-ready technologies. Another objective is to develop an efficient method for testing commercial-ready technologies. Many of the pilot tests accomplish this objective by preparing generic verification protocols whereby the performance of similar technologies can be verified in the future using the same protocol. The characteristics of individual technologies and the specifics of individual tests are covered in the test/QA plan that incorporates the generic verification protocol by reference. 1.2.3 Data to be collected and design of experiment During planning of the technology verification test, the process, environmental, laboratory, response, and QA data to be collected are identified. Also identified are testing organizations, test personnel, skill levels, methods, procedures, and equipment unique to each verification test. Planning is integrated into design as discussed in part B section 2.0. 1.2.4 Documentation and reporting Records generated during the pilot tests are listed in part A section 5.0. Records consist of both paper and electronic records. Electronic methods for storing, retrieving, analyzing, and reporting the data are generally commercially available programs for word processing, spreadsheet, or database processing, or commercial software developed especially for data collection and processing on a specific instrument or piece of equipment. Pilots may also develop software/hardware configurations, as appropriate, in their technology verification tests. The use of computer hardware and software is discussed in part A section 6.0. Paper records such as field notebooks, bench sheets, field data sheets, custody sheets, and instrument printouts are part of the raw data test record and kept with the study records. 1.2.5 Assessments The assessment tools and minimum frequencies of assessments for the verification tests are identified in part A section 9.0. The definitions of the assessment tools and suggested frequencies are given in part B section 4.0. 1.2.6 Constraints, suspension of work, waste minimization and disposal Verification partners work under the constraints of time and resources communicated to them by the EPA ETV Coordinator and the EPA pilot manager. When constraints are determined by the verification partner to affect quality, the resolution of the problem proceeds as described in part A section 1.5. Circumstances under which work can be suspended are discussed in part A section 1.7. If waste is generated as part of the verification testing, the verification partner seeks to minimize the amount, and disposes of it in accordance with applicable local, state, and federal laws. 2.0 DESIGN OF TECHNOLOGY VERIFICATION TESTS 2.1 Design process The design shall incorporate those activities pertaining to verification of performance identified during the planning process, establish test specifications, and identify appropriate controls. The design shall include 2.1.1 Design technique In designing technology performance verification operations, designers use verification testing design techniques including statistical methods, as appropriate. The design takes into account constraints of time, scheduling, and resources. 2.1.2 Field and laboratory equipment and methods During the design process, the appropriate field and laboratory equipment that was identified during planning for the testing of the technology verification performance is incorporated. Appropriate test methods and operating parameters are specified. 2.1.3 Sampling and analysis The design process produces a testing plan based upon the data quality objectives for the verification of the technology performance.
2.1.4 Assessments Assessments incorporated into the design include self-assessments (internal audits) by the verification partner and independent assessments by EPA. The assessments identified in the planning process are incorporated into the design. The type and minimum number of assessments are identified in part A section 9.0. A suggested schedule of assessments is given in part B section 4.0. 2.1.5 Validating, reporting, securing, and archiving data Data are validated as indicated under Audits of Data Quality in part A section 9.0. Data are reported in ETV verification reports and ETV verification statements. Data records are stored as discussed in part A, section 5.0 and in Appendix A. 2.2 Generic verification protocols and test/QA plans: planning documents from the design process Planning documents from the design process include generic verification protocols and test/QA plans. Writing planning documents is generally a lengthy process involving iterations of review and revision. Authors should be knowledgeable of the activity and the equipment described in the planning documents. Two types of planning documents have been identified, as the core documentation needed for operation of an ETV pilot: the generic verification protocol and the test/QA plan. The generic verification protocol is meant to promote uniform testing for a single pilot and, therefore, is considered a more general document. The test/QA plan contains the specific information needed to conduct a verification test. 2.2.1 Generic verification protocols provide the necessary framework for development of the more detailed test/QA plan. The specific content and level of detail given in generic verification protocols will vary greatly between pilots. For some pilots, the generic verification protocol may be so detailed that the test/QA plan may require very little additional information. Conversely, other pilots may use the generic verification protocol to describe the general procedures that guide the pilot. Given the highly variable nature of the generic verification protocol, no specific format has been proposed. The issues that may be addressed in the generic verification protocol are the following:
The QA/QC section of the generic verification protocol typically describes the activities that verify the quality and consistency of the work. Preparation and use of appropriate QA procedures such as QC samples, blanks, split and spiked samples, and performance evaluation (PE) samples to verify performance of the technology being tested can be described. Criteria for success can be included. Frequency of calibrations and QC checks and the rationale for them can be described. Procedures for reporting QC data and results can be given. Who or what organization is responsible for each QA activity, and who has the responsibility for identifying and taking corrective action can be specified. However, if these items vary between tests within a given pilot, the more appropriate document in which to describe them may be the test/QA plan. The protocol may cite documents or procedures that explain, extend, and/or enhance the protocol such as related procedures, the published literature, or methods manuals. The specific location of any reference not readily available from a full citation in the reference section should be given (as in a facility-specific standard operating procedure) or attached to the protocol. 2.2.2 Test/QA plans contain the following required elements. Not all elements listed are appropriate to every test. The author of the test/QA plan will note and explain those elements that are not applicable.
The generic verification protocol is incorporated by reference. One reference document available for writing test/QA plans is EPA/QA G-5, Guidance for Quality Assurance Project Plans. If another level of detail is required for describing test activities, for example operation of an instrument, a standard operating procedure may be written and attached to the QA/test plan. The following topics, from EPA QA/G-6, Guidance for Development of Standard Operating Procedures (SOPs), may be included (or a reference provided) in the standard operating procedure:
3.0 IMPLEMENTATION OF PLANNED OPERATIONS 3.1 Implementation of planning Environmental data operations shall be implemented according to the approved planning documents. Deviations shall be documented and reported to and evaluated by management. Approved changes shall be made and distributed to test personnel to replace previous versions of the documents. Technology performance verifications are implemented according to the generic verification protocols and test/QA plans prepared during planning. During implementation, changes are incorporated, reviewed and approved according to the scheme discussed in part A section 5.0. Test personnel have access to the approved planning documents, approved changes to planning documents, and all referenced documents. The final protocols are posted on the ETV web page for future use for similar technology verifications. All implementation activities are documented. Suitable documents are bound notebooks, field and laboratory data sheets, spreadsheets, computer records, and output from instruments (both electronic and hardcopy). All documentation is developed as described in the planning documents. All implementation activities are traceable to the planning documents and to test personnel. Only qualified and accepted services and items shall be used in the performance verification operations. Acceptance shall be identified on the items themselves and /or in documents traceable to the items. Tools, gauges, instruments, and other sampling, measuring, and testing equipment used for activities affecting quality shall be controlled as required and, at specified intervals, calibrated to maintain accuracy with specified limits. Documentation of calibration shall be maintained and shall be traceable to the equipment. Periodic preventative and corrective maintenance of equipment shall be performed, and it shall be recalibrated prior to use. ETV program services are delivered by the verification partners. The verification partners are accepted via the request for application, proposal, and assistance agreement process as discussed in part A section 4.0. Qualified and accepted services and items used in testing are provided for in the verification partners quality systems. The pilot quality management plan contains provisions for acceptance of services and items, and documentation of acceptance. Control of equipment, calibration to maintain accuracy within specified limits, maintenance, and documentation is the responsibility of the verification partner. The verification partner verifies that the tools, gauges, instruments, and any other sampling, measuring, and testing equipment used for activities affecting quality are controlled as required by the planning documents, and calibrated at specified intervals to maintain accuracy within specified limits. Equipment found to be out-of-specification is not used without documented repair and reassessment of performance. All maintained and repaired equipment is recalibrated as necessary before it is used for measurement work. Oversight is the responsibility of EPA during the pilot period, and is conducted through review and acceptance of the verification partners quality system documents, the pilot quality management plan, and through independent audits. 3.3 Field and laboratory samples Handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples shall be performed according to required specifications, protocols, or procedures to prevent damage, loss, deterioration, artifacts, or interference. Sample chain of custody shall be tracked and documented. If samples for analysis are taken in the field, they are to be handled according to procedures in the verification partners quality systems and the pilot quality management plan. The oversight responsibility of EPA during the pilot phase is to determine that the approved systems and plans contain adequate procedures for handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples to prevent damage, loss, deterioration, artifacts, or interference. The verification partner provides adequate chain of custody procedures, if they are required. 3.4 Data and information management Data or information management, including transmittal, storage, validation, assessment, processing, and retrieval, shall be performed in accordance with the approved instructions, methods, and procedures. ETV program records and the procedures for handling them are listed in part A section 5.0. 4.0 ASSESSMENT AND RESPONSE Management system review - Audit of a quality system for conformance to a quality management plan Technical systems audit - Qualitative onsite audit of the physical setup of the test. The auditors determine the compliance of testing personnel with the test/QA plan. Performance evaluation audit - Quantitative audit in which measurement data are independently obtained and compared with routinely obtained data to evaluate the accuracy (bias and precision) of a measurement system. Audit of data quality - Qualitative and quantitative audit in which data and data handling are reviewed and data quality and data usability are assessed. Activities performed during technology verification performance operations that affect the quality of the data shall be assessed regularly, and the findings reported to management to ensure that the requirements stated in the generic verification protocols and the test/QA plans are being implemented as prescribed. The types and minimum frequency of assessments for the ETV programs that are listed in part A section 9.0. The pilot tests will have at minimum the following types and numbers of assessments:
Additional assessments may be provided for in individual test/QA plans. Assessments by the verification partner will remain steady throughout the pilot period, but independent assessment by EPA will decrease in keeping with the policy of preparing the pilots to eventually operate independently. Appropriate corrective actions shall be taken and their adequacy verified and documented in response to the findings of the assessments. Data found to have been taken from non-conforming equipment shall be evaluated to determine its impact on the quality of the data. The impact and the action taken shall be documented. Assessments are conducted according to procedures contained in the verification partners quality systems or the quality procedures available to EPA personnel, as discussed in part A, section 9.0. Findings are provided in audit reports. Responses to adverse findings are required within 10 working days of receiving the audit report. Follow-up by the auditors and documentation of response are required. 5.0 ASSESSMENT AND VERIFICATION OF DATA USABILITY Data obtained during verification tests shall be assessed, verified, and qualified according to their intended use (as verification performance data). Any limitations on this intended use shall be expressed (quantitatively to the extent practicable) and shall be documented in the ETV verification report. Audits of data quality are used to validate data at the frequency cited in Table 9.1 and are documented in the data audit report. The goal of an audit of data quality is to determine the usability of test results for reporting technology performance, as defined during the design process. Validated data are reported in the ETV verification reports and ETV verification statement along with any limitations on the data and recommendations for limitations on data usability. Any data obtained from sources that did not use a quality system equivalent to the E4 Standard shall be assessed according to approved and documented procedures. Existing data may be used for planning, subject to the individual rules set up by each pilot. Data used for verification collected outside the ETV test is subject to rigorous scrutiny according to the procedure in Appendix C. ETV verification reports containing data and reporting the results of technology verification performance shall be reviewed independently (i.e., by others than those who produced the data or the reports) to confirm that the data or results are presented correctly. These reports shall be approved by management prior to release, publication, or distribution. The procedure for ETV verification report and ETV verification statement review and approval is given in part A section 5.0. ETV verification reports are peer-reviewed, and during the pilot phase ETV verification statements are signed by the EPA laboratory directors. Guidance for the Preparation of Standard Operating Procedures (SOPs) for Quality Related Documents, EPA/600/R-96/027. Washington DC: U.S. Environmental Protection Agency, 1995. Guidance for Quality Assurance Project Plans, EPA QA/G-5 Washington DC: U.S. Environmental Protection Agency, 1998. Simes, G. F., Preparation Aids for the Development of Category II Quality Assurance Project Plans, EPA/600/8-91/004. Cincinnati OH: U.S. Environmental Protection Agency, 1991. Guidance for the Data Quality Objectives Process, EPA QA/G-4, EPA/600/R-96/055. Washington DC: U.S. Environmental Protection Agency, 1994. Guidance for Data Quality Assessment, EPA QA/G-9, EPA/600/R-96/084. Washington DC: U.S. Environmental Protection Agency, 1996. USEPA Office of Research and Development. Environmental Technology Verification Program Verification Strategy. EPA/600/K-93/003. US Government Printing Office; 1997. American Society for Quality Control, Energy and Environmental Quality Division, Environmental Issues Group. AMERICAN NATIONAL STANDARD Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-1994. American Society for Quality, 1994. ENTICE White Paper - 1994 Science Advisory Board Review Report- 1995 ETV Team and Duties Memorandum - 1995 Budget Memoranda - Fiscal Year 1995 Budget Memoranda - Fiscal Year 1996 Budget Memoranda - Fiscal Year 1997 APPENDIX AReturn to Table of Contents EPA SERIES NO. 003A U.S. EPA RECORDS CONTROL SCHEDULE SERIES TITLE: Grants and Other Program Support Agreements PROGRAM: All Programs, except Superfund Site Specific and Wastewater Construction and State Revolving Fund Grants EPA SERIES NO: 003A NARA SCHEDULE NO. N1-412-94-2/1 (Use this number to retire records to the FRC) APPLICABILITY: Agency-wide IDENTIFYING INFORMATION: DESCRIPTION: Includes records that document all types of agreements with other Federal, State, or local government agencies, universities and other institutions to which EPA is a party, and which support EPA's environmental programs (other than Superfund site specific and wastewater construction grants). Specific types of agreements include assistance agreements, grants, cooperative agreements, Interagency Agreements, and other types of program support agreements administered by Headquarters or EPA regions and which provide for research, demonstration projects, training, fellowships, investigation, surveys, studies, or other types of program support activities. Includes: Supporting documentation - Specific types of records include documentation of significant actions and decisions, justifications, cost estimates, scopes of work, correspondence, applications, pre-award reviews, funding decisions, award documentation, commitment notices, transmittal correspondence, agreements, agreement oversight activities, non-compliance/dispute documentation, audit records, closeout documentation for completed agreements, and reports and evaluations resulting from agreements. Excludes: Final products and deliverables, Superfund site specific grants, and agreements and wastewater construction grants which are scheduled separately. ARRANGEMENT: Arranged by agreement.
FUNCTIONS SUPPORTED: Program operations SPECIFIC LEGAL REQUIREMENTS: Varies according to program 40 CFR 30, 31, 35, Subparts A, H, P, 40, 45-47 DISPOSITION INFORMATION: FINAL DISPOSITION: Disposable TRANSFER TO FRC PERMITTED: Yes FILE BREAK INSTRUCTIONS: Break files immediately after closeout of the agreement. DISPOSITION INSTRUCTIONS: Keep inactive materials in office at least 1 year after file break, then retire to FRC. Destroy 7 years after file break. If record copy is in microform, break file upon completion of microform quality assurance check. Retire one silver and one diazo copy to the FRC along with finding aids and indexes. Destroy 7 years after file break. Retain one or more sets for office use. Destroy any Agency microform copies when superseded or no longer needed. APPLICATION GUIDANCE: REASONS FOR DISPOSITION: The retention period for supporting documentation has been extended because the records are needed in the event of a claim against the Agency. The statute of limitation on such claims is 7 years. Final products and deliverables are covered in EPA 258A. AGENCY-WIDE GUIDANCE: Final products and deliverables are permanent records and are scheduled as EPA 258A. Agreement closeout is when the Agency determines all administrative actions and required work is completed (submission of the final expenditure report, SF 269 - Financial Status Report, by the recipient) or when the agreement is terminated or annulled and any disputes settled. Final closeout documentation may consist only of an internal Agency memo. The Grants Administrator (also called the Grants Management Officer), Grant Project Officer, and Financial Management Officer are responsible for the record copies of grant agreement records and implementing the disposition. Records can include unique program files maintained by the grant project officer or client or technical representative. All other copies may be destroyed when no longer needed. The following offices and managers are responsible for maintaining a complete record set and dispositioning documents as designated below: Grants Management Officer (Grants Specialist) - Record copy of applications; reviews and amendments related to the application; administrative review checklist; certifications; agreements and any amendments; award documentation; requests for deviations; stop work orders; documentation relating to termination actions, disputes and appeals, annulments and audits; legal opinions; financial status reports; and increases and decreases; correspondence and other related documents. Program Office (Project Officer) - Record copy of documents used for day-to-day technical direction of the grant or interagency agreement such as draft and final products and deliverables; work plans and progress reports; draft documents and comments provided or other records of technical direction. Copies of applications, awards, amendments and other administrative and financial documents. Financial Management Officer - Record copy of reimbursement requests, payment vouchers, payment files, federal cash transaction reports; copies of financial status report and other related documents. See EPA 274A for Unsuccessful Grant Application Files. This item does not include Superfund site specific grants which are scheduled as EPA 001A or Waste Water Construction and State Revolving Fund Grants which are covered in EPA 232A. Contracts are covered under EPA 020A, EPA 055A, EPA 202A, and EPA 258A. The Grants Information and Control System is scheduled as EPA 575A. PROGRAM OFFICE GUIDANCE/ DESCRIPTIVE INFORMATION: Previous schedule items combined into this schedule were for the following programs: Federal Activities, Water, Solid Waste, Emergency and Remedial Response, Toxic Substances, Mobile Source, Air and Hazardous Waste, Regional Administrator, Research and Development, Pesticides, Radiation, and Information and Resources Management. Specific item numbers are cited below. CUSTODIAL INFORMATION:
CONTROL INFORMATION: RELATED ITEMS: EPA 001A, EPA 020A, EPA 055A, EPA 202A, EPA 232A, EPA 258A, EPA 274A, EPA 575A PREVIOUSLY APPROVED BY NARA SCHEDULE NOS: NC1-412-75-6/1, NC1-412-76-1/III/14 and 20, NC1-412-76-9/25, NC1-412-77-1/8 and 9, NC1-412-77-4/1, NC1-412-77-5/11, NC1-412-78-10/6b, NC1-412-82-12/11, NC1-412-85-6/8 and 15, NC1-412-85-7/8, NC1-412-85-12/6, NC1-412-85-14/7, NC1-412-85-17/2, NC1-412-85-18/2, NC1-412-85-19/4, NC1-412-85-23/4a, NC1-412-85-25/5a
and b, NC1-412-85-26/I/4, N1-412-86-1/8, N1-412-86-3/7 _________________________________________________________
APPENDIX B The following measures of success is excerpted from the November 1997 ETV meeting: What Constitutes Success for ETV? Timing
Cost of Operation, Testing, Participation Customer Satisfaction
Effects
APPENDIX C ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM EXISTING DATA: POLICY AND PROCESS Background The Environmental Technology Verification program was established by the U.S. EPA for the purpose of verifying the performance of commercial-ready technologies for their ability to monitor, prevent, control, or clean-up pollution. Verification is accomplished by the evaluation of objectively-collected, quality-assured data which are provided to potential purchasers and permitters as an independent and credible assessment of the performance of a technology. Data are collected and evaluated in partnership with independent third party verification partners chosen from the public sector (such as states), the private sector (such as non-profit research institutions), federal laboratories, and others. During the pilot phase (1994-2000), EPA provides oversight of the verification partner to assure the credibility of the process and data, and keeps the authority for the verification process and decision (except in the case of an independent pilot). After the pilot phase, responsibility and authority revert to the verification partner. The ETV program seeks to identify optimal methods to verify environmental technologies without compromising quality. Stakeholder groups, consisting of representatives of major verification customer groups, advise and assist EPA and the verification partners in this effort. One consistent and urgent request has been that existing data, i.e., data collected prior to the ETV program, be used for ETV verification. This suggestion is reinforced by the programs of individual states, as well as those of other countries, that routinely consider previously-collected data in the verification of vendor claims for a technology. The purpose of this document is to establish a guideline whereby the ETV program may use these “historical,” “existing,” or “secondary” data to increase and enhance the scope of individual pilot projects. POLICY Currently, under the U.S. ETV program, the verification partner and the technology developers typically plan and execute tests which provide the objective and quality-assured data by which the environmental technologies are evaluated. Existing data are used to support test plan development. Measurements and data are collected in a demonstration of the technology by the developer, under the direction of the verification partner, and overseen by EPA. Reports are peer-reviewed and verification statements are issued. In this closely-monitored scenario, the origin and quality of the data upon which the verification statement rests are generally known and documented, and therefore the possibility for verification decision error is minimized. The consequences of a serious verification decision error can include verification of fraudulent claims, litigation, and loss of credibility for the ETV program, the verification partners, and EPA. Compelling arguments exist for considering using certain qualified existing data to replace some or all of the verification testing for a given technology. Some technologies are time-consuming and expensive to evaluate. Due to resource constraints, demonstrations can, at best, show the performance of the technology under only limited conditions. A test may provide only one small performance snapshot in time as opposed to providing data from several years of performance collected by the developer or his customers under a full range of conditions. Limited resources may require that testing focus on only one component of a technology rather than its full range of capability. Before coming to the commercially viable stage of development, these technologies may have been tested numerous times with acceptably reproducible results. Judicial precedent provides argument for the defensible use of existing data. In Daubert v. Merrill Dow Pharmaceuticals, Inc. , the Supreme Court in 1993 adopted a new standard for the admissibility of scientific evidence. The Court there held that Federal Rule of Evidence 702 requires that, when presented with proposed scientific testimony, the district court must make a preliminary assessment of whether the reasoning or methodology underlying the testimony is scientifically valid, and therefore reliable. The Court declined to adopt a definitive checklist or test, but noted several factors a court should consider. Those factors include: (1) does the theory or technique involve testable hypotheses; (2) has the theory or technique been subject to peer review and publication; (3) are there known or potential error rates and are there standards controlling the technique’s operation; and (4) is the method or technique generally accepted in the scientific community? The court must also consider the relevance or fit of the proposed testimony by determining if the reasoning and methodology can properly be applied to the facts at issue. The Clean Air Act Credible Evidence Revisions (see Federal Register, Vol. 62, No. 36, February 24, 1997) provide precedent within the Agency for defensible consideration of existing data for verification use. These revisions clarify that data from methods which are not EPA Standard Reference Methods can be used in enforcement actions and for compliance certification. Conversely, emission sources will be able to use any credible evidence (ACE) for contesting allegations of noncompliance in enforcement actions. As the rule states, it “exemplifies EPA’s common sense” approach to environmental protection, which encourages smarter, cheaper and more flexible means of achieving environmental goals without compromising the fundamental health and environmental protections provided by federal environmental laws.” It follows that if EPA can use ACE for enforcement actions, it can be considered for verification. Other precedent within the Agency exists at the Office of Air Quality Planning and Standards (OAQPS). OAQPS uses secondary data, defined as data that are utilized for a purpose other than that for which they were initially collected, in its regulatory efforts. In order to effectively focus its quality assurance (QA) efforts within the constraints of available resources, OAQPS concentrates its consideration of secondary data according to category of project. The QA activities associated with evaluating secondary data are conducted to assure that the data will be adequate and sufficient for their planned secondary use. Recognizing therefore that it is neither prudent nor cost-effective to ignore existing data, the ETV program establishes by this document a consistent process to evaluate these data for the extent of their credibility and usability in the verification decision. Data to be considered for use to replace verification testing undergo a rigorous process of evaluation using stringent criteria. The following guidelines are used to qualify existing data for verification purposes (detailed procedures follow in the “process” section of this document):
Recognizing that useful data exist which will not qualify for verification under these guidelines, and responding to customer needs, individual pilots may establish individual evaluation criteria by which existing data may be considered. These data may not be used directly for verification, but may be used, for example, to support planning or to augment verification testing. No ETV program-wide guidelines are necessary for the use of existing data for purposes other than for verification. PROCESS Identifying and Qualifying the Data The vendor proposes the data to be evaluated. EPA and the verification partner shall (with input from the stakeholder group, as applicable) identify for the vendor the procedures and acceptance criteria used in the pilot demonstrations to evaluate technology performance. These procedures and criteria are the same as that used for other technologies evaluated by the verification partner. The data requirements are developed by EPA, the verification partner, and interested stakeholders for the pilot, and are not specific to the existing data. The vendor and verification partner perform the initial evaluation. The vendor shall provide the verification partner with the detailed protocols and test plans used to develop the existing data. The vendor shall identify those data that he believes will meet the acceptance criteria, qualify those data, and submit the data along with detailed evidence that the data meet the requirements of the pilot project. The evidence shall be submitted to the EPA and verification partner in a detailed report. The report shall show how the data verify the performance of the technology, identify data that were excluded, give an explanation of how and why they were excluded, and address other requirements specific to the pilot project. The vendor shall be prepared to provide all of the raw data. The verification partner shall review the planning documents to determine whether they meet the requirements of those being used by the verification partner for evaluation tests of other technologies. At a minimum the existing data protocols and test plans shall require the same level of QA/QC, replicate tests, data treatment, and reporting as that required by the verification partner in its technology demonstrations. The verification partner shall conduct a detailed review of the vendor’s data report to determine whether the data adequately evaluate the performance of the technology. The verification partner has access to the raw data and works through a reasonable random sample (suggest 10% of the data). A recommended method for evaluation of data is tracing a random selection of data points from the raw data set to the final report. Minimum General Acceptance Criteria
Specific Acceptance Criteria In addition to the general acceptance criteria, the specific pilot project stakeholders may impose specific acceptance criteria which must be as stringent as the acceptance criteria for the data collected during verification testing. Convening the Data Evaluation Panel If the verification partner determines that the report does not adequately evaluate the performance of the technology, the vendor is notified and no further action is required. If the verification partner determines that the vendor’s report does adequately evaluate the performance of the technology, then a data evaluation panel (DEP) is appointed. The verification partner enlists the services of 3 qualified reviewers to serve on the DEP. During the pilot phase of the ETV program, the DEP will generally consist of one person from EPA, one person from the verification partner, and one person who is an outside expert in the technology being evaluated. The DEP must contain members who are credible, experienced, knowledgeable, and qualified in the technical areas critical to the technology being evaluated. The members of the DEP must be objective and have no real or perceived conflict of interest with the commercial developer of the technology they are evaluating. DEP members must be independent; they cannot have been involved in the collection of the data being evaluated. Evaluation of the Data by the DEP The DEP reviews and agrees on the acceptance criteria and determines their applicability to the data to be evaluated. The evaluation shall follow the procedures and criteria developed by the verification partner and EPA for other technology verifications conducted in the pilot project. The verification partner provides a written summary of its review to the DEP. When the verification partner submits the data to the DEP, it ceases to be proprietary. The DEP reviews and evaluates the data using the applicable acceptance criteria.. The DEP determines that the data were gathered following appropriate test protocols similar to the protocol used for verification testing. It ensures that the data were gathered following written test plans developed using a similar protocol. Planning must have included specific test objectives, experimental design, criteria for data quality, QA/QC procedures followed and reported, number of samples or frequency of sampling, and sampling and analytical procedures. The DEP must determine that the data quality meets or exceeds the minimum data quality requirements of the verification testing conducted during the pilot. The quality and usability of the existing data shall be evaluated against clearly defined data quality requirements based on the data quality requirements of the ETV pilot project. The data shall be sufficient to evaluate the performance of the technology. Recommendations for Acceptance of Data for Verification Role The DEP shall prepare a report on its findings. At a minimum the report must address the following:
The DEP provides a written statement of the performance of the technology as provided by the data, a statement of how well the data meet the acceptance criteria, and a data acceptance recommendation. Review and Acceptance of Recommendation by Verification Partner and EPA The EPA reviews the report, determines whether to accept the data acceptance recommendation, and signs the verification statement. ____________________ @ It is suggested that testing entities having a quality system which is modeled after the American National Standard Institute/American Society for Quality Control (ANSI/ASQC) Standard E-4-1994, Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs, or the International Organization for Standardization (ISO) Standard 9000, Quality Management and Quality Assurance Standards: Guidelines for Selection and Use, may have appropriate quality systems. Other similar quality systems may be accepted at the discretion of the reviewers. ____________________
|