The FAA and EDS manufacturers should maintain expertise to fully implement the following configuration-management concepts:. There are several commercially available software-based configuration-management tools that are applied in technology-intensive industries.
The panel believes that the use of software-based configuration-management tools would facilitate tracking changes to the configuration of a CI, a CSCI, or a system baseline.
However, software-based configuration-management tools developed internally by manufacturers may also be appropriate—if they enable version control. The FAA and the manufacturers of explosives-detection equipment should implement and maintain expertise in software-based configuration-management tools as a part of their management plan. The FAA has defined and prioritized the threats to aviation security and has defined an operational test protocol for radiographic x-ray scanners used for screening carry-on baggage and passenger-screening metal-detecting portals.
However, they have not developed performance requirements that are testable in airports or at manufacturing sites or adopted an airport performance-verification protocol for automated explosives-detection equipment. Developing such performance requirements and test protocols would allow for clear communication between the FAA and the manufacturers and end users. The test plan outlined in a National Research Council report NRC, could serve as a model for a test protocol appropriate for use in an airport or at a manufacturing site.
Such a protocol should include. The FAA should require a wide variety of tests for maintaining EDS certification, including qualification testing and periodic verification testing of detection performance levels using a secondary standard bag set , frequent monitoring of critical system parameters using test articles , and continuous self-diagnosis of subsystem parameters e.
In the sections that follow, a protocol is discussed to provide guidelines for developing performance-verification procedures for certified and noncertified explosives-detection equipment. Certification testing determines the integrated performance of all of the operational subsystems of equipment under examination. During certification, explosives-detection equipment is tested as a monolithic entity—without testing individual components. The panel believes that this mode of testing results in a limited amount of information regarding how modifications to operational subsystems and components could affect the performance of the explosives-detection equipment.
Testing of appropriate parameters, however, could provide such information. Early identification of operational subsystems will increase the efficiency and rigor of performance verification, particularly with respect to factory testing.
For x-ray CT-based equipment, examples of operational subsystems include the illuminator—that is, the x-ray generator composed of a high-voltage generator and an x-ray source—and the detector, which measures the modulation in the x-ray profile exiting the scanned bag.
The relationship between component parameter values and the performance of operational subsystems, and, ultimately, overall detection performance must be known for such tests to be effective. Test objects are a variety of objects used to test the performance of explosives-detection equipment.
These objects range from test articles that measure a critical system parameter an example is described in Appendix F , to materials that simulate explosives, to the primary standard bag set—which contains explosives—to determine the detection performance of explosives-detection equipment Table The ability to test with a simulated explosive i.
However, the nature of a simulated explosive is dependent on the explosive. Representative passenger bags, some containing simulants representing explosives at threat quantity.
Simulate primary standard bag set requires no special safety-related handling permits or precautions. Currently, only explosives are used during certification testing of explosives-detection equipment, and the availability of FAA-validated secondary standards is limited. The FAA certification process, however, provides an ideal opportunity to correlate, for a particular piece of explosives-detection equipment, technical test data using explosives with that obtained using secondary standards.
Furthermore, it is the opinion of the panel that the FAA has the responsibility for and should continue to work toward ensuring the availability of appropriate secondary standard materials. Prior to FAA validation of secondary standard materials developed by equipment manufacturers, the FAA should require manufacturer validation of such materials as discussed in paragraph 2.
Furthermore, the FAA should validate or arrange for independent validation of all simulants regardless of who developed them to be used for qualification testing and verification testing according to the guidelines presented in Annex II of Detection of Explosives for Commercial Aviation Security NRC, For qualification and verification testing, the FAA should work with EDS manufacturers and users to develop a secondary standard bag set for each specific technology or technology class.
The secondary standard bag set should be developed, retained, and controlled by the FAA personnel responsible for the conduct of the certification test and evaluation and utilized in the conduct of periodic verification testing. The secondary standard bag set should be controlled to assure reliable test results, as is done by the FAA for the primary standard bag set. It is important that the FAA periodically on the order of the lifetime of the simulants verify the condition, configuration, and performance of the secondary standard bag set.
All data generated by the use of the secondary standard bag set should be collected, analyzed, reported, and maintained by the FAA. A proposed test protocol based on the secondary standard bag set is presented in Appendix F. The performance of explosives-detection equipment may be inferred by testing critical system parameters using a test article or by continuously monitoring subsystem parameters Table A critical system parameter is a parameter that is fundamental to the performance of explosives-detection equipment.
Such a parameter is directly related to the ability of explosives-detection equipment to detect an explosive. For the case of a CT-based imaging system, these parameters include the macrosystem transfer function, spatial resolution, and background noise. Test articles exist to test critical system parameters for many explosives-detection technologies as a result of their use in other applications—for example, CT in the medical field.
In addition to the critical system parameters, there are subsystem parameters, which are measurable parameters that indicate the operational consistency of explosives-detection equipment e.
There is a rich history in the medical imaging community of testing subsystem parameters associated with specific components of medical CT systems. Most manufacturers perform tests. Inappropriate calibration of these two subsystem parameters can cause errors in the output of the complete CT system, particularly with respect to any quantitative determination of the x-ray attenuation coefficient, a critical system parameter used to characterize the objects being imaged Testing, monitoring, and evaluating the accuracy of the x-ray tube potential and the x-ray tube current is as important in CT-based explosives-detection equipment as it is in a medical CT system.
Tests on other subsystem parameters, such as voltage and noise levels associated with the x-ray detector—a separate component from the illuminator—are also incorporated in most factory test programs for medical CT systems and, therefore, could also be considered for CT-based explosives-detection equipment. For monitoring the performance of EDSs, the FAA should work with manufacturers to develop a set of critical system parameters and their tolerances that could be monitored frequently and recorded to track changes in performance during normal operations or to verify performance after maintenance or upgrading.
Critical system parameters e. Subsystem parameters e. Certification testing determines the integrated performance of all operational subsystems of explosives-detection equipment. During certification, explosives-detection equipment is tested as a monolithic entity without testing individual subsystems or components. The panel concluded that this mode of testing results in a limited amount of information regarding how modifications to operational subsystems and components could affect the performance of the explosives-detection equipment.
Testing of appropriate parameters, however, can provide such information. Recommendation , The FAA should verify critical system parameters during certification testing. Note that validation of critical and subsystem parameter ranges may require monitoring the correlation of these ranges with equipment performance over time—even after deployment. In this context, system performance pertains to the ability of the equipment to detect the explosive compositions and configurations e.
Therefore, parameter values measured outside of accepted ranges should trigger testing the equipment with the secondary standard bag set in the field. Figure illustrates a verification testing process. The FAA needs qualification testing, verification testing, and monitoring protocols to verify the performance of deployed EDSs, because it is not reasonable to duplicate certification testing in airports and manufacturing sites, and there is a need for determining that an EDS is operating properly on a daily basis.
The first step in the development of a qualification, verification, and monitoring testing protocol to verify the performance of deployed bulk explosives-detection equipment is the realization that performance requirements cannot easily be directly related to the FAA certification standard e. To accomplish this, the FAA must first define a primary standard e. For example, the bag set used for certification testing of an EDS may serve as this primary standard.
Given the assumption that the primary standard bag set accurately reflects the "real threat," there is a level of uncertainty or risk involved in modeling the primary standard bag set with a secondary standard bag set e. The next logical step, testing critical parameters e. Continuous monitoring of appropriate. Monitoring and verification testing for certification maintenance. As described in Figure , the further removed a test method is from the real threat, the greater the uncertainty about the ability of the test method to effectively measure the performance of the EDS.
The practicality of conducting performance verification, however, increases concurrently with degree of uncertainty. In the context of performance verification, practicality reflects test duration and difficulty and the likelihood that the test will not disrupt airport and airline operations. That is, the less obtrusive the performance-verification test, the more practical it is. Furthermore, Figure indicates the dependence of each successive level of performance verification on the previous level.
For example, the quality of the primary standard bag set, and analogously the secondary standard bag set, is dependent on the understanding of the real threat. Similarly, the efficacy of testing a critical parameter with a test article is dependent on the secondary standard bag set to resolve a potential problem detected by such a test.
In this light, it is apparent that the FAA needs to develop a performance-verification protocol for airport testing of an EDS that incorporates more than one level of testing. For example, daily diagnostic testing of critical parameters of an EDS could be augmented by monthly testing with simulated explosives hidden in luggage or an annual check with explosives. Regardless of the performance-verification protocol established, external developments such as emerging explosives-detection technologies and changing threats to aviation security should be reflected with appropriate changes to primary standards, secondary standards, and diagnostic tests.
Schematic representation of the relationship between various levels of performance verification test objects and the "real threat" and the relative practicality and degree of uncertainty associated with them.
The first option would be useful for daily calibration and diagnostic testing of an EDS. This approach, however, will only provide inferential information regarding the capacity of the EDS capacity for detecting explosives. The adequacy of this information for predicting detection performance depends on the proper choice and understanding of the critical parameters measured.
This type of testing will not provide a quantitative measure of performance relative to certification criterion i. Therefore, the panel recommends that the second, more rigorous, approach be utilized periodically to yield performance probabilities P D and P FA to correlate performance-verification testing more directly with certification specifications. In addition to monitoring the detection performance of an EDS, it is important to monitor the false-alarm rate and the baggage throughput rate of the system.
These quantities can be monitored continuously as passenger baggage passes through the system. However, if the false-alarm rate is determined in this manner to be unacceptably high, the system should not be taken off-line.
Rather, this situation warrants testing the system with the secondary standard bag set to determine if the false-alarm rate is within specifications. If the system is found—with the secondary standard bag set—to have a false-alarm rate that is outside of specifications, the FAA, the user, and the manufacturer should develop a plan to correct this problem.
Similarly, if the baggage throughput rate drops to a rate that is unacceptably low, the FAA, the user, and the manufacturer should develop a plan to correct this problem. Noncertified explosives-detection equipment comprise bulk explosives-detection equipment that have not met FAA certification requirements and trace detection devices, for which there are no FAA-defined certification criterion.
As a result of the Federal Aviation Reauthorization Act of Public Law , noncertified explosives-detection equipment has been, and will continue to be, deployed by the FAA in U. The panel believes that the FAA has the responsibility for determining the performance capabilities of all equipment deployed as per Public Law and for establishing a plan for maintaining the determined level of performance in the field.
To date, the FAA has not established formal performance specifications for noncertified explosives-detection equipment. The FAA has, however, developed a baseline test that is used to determine what noncertified bulk explosives-detection equipment will be deployed FAA, The FAA tests explosives-detection equipment against the same categories and amounts of explosives that are used in certification testing as well as amounts greater than and less than those used during certification testing to determine a performance baseline for that equipment.
A field test protocol for the deployed explosives-detection equipment may involve simulants for each of the categories in a secondary standard bag set, where the expectation would be that the average P D and P FA would be similar to that when explosives were used. For bulk noncertified systems and devices, the panel recommends that. Note that the bags that compose the primary and secondary standard bag sets will not necessarily be representative of those being processed at any one airport.
It is likely that the daily P FA for actual passenger baggage at a particular airport will vary from that determined using the secondary standard bag set. Therefore, the daily P FA should not be used for comparison against certification requirements for the purpose of disqualifying a deployed system from service. The FAA's protocol for evaluating trace explosives-detection devices uses extremely small amounts of known quantity of each category of explosives on a variety of substrates, with some substrates intentionally left uncontaminated.
Several trace explosives-detection devices were shown by the FAA to be capable of finding explosive materials on the surface of various types of carry-on luggage. These tests, however, do not distinguish between the capabilities of the machine and the ability of the operator to locate and adequately sample the contaminated surface. In contrast to bulk detection methods, trace detection systems are more likely to suffer from false negatives i.
Furthermore, for trace detection devices sample collection is operator dependent to the point that the performance of a trace device is directly dependent on the performance of the human operator. The panel determined that a performance-verification testing protocol for trace explosives-detection devices should test each of the following three tasks:.
In the preceding discussions, specific recommendations were made regarding performance verification and configuration management. However, performance verification and configuration management are elements of a broader quality system. A quality system structure should be put in place for oversight, validation, verification, and management of configuration management and performance-verification activities throughout the life cycle of explosives-detection equipment.
Every stakeholder must have a quality system in place that includes oversight, validation, verification, and procedures for configuration management and performance verification covering that stakeholder's responsibilities throughout the life cycle of an EDS. Each stakeholder must have a documented and auditable quality system that governs its specific responsibilities in the manufacture, deployment, operation, maintenance, and upgrading of EDSs.
To provide effective aviation security, each of the stakeholders should have a quality system in place. As the regulator of U. For example, the manufacturers of explosives-detection equipment need a quality system that deters unintentional performance degradation changes without stifling innovative product improvements. Air carriers, airports, and other end users need a quality system that ensures that baggage-handling facilities operate smoothly and that proper detection performance could be demonstrated as requested by the FAA.
In addition to ensuring confidence in its testing protocols, procedures, data handling, and test objects, the FAA must have its own quality system. All of the quality standards considered by the panel could provide the framework for development of quality systems that meet individual stakeholders needs.
Because there is already a global movement toward using the ISO series of quality system standards, the FAA should base both its in-house quality system and its requirements for other quality systems on these standards. The FAA should accept the quality system of any stakeholder that has the following attributes:.
It is the opinion of the panel that diligent adherence by the FAA to an auditable quality system e. Finally, an auditable quality system would provide a mechanism to determine the FAA's conformance to its own management and test and evaluation plans. The FAA should implement and document its own quality system under which precertification activities, certification testing, test standards development and maintenance, and testing for maintaining certification are conducted.
For a quality system to be effective in a manufacturing environment, the critical manufacturing steps must be defined. An ISO audit, for example, will verify that a manufacturer is following their quality system, but does not validate that the quality system results in the production of a certifiable EDS.
The FAA should ensure that each stake-holder periodically receive an audit of their quality system—including where applicable a configuration audit—from an independent third-party auditor. However, the FAA also purchases noncertified equipment for airport demonstrations and operational testing. As purchaser, the FAA should require that the manufacturers of non-certified equipment have in place a quality system that meets the same standards as the quality systems for manufacturers of certified equipment.
The FAA should require that the manufacturers of noncertified equipment demonstrate the same level of quality system covering both manufacturing and upgrade as required for manufacturers of EDSs when noncertified equipment is purchased by the FAA for airport demonstrations, operational testing, or airport deployment.
The FAA should ensure that the manufacturers of noncertified explosives-detection equipment purchased by the FAA periodically receive an audit of their quality system including a configuration audit—from an independent third-party auditor. Our team at AviationHunt is a group of aviation experts and enthusiasts. We aim to provide the best aircraft maintenance practices, technology, and aviation safety tips.
Your email address will not be published. Save my name, email, and website in this browser for the next time I comment. Skip to content. Aviation Quality Assurance Aviation quality assurance is a system for monitoring aviation equipment, programs, and procedures to ensure that the ICAO and state civil aviation regulatory quality standards are being met.
Aviation Quality System In the aviation industry, the implementation and operation of a quality assurance program is usually the prime responsibility of a Quality Manager QM. Quality Management System Quality Management System QMS is a means of ensuring that an organization meets the requirements and continuously improves its processes. Quality Audits The quality systems do not investigate incidents or accidents for the risk assessment.
QA Auditors The QA auditors have no involvement in the planning, performance, recording, or certifying of the work audited.
Facebook Twitter. Leave a Reply Cancel reply Your email address will not be published. Copy link. Copy Copied. Powered by Social Snap. Toggle Menu Close. Search for: Search. We use cookies on this website. If you continue using it, we assume you are happy with our privacy policy OK. As consultancy auditor, he not just report findings, but provide value-added service in recommending appropriate solutions.
Training: He has delivered public and on-site quality management training to over students. Other services: He has provided business planning, restructuring, asset management, systems and process streamlining services to a variety of manufacturing and service clients such as printing, plastics, automotive, transportation and custom brokerage, warehousing and distribution, electrical and electronics, trading, equipment leasing, etc.
Prior to becoming a business consultant 6 years ago, he has worked in several portfolios such as Marketing, operations, production, Quality and customer care. He is also certified in Six Sigma Black belt. View all posts by preteshbiswas. Spot on with this write-up, I actually believe that this website needs a lot more attention.
Like Like. You are commenting using your WordPress. You are commenting using your Google account. You are commenting using your Twitter account. You are commenting using your Facebook account. Notify me of new comments via email. Notify me of new posts via email.
Skip to content.
0コメント