Environmental Stress Screening (ESS) is an often-misunderstood tool of the reliability practitioner. When dealing with ESS one must think of it as a process rather than a test. There is no accept/reject criterion and failures are welcomed.
ESS Programs, which are applied during the development and production phases, can yield significant improvements in field reliability and reductions in maintenance costs. Application during development can reap significant savings in test time and costs as a result of eliminating or reducing the number of latent defects prior to qualification tests. The benefits for the manufacturer include: insights into the sources of reliability problems in the product or process, better control of rework costs, decreased warranty costs and the opportunity to determine corrective actions that eliminate the sources of reliability problems from the product or process.
This START sheet addresses the steps necessary to implement an effective ESS Program.
ESS is a process in which environmental stimuli, such as rapid thermal cycling and random vibration, are applied to electronic items in order to precipitate latent defects to early failure. An equally important and inseparable aspect of the screening process is the items electrical testing that is done as part of the screen, so as to detect and properly identify the defects that have been precipitated to failure.
Contrary to popular belief ESS does not increase the inherent reliability of a product. The inherent reliability of a product is driven primarily by the design. ESS is not a substitute for but an integral part of a sound reliability program conducted during the design and development phases.
There are three phases in the development of an ESS Program:
ESS Planning Identify the equipment to be screened, develop quantitative goals for the ESS Program, and describe initial screens.
ESS Implementation Identify the organizational elements that will be responsible for conducting the screening activity, and the Failure Reporting And Corrective Action System (FRACAS) to be used for documenting failures.
ESS Monitoring Continuously monitor the screening process to ensure that is both technically and cost effective.
Historically two basic approaches have been used in the applying of stress screens. In one approach, the customer explicitly specifies the screens and screening parameters to be used. In the second and preferred approach, the contractor develops a screening program that is tailored to the product.
The tailored approach requires: 1) an estimate be made of the initial part and manufacturing type latent defects present in the equipment, 2) a determination of the maximum allowable latent defects present in the equipment after ESS, and 3) the development of screens that have a sufficient screening strength based on 1) and 2). A block diagram depicting this approach is found in Figure 1.
Typical latent defects include cold solder joints, broken/damaged wires, loose hardware, ESD damage, and handling damage. The best way to estimate the initial number of latent defects is from company experience data. This experience data includes history on other programs, qualification test results, and field failure data. MIL-HDBK-344 ESS of Electronic Equipment also presents a method for determining initial latent defects based on system complexity.
Figure 1. ESS Program Sequence of Events (Click to Zoom)
No ESS program is perfect and some latent defects may remain in equipment after screening. MIL-HDBK-344 presents the following equation for determining maximum allowable number of latent defects:
FR = failure rate from latent defects
Required MTBF is the customer required
Inherent MTBF is the predicted MTBF
Safety Margin typically ranges from 1.5 to 2
The calculation of a stress screen's strength is relatively straight forward and is detailed in MIL-HDBK-344. The selection of which stress screens to apply and at what assembly level is the more difficult question to tackle. Cost is a prime driver here and the goal is to select the most effective screen for the least cost in the minimum time. Environmental limitations of the equipment and test equipment available must also be considered.
Figure 2, which is based on MIL-HDBK-344, provides guidance for initial screen selection and placement.
Level of Assembly
• Cost per flaw precipitated is lowest (unpowered screens).
• Test detection efficiency is relatively low.
• Higher test detection efficiency than assembly level.
• Cost per flaw significantly higher than assembly level.
To have an effective ESS program management must be committed to provide the time and resources needed to adequately support it. The roles of all participants must be clearly defined. Daily meetings are usually required when first implementing an ESS program, as the process of moving from paper concepts to physical tests can be daunting. It is not uncommon to revise ESS plans due to implementation issues.
A FRACAS forms the backbone of an effective ESS program. It provides the data needed to identify, track and resolve deficiencies. In addition to failures of the equipment under test, failures of the test equipment, environmental equipment and test software should be included in the FRACAS. At the heart of any FRACAS is a database used for data retention, analysis, and reporting.
The type and timeliness of data entered into the database needs to be closely managed. As a minimum, the following data should be captured upon failure:
Location of failure
Test being performed
Date and time
Part number and serial number
Circumstances of interest
Individual who observed failure
Upon completion of the ensuing failure analysis, the following additional data should be captured:
Failure root cause
Corrective action taken
Date and serial number of corrective action cut-in
Ideally, the failure data should be captured in real-time. As a minimum, data should be captured at the end of each production shift to ensure those pertinent details of the failure are not forgotten. To facilitate quick data entry the FRACAS database must be user-friendly and available to all that will be involved with testing conducted during ESS.
The FRACAS database should also have an easy to use reporting capability. Reports from the database will be the primary source of insight into an ESS program. Timely, detailed reports will indicate the effectiveness of the current program, provide a means to evaluate refinements in the program, and keep management satisfied that the ESS program is working.
Statistical Process Control (SPC) and Pareto charts are the primary tools for monitoring ESS performance. Prepared from data contained in the FRACAS database, these reports are used to monitor key ESS parameters against established requirements. Typical key ESS parameters include Incoming Latent Defects, Stress Screening Strength, Latent Defects Remaining, Defect Trend Analyses, and Field Failures Observed.
The SPC chart presents a graphical comparison of actual results to requirements. The expected statistical variation due to sample size is calculated using a Poisson distribution. This variation, commonly expressed as ±3 standard deviations, is plotted along with the actual results.
Pareto charts are used to display breakdowns of failure causes and are useful for showing defect frequency. Often, the actual results reported on a Pareto are guard banded by the high and low expected results.
The ESS process is a closed-loop process and relies upon information obtained through monitoring to valuate and improve the implemented screens. It is only through this feedback that the ESS program can remain balanced in terms of effective latent defect removal and cost. As stated earlier ESS is not a test and should not be viewed as having rigid requirements. With time, new key parameters may surface and previous key parameters may no longer be of concern. Changes in manufacturing techniques may eliminate some latent defects and introduce new ones. To remain effective, the ESS program must evolve.
For Further Study
Web Sites. Additional information on ESS and Environmental Test Equipment can be obtained from the following web sites.
MIL-HDBK-344 was the primary source for this sheet.
Department of Defense. MIL-HDBK-344A, Environmental Stress Screening of Electronic Equipment. Washington, D.C., August 1993.
Institute of Environmental Sciences and Technology. Management and Technical Guidelines for the ESS Process. Mount Prospect, IL, 1999.
Rome Laboratory. RL-TR-91-300, Evaluation of Quantitative Environmental Stress Screening Methods. Rome, NY, November 1991.
Rome Laboratory. RL-TR-94-233, Environmental Stress Screening Process Improvement Study. Rome, NY, December 1994.
About the Author
* Note: The following information about the author(s) is same as what was on the original document and may not be correct anymore.
John P. Farrell is a Senior Engineer with IIT Research Institute, where he has worked on projects implementing proven reliability techniques and solutions. He is the primary interface for answering technical inquiries. He staffs both the technical inquiry telephone line and the on-line inquiry form available at the RIAC web site. Mr. Farrell is also part of the core team supporting the release of the PRISM system reliability assessment tool.
Before joining IITRI, he spent 15 years in Specialty Engineering with Lockheed Martin Corporation working in the areas of reliability, maintainability, and logistics for fire control and IR sensor systems. He received the Lockheed Martin APEX Award in 1998 for superior job performance. While at Lockheed Martin he planned, developed and implemented a number of ESS, Reliability Growth Test and Environmental Qualification Test programs.
Mr. Farrell holds a B.S. in Computer Science (minor in Electrical Engineering) from the State University of New York and has completed some post-graduate studies at Rensselaer Polytechnic Institute. He also graduated from the GE Manufacturing Studies program.