Notes on:
Benchmarking Commercial Reliability PracticesRAC |
|
Benchmarking Commercial Reliability Practices, Ned Criscimagna, Reliability Analysis Center, Rome NY, 1995. (140 pages)
I DON"T PUBLISH NOR SELL THIS BOOK -- CONTACT RAC FOR INFO ON OBTAINING IT!
This reports the results of a business "benchmarking" exercise in which 41 companies from seven industries were surveyed to determine their current reliability practices. 11 companies were then interviewed.
As with all business benchmarking exercises, the results do not prove whether or not any particular practice is optimal, but rather gives a measure for what industry currently practices and what they see as practicing in the near future. The actual report is relatively brief at 32 pages; the remainder of the report is bulked up with a survey of reliability books in the RAC library, and details about the surveys and interviews.
The summary results of the benchmark (page 29) are that the following are most universally implemented or deemed important (there was a note elsewhere stating that in some cases items deemed important were not actually practiced):
- "Completely analyze all failures, regardless of when or where they occur in development, to identify the root cause of failure and determine the necessary corrective action, including redesign and revision of analytical tools and models.
- "Avoid dedicated reliability demonstration testing. If required, demonstrations should focus on new components or assemblies, or the integration of old items in a new way. Emphasize engineering development testing to understand the and validate the design process and models. Accelerated testing should be used to age high reliability items and to identify their failure mechanisms.
- "Assign responsibility for reliability to an Integrated Product Team (also referred to as a Product Development Team). Give the team the authority to determine the reliability requirements and to select the design, analysis, test, and manufacturing activities needed to achieve that reliability."
Topic coverage: (*** = emphasized; ** = discussed with some detail; * = mentioned)
*** | Dependability | Electronic Hardware | * | Requirements | |||||
Safety | Software | * | Design | ||||||
Security | Electro-Mechanical Hardware | * | Manufacturing | ||||||
Scalability | Control Algorithms | * | Deployment | ||||||
Latency | Humans | * | Logistics | ||||||
Affordability | *** | Society/Institutions | Retirement |
Other topics: dependability math, data, common-cause failures, dependability assessment
Abstract:
The Reliability Analysis Center (RAC) conducted a fact-finding, study project to benchmark the reliability practices used by commercial industry. The project was performed for the Office of the Under Secretary of Defense for Economic Security, Weapon Support Improvement Group.
The project consisted of four distinct tasks: a literature search, a survey of the reliability practices of a wide range of commercial companies, personal interviews of a smaller group of companies, and an analysis of the data collected. Based on the results of these tasks, areas of commonality and divergence among commercial reliability practices were identified, as wen as the general commercial approach to designing, developing, and manufacturing reliable products. Four benchmarks and eight Keys to Success were derived from the findings and conclusions. Insights were gained into the motivations behind commercial companies' approaches and their use of military specifications, standards, and handbooks.
The report includes a discussion of Defense Acquisition Reform, a bibliography, summaries of selected documents, and the results of the survey and interviews. In the case of the surveys and interviews, information was treated on a not-for-attribution basis.
1.0 INTRODUCTION 1 1.1 Background 1 1.2 Benchmarking 2 1.3 Organization of Report 4 2.0 TECHNICAL APPROACH 5 2.1 Overview 5 2.2 Literature Search 5 2.3 Surveys 6 2.3.1 Rationale for Surveys 6 2.3.2 Selection and Qualification of Companies 7 2.3.3 Development of Survey Form 9 2.4 Interviews 10 3.0 RESULTS 13 3.1 Literature Search Results 13 3.2 Survey Results 13 3.3 Interview Results 16 3.4 Overall Discussion 17 3.5 General Findings 20 3.5.1 The Environment 20 3.5.2 The Practices 21 3.6 Specific Findings 22 4.0 CONCLUSIONS 27 4.1 The Practices 27 4.1.1 Reliability Tasks 27 4.1.2 Approach to Achieving Reliability 28 4.2 The Contracting Environment 28 4.3 The Benchmarks 29 4.4 Other Keys to Success 29 5.0 AREAS REQUIRING ADDITIONAL RESEARCH 31 APPENDIX A: DEFENSE ACQUISITION REFORM APPENDIX B: LITERATURE SEARCH RESULTS APPENDIX C: SURVEY FORMS APPENDIX D: INTERVIEW QUESTIONS AND NOTES APPENDIX E: TERMS AND DEFINITIONS APPENDIX F: RAC PRODUCT ORDER FORM
Go to: other books | resource page
Philip Koopman: koopman@cmu.edu