Carnegie Mellon University
18-849b Dependable Embedded Systems
Spring 1999
Author: Adrian Drury
Social and legal concerns in embedded systems design and implementation are necessary considerations. It is generally acknowledged that no system can be "perfect," but at what point is a system "safe enough" to be saleable? These responsibilities motivate safety decisions and liability assessment. Responsibilities in these areas vary from industry to industry, as evidenced by the varying "value of a life" - how much money will be spent to save one more person. Additionally, the the threat of legal action in the event of the failure of an embedded system impacts design and financial decisions.
Social and legal concerns are closely related to a variety of other topics in the discipline of dependable embedded systems. Ethics is most closely related because ethical concerns often motivate choices made in the social and legal spheres. Profits and business models are also closely related -- unfortunately, social responsibility is often seen to be at odds with increasing profits. Safety-critical systems, and ultra-dependable systems in general, are closely related to social and legal concerns because of the ramifications of failure. Additionally, shoddy spares, customer circumvention and dependability standards are part of legal concerns.
There are a variety of motivating factors for this topic as it relates to dependable embedded systems. The primary motivating factor is that engineers design systems that are used in society daily. Sometimes, dependable embedded systems are systems that people depend on without even knowing they exist. How many consumers know about all the computers in a modern automobile, or elevator, or jet engine? A slight malfunction in the operation of any of those could result merely in inconvenience, or at worst, in the deaths of hundreds of people. Engineering as a profession, then, can be characterized as the "...[application] of scientific knowledge to the public good." [Cohen94] Working for the public good implies some level of involvement with and awareness of the society, which is also an obligation of engineers. One complicating aspect of this topic is that it can be difficult to enforce social responsibility. In an altruistic world, there would be no need for enforcement. Unfortunately, economic considerations sometimes take precedence over social responsibility. Legal recourse by "society" is then the balancing factor against lack of social responsibility.
Social responsibility can be thought of as positive ethics in action. During the design of a system, an engineer can make thousands of decisions about even the smallest aspects of a system. While it could be argued that every choice an engineer makes has an ultimate social affect, at that point it degenerates into a question of where to draw the boundary. There is however a kernel of truth in the children's story about how, for want of a nail, a kingdom was lost. One hopes that in general, positive ethical choices throughout the design process will at least marginally increase the social good of the system, or at least minimize as much as possible the potential harm.
Profits and business models are a very important topic related to social and legal responsibility. While it may be unfortunate that social responsibility is often seen to be at odds with increasing profits, it is often true. When considering profits, it is important to raise the issue of cost/benefit analysis. Usually, social responsibility is seen to be at odds with increasing profits because cost/benefit analysis has been performed and has shown that for a certain change in cost, the resulting change in benefit is "small enough" to be discounted. That can apply to both increasing benefit and decreasing cost. The cost to increase the benefit by a desired amount (fewer people killed, toxic chemicals released, etc) is deemed "too high." Similarly, the affects of cutting costs (more people killed, toxic chemicals released, etc) are deemed "insignificant enough" to warrant the cost reduction. [Perrow84] discusses a hypothetical example where a corporation can save fifty million dollars per year by not installing some piece of safety equipment. That savings means the company does not have to raise by one dollar the price on twenty million items, and can avoid a thirty million dollar cut in dividends. However, the risk is that one more worker will die per year. Is the benefit of saving one life worth fifty million dollars? Or conversely, is the benefit of saving fifty million dollars worth one human life? Business models of today, such as corporations responsible to maximize shareholder return, often force decisions in circumstances such as the example above.
Many of the comparative terms used in discussing this field, and in this paper are qualitative terms like "small enough," "too high," or "insignificant enough." The quantitative values of these terms vary from industry to industry, product to product and engineer to engineer. It is issues like these that make social considerations in dependable embedded systems difficult -- what does the term "enough" really mean in the context of safety and risk? When is a product safe enough? When the budget runs out? Today, safe enough sometimes means when the cost to make it any safer is more than the expected cost of dealing with its harmful effects (cost/benefit analysis again). Is this a socially responsible answer? Is this a positively ethical answer?
The "cost of a life" is integrally related to social responsibility and cost/benefit analysis. [Khan86] discusses the results of a variety of studies of the value of a life and concludes that a value of a life of about $8 million is accurate. It is important to remember that the value of a life is also affected by the public perception of the safety of, or risks associated with, a particular product or industry. Nuclear power plants can have millions of dollars spent to prevent the death of a worker. Airlines can spend a million dollars to prevent a crash. Yet improving road and highway conditions is limited to a value of a few tens of thousands of dollars per person. The widely differing values are a reflection of how society in general views the risks associated with each. Nuclear power plants are seen as much riskier than driving, for example, because people have no control over them, and the possible catastrophe is large, although the probability of problems is low. Airplanes are similar in this regard - people have no control over them, and when a plane crashes, the loss of life is usually significant. On the other hand, when driving, people feel they have control over the situation they are in and feel that it is a low risk activity. The stark reality is, however, that more than 41,000 people were killed in automobile accidents in 1997 in the United States [NHTSA97], while there were only 621 fatalities in U.S. general aviation in 1998 [NTSB98].
The legal aspects of dependable embedded system design are increasingly important to be aware of. Courts are increasingly holding individuals (such as engineers, in this case) responsible for their decisions, as they affect society. Social responsibility and legal accountability go hand in hand. Arguably the best remedy is not to end up in court in the first place. Socially responsible design decisions help that, although there are certain industries in which it seems that legal liability issues are inevitable, such as the automobile industry, or the nuclear power industry.
In industries where increased risk is part and parcel of everyday activities, there is likely to be industrial or governmental regulation. Being able to demonstrate that applicable regulations and standards were followed, and that the user was aware of the risk is sometimes enough to evade legal liability. Currently, however, financial liability is not necessarily correlated with culpability. Whether this furthers the goal of social responsibility depends on the particular case or award in question.
Contracts are used for a wide variety of situations to ensure mutual accountability between parties. For any product, there are explicit contracts with contractors and subcontractors. There can be contracts with suppliers, and contracts with employees, consultants and a variety of other personnel. Contracts lay out the responsibilities of the parties involved and the consequences for failure to perform the duties specified. However, legal recourse through lawsuits in the event of a breach of contract, or a perceived breach of contract, is becoming more and more common. This raises the question of the language or intent of the contract in the first place, although this is a very fuzzy area. There are certainly cases at both ends of a continuum - from frivolous lawsuits to class action lawsuits because of a particularly egregious violation, or gross societal disregard.
Shoddy spares and customer circumvention are two other aspects of legal concern. If parties other than the original manufacturer can supply spares for a system, the issue of shoddy spares is an important one. If the cost of spares is deemed to be "too high" a gray market of unqualified or possibly lower quality spares may develop. The legal ramifications of this issue can be serious for all parties involved, even if a shoddy spare never fails. Customer circumvention is an issue in any dependable embedded system, and more so in safety-critical or ultra-dependable systems. Customer circumvention is generally done to prolong the operation of a system when some part of it has degraded or failed and its use should be discontinued. Often in safety-critical systems, such an event will cause the system to automatically halt operation or enter a simpler, wait state until maintenance can be performed. Customer circumvention can be especially dangerous in such an event, especially if the user does not know why the system has halted. Legal liability in a case such as this can hinge on whether or not the design of the system made it "difficult enough" to circumvent its internal safety systems, and whether or not the user was aware of the risk of doing so.
The actions and decisions of an engineer with respect to social responsibility are affected by his morals and sense of ethics, and also by external considerations such as profit and economics. Engineers should try to work for the public good in all that they do -- sometimes socially irresponsible choices can have negative legal consequences. In fields such as dependable system design, social aspects of a design must be approached carefully because of the greater affect of the design on society. The social responsibility of engineers is likely to increase as time goes on. That increase, combined with legal liability may make prices of certain technologies or products rise significantly to cover certain levels of risk. Some technologies or products may not be commercialized, or may be discontinued from the market because of the associated cost and legal liability.
[Cohen94] Cohen, Stephen and Grace, Damian. Engineers and Social Responsibility: An
Obligation to Do Good. IEEE Technology and Society Magazine, Fall 1994, pp. 12-19.
Good discussion of social responsibility as it relates to engineering as a profession.
[Perrow84] Perrow, Charles. Normal Accidents: Living with High-Risk Technologies. Basic
Books. ISBN 0-465-05144-8.
Excellent book about risks of technology and accidents inherent in the world in which we live.
[Kahn86] Kahn, Shulamit. Economic Estimates of the Value of Life. IEEE Technology and Society Magazine, June 1986, pp. 24-31.
Analysis of studies of the value of a human life. Also has analysis of the factors relating to the value of a life.
[NHTSA97] http://www.nhtsa.dot.gov/people/ncsa/ovrfacts.html Visited April 30, 1999.
Good information about highway fatalities.
[NTSB98] http://www.ntsb.gov/aviation/Table1.htm Visited April 30, 1999.
Good information about air fatalities, broken down by category.