Created by Materia for OpenMind Recommended by Materia
22
Start Technology and the Burden of Responsibility
Article from the book Values and Ethics for the 21st Century

Technology and the Burden of Responsibility

Estimated reading time Time 22 to read
In the techno-lifeworld, responsibility has become a pervasive theme, and new obligatory form of good. This is differentially manifested in the law through strict liability for technological products; in Christian theologies that emphasize responding to God in a secular age; in engineering, professional responsibility for public safety, health, and welfare; in the social responsibility of scientists to consider the implications of research; and through increased philosophical attention to responsibility as a concept in ethics. The experience of living with the expanded powers of the techno-lifeworld calls for enlarged measures of responsibility, thus inserting into human experience special and unique burdens that can be formulated as a duty plus respicere, to exercise more conscious reflection in human action than has ever previously been the case.

Over the course of the five hundred years after 1500, traditional hand and human-based technics was transformed through a scientific exploitation of previously unseen forces into what is now known as modern technology. Such technology is complicit in all the most basic problems facing humanity in the opening decades of the 21st century—whether nuclear (weapons and power plants), chemical (environmental pollution), medical (life-extension and body hybridization), biological (loss of biodiversity, biotechnology), informational (overload, privacy, and virtual reality), climatological (global transformations of sky, sun, ocean, and earth), and more. Despite on-going efforts to address such challenges with scientific research and technological innovation, responses remain fundamentally ethical in character. Technological fixes require ethical reflection concerning which of the available design options to pursue. Yet so overwhelmed are we with conflicting crises and divergent interest group arguments for different solution paths that it is often difficult to think. How can we begin to assess the techno-human condition in which we now live and move and have our being?

In the presence of this dynamism of challenges, there has been a promiscuous, polymorphous invocation of the concept of ethical responsibility. Scientists have obligations for the responsible conduct of research. Physicians must be responsible to their patients. Engineers are responsible for protecting public safety, health and welfare in the design of structures, products, processes, and systems. Entrepreneurs have responsibilities to commercialize science and technology for public benefit, and the public itself is asked to responsibly support science and technology. People are counseled to practice responsible sex. Consumers are admonished to be responsible users of the artifacts and opportunities saturating the techno-lifeworld. Governments must be responsible to their citizens, corporations to their investors, schools to their students.

Despite on-going efforts to address such challenges with scientific research and technological innovation, responses remain fundamentally ethical in character

So diversely referenced, what does responsibility mean? Appeal to responsibility is all but absent in traditional ethical discourse, whether that discourse is centered on virtue, rights, contract, utility, or duty. Insofar as responsibility is present in received moral theory it awaits disclosure or interpretation. Indeed, in English the abstract noun “responsibility” (although not the adjective “responsible”) is only a few hundred years old, and it has emerged to cultural and ethical prominence in diverse legal, religious, engineering, scientific, and philosophical contexts precisely through progressive engagements with technology. One way to try to define the meaning of responsibility therefore appropriately begins with a review of this history.

The contraction and expansion of legal liability

The legal term for responsibility is “liability.” Liability is differentially manifest in two types of law: criminal and civil. Criminal law deals with those offenses prosecuted and punished by the state in order to protect a public interest. Civil law includes breaches of explicit or implicit contract in which injured parties sue for compensation or damages.

Criminal liability was once construed to follow simply from a transgression of the external forum of the law—doing something the law proscribed or not doing something it prescribed. But as it developed in Europe under the influence of a Christian theology of sin, which stresses the importance of inner consent, criminal liability became appreciative of the internal forum of intent. The result is a distinction between unintended transgressions (accidental homicide) and intentional acts (first degree murder). The result has been a historical contraction of criminal liability insofar as punishments for the former are less severe than for the latter.

In contrast to the contraction in criminal liability, civil liability has expanded in scope as a result of progressive delimitations on the requirements for intentionality. Civil liability can be incurred by contract or by what is called “strict liability.” In the case of explicit or implicit contract, intentional fault or negligence (a kind of failure of intention) must be proved. In the case of strict liability there need be no fault or negligence per se. In strict liability, a person can be responsible for harm caused by an action whether intended or not.

The concept of strict or no-fault liability as a special kind of tort for which the law provides redress developed in parallel with modern industrial technology. In pre-modern Roman law, an individual could sue for damages only when losses resulted from intentional interference with person or property, or from negligence. By contrast, in the English common law case of Rylands v. Fletcher, decided on appeal by the House of Lords in 1868, Thomas Fletcher was held liable for damages caused by an industrial undertaking despite their unintentional and non-negligent character. Fletcher, a mill owner, had constructed a water reservoir to support his mills. Water from the reservoir inadvertently leaked through an abandoned mine shaft to flood John Ryland’s adjacent mine. Although he admitted Fletcher did not and perhaps could not have known about the abandoned mine shaft, Rylands sued for damages. The eventual ruling in Rylands’ favor was based on the idea that the building of a dam, which raised the water above its “natural condition,” in itself posed a hazard for which Fletcher was liable. Unintended consequences was no excuse.

Today the most common kinds of civil liability are just such no-fault or prima facie liabilities related to “non-natural” industrial workplaces and engineered products in which artifacts in themselves, independent of intent, pose hazards. In the United States one key case establishing this principle was that of Greenman v. Yuba Power Products, Inc., decided on appeal by the California Supreme Court in 1963. In the words of Chief Justice Roger Traynor, in support of the majority:

A manufacturer is strictly liable in tort when an article he places on the market … proves to have a defect that causes injury to a human being … The purpose of such liability is to ensure that the costs of injuries resulting from defective products are borne by the manufacturers … rather than by the injured persons who are powerless to protect themselves.

The expansion of legal liability is thus coordinate with and in response to issues engendered by the expansive presence of technological products the consequences of which users find it increasingly difficult to appraise.

Religious responsibility in a secular age

The term “responsibility” itself comes from the Latin respondere, “to promise in return” or “to answer.” As such it readily applies to what is the primordial experience of the Judeo-Christian-Islamic tradition: a call from God that human beings must accept or reject.

The discovery and development of religious responsibility again parallels increased appreciation of the ethical issues emerging in association with science and technology. It is in opposition to notions of secularization and control over nature, for instance, that the Swiss Protestant theologian Karl Barth (1886–1968) distinguished between worldly and transcendent relationships. God is the wholly other, the one who cannot be reached by scientific knowledge or influenced by technological power. There is a radical difference between the human attempt to reach God (which Barth terms religion) and the human response to God’s divine revelation (faith). In his Church Dogmatics (1932) Barth goes so far as to identify goodness with responsibility in the sense of responding to God.

Catholics have been no less ready to make responsibility central to their understanding of moral theology. For Canadian Jesuit Bernard Lonergan (1904–1984), “Be responsible” is a transcendental precept coordinate with the duties to “Be attentive,” “Be intelligent,” and “Be reasonable.”

Responsibility also plays a prominent role in the documents of Vatican II. At one point, after referencing the achievements of science and technology, Gaudium et spes (1965) adds that, “With an increase in human powers comes a broadening of responsibility on the part of individuals and communities” (no. 34). Later, this same document on the church in the modern world states, “We are witnesses of the birth of a new humanism, one in which man is defined first of all by his responsibility toward his brothers and toward history” (no. 55).

One sustained effort to articulate a general Christian ethics of responsibility can be found in H. Richard Niebuhr’s The Responsible Self (1963). Niebuhr contrasts the Christian anthropology of the human-as-answerer to the secular anthropologies of human-as-maker and human- as-citizen. For human-as-maker, moral action is essentially consequentialist and technological. For human-as-citizen, morality becomes deontological in character. With human-as-answerer, the tension between consequentialism and deontology is bridged by responsiveness to a complex reality, by an interpretation of the nature of this reality—and by attempts to fit in, to act in harmony with what is already going on. “What is implicit in the idea of responsibility is the image of man-the- answerer, man engaged in dialogue, man acting in response to action upon him” (Niebuhr 1963, 56). Niebuhr’s ethics of responsibility exhibits affinities with an ecological ethics.

This feature of Niebuhr’s theology of responsibility also suggests a weakness. Niebuhr wrote in an age that was becoming increasingly secular, where belief and the experience of God is, in Charles Taylor’s (2007) description, more and more simply one option among many—and not the easiest to affirm. More persuasive calls come from experiences of strictly this-worldly flourishing. In these cases, however, responsive commitments have to be “mobilized” as environmental or libertarian movements utilizing methods analogous to those deployed by engineers to design and bring into existence large-scale material constructions. Calls of this form are likewise often experienced after the manner of the new archetype of the telephone: as an electronic interruption to be picked up or not, as we will.

Engineering responsibility for public safety, health and welfare

Technologists and engineers as inventors of such commercially crucial communication devices as telephones and computers are more subject than others in the techno-lifeworld to both external (legal, economic) and internal (ethical professional) constraints. Indeed, since the early 20th century engineers, especially in the United States where they largely work outside explicit state control and as employees of private corporations, have attempted to formulate guidelines for professional conduct as an internal ethics of responsibility—precisely because of the technological powers they wield.

Engineering associations aspire to the formulation of codes of conduct similar to those found in medicine and law. Yet unlike medicine, with its ideal of health, or law and its ideal of justice, it is unclear precisely what general ideal could serve in engineering as the basis for a professional ethics. The original engineer (Latin ingeniator) was the builder and operator of battering rams, catapults, and other “engines of war.” Engineering was originally military engineering. As with all military personnel, the engineers’ behavior was primarily dictated by the duty of obedience to hierarchical authority.

The 18th century emergence of civil engineering in the design and construction of public works such as roads, water-supply and sanitation systems, and other non-military infrastructure did not initially alter this situation. Civil engineers were loyal members of whatever social institutions in which they served. But as technological powers in the hands of engineers began to enlarge, and the number of engineers increased, tensions mounted between subordinate engineers and their superiors. The manifestation of this tension is what Edwin Layton (1971) called the “revolt of the engineers,” which occurred during the late 19th and early 20th centuries. It was in association with this revolt and its aftermath that “responsibility” entered the engineering ethics vocabulary.

One influential effort at formulating engineering responsibility led to the technocracy movement and the failed idea that engineers more than politicians should wield political power. Henry Goslee Prout, a former military engineer who became general manager of the Union Switch and Signal Company, speaking to the Cornell Association of Civil Engineers in 1906, described the profession in just such leadership terms:

The engineers more than all other men, will guide humanity forward … On the engineers … rests a responsibility such as men have never before been called upon to face (quoted from Akin 1977, 8).

Engineering associations aspire to the formulation of codes of conduct similar to those found in medicine and law

At the height of this dream of expanded engineering responsibility— following his successful leadership, as Secretary of Commerce, of the response to the great 1927 floods on the Mississippi River—Herbert Hoover was elected the first civil-engineer president of the United States. The same time period witnessed the creation of a political technocracy movement that fielded its own candidates for elective office. The ideology of technocracy sought to make engineering efficiency an ideal analogous to those of medical health and legal justice.

The problem with this ideal was two-fold. First, the elevation of efficiency to ideal status tends to undermine democracy. The major totalitarianisms of the mid-20th century (communism and fascism) often justified themselves by appeals to efficiency. Second, the ideal of efficiency itself, as a ratio of outputs over inputs, is context-dependent; efficiency is subject to multiple interpretations, depending on how the inputs and outputs themselves are defined.

Influenced in part by the communist and fascist contaminations of efficiency, during World War II another shift took place in the engineering conception of responsibility: not from company and client loyalty to technocratic efficiency, but from private to public loyalty. Beginning in the late 1940s, professional codes of engineering ethics in the United States increasingly made protection of public safety, health, and welfare a paramount responsibility. Having failed in formulating a technical ideal as the basis of responsibility, engineers emphasized a commitment to safety, health, or welfare in the public realm—even though in many instances their relevant expert knowledge was quite limited (Mitcham 2009).

With engineering under attack as a cause of environmental pollution, for the design of defective consumer goods, and as too willing to feed at the trough of the defense contract, one American engineer, Frank Collins, summed up the situation in the mid-1970s as follows:

Unlike scientists, who can claim to escape responsibility because the end results of their basic research cannot be easily predicted, the purposes of engineering are usually highly visible. Because engineers have been claiming full credit for the achievements of technology for many years, it is natural that the public should now blame engineers for the newly perceived aberrations of technology (Collins 1973, 448).

In other words, engineers may have oversold their responsibilities and are justly being chastened.

For Collins, the responsibilities of engineers are in fact quite limited. They have no general responsibilities, only specific or special ones:

There are three ways in which the special responsibility of engineers for the uses and effects of technology may be exercised. The first is as individuals in the daily practice of their work. The second is as a group through the technical societies.
The third is to bring a special competence to the public debate on the threatening problems arising from destructive uses of technology (Collins 1973, 449).

This debate, formalized in various technology-assessment methodologies and governmental agencies, can be interpreted as a means of subordinating engineers to the larger social order. Yet the issue of responsibility has so intensified that engineers now commonly and consciously debate the scope of their responsibilities relative to issues not previously acknowledged.

Science and social responsibility

The debate with regard to responsibility has been equally pronounced in science. Efforts to define the responsibility of scientists have involved a refinement of the Enlightenment view that science has the best handle on truth and is thus essentially and under all conditions beneficial to society. From the Enlightenment perspective, the primary responsibility of scientists is simply to pursue and extend their disciplines. Using the knowledge they produce, scientists then have a responsibility to educate the public about the nature of reality—to speak the truth to traditional authorities and drive superstition from public affairs.

Historically this Enlightenment responsibility found expression in Isaac Newton’s hope for science as theological insight, Voltaire’s belief in its comprehensive utility, and Baruch Spinoza’s thought that in science one possesses something pure, unselfish, self-sufficient, and blessed. A classic manifestation was the great French Encyclopédie (1751–1772), which sought “to collect all the knowledge that now lies scattered over the face of the earth, to make known its general structure to the men among whom we live, and to transmit it to those who will come after us.” Such a project, wrote Denis Diderot, demands “intellectual courage.” In the words of Immanuel Kant, Sapere aude, Dare to know.

Questioning of this tradition has roots in the Romantic critique of scientific epistemology and industrial practice. Only after World War II, however, did scientists themselves begin to have any serious questions of their own. Since then one may distinguish four overlapping phases. Simplifying somewhat, in the first (1945–1965), scientists recognized the potentially adverse unintended consequences of some of their work and tried to help society to adjust accordingly. In the second (1965–1985), some scientists aspired to transform the inner character of science itself. In a third (1985–2000), there was a renewed defense of science and affirmation of its value while recognizing the need for better internal professional self-regulation. More recently (2000–present), science has become a battleground of competing interpretations of responsibility and policy interests.

Phase One: Recognizing Responsibilities. In December 1945, the first issue of the Bulletin of the Atomic Scientists began with a statement of goals for the newly formed Federation of Atomic (later American) Scientists. Members should “clarify … the … responsibilities of scientists in regard to the problems brought about by the release of nuclear energy” and “educate the public [about] the scientific, technological, and social problems arising from the release of nuclear energy.” Previously scientists would have described their responsibilities as restricted to doing good science, not falsifying experiments, and cooperating with other scientists. Now, because of the potentially disastrous implications of at least one branch of science, scientists felt their responsibilities enlarged. They were called on to take into account more than the procedures of science; they had to respond to a transformed situation.

The primary way the atomic scientists responded over the next decade to the new situation created by scientific weapons technology was to work for placing nuclear research under civilian control in the United States and to further subordinate national to international control. They did not, however, oppose the exceptional growth of science. As Edward Teller wrote in 1947, the responsibility of the atomic scientists was not just to educate the public and help people establish a civilian control that would “not place unnecessary restrictions on the scientist;” it was also to continue to pursue scientific progress. “Our responsibility,” in Teller’s words, “is [also] to continue to work for the successful and rapid development of atomic energy” (Teller 1947, 355).

Science has become a battleground of competing interpretations of responsibility and policy interests

Phase Two: Responsible Questioning. During the mid-1960s and early 1970s, however, there emerged a second-stage questioning of scientific responsibility. Initially this questioning arose in response to growing recognition of the problem of environmental pollution—a phenomenon that cannot be imagined as alleviated by any simple de-militarization of science or increases in democratic control. Some of the worst environmental problems are caused precisely by democratic availability and use—as with pollution from automobiles, agricultural chemicals, and aerosol sprays, not to mention the mounting burden of consumer waste disposal. Rachel Carson’s Silent Spring (1962) was an early statement of the problem that called for an internal transformation of science and technology themselves. But an equally focal experience during this second-stage movement toward an internal restructuring of science was the Asilomar Conference of 1975, which addressed the dangers of recombinant DNA research.

After Asilomar, it turned out that the danger of recombinant DNA research was not as immediate or great as feared, and some members of the scientific community became resentful of post-Asilomar agitation. Increased possible consequences nevertheless further broadened the scope of what could be debated as the proper responsibility of scientists. Robert L. Sinsheimer, for instance, himself a respected biological researcher and chancellor of the University of California, Santa Cruz, argued that modern science was based on two faiths. One is “a faith in the resilience of our social institutions … to adapt the knowledge gained by science … to the benefit of man and society more than the detriment”—a faith that “is increasingly strained by the acceleration of technical change and the magnitude of the powers deployed” (Sinsheimer 1978, 24). But even more telling is a faith in the resilience, even in the benevolence, of Nature as we have probed it, dissected it, rearranged its components in novel configurations, bent its forms, and diverted its forces to human purpose. The faith that our scientific probing and our technological ventures will not displace some key element of our protective environment, and thereby collapse our ecological niche. A faith that Nature does not set booby traps for unwary species (Sinsheimer 1978, 23).

This kind of argument points toward subsequent affirmations and promotions of critical science (Ravetz 1971), stewardship science (Lowrance 1985), post-normal science (Funtowicz and Ravetz 1993), and mode-2 knowledge production (Gibbons et al. 1994). In each case the idea is that science can no longer be pursued without some degree of reflexivity or self-consciousness about its assumptions and social contexts—especially the ways in which its products become engaged with social, political, and economic contexts.

Phase Three: Re-emphasizing Ethics. The attempt to transform science from within was overtaken in the mid-1980s by a new external criticism not of scientific products (knowledge) but of scientific processes (methods). A number of high-profile cases of scientific misconduct raised questions about whether public investments in science were being wisely spent. Were scientists simply abusing the public trust? Some economists also began to question whether, even insofar as scientists did not abuse the public trust, but followed responsible research practices, science contributed as much to economic progress as had previously been assumed.

The upshot was that the scientific community undertook a self- examination of its ethics and its efficiency. Efforts to increase ethics education, or education in what became known as the responsible conduct of research (RCR) became a required part of science education programs, especially in the biomedical sciences at graduate level. Increased efficiency in grant administration, management, and accountability became issues for critical assessment, so that since the 1990s scientists have increasingly been understood to possess social responsibilities that include the promotion of ethics and efficiency in the processes of doing science. When supported with public funding, science has also increasingly been required to justify itself, in terms used by the US National Science Foundation, with reference not just to intellectual merit but also to broader impacts.

Accordingly, scientists have attempted to re-emphasize the importance of science to national healthcare, the economy, environmental management, and defense. In the face of the AIDS epidemic, biomedical research is presented as the only answer. Computers, biotechnology and nanotechnology have been offered as gateways to new competitive advantages and the creation of whole new sectors of jobs. The understanding of such phenomena as global climate change is argued to depend on computer models and the science of complexity. Finally, especially since the suicide attacks of September 11, 2001, new claims have been made for science as a means to develop protection against the dangers of international fundamentalist terrorism. The social responsibility of science is defended as the ethically-guided production of knowledge that addresses a broad portfolio of social needs, from the promotion of health to the defense of civilization. Appropriately enough, during this same period sociologists and historians have begun to re- conceptualize science in terms of its social construction and to emphasize the extent to which boundaries have broken down between science and technology so that the two have merged into something better termed “technoscience.”

The social responsibility of science is defended as the ethically-guided production of knowledge that addresses a broad portfolio of social needs, from the promotion of health to the defense of civilization

Phase Four: Policy Battles. Insofar as technoscience is seen as socially constructed, its social, political, and economic engagements readily become contested. Since the turn of the century technoscience has increasingly become a policy battleground. Scholars of science- technology-society (STS) relations have criticized technocratic positivism in science-policy formation. Fundamentalist Christians have charged atheist scientists with using biological evolution and human embryonic stem-cell research to promote a secular humanist agenda. Neo- conservative economists and politicians have charged climate modelers with promoting socialist ideologies under the subterfuge of claims to sound science and proposals for dramatic changes in energy production and use. Scientists, progressive politicians, and ecological economists have retaliated with exposés of science distorted in the name of corporate and conservative political interests. In such circumstances distinctions that had been central to the practicing of social responsibility in and with science—distinctions such as those between facts and values, scientists and politicians, omission and commission—appear increasingly frail if not philosophically indefensible.

Responsibility in philosophy

The turn to responsibility in philosophy, like that in theology, exhibits two faces: first, a reaction to the challenge posed by the dominance of scientific and technological ways of thinking; second, an attempt to take into account the rich and problematic complexity of technoscientific practice. The first is prominent in Anglo-American analytic discourse, the second in European phenomenological traditions.

According to Richard McKeon (1957) the concept of responsibility has diverse philosophical roots, one of which is the Greek analysis of causality (or imputability) and punishment (or accountability) for actions. As McKeon initially noted: “Whereas the modern formulation of the problem [of responsibility] begins with a conception of cause derived from the natural sciences and raises questions concerning the causality of moral agents, the Greek word for cause, aitia (like the Latin word causa), began as a legal term and was then extended to include natural motions” (McKeon 1957, 8–9). But it was in efforts to defend moral agency against threats from various forms of scientific materialism that the term became prevalent in analytic philosophy. For instance, H. L. A. Hart’s (1968) distinctions between four kinds of responsibility—role, cause, liability, and capacity—are all related to issues of accountability as they occur in a legal framework, where they can be used to articulate a theory of punishment that meets challenges posed by modern psychology.

McKeon’s general thesis is that the term “responsibility” appeared in late 18th and early 19th century moral and political discourse—as an abstract noun derived from the adjective “responsible”—in coordination with the expansion of democracy. But there are also numerous historical connections between the rise of democracy and the development of modern technology. On the theoretical level, the possessive individualism of homo faber, developed by Thomas Hobbes and John Locke, prepared the way for democracy and the new industrial order. On the practical level, democratic equality and technology feed off each other.

But the connection goes deeper. According to McKeon, responsibility was introduced into the political landscape because of a breakdown of the old social order based on hierarchy and duty, and the inability of a new one to function based strictly on equality and self-interest. Whereas the former was no longer supported by the scientific world view, the latter led to the worst exploitative excesses of the Industrial Revolution. To meet this crisis, there developed the ideal of relationship, in which individuals not only pursued their own self-interest but tried to recognize and take into account the interests and actions of others. Responsibility became respectability.

Something similar was called for by industrial technology. Good artisans, who dutifully followed the ancient craft traditions, were no longer enough, yet neither should they just be turned loose to invent as they pleased. Thomas Edison invented an electric vote-tallying device for the state legislature, only to discover that the legislature preferred the traditional non-automatic method; in response he resolved to eschew inventing what he thought someone needed without first consulting the relevant potential users’ world about what they wanted. (Marketing had not yet been invented.) The new artisan must learn to respond to a variety of factors—the material world, the economy, consumer demand, and more. This is what turns good artisans into responsible inventors and engineers. As their technological powers increase, so will their need to respond to an increasing spectrum of factors, to take more into account. Thus arises what may be described as a duty plus respicere, to enlarge an agent’s circumspection (Mitcham 1994).

Another argument to this effect is provided by John Ladd who, in considering the situation of physicians, argues that the expansion of biomedical technology has increased the private practitioner’s dependence on technical services and undermined professional autonomy. Moral problems concerning physicians and society can no longer rest on an ethics of roles but involve the ethics of power, “the ethical side of [which] is responsibility” (Ladd 1981, 42).

The metaphysical elaboration of this concept of responsibility has taken place primarily in European philosophical traditions. Lucien Lévy-Brühl’s treatise on L’Idée de responsabilité (1884) is its starting point. As subsequently echoed by his student McKeon, Lévy-Brühl begins by sketching the history of various aspects of the idea from antiquity to the late 19th century, and he is astonished that a concept so basic to modern morality should never have been subject to systematic analysis. Following Lévy-Brühl, the principle can be described as manifest in a variety of ways across the whole spectrum of phenomena. There is responsibility or responsiveness at the level of physical matter, as atoms and molecules interact or respond to each other. Living organisms are further characterized by a distinctive kind of interaction or responsiveness to their environments and each other.

Drawing on a similar ontological interpretation (although without reference to Lévy-Brühl) Hans Jonas has explored implications for science and technology. Responsibility was not a central category in previous ethical theory, Jonas argues, because of the narrow scope of pre-modern scientific knowledge and technological power. “The fact is that the concept of responsibility nowhere plays a conspicuous role in the moral systems of the past or in the philosophical theories of ethics.” This is because “responsibility … is a function of power and knowledge,” which “were formerly so limited” that consequences at any distance “had to be left to fate and the constancy of the natural order, and all attention focused on doing right what had to be done now” (Jonas 1984, 123).

All this has decisively changed. Modern technology has introduced actions of such novel scale, objects, and consequences that the framework of former ethics can no longer contain them … No previous ethics had to consider the global condition of human life and the far-off future, even existence, of the race. These now being an issue demands … a new conception of duties and rights, for which previous ethics and metaphysics provide not even the principles, let alone a ready doctrine (Jonas 1984, 6 and 8).

The new principle thus made necessary by scientific knowledge and technological power is responsibility—especially responsibility toward the future. For Jonas, “responsibility today” is summarized in the statement that “care for the future of mankind is the overruling duty of collective human action in the age of a technical civilization” (Jonas 1984, 136).

Power conjoined with reason carries responsibility with it. This was always self-understood in regard to the intra-human sphere. What is not yet fully understood is the novel expansion of responsibility to the condition of the biosphere and the future survival of mankind (Jonas 1984, 138).

What for Jonas functions as a deontological principle, Caroline Whitbeck argues can also name a virtue. When children are described as reaching “an age of responsibility” it indicates they have become able to “exercise judgment and care to achieve or maintain a desirable state of affairs” (Whitbeck 1998, 37). Acquiring the ability to exercise such judgment is to become responsible in the sense of acquiring a virtue. In this way, discussions of responsibility have also been influenced by feminist arguments for an ethics of care or relationship that would complement more common utilitarianism or deontology. At the same time, the term “responsibility” continues to name distributed obligations to practice such a virtue derived either from interpersonal relationships or from special knowledge and powers. “Since few relationships and knowledge are shared by everyone, most moral responsibilities are special moral responsibilities, that is, they belong to some people and not others” (Whitbeck 1998, 39).

But is the notion of responsibility delimited by and thereby moderated according to social role really adequate in a technological world where all to some degree exercise the powers of technoscience through their support for modern scientific education and research or the utilization of and dependence on technological products, processes and systems? Is it not the case that all citizens in technoscientific society have become in some sense engineers and thereby unavoidably assumed responsibilities for public safety, health, and welfare?

Responsibility generalized

Traditional technics has been transformed into technology; the transformed process of making has in turn transformed the traditional lifeworld into what may be termed a techno-lifeworld. A key feature of the transformation in making is the conscious engagement with dimensions of reality unattended to in traditional technics.

Traditional technics engaged the material world through the unaided senses of touch and sight, of hand and eye coordination, taking into account only what is available to direct experience. In such a world responding to relevant aspects of phenomena did not need to be consciously conceptualized as responsibility; artisans naturally respond to fire with care, skillfully learn how to assess and stack stone or mold clay into stable configurations, and have been taught from the remembered injury or death of others to avoid breathing or ingesting poisonous substances.

Modern engineering and technology, by contrast, introduce into the making activity an engagement with phenomena via mathematically analyzable forces in a sensorium extended through instrumentation into chemical composition at the level of atoms and molecules; engineers depend on conceptually analyzed materials and calculated centers of gravity, pressures, flows, and resistances. Technology further thinks out its makings through systematic design or miniature construction that takes into account more than traditional technics was ever able to experience.

In the techno-lifeworld so constructed by the rational taking into account of more than directly experienced phenomena, it is not surprising that moral behavior likewise must move beyond the primacy of anything approaching natural intuitions. Moral conduct too has to become more conscious, more rational, and take more into account. Such is the burden of responsibility in the presence of technology. Consider three simple but archetypical examples:

First, giving birth: In the natural state in which many children die young it is not just permissible but also virtuous for humans to desire children in order to reproduce the species. In the techno-lifeworld, where the large majority of children survive and live to old age, the natural virtue gives rise to overpopulation. The desire for children must submit to a consciousness of the long-term consequences of unfettered reproduction in order to bring a natural desire under the guidance of rationally determined limits.

Second, eating: In the natural world where evolution and adaptation have over long periods of time established a balance between human tastes for available foods and the requirements of human activity, eating can be cultivated into a quotidian and festive art. Eating is disciplined without much thinking into healthy patterns by natural availabilities as well as daily work. When people become food rich through industrial agriculture and technologically manufactured chemical attractions while at the same time being freed from requirements of physical labor, the healthy meal becomes dependent on scientifically guided nutrition and dietary labeling. Healthy eating increasingly requires scientific research and conscious discipline.

Moral conduct too has to become more conscious, more rational, and take more into account. Such is the burden of responsibility in the presence of technology

Third, dying: In the natural state in which human death is easily defined by pulmonary or cardiac arrest, there is little difficulty in determining when a life has ceased. In the techno-lifeworld, where highly technologized medicine is able to intervene and provide artificial prolongation of pulmonary and cardiac functioning, humans have to develop a definition of death that depends on the instrumental appraisal of brain functioning. Death necessarily becomes a concept more than an experience.

Efforts to come to terms with such new dimensions of responsibility can be found not only in philosophy but in popular culture in the form of comic book super-heroes. Spider-Man is one peculiarly poignant example. Having been bitten by a radioactive spider in a science laboratory, Peter Parker becomes the possessor of great powers that bring with them great responsibilities. The result, as Parker later reflects, is that “the choice to lead an ordinary life is no longer an option.” Such is the burden of the new world of responsibility in the presence of technological powers.

Acknowledgements

Earlier iterations of this argument have appeared in Mitcham (1987 and 2005).

Bibliography

Akin, William E. 1977. Technocracy and the American Dream: the Technocrat Movement, 1900–1941. Berkeley: University of California Press.

Collins, Frank. 1973. “The Special Responsibility of Engineers.” In The Social Responsibility of Engineers, Annals of the New York Academy of Sciences 196 (10), edited by Harold Fruchtbaum, 448–450.

Funtowicz, Silvio O., and Jerome R. Ravetz. 1990. Uncertainty and Quality in Science for Policy. Dordrecht, Netherlands: Kluwer Academic Publishers.

Hart, H. L. A. 1968. Punishment and Responsibility: Essays in the Philosophy of Law. New York: Oxford University Press.

Jonas, Hans. 1984. The Imperative of Responsibility: In Search of an Ethics for the Technological Age. Chicago: University of Chicago Press.

Ladd, John. 1981. “Physicians and Society: Tribulations of Power and Responsibility.” In The Law-Medicine Relation: A Philosophical Exploration, edited by Stuart F. Spicker, Joseph M. Healey and H. Tristam Engelhardt, 33–52. Dordrecht, Netherlands: D. Reidel.

Lévy-Bruhl, Lucien. 1884. L’Idée de responsabilité. Paris: Hachette.

Lowrance, William O. 1985. Modern Science and Human Values. New York: Oxford University Press.

Layton, Edwin. 1971. The Revolt of the Engineers. Cleveland: Press of Case Western Reserve University. Second edition, with a new introduction, Baltimore: Johns Hopkins University Press, 1986.

McKeon, Richard. 1957. “The Development and the Significance of the Concept of Responsibility.” Revue Internationale de Philosophie 11 (1): 3–32.

Mitcham, Carl. 1987. “Responsibility and Technology: The Expanding Relationship.” In Technology and Responsibility, edited by Paul T. Durbin, 3–39. Boston: D. Reidel.

Mitcham, Carl. 1994. “Engineering Design Research and Social Responsibility.” In Ethics of Scientific Research, edited by Kristin S. Shrader-Frechette, 153–168. Lanham, MD: Rowman and Littlefield.

Mitcham, Carl. 2005. “Responsibility: Overview.” In Encyclopedia of Science, Technology, and Ethics, 1609–1616. Detroit: Macmillan Reference.

Niebuhr, H. Richard. 1963. The Responsible Self. San Francisco: Harper and Row.

Ravetz, Jerome R. 1971. Scientific Knowledge and its Social Problems. Oxford: Clarendon Press.

Sinsheimer, Robert L. 1976. “Recombinant DNA: on our Own.” Bioscience 26 (10): 599.

Sinsheimer, Robert L. 1978. “The Presumptions of Science,” Daedalus 107 (2): 23–35.

Taylor, Charles. 2007. A Secular Age. Cambridge, MA: Harvard University Press.

Teller, Edward. 1947. “Atomic Scientists Have Two Responsibilities,” Bulletin of the Atomic Scientists 3 (12): 355–356.

Whitbeck, Caroline. 1998. Ethics in Engineering Practice and Research. New York: Cambridge University Press.

Quote this content
Listening
Mute
Close

Comments on this publication

Name cannot be empty
Write a comment here…* (500 words maximum)
This field cannot be empty, Please enter your comment.
*Your comment will be reviewed before being published
Captcha must be solved