Created by Materia for OpenMind Recommended by Materia
Start Paradise Lost? Paradise Regained? Nanotechnology, Man and Machine
Article from the book There’s a Future: Visions for a Better World

Paradise Lost? Paradise Regained? Nanotechnology, Man and Machine

Estimated reading time Time 25 to read
The history of the past five hundred years is as much a chronicle of mankind using machines as of mankind being changed by machines. When the British brought railways to India, it was to a control a distant colony through the efficient movement of goods and troops. But it also spurred the breaking down caste barriers since all travelers had to occupy common small spaces. A frontier research area and molecular scale, nanotechnology´s scale from which physical and biological properties arise. This will lead to the disappearance of the distinctions between man-made and living that we see today. Man and machine will fuse, raising some of the most difficult evolutionary and ethical questions of our history.

“Not well conceived of God; who, though his power Creation could repeat, yet would be loth Us to abolish, lest the Adversary Triumph”
John Milton, Paradise Lost, Book 9

Eve has eaten the fruit. Adam must now decide whether to join her in sin or live without her. He cannot imagine living without Eve. Is God’s warning a bluff or could Satan possibly be triumphant?

Nanotechnology gives the human ability to manipulate, control, and process at atomic and molecular dimensions. This capability has been gained rapidly in the past decades, and our knowledge and engineering at this dimensional scale continues to accelerate. Man can now change properties for use in the physical environment quite effectively. Some examples: lighter weight materials with increased strength used in vehicles or turbines; improved energy conversion or storage efficiencies as in batteries and photovoltaics; increased safety through improved filtration of water for heavy metals to bacteria; improved pollution control via catalysis for combustion effluents; communication ability that changes the feeling of time and space through mobile instruments; and an internet and computing environment that has made information more open and pervasive. Not long ago, typewriters abounded, often manned by pools of typists; information for most of the needs was accessed by phone or hand-written mail or a personal visit to the office, and commerce involved transactions with carbon copies. The openness of information, the easy communication while being mobile, commerce of all kinds executed electronically — these are all physical changes that have taken place in a short period of about one human reproduction cycle and changed how man conducts private and public life. Ever increasingly, the world practices high frequency living, not unlike the money folks of Wall Street. The dependence and the possessiveness that man feels for mobile connectivity and the Internet even hints that man and machine may be more enjoined than man and man. Man is being changed by machine.

This change through creations in matter based on the building blocks from across the periodic table and its use in the physical environment is only the beginning. In the carbon-centric living world, we live longer because diagnoses are more sensitive: instrumentation such as MRI in its many forms, NMR, or PET or CT Scan probe in physically unreachable spaces; smaller versions probe localized regions; cochlear miniature implants and artificial joints made with strong materials aid the daily living of elderly; and medicine is now more specific to the cause of the ailment because we have very precise imaging capabilities in many signaling forms — mechanical, electrical, magnetic, and optical across the breadth of the electromagnetic spectrum. These multidimensional eyes are capable of looking down through the brain and body cross-sections, to tumors and even down to single molecules and single atomic bonds. Arguably, a synthetic cell has been created. Better understanding of biological interactions, including the genetic underpinnings, has been important to these developments.

Nanotechnology gives the human ability to manipulate, control, and process at atomic and molecular dimensions. This capability has been gained rapidly in the past decades, and our knowledge and engineering at this dimensional scale continues to accelerate.

Life and living is vastly more complex, and we barely understand it. Both digital interactions, such as in the genetic code, and analog interactions, such as in ionic triggering in synaptic interactions, underlie the fundamental behavior of the networks of interactions that make life in the presence of energy flow stable. Answers to difficult questions such as what is consciousness, what is the basis of various mental diseases or how do we make decisions are also getting within reach as instrumentation such as functional MRI, neural implants, and non-invasive electrical, magnetic and optical tools help us probe the brain. Learning from the living world, robots swim like fish, jump like insects, fly autonomously, drive cars like humans, almost achieve mammalian stability in walking and running in difficult terrain, recognize simple instructions by gesture or voice and perform repetitive tasks better. Information machines are becoming capable of learning the rules or algorithms of response, action, behavior and the natural laws by recognizing the patterns in the data unsupervised. Neural implants allow the brain to directly control physical actions in prosthetics. These changes, in living, and in machines learning from the living, have harnessed the ability to elucidate and mediate the atomic and molecular scale interactions through biology and nanotechnology practices. Man’s outer and inner workings are being understood by man and machine cooperatively.

This pattern of rapid change remains unending. This man-machine cooperation can certainly extend with the human harnessing machine in amelioration of diseases, or health defects, or just in repetitive menial tasks. With the pace of learning, and machines learning on their own, the day is not far, possibly by mid-century, when the distinguishability between man and machine may even be philosophical nit-picking. Machines, for all practical purposes, would qualify the intelligent behavior test of Turing where an object behind a curtain cannot be reliably distinguished between a machine and a human through a language conversation. Machine would be capable of building intelligence, drawing deterministic logical conclusions, asking questions, also probing connections, short and long, and analyzing by building reasoning with confidence of the non-deterministic variety — emotional, contextual and for the surprises. This machine has a human-like behavior. If achieved, this non-carbon form of machine would also be capable of being a third form, a silicon-human, where the “persona” of the finite life of the carbon-based form is imbued into the non-carbon based form for continuing the “living.” Man and machine may fuse. Or, as most artificial intelligence community proponents argue, man is a machine, so, now machine forms fuse.

This is the “Garden of Eden” of mid-twenty-first century. Is this cataclysmic? Paradise lost? Paradise regained? Should Adam join Eve in this sin or live without her? Is Adam’s reasoning that God may be bluffing, for He would never kill him nor would He want Satan to triumph rational? Such is the dilemma of the future as nanotechnology and biology progress. Like Adam, we will have to find our way. Do we find a path that is not black or white but gray? Is gray, weighing the odds, too far from the practices of the Wallace-Darwinian evolution through probabilistic events of mutations and the survival of the fittest?

This trajectory raises many questions and challenges for the world we inhabit and for our place in the universe. It goes beyond simple scientific, economic, social or cultural changes and the issues they raise. It affects humanity and its belief in its own uniqueness at its very core. Francis Bacon’s remark, “Nature, in order to be commanded, must be obeyed,” is particularly true for this path that technology may take. We must not destroy what we cannot create.

Even predicting the past is difficult. Questions such as when did humans first acquire language, or how many different migrations take place from Africa, or even when and if the events described in ancient texts — Rigveda or the Egyptian Book of the Dead, let alone older stories of Kesh temple hymn or the Epic of Gilgamesh — take place are continuously subject to revision. Predicting the future is worse in uncertainty. Nature follows a non-deterministic probabilistic path under random and deterministic influences. With this caveat, let us explore the possibilities of the future given the developments in science and engineering of nanoscale as it relates to the physical, the life sciences and through them to humanity and the world. Much of nanotechnology’s progress has been in the physical world, the one we create through the diversity of materials, natural and created, at our disposal. But, much is starting to happen in the natural living world, and the time is nigh where the two will increasingly connect.

Scientists and engineers appear to be at their creative best between the mid-twenties and end of the thirties of their life, perhaps because during these years they are up to date in the technical wizardry of their profession and have not yet accumulated the baggage of other responsibilities of their profession and life.

What might an infant born today see as an active professional and what might be the questions she has to grapple with? I will call this creativeness cycle time, and this is the timescale that I will explore.

One of the spectacular successes of physical engineering with reduction in dimensions is the variety of compact tools we employ in daily life — phones that provide near-instantaneous communication connectivity and quick answers to queries of static nature: finding directions, places for activities, and ease of commerce or physical planning for the joys of living: financial transactions, buying and selling of necessities, planning travel or listening to the personal musical favorites. Miniaturization has also benefited humans through healthcare. We spend less time in hospitals as a consequence of the reduced invasiveness of procedures, e.g. all the endoscopic procedures. Small sensitive instruments, rapid data transfer, robots and cell phones provide clever ways that make diagnosis and treatments, even physical procedures, possible remotely. This has relevance to both affluent and deprived communities. Miniaturization reduces costs and allows an expert to be more distant and diversely connected with the community. Inexpensive diagnostic kits, even paper based (Vella et al. 2012), coupled to the transmission of test results via camera-based cell phones to specialists, lets one reach the remotest of communities as mobile phones reach into all communities. One just needs to focus on inexpensive test kits utilizing nanoscale sensitivities that provide a number of common tests simultaneously and their availability through the rural stall. Such procedures are particularly useful for common ailments — malaria, cholera, malnutrition, and others. Advanced hospital instruments — NMR, MRI, CT and PET scans, confocal microscopes, even X-rays — that are useful in the more challenging diseases of cancer, brain, joints, tuberculosis and others are also subject to miniaturization (Sun et al. 2010 and Spector 2010). But, because these do require specialized knowledge of operation, they may be made available in the district hospitals.

The rich benefit from these trends too; after all, the cost of healthcare is a large economic cost in the society. Self-tests and less use of expensive infrastructure and human expertise are constructive avenues for reduction in these healthcare costs. There are a number of procedures that machines do better. Artificial joints require careful surface preparation and alignments and are best done robotically. Hard to reach places and small features are best handled by machines, e.g. the prostate and others by machines such as da Vinci.1 These robots will proliferate. Machines such as Dr. Watson of IBM answer a lot of questions based on accumulated facts that have been fed to it and its ability to parse natural language. Such machines, even today, should be capable of much of the systematizing that a physician does. Machines should be able to analyze rapidly, draw inferences employing reasoning systems to determine information, genotype, phenotype, microbiomic and epigenetics needed, acquire it, and act on it using probabilistic reasoning just as the specialist does. With the learning capabilities that they will acquire, machines will become adept assistants to physicians initially, then replacements for most common tasks, and finally specialists. These are examples where machine is acquiring more and more capability as smaller elements proliferate and provide a capability to acquire, assimilate and from that ensemble sort data and important patterns buried in them.

Another consequence, particularly beneficial to infrastructure that makes our life easy and social, are a plethora of tools that collect time-sequenced data— sensors that acquire information on bridges, or of traffic density and patterns, or geological activity and of environment as exemplified in tsunami prediction, or the time-stamped health records of the living, and that look for patterns and trigger activity as a safety response.

The smallness itself still has a considerable intellectual distance to go. We will put together nature’s rules to work more efficiently in the physical world. We will exploit the fundamental physical phenomena, that an electron has a single electron charge, or that magnetic flux has a corresponding flux quantum, to create forms of digitization that are much more efficient, where information is coded right at the source of a data in a more efficient form. Indeed, we should be able to exploit chaos and fluctuations by coupling to nonlinearities to achieve new devices that allow much higher sensitivity than is currently possible for uses such as noise-activated mechanical sensors or transmission bands for more efficient data movement. We will learn to transmit energy across the electromagnetic spectrum efficiently and wirelessly by using nonlinearities, as well as we do along metal wires — balls of energy being transmitted similar to how tsunamis can move long distances without losing their rise and fall. We may even practice rudimentary forms of teleportation, and certainly secure forms of communication using principles of quantum mechanics. We will learn to make things smaller and smaller in medical instruments, in the process allowing negligibly invasive surgery where the physician will be able to see what she is doing while the organ, e.g. the heart, is still working in the midst of the procedure.

These changes, in the living, and in machines learning from the living, have harnessed the ability to elucidate and mediate the atomic and molecular scale interactions through biology and nanotechnology practices. The day is not far off when the distinguishability between man and machine may even be philosophical nit-picking.

Easily visible as one consequence of this large accumulation of data through all these means is the increasing loss of privacy and the ability of private enterprises to troll and of governments to acquire data legally and illegally, just because it is there or because of intent, and because most individuals do not know how to protect themselves. This has a bipolar effect. Financial transactions can be followed, and roles of corporations and individuals in financial events identified; the center of communications, such as in a network of terror, can be identified. But, in the same way, the transactions can be faked, and legitimate democratic protests squelched.

The common consequence underlying these activities is the generation of large amounts of data. Even in a decade, this data agglomeration in many different collections will be yottabytes or more. A yottabyte is 10 followed by twenty-four 0s of bytes; a byte being 8 binary digits or bits. A yottabyte is 80 binary digits (a yobibyte) in the number representation that machines employ. As a reference point for data, today’s (2012) thumb drives carry at most 32 followed by nine 0s in the binary digital representation. Data, as mentioned here, has information buried in it or even at a higher level, knowledge which I interpret as the connections represented in the information. Much of this data is superfluous and irrelevant. The same temperature, say a room temperature of 20°C, can be written as 20 in decimal form (to a degree of Celsius accuracy), or 10100 in the binary form that the binary logic of digital electronics employs. Digital electronics uses computational engines that have precision in lumps of 64-bit precision these days, or 132 bit in rare instances, to represent data accurately. After all, data can be very large, theoretically infinite, and they need to be represented accurately. Much of this precision is of no use in the case of this temperature. On the other hand, we do not have enough precision and data of other forms. A hundred billion galaxies with a hundred billion stars in each galaxy in our universe are 10 followed by twenty-two 0s of stars. Let us assume that in a picture each star is just one spot, i.e., a pixel in a picture. Now let us suppose we want to store a number of pieces of data: intensity of light as a function of wavelength at each of these pixels. We need to identify the object, so there is data for that, and let us say we wish to capture at nanometer precision the optical spectrum over a reasonable wavelength range — about 10 followed by eight 0s of data points, because this information helps us identify some of the materials in the star and what happens to the light as it traverses the universe before reaching us. We have just formed 10 followed thirty-one 0s of data where each star is just a pixel and only a limited electromagnetic spectrum has been captured. Immense data, and buried in there is some information related to events that happened in the star millions to billions of years ago to get only an instantaneous snapshot! This is more than ten million yottabyte of data and we haven’t really described much of what is happening in the universe in time or with any precision since each star is only a pixel. One could look at this problem another way in our own vicinity. Suppose we want to use the techniques of measuring strain, through deformation, as to when a bridge may fail, to assess when deformations accelerate leading to catastrophic failures. Let us collect data through about 10 000 autonomous sensors on the bridge, measure them every second over a year (a year has about 32 million seconds), measure expansion in three directions, and a few other characteristics such as temperature, etc., at each identifiable sensor. This is 10 000 x 32 000 000 x 3 x 10 = 10 followed by twelve 0s. In a country, say like USA, there is about one bridge every fraction of a square mile, i.e. about 10 million bridges, and this creates 100 billlion billion (10 followed by twenty 0s) pieces of data. Just data collection on static bridges is this much in one country. Imagine what the data looks like when 4 cameras at every intersection produce 10 million pixels each of data every second in a country like England which likes to keep an eye out, or with the eavesdropping and snooping that goes on at the hands of companies and governments.

The challenge in this proposition is that while data is sacrosanct, as all scientists and engineers are taught, data is not information nor is it knowledge which puts the information in perspective. Data is subject to errors. A decade ago, we found that the measurement of ocean temperature had systematic error in a particular period because of a specific satellite sensor approach employed. How does one then compare this data to another set of data acquired through another approach, and the accumulation of such data? If one measures the temperature of the solid surface of the earth to the fifth place of fractional decimal every mile, is it as meaningful as of first place of decimal data every 0.1 mile or no decimal digit data so long as we measure it at the solid surface and also all the way down a mile into the oceans? The last is as much data as the second which is 100 times less than the first. However, the last provides depth data in the ocean, an area three times that of the solid surface, and whose energy movement through ocean currents is very central to earth’s energy flow and hence global warming. Any weather prediction based on the first, even with its immense precision, is certifiably wrong.

This discussion points to an essential point: data is not information and information alone is not knowledge. Each of these is a higher and more actionable form that one can work with more efficiently. One can create a lot of meaningless data, e.g., by higher precision where the precision accuracy is meaningless if all the other inaccuracies — of the model, and of other data and information — are large. We are interested in actionable inference through knowledge. This needs a different perspective. We need to find answers to queries: what is important? What is important in making robust judgments that one can act on? When we cross a street with traffic there are numerous judgments that we make. What would the driver do? The answer probably depends on what kind of person is driving the vehicle. We can’t judge that rapidly, so we employ heuristics, visual cues, man or a woman, the vehicle that person is driving, the region he/she represents that one can see in the license plates and the traits we associate with these cues. We start making guesses, sometimes right, sometimes wrong, where a number of characteristics are endowed on the person and the vehicle to make a judgment. As a person who lives in a small upstate New York town, Ithaca, more really a village, I behave differently in September, when all the new young folks from the urban areas of Long Island and New Jersey arrive, than in March or April, by which time they have some experience of the expectations of a rural college town community, so the influence of the environment and time in judgments also enters the decision-making.

We will learn better ways to handle such incomplete problems as we start moving away from the deterministic style of data processing to a non-deterministic approach of information manipulation and knowledge extraction. The proliferation of nanoscale physical elements and the data will force a new direction of information science — a move towards finding of short and long range connections, a development of theory of networks, irreducible representations, patterns, and robust answers, as close to the point of collection of data as possible, so that nanoscale developments can be harnessed at very low energy robustly. What this implies is that hardware, until now deterministic and binary digitized, will place much more emphasis on probabilistic approaches. Machines will use the data gathered to analyze autonomously to learn from it, see if predictions come true, and thus learn what collection of data connects to what inference. Such machines will be inherently safer — less prone to misbehavior even if the people operating them are incompetent. Intelligence will gradually imbue into machine as a result of the greater confidence and robustness accrued from predictions that increasingly come true. Information and knowledge extraction and accumulation at the source, agglomeration of this knowledge from multiple streams, learning and acting on this, and evolving, i.e., changing oneself based both on the knowledge, but also on changes taking place within oneself, will be the themes of physical machines operating with increasing knowledge efficiency and reduced energy consumption in executing inferences and conducting tasks. Computers or robots, etc., will be difficult to recognize as the static form they currently have changes. They will be able to talk to us, question to get data, from it extract information, and thus build the knowledge that will make them smarter.

This change in itself raises many profound questions. When is this machine in its actions distinguishable from human? In its intellectual capabilities, in probing, in finding connections, in finding patterns, in drawing conclusions, it should have capabilities that will be exceed the human median. After all, even simpler data mining and pattern recognition approaches have allowed Dr. Watson to win the answer challenge of factual questions, and for computers to win against chess champions. Can this machine be a liar, conniver, conservative, liberal, warmongerer? Can it have emotions? Certainly appears to me that this should be possible. These are characteristic responses based on accumulated experiences and innate tendencies, all mathematically representable and therefore machine programmable to extract from experiences.

The other major characteristic of the living is of metabolism and replication. We need energy flow to be dynamically stable. We reproduce. This former is certainly true for machine, it needs energy to work, reduce the energy and it can certainly employ mechanistic techniques to shut parts of it down or to slow itself to work with less. These are quite elementary tricks that machine learns easily. Reproduction is harder, but certainly possible. Hardware can program components to create new versions of hardware. The separation of hardware and software is a construct that has arisen because, at least until now, it has eased the process of creation of machines. My belief is that if it is possible in theory, it is possible in practice, and that this will be a main approach within the creative cycle period.

This theory of a self-replicating and energy consuming machine — a self-reproducing automaton — was originally described by von Neumann at the start of the modern computing era. The automaton requires a few different parts. One collects the resources of materials and energy to process and to execute the production. Another duplicates instructions by passively copying and for passing on to the first part. Both these parts receive instructions that control

them for the action and the copying, and this controller keeps its received copy of the instruction for its own cumulative knowledge. Finally, this automaton has another unit that contains the specification — the building code, the design principles — that make the other three perform as a self-reproducing unit. In the present computing machines, the last is the software, and the machine processes the data under control guided by the software. Software can reproduce, but we normally don’t reproduce hardware. There is no fundamental barrier to it, however.

In biology, this software and hardware machinery is intertwined. The proteins, and there are many of them, perform specific tasks of taking apart and assembling what is needed by the living. Quite a bit of the body is protein, e.g., collagen throughout the body, others for blood vessels or lung tissues, others for transporting products, or metabolizing sugar, etc. The ribosome is the controller consisting of several RNA chains, including for transcription and messenger functions, and it is essential to the synthesis of proteins — highly complex long chain of molecules, whose action, reaction, and behavior we are still trying to understand. The linear DNA code is what this ribosome-based machinery converts into the production of the proteins by the transcription step which synthesizes messenger RNA followed by a translation to create the protein from the messenger RNA. This biology certainly as described here and believed in is an automaton. The physical and knowledge part of the machine are separated within the cell into DNA carrying the knowledge software code, and the rest is the physical machinery executing the code. So, one can see that the Emergent Machine certainly impinges in many characteristics that we usually associate with life sciences. The questions that this brings up are diverse.

One set of important questions will relate to the learning and awareness of what we observe all around us. How does this system’s dynamics relate to information manipulation, its organization, and its long term existence through itself and through reproduction? What causes certain conditions to be more stable than others? In a flowing stream (Dyson 2007), when a child pokes a stick into an eddy, the eddy is lost. But, removing the stick lets the eddy form again. The child disturbs it again. But, again it forms. This is the fascinating game of life where life exists as a resilient pattern in energy flow. Imagine the network of airplanes and airports that moves human beings around. There is a snow storm in Chicago. Suddenly, travelers everywhere in USA start getting affected, even those travelling from Miami to San Francisco with flights cancelled and flights delayed. It could be because the plane that was to arrive from somewhere through Chicago didn’t come, or that it got rerouted to a needier route, or that it was even another set of connective effects. But, the system is adaptive, once the storm passes, the network that moves humans around is restored. Nature is a complex dynamic system, we are a complex dynamic organization, and so is this connection of machines.

These characteristics are related to stability, appearing as emergent phenomena in presence of energy flow. With the flow of energy, machines learn and evolve increasing their capability and efficiency, and in the process achieving robustness in their inference capabilities in the presence of uncertainty. In this process they become autonomous systems that have the capability to understand, elucidate, even predict outcomes on complex problems with some confidence — how do all the interconnected causes and effects and relationships lead to global warming? How do different actions lead to different economic effects that cascade through the society? Our Emergent Machines will be able to tackle these.

This complex system theme, i.e., one where there are a large number of interconnected parts with their strengths and weaknesses of beliefs, of connections, exemplify the complexity of the world in which we dwell. With nanoscale, it is this large relationship of interconnectedness of heterogeneous components that comes to pass. There is a diversity of sensors measuring medical, environmental, human, financial, economic and social characteristics, connected together in a global system. This complex system needs to be robust, so even with elements breaking in use, new ones being generated and connecting into this network, sanity and predictable behavior prevail. This robust operation of interconnected physical systems, central to our comfortable living, will be one of the achievements of this coming period.

Examples of biological complexity that we now understand include: gene regulatory networks for controlling cellular differentiation in developmental and tissue repair processes; the reestablishment of tissue structure and function following a substantial loss of tissue mass; and the processing of visual information in the cortex. The principles that these point to are the rewiring of modules (genes), high connectivity of components (adaptability and fault tolerance), feedback and feed-forward, and gradient-driven processes. That is, we are starting to understand the commonality of principles between the biological world and the principles needed as physical machines become as complex as the biological machines are. For example, 100 neurons can now be easily assembled on a semiconductor chip. It allows us the rudimentary capability to explore interesting/emergent behaviors that come about even in such small-scale assemblies.

These instances exemplify the capability made feasible by physical changes due to molecular and atomic scale control of the physical. We have applied several of these techniques over the past decade and longer in understanding and controlling atomic scale phenomena through atomic manipulation, such as using small cantilevers to pick up atoms and molecules and to place them elsewhere on a surface and to construct physical objects for using them. We know how to trap single molecules using optical tweezers so that we may characterize them in detail.

We know how to measure many of their properties with exquisite precision through ultra-sensitive quantum-interference devices. We have also used similar techniques to build a microscopic world to assess their behavior. A laboratory-on-chip allows us today to carefully understand the behavior of chemicals, macromolecules, cells, bacteria, and others in a small environment under controlled condition. This lets us understand how bacteria communicate and collectively respond, for example, quorum sensing, how different chemicals affect living objects; safety, how proteins fold; and how some of the very complex phenomena of living happen.

It is not too difficult to visualize how these same techniques will lead us to experimentation in parallel with the randomness and mutations inherent in nature’s processes, to understand the cause and effects, and to select the most beneficial of the mutations. That is, instead of using lifetime cycle, as our living world does in its Wallace-Darwinian survival of fittest, we conduct these processes in parallel, faster, to select the most useful ones for ourselves. We will be able to explore and utilize this evolutionary process through this post-Wallace-Darwin synthesis machine.

Its power is the ability of a machine to do compact systems-scale experiments of many interlinked active components — many changed genes coupled together, metabolic engineering, photosynthetic proteins, and numerous others.

Constructing small and large molecules will be a precursor to this Evolution Machine exploring and constructing answers to a variety of societal problems and its diseases with evolution as a constructionist addendum to the optimization practiced by engineering. Genetic changes will become practical to correct for diseases caused by genetics, e.g., color blindness or even blindness. It would be possible to endow characteristics man doesn’t possess currently, e.g., sensitivity to the infrared region of the electromagnetic spectrum to the eye as in most nocturnal animals, or lower frequency or higher frequency audio spectrum to the ear as in bats and dogs. It will be capable of building complex genomes with properties useful for society. The use of enzymes in detergents is a good example of the utility of protein approaches to manufacturing that are very low cost. Vaccines are another important one. The latter are low volume and high cost. But, such examples are very few. Today’s protein synthesis uses solid phase synthesis, peptide ligation, in vitro translation, non-ribosomal pathways, and cell-based systems, all slow producing low volume, except in the case of detergent assisting enzymes. Scalable controlled methods for protein synthesis through these Evolution Machines would enable new classes of proteins to be produced both for biomedical, e.g., diagnostics, therapeutics, and vaccine, and for industrial use, e.g., catalysts and self-healing materials, at large volumes and low cost.

Microorganisms are the simplest of organisms that are also likely to be the easiest to modify robustly through experimental multiple evolutionary genetic changes. Modify the photosynthetic machinery of the organisms to strongly couple to fuel production through metabolic engineering and we will be able to provide efficiency improvements in biofuels. This modification again will require multiple genomic changes so that the system as a whole is robust by balancing the multiple pathways that exist, and yet it provides efficiency in the photosynthesis process that currently bases itself on the photosystem I and photosystem II that convert light energy to chemical energy in plant molecules.

If the genetic language has truly been decoded, and the synthetic cell created (Gibson 2010) a useful engineering direction is to manipulate and program cells so that they can be made efficient living foundries — factories at cell scale. These would be particularly useful for the creation of high volume rapid vaccine creation, and possibly in finding new methods such as through proteins and autoimmune systems in fighting the emergent antibiotic immune diseases.

Many cellular functions are carried out by organelles. These perform functions within a cell similar to that of organs in the body. As examples of organelles, mitochondria make adenosine triphosphate (ATP), which is the energy transport chemical; splicosomes, a complex of specialized RNA and protein sub-units, digest protein; golgi apparatus packages proteins inside cells before moving them; and cytoskeleton, a cellular scaffolding, generates force causing cell locomotion and muscle contraction. What we don’t understand is how this machinery inside the cell works in detail. In our present approach in device technologies, we employ large-scale in vivo sensing and drug delivery or gene delivery to modify cells. We should be able to design ultra-compact nanoscale systems to understand the processes at cellular level — how nuclear pores work, measuring the ionic cellular machinery as it works, and following the longer timescale metabolic pathways. This would lead to the engineering of particle-based approaches for redesigning and augmenting cell functions, including cellular functions for ATP generation, extending the electromagnetic range over which cells respond, and thus enabling the tracking and manipulation of individual cells. The result would be techniques for the restoration of lost functions to cells, tissue/wound repair, bionic blood transfusions, cell-based sensors, and camouflage. Specificity in these approaches will let us tackle the myriad cancer forms and tumors effectively.

One of the spectacular successes of physical engineering with reduction in dimensions is the variety of compact tools we employ in daily life. Miniaturization has also benefited humans through healthcare.

Evolution Machines will also allow us to explore methods for modifying plants. Plants, including the single and multi-celled algae, are perhaps a simpler and safer avenue to evaluate bioengineering before embarking on the more ambitious task of complex organisms. The Evolution Machine gives us a chance to modify plants for increased topsoil creation — a natural way to sequester carbon, produce energy efficiently by creating plants where the enzymatic conversions are easier, develop food source plants that use less water, modify plants that provide humans with a diverse diet, create plants that clean up fertilizer run-off in aquatic systems, and allow energy conversion through changed photosynthetic processes.

One would be able to explore further out beyond the immediate problems that humanity faces. Given a functional need, Evolution Machine would allow one to create methods for manufacturing that were most fit to survive, in the same way as an organism does, with properties that are useful. It would be scalable — it could produce one product or millions; adaptive — both the product and process would work within the constraints; repairable — it would be self-healing and correct errors as it encountered them; and it would be self-building — it would build the tools and the assemblies needed for the tasks.

These streams of speculation, of the machine as a parallel of living form, and of the ability to mimic and engineer nature to create new life forms, naturally leads to the more profound question of where does the inanimate end and the animate begin? The next fifty years will confound us by eliminating this as a normative question.

It will simply not be possible to distinguish between man-made and nature-evolved forms in the creative cycle time.

In this past decade, neural implants, prosthetics, and wired and wireless coupling for disease amelioration have demonstrated important successes in the coupling between information processing and living world. Implants can provide rudimentary gray-scale vision capability in cases of some eye diseases;2 depression can be controlled by optogenetics (Deisseroth 2010), local brain exposure to light; epileptic fits appear to follow many hours of potential activity that can be monitored (Litt 2001) and presumably therefore one can act on it before the event; cochlear implants improve hearing; and prosthetics allow people to walk and under neural control perform simple tasks such as peeling bananas or drinking from a glass. Many of these are based on nerve signaling. Using the simple mathematical approach of correlations in prosthetics is now well accepted. These are all examples where man-machine fusion occurs in very rudimentary form. But, all this will change dramatically, when the young of today grow up with the integrated technical knowledge from the physical and the living world, and the symbiosis proceeds apace. They will know how to exploit rational trans-differentiation using gene regulatory networks and gene-interfering approaches to reprogram a differentiated cell for use in physical machines. The physical machines themselves will know how to incorporate itself in a cell and work with the complex forms that are created together through the fusion.

Using the Emergent Machine’s advanced hardware and advances in neural assemblies, it will be possible to make radio-telepathy — the use of wireless connection to neurons for human activity — feasible. Using neural process learning, removing the effects of brain-based diseases and cognition repair will happen, bringing relief to individuals and families suffering the consequences of debilitating diseases such as autism, Parkinson’s or Alzheimer’s. Such non-invasive and invasive brain-computer interfaces, taking advantage of brain plasticity, would be a major engineering accomplishment for humans.

One fascinating thought along this speculative path is that if we really do have a good model in machine of the human construct — its emotions, personality, experiences — then we also have the means to have it live in a machine, where it would be indistinguishable in its response, such as in the Turing test, from the real form (Dyson 2007). In case of Alzheimer’s disease, or loss of brain connection or function, or even death, one could “live” in a silicon-human form. The program is installed in a robotic body and it takes over life and the responsibilities of the human.

This leaves us with a lot of breathtaking questions. When does it end? This is not a question I can comfortably respond to. I do believe that two facts of life will not change. Science will keep being unpredictable with new ideas and opportunities constantly arising. Engineering, with us as tool-making animals, will remain central to exercising our creativity. It is just that this creativity is slowly moving to the domain of physical complexity and of biological complexity. Fascinating questions will still remain. Among a few that will likely still be there following this time period that I have discussed are:

  1. Will we understand why life is so immensely complicated? Will we understand living systems in a deep sense: stable, complex, dynamic? Will we understand emotions: romantic love or sadness? Will we understand the development of skills in infants or the entangled play of moods, emotions, learning and understanding in the species? Will we be able to construct a mathematical predictive model of these? For example, will we be able to provide a complete mapping of the DNA “software” to the species? Today, we cannot even answer the question: why does a simple genetic modification change the species completely?
  2. The capacity to feel and relate to other people provides humans very unique capabilities when integrated with their intellectual activity. Will physical machines have this consciousness, free will, self-awareness, emotions, feelings, personalities, etc. — emergent characteristics we associate with the carbon life? In what way will they differ from carbon life? Will they suffer diseases such as depression or alcoholism?
  3. Are we humans going to get smarter, or will most of technology be used for repair? Who will be smarter? Man or machine, or will it be silicon-human?
  4. Will we be able to send an Emergent Machine or just the code of the automaton (the genetic code for the living, or the physical machine’s code, or silicon-human’s code) to outer space where it will create itself in the machine form, and thus achieve movement of synthetic life in the universe and through this will we finally find out if we are alone, or that there is nothing really unique about us?
  5.  Will our belief in the Wallace-Darwin theory of natural selection as the basic mechanism of evolution — that life adapts itself to the given conditions on the planet — continue to stand? Or will our beliefs have to change towards the Gaia hypothesis that life doesn’t just adapt to the conditions around it, but changes them so as to survive and perpetuate? How does technology fit into this? Where will social justice as our species’ intellectual emergent outcome fit into this man-machine fusion? Is the silicon-human a new species in this evolution?
  6.  If the human being’s life becomes longer, how will the human body change? The design of the body for living longer cannot be the same as the seventy-eighty-year length it is appropriate for today.

And most important of them all: will it be paradise lost? Or will it be paradise regained? Only how we handle the perennial questions of sin and virtue, or the gray in between, with our acquired collective wisdom on this little speck of a planet in this giant universe, will unravel that. Only ethics, not discussed here, can fill the ever widening gap between technology and human needs that too is not discussed here. Are science, engineering and technology here to serve human needs, or to take an emergent path of their own? Only our actions — we Adams and Eves — will determine this future course as we transit through the gray regions of living.


  1. See the minimally invasive surgical procedures with machine operating under physician control listed at
  2. See describing vision restoration in two blind patients.


Deisseroth, Karl. 2010. “Controlling the Brain with Light.” Scientific American, October 20.

Dyson, Freeman J. 2007. A Many Colored Glass: Reflections on the Place of Life in the Universe. Charlottesville, VA: University of Virginia Press.

Gibson, D., et al. 2010. “Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome.” Science, May 20. DOI: 10.1126/science.1190719.

Litt, Brian, et al. 2001. “Epileptic Seizures May Begin Hours in Advance of Clinical Onset: a Report on Five Patients.” Neuron 30 (1): 51–64.

Spector, Michael. 2010. “A Deadly Misdiagnosis: Is it possible to save the millions of people who die from TB?” The New Yorker, November 15. Available at

Sun, Nan, et al. 2010. “Palm NMR and One-Chip NMR.” International Solid State Circuits Conference. IEEE Journal of Solid-State Circuits, 46 (1). Available at

Vella, Sarah J., et al. 2012. “Measuring markers of liver function using a micropatterned paper device designed for blood from a finger stick.” Analytical Chemistry.

Quote this content

Comments on this publication

Name cannot be empty
Write a comment here…* (500 words maximum)
This field cannot be empty, Please enter your comment.
*Your comment will be reviewed before being published
Captcha must be solved