Is the brain a Computer?

Ecological psychology as an alternative

It is becoming ever more evident to the scientific community that the answer to complex problems cannot be found in simplistic solutions. Although we all value simple solutions, the history of science shows that an excess of reductionist fervor (very useful for explaining the movement of “spherical cows in a vacuum” and “free-falling stones”) can make us lose sight of what exactly it is we wanted to understand. And in fact, the adaptive behavior of human beings is a paradigmatic example of this situation.

Traditionally, the scientific study of behavior has been based on the “reduction” of organisms to machines. This is a simple response to a complex problem, and has dramatic consequences on our understanding of psychology and biology. This is why it is worth reflecting on the concept of machine. Machines are artificial devices –that is, built by a designer to carry out a series of predefined functions in a series of preselected contexts. One thing common to designed machines (at least those designed by humans, and until today) is that they are based on the composition of simple elements with clearly defined functions that are independent of their context (a gear assembly makes the same mechanical coupling regardless of the other gears surrounding it, although the function performed by the mechanical system as a whole depends on how these elementary functions are combined). These simplifying design principles enable the task of building the complex artificial devices to which we have become accustomed.

dsfsd

Machine design is based on the composition of simple elements with clearly defined functions independent of the context / Image: PIXNIO

The computer is the machine par excellence, as it represents the ultimate abstraction of physical machines. That abstraction of the functioning of the machine is what we call an algorithm, and consists of a sequence of transformations the computer must make in the input data to generate the output data specified by the designer. A computer is therefore a machine capable of emulating the operation of any conceivable machine by implementing an algorithm that replaces the relevant functions of the original design. From this standpoint, the machine’s operation consists of mapping the process between the information inputs and outputs, and the physical substrate that constitutes the machine is irrelevant.

The “buts” of the brain-computer metaphor

The reductionist schools have compared the mind with a computer, and perception and action with the sensors and effectors that connect that formal mind with the changing physical world. This metaphor has been so successful that today it seems only natural to think that way. For example, the possibility that is frequently discussed in cyberpunk of “raising our awareness to the cloud” encapsulates the dualism between an abstract and immaterial mind (in which our personal identity resides) and a physical body that is completely irrelevant. Another example of this reductionist focus is the consideration of the eye as a camera, and not as an organ in constant development, structurally and functionally plastic, context-dependent and integrated in the perceptive system of an organism that actively and intentionally explores its environment.

bjhb

Comparing a living system with a machine assumes that living beings were built by a designer with the same cognitive and operational limitations as humans.

This reductionist metaphor of the machine to explain the relationship between behavior, corporality and psyche has serious limitations. Comparing a living system with a machine assumes that living beings were built by a designer with the same cognitive and operational limitations as humans. However, we know beyond a shadow of a doubt that biological systems are made in the opposite way; that is, as systems that are open to interaction, and strongly interlinked through interactions on all organizational scales (from the molecular to the organic/systemic level). Similarly, the metaphor of the machine assumes that the purpose of the artificial system is different from the system itself, as it has been defined a priori by the designer to fulfill a series of functional requirements. This contradicts the very notion of the agent as the autonomous unit of action and perception. The machine metaphor leaves no room for intentionality, one of the basic properties of all organisms.

So the metaphor affords us simple but also simplistic solutions, and renders the most fundamental properties of living organisms invisible.

How can we scientifically explain organisms and their behaviors with scientific rigor but without reducing them to a mere mapping of inputs and outputs? What defines an organism from the approach of Ecological Psychology (a minority branch of psychology, based on the works of Jerome Gibson and on his classic work “The perception of the visual world” which has recently been rediscovered) is the particular way in which structure and function are interlinked to interact adaptively with the environment.

Functionally, the ecological focus sees the identity of an organism as being constituted through its intentional and active view of the environment; in other words, its ways of interacting explained in the first person. Structurally, an organism is built as a complex physical system, formed by processes at multiple closely interlinked spatial and temporal scales. The ecological approach does not seek to reduce the psychological dimension to the physical laws that govern its materiality, as this would mean eliminating the intentional perspective. Its aim is rather to understand how the material constitution of the organism enables intentional adaptive behavior in specific ways. This is the challenge this theory sets out to tackle, by debunking the “mind-computer” analogy which has so blocked scientific progress towards the achievement of a unifying cognitive theory.

Jorge Ibáñez

Universidad Autónoma de Madrid

References

  • James Gibson (1950). The perception of the visual world. Buenos Aires, 1974.