Posts List

레이블이 철학인 게시물을 표시합니다. 모든 게시물 표시
레이블이 철학인 게시물을 표시합니다. 모든 게시물 표시

2014년 8월 13일 수요일

The Information Philosopher

The Information Philosopher
The Freedom section of Information Philosopher is now a book.
Click here for infoClick for information about <i>Free Will: The Scandal in Philosophy</i>
What is information? How is it created? Why is it a better tool for examining philosophical problems than traditional logic or linguistic analysis? Has information philosophy actually solved any problems?
What is information?
The simple definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver.A message that is certain to tell you something you already know contains no new information.
If everything that happens was certain to happen, as determinist philosophers claim, no new information would ever enter the universe. Information would be a universal constant. There would be "nothing new under the sun." Every past and future event can in principle be known by a super-intelligence with access to such a fixed totality of information (Laplace's Demon).
The total amount of mass and energy in the universe is a constant. A fundamental law of nature is the conservation of mass and energy.
But information is neither matter nor energy, though it needs matter to be embodied and energy to be communicated. Information can be created and destroyed. It is the modern spirit, the ghost in the machine, the mind in the body. It is the soul, and when we die, it is our information that perishes. The matter remains. Information is a potential objective value, the ultimate sine qua non.
How is information created?
Ex nihilo, nihil fit, said the ancients, Nothing comes from nothing. But information is no (material) thing. Information is physical, but it is not material. Information is aproperty of material. We can create something (immaterial) from nothing!
But we shall find that it takes a special kind of energy (free or available energy, with negative entropy) to do so.Information is not constant. Where matter and energy are conserved quantities in physics, information is not conserved. We know that information is being createdbecause the universe began some thirteen billion years ago in a state of minimal information. The "Big Bang" was formless radiation, pure energy, no material particles. How matter formed into information structures like the galaxies, stars, and planets is the beginning of a story that will end with understanding how human minds emerged to understand our place in the astrophysical universe.
We identify three fundamental processes of information creation - the purely material, the biological, and the mental. The first was the "order out of chaos" when matter formed from radiation and the expansion of the early universe led to the gravitational attraction of randomly distributed matter into highly organized galaxies, stars, and planets. The expansion - the increased space between material objects - drives the universe away from thermodynamic equilibrium (maximum entropy) and creates negative entropy, a quantitative measure of the order that is the basis for all information.
A second kind of information creation was when the first molecule on earth replicated itself and went on to duplicate its information exponentially. Accidental errors in the duplication provided variations in reproductive success. Most important, besides creating information structures, biological systems are alsoinformation processors. They use information to guide their actions.
The third process of information creation, and the most important to philosophy, is human creativity. Almost every philosopher since philosophy began has considered the mind as something distinct from the body. We can now explain that distinction. The mind is the immaterial information in the brain. The brain, part of the material body, is a biological information processor. As some philosophers have speculated, the mind is software in the brain hardware.
The most important information created in a mind is a recording of an individual's experiences (sensations). Recordings are played back (automatically and perhaps mostly unconsciously) as a guide to evaluate future actions (volitions) in similar situations. The particular past experiences reproduced are those stored in the brain located near elements of the current experience.
Sensations are recorded as the mental effects of physical causes.
They are stored as retrievable information in the mind of an individual self. Recordings include not only the five afferent senses but also the internal emotions - feelings of pleasure, pain, hopes, and fears - that accompany the experience.
Volitions are mental causes of physical effects. They begin with 1) reproduction of past experiences that are similar to the current experience as thoughts about possible actions and the (partly random) generation of other alternative possibilities for action. They continue with 2) evaluation of those freely generated thoughts and a willful selection (sometimes habitual) of one of those actions.
Volitions end with 3) the sensations coming back to the mind indicating that the self has caused the action to happen (or not). This feedback is recorded as further retrievable information, reinforcing the knowledge stored in the mind that the individual self can cause this kind of action (or not).
Why is information better than logic and language for solving philosophical problems?
The theory of communication of information is the foundation of our "information age." To understand how we know things is to understand how knowledgerepresents the material world of embodied "information structures" in the mental world of immaterial ideas.All knowledge starts with the recording of experiences. The experiences of thinking, perceiving, knowing, feeling, desiring, deciding, and acting may be bracketed by philosophers as "mental" phenomena, but they are no less real than other "physical" phenomena. They are themselves physical phenomena.
They are just not material things.

All science begins with information gathered from experimental observations, which are mental phenomena. So all knowledge of the physical world rests on the mental. All scientific knowledge is shared information and as such science is immaterial and mental, some might say fundamental. Recall Descartes' argument that the experience of thinking is that which for him is the most certain.
The analysis of language, particularly the analysis of philosophical concepts, which dominated philosophy in the twentieth century, has failed to solve the most ancientphilosophical problems. At best, it claims to "dis-solve" some of them as conceptualpuzzles. The "problem of knowledge" itself, traditionally framed as "justifying true belief," is recast by informaton philosophy as the degree of isomorphism between the information in the physical world and the information in our minds. Psychology can be defined as the study of this isomorphism.
We shall see how information processes in the natural world use arbitrary symbols (e.g., nucleotide sequences) to refer to something, to communicate messages about it, and to give the symbol meaning in the form of instructions for another process to do something (e.g., create a protein). These examples provide support for both theories of meaning as reference and meaning as use.
Note that just as language philosophy is not the philosophy of language, so information philosophy is not the philosophy of information. It is rather the use of information as a tool to study philosophical problems, some of which are today yielding tentative solutions. It is time for philosophy to move beyond logical puzzles and language games.
What problems has information philosophy solved?
Why has philosophy made so little progress? Is it because philosophers prefer problems, while scientists seek solutions? Must a philosophical problem solved become science and leave philosophy? The information philosopher thinks not.But in order to remain philosophy, interested philosophers must themselves examine the proposed information-based solutions and consider them as part of the critical philosophical dialogue.
The full story of cosmic, biological, and mental information creation involves learning some basic physics, particularly quantum mechanics and thermodynamics, along with some information theory. The information philosopher website provides animated visualizations of the most basic concepts that you will need to become an information philosopher.
When you are ready to consider them, among the proposed solutions are:
It turns out that the methodology of information philosophy can be productively applied to some outstanding problems in physics. Philosophers of science might take an interest in the proposed information-based solutions to these problems.
The Fundamental Question of Information Philosophy
Our fundamental philosophical question is cosmological and ultimately metaphysical.
What are the processes that create emergent information structures in the universe?
Given the second law of thermodynamics, which says that any system will over time approach a thermodynamic equilibrium of maximum disorder or entropy, in which all information is lost, and given the best current model for the origin of the universe, which says everything began in a state of thermodynamic equilibrium some 13.75 billion years ago, how can it be that living beings are creating and communicating vast amounts of new information every day?
Why are we not still in that original state of equilibrium?
Broadly speaking, there are four major phenomena or processes that can reduce the entropy locally, while of course increasing it globally to satisfy the second law of thermodynamics. Three of these do it "blindly," the fourth does it with a built-in "purpose," or telos."
  1. Universal Gravitation
  2. Quantum Cooperative Phenomena (e.g., crystallization, the formation of atoms and molecules)
  3. "Dissipative" Chaos (Non-linear Dynamics)
  4. Life
None of these processes can work unless they have a way to get rid of the positive entropy (disorder) and leave behind a pocket of negative entropy (order or information). The positive entropy is either conducted, convected, or radiated away as waste matter and energy, as heat, or as pure radiation. At the quantum level, it is always the result of interactions between matter and radiation (photons). Whenever photons interact with material particles, the outcomes are inherently unpredictable. As Albert Einstein discovered ten years before the founding of quantum mechanics, these interactions involve irreducible ontological chance.
Negative entropy is an abstract thermodynamic concept that describes energy with the ability to do work, to make something happen. This kind of energy is often called free energy or available energy. In a maximally disordered state (called thermodynamic equilibrium) there can be matter in motion, the motion we call heat. But the average properties - density, pressure, temperature - are the same everywhere. Equilibrium is formless. Departures from equilibrium are when the physical situation shows differences from place to place. These differences are information.
The second law of thermodynamics is then simply that isolated systems will eliminate differences from place to place until the various properties are uniform. Natural processes spontaneously destroy information. Consider the classic case of what happens when we open a perfume bottle.
Ludwig Boltzmann derived a mathematical formula for entropy as a summation of the probabilities of finding a system in all the possible states of a system. When every state is equally probable, entropy is at a maximum, and no differences (information) are visible. The formula for negative entropy is just the maximum possible entropy minus the actual entropy(when there are differences from place to place).
Claude Shannon derived the mathematical formula for information and found it to be identical to the formula for negative entropy - a summation of the probabilities of all the possible messages that can be communicated.
Because "negative" entropy (order or information) is such a positive quantity, we chose many years ago to give it a new name - "Ergo," and to call the four phenomena or processes that create it "ergodic," for reasons that will become clear. But today, the positive name "information" is all that we need to do philosophical work.
Answering the Fundamental Question of Information Philosophy
How exactly has the universe escaped from the total disorder of thermodynamic equilibrium and produced a world full of information?
It begins with the expansion of the universe. If the universe had not expanded, it would have remained in the original state of thermodynamic equilibrium. We would not be here.
To visualize the departure from equilibrium that made us possible, remember that equilibrium is when particles are distributed evenly in all possible locations in space, and with their velocities distributed by a normal law - the Maxwell-Boltzmann velocity distribution. (The combination of position space and velocity or momentum space is called phase space). When we open the perfume bottle, the molecules now have a much larger phase space to distribute into. There are a much larger number of phase space "cells" in which molecules could be located. It of course takes them time to spread out and come to a new equilibrium state (the Boltzmann "relaxation time.")
When the universe expands, say grows to ten times its volume, it is just like the perfume bottle opening. The matter particles must redistribute themselves to get back to equilibrium. But suppose the universe expansion rate is much faster than the relaxation time. The universe is out of equilibrium, and it will never get back!
In the earliest moments of the universe, material particles were yet stable. Pure radiation energy was in equilibrium at extraordinarily high temperatures. When material particles appeared, they were blasted back into radiation by photon collisions. As the universe expanded, the temperature cooled, the space per photon was increased and the mean free time between photon collisions increased, giving particles a better chance to survive. The expansion red-shifted the photons. The average energy per photon decreased, eventually reducing the number of high energy photons that destroyed the matter. Quarks and electrons became more common. The mean free path of photons was very short. They were being scattered by collisions with electrons.
When temperatures continued to decline, quarks combined into nuclear particles, protons and neutrons. When temperature declined further, to 5000 degrees, about 400,000 years after the "Big Bang," the electrons and protons combined to make hydrogen atoms.
At this time, a major event occurred that we can still see today, the farthest and earliest event visible. When the electrons combined into atoms, the electrons could no longer scatter the photons as easily. The universe became transparent for the photons. Some of those photons are still arriving at the earth today. They are now red-shifted and cooled down to the cosmic microwave background radiation. While this radiation is almost perfectly uniform, it shows very small fluctuations that may be caused by random difference in the local density of the original radiation or even in random quantum fluctuations.
These fluctuations mean that there were slight differences in density of the newly formed hydrogen gas clouds. The force of universal gravitation then worked to pull relatively formless matter into spherically symmetric stars and planets, the original order out of chaos (although this phrase is now most associated with the work on deterministic chaos theory and complexity theory, as we shall see.
How information creation and negative entropy flows appear to violate the second law of thermodynamics
In our open and rapidly expanding universe, the maximum possible entropy (if the particles were "relaxed" into a uniform distribution among the new phase-space cells) is increasing faster than the actual entropy. The difference between maximum possible entropy and the current entropy is called negative entropy. There is an intimate connection between the physical quantity negative entropy and abstract immaterial information, first established by Leo Szilard in 1929.
As pointed out by Harvard cosmologist David Layzer, the Arrow of Time points not only to increasing disorder but also to increasing information.
Two of our "ergodic" phenomena - gravity and quantum cooperative phenomena - pull matter together that was previously separated. Galaxies, stars, and planets form out of inchoate clouds of dust and gas. Gravity binds the matter together. Subatomic particles combine to form atoms. Atoms combine to form molecules. They are held together by quantum mechanics. In all these cases, a new visible information structure appears.
In order for these structures to stay together, the motion (kinetic) energy of their parts must be radiated away. This is why the stars shine. When atoms join to become molecules, they give off photons. The new structure is now in a (negative) bound energy state. It is the radiation that carries away the positive entropy (disorder) needed to balance the new order (information) in the visible structure.
In the cases of chaotic dissipative structures and life, the ergodic phenomena are more complex, but the result is similar, the emergence of visible information. (More commonly it is simply the maintenance of high-information, low-entropy structures.) These cases appear in far-from-equilibrium situations where there is a flow of matter and energy with negative entropy through the information structure. The flow comes in with low entropy but leaves with high entropy. Matter and energy are conserved in the flow, but information in the structure can increase (information is not a conserved quantity).
Information is neither matter nor energy, though it uses matter when it is embodied and energy when it is communicated. Information is immaterial.
This vision of life as a visible form through which matter and energy flow was first seen byLudwig van Bertlanffy in 1939, though it was made more famous by Erwin Schrödinger's landmark essay What Is Life? in 1945, where he claimed that "life feeds on negative entropy."
Both Bertalanffy and Schrödinger knew that the source of negative entropy was our Sun. Neither knew that the ultimate cosmological source of negative entropy is the expansion of the universe, which allowed ergodic gravitation forces to form the Sun. Note the positive entropy leaving the Sun becomes diluted as it expands, creating a difference between its energy temperature and energy density. This difference is information (negative entropy) that planet Earth uses to generate and maintain biological life.
Note that the 273K (the average earth temperature) photons are dissipated into the dark night sky, on their way to the cosmic microwave background. The Sun-Earth-night sky is a heat engine, with a hot energy source and cold energy sink, that converts the temperature difference not into mechanical energy (work) but into biological energy (life).

When information is embodied in a physical structure, two physical processes must occur.
Our first process is whatJohn von Neumanndescribed as
irreversible Process 1.
The first process is the collapse of a quantum-mechanical wave function into one of the possible states in a superposition of states, which happens in any measurement process. A measurement produces one or more bits of information. Such quantum events involve irreducible indeterminacy and chance, but less often noted is the fact that quantum physics is directly responsible for the extraordinary temporal stability and adequate determinism of most information structures.
We can call the transfer of positive entropy, which stabilizes the new information from Process 1, Process 1b.
The second process is a local decrease in the entropy (which appearsto violate the second law of thermodynamics) corresponding to the increase in information. Entropy greater than the information increase must be transferred away from the new information, ultimately to the night sky and the cosmic background, to satisfy the second law.
Given this new stable information, to the extent that the resulting quantum system can be approximately isolated, the system will deterministically evolve according to von Neumann's Process 2, the unitary time evolution described by the Schrödinger equation.
The first two physical processes (1 and 1b) are parts of the information solution to the "problem of measurement," to which must be added the role of the "observer."
The discovery and elucidation of the first two as steps in the cosmic creation process casts light on some classical problems in philosophy and physics , since it is the same two-step process that creates new biological species and explains the freedom and creativity of the human mind.
The cosmic creation process generates the conditions without which there could be nothing ofvalue in the universe, nothing to be known, and no one to do the knowing. Information itself is the ultimate sine qua non.

The Three Kinds of Information Emergence
Note there are three distinct kinds of emergence:
  1. the order out of chaos when the randomly distributed matter in the early universe first gets organized into information structures.This was not possible before the first atoms formed about 400,000 years after the Big Bang. Information structures like the stars and galaxies did not exist before about 400 million years. As we saw, gravitation was the principal driver creating information structures.
    Nobel prize winner Ilya Prigogine discovered another ergodic process that he described as the "self-organization" of "dissipative structures." He popularized the slogan "order out of chaos" in an important book. Unfortunately, the "self" in self-organization led to some unrealizable hopes in cognitive psychology. There is no self, in the sense of a person or agent, in these physical phenomena.
    Both gravitation and Prigogine's dissipative systems produce a purely physical/material kind of order. The resulting structures contain information. There is a "steady state" flow of information-rich matter and energy through them. But they do not process information. They have no purpose, no "telos."
    Order out of chaos can explain the emergence of downward causation on their atomic and molecular components. But this is a gross kind of downward causal control. Explaining life and mind as "complex adaptive systems" has not been successful. We need to go beyond "chaos and complexity" theories to teleonomic theories.
  2. the order out of order when the material information structures form self-replicatingbiological information structures. These are information processing systems.In his famous essay, "What Is Life?," Erwin Schrödinger noted that life "feeds on negative entropy" (or information). He called this "order out of order."
    This kind of biological processing of information first emerged about 3.5 billion years ago on the earth. It continues today on multiple emergent biological levels, e.g., single-cells, multi-cellular systems, organs, etc., each level creating new information structures and information processing systems not reducible to (caused by) lower levels and exertingdownward causation on the lower levels.
    And this downward causal control is extremely fine, managing the motions and arrangements of individual atoms and molecules.
    Biological systems are cognitive systems, using internal "subjective" knowledge to recognize and interact with their "objective" external environment, communicating meaningful messages to their internal components and to other individuals of their species with a language of arbitrary symbols, taking actions to maintain themselves and to expand their populations by learning from experience.
    With the emergence of life, "purpose" also entered the universe. It is not the pre-existent "teleology" of many idealistic philosophies (the idea of "essence" before "existence"), but it is the "entelechy" of Aristotle, who saw that living things have within them a purpose, an end, a "telos." To distinguish this evolved telos in living systems from teleology, modern biologists use the term "teleonomy."
  3. the pure information out of order when organisms with minds generate, store (in the brain), replicate, utilize, and then externalize some non-biological information, communicating it to other minds and storing it in the environment. Communication can be by hereditary genetic transmission or by an advanced organism capable of learning and then teaching its contemporaries directly by signaling, by speaking, or indirectly by writing and publishing the knowledge for future generations.This kind of information can be highly abstract mind-stuff, pure Platonic ideas, the stock in trade of philosophers. It is neither matter nor energy (though embodied in the material brain), a kind of pure spirit or ghost in the machine. It is a candidate for the immaterial dualist "substance" of René Descartes, though it is probably better thought of as a "property dualism," since information is an immaterial property of all matter.
    The information stored in the mind is not only abstract ideas. It contains a recording of the experiences of the individual. In principle every experience may be recorded, though not all may be reproducible/recallable.
The negative entropy (order, or potential information) generated by the universe expansion is a tiny amount compared to the increase in positive entropy (disorder). Sadly, this is always the case when we try to get "order out of order," as can be seen by studying entropy flows at different levels of emergent phenomena.
In any process, the positive entropy increase is always at least equal to, and generally orders of magnitude larger than, the negative entropy in any created information structures, to satisfy the second law of thermodynamics. The positive entropy is named for Boltzmann, since it was his "H-Theorem" that proved entropy can only increase overall - the second law of thermodynamics. And negative entropy is called Shannon, since his theory of information communication has exactly the same mathematical formula as Boltzmann's famous principle;
S = k log W
where S is the entropy, k is Boltzmann's constant, and W is the probability of the given state of the system.


Material particles are the first information structures to form in the universe.. They are quarks, baryons, and atomic nuclei, which combine with electrons to form atoms and eventually molecules, when the temperature is low enough. These particles are attracted by the force of universal gravitation to form the gigantic information structures of the galaxies, stars, and planets.

Microscopic quantum mechanical particles and huge self-gravitating systems are stable and have extremely long lifetimes, thanks in large part to quantum stability. Stars are another source of radiation, after the original Big Bang cosmic source, which has cooled down to 3 degrees Kelvin (3°K) and shines as the cosmic microwave background radiation.

Our solar radiation has a high color temperature (5000K) and a low energy-content temperature (273K). It is out of equilibrium and it is the source of all the information-generating negative entropy that drives biological evolution on the Earth. Note that the fraction of the light falling on Earth is less than a billionth of that which passes by and is lost in space.
A tiny fraction of the solar energy falling on the earth gets converted into the information structures of plants and animals. Most of it gets converted to heat and is radiated away as waste energy to the night sky.

Every biological structure is a quantum mechanical structure. DNA has maintained its stable information structure over billions of years in the constant presence of chaos and noise.

The stable information content of a human being survives many changes in the material content of the body during a person’s lifetime. Only with death does the mental information (spirit, soul) dissipate - unless it is saved somewhere.
The total mental information in a living human is orders of magnitude less than the information content and information processing rate of the body. But the information structures created by humans outside the body, in the form of external knowledge like this book, and the enormous collection of human artifacts, rival the total biological information content.

The Shannon Principle
In his development of the mathematical theory of the communication of information, Claude Shannon showed that there can be no new information in a message unless there are multiple possible messages. If only one message is possible, there is no information in that message.
We can simplify this to define the Shannon Principle. No new information can be created in the universe unless there are multiple possibilities, only one of which can become actual.
An alternative statement of the Shannon principle is that in a deterministic system, information is conserved, unchanging with time. Classical mechanics is a conservative system that conserves not only energy and momentum but also conserves the total information. Information is a "constant of the motion" in a determinist world.
Quantum mechanics, by contrast, is indeterministic. It involves irreducible ontological chance. An isolated quantum system is described by a wave function ψ which evolves according to the unitary time evolution of the linear Schrödinger equation,
i ℏ d | ψ > / dt = H | ψ >.
But isolation is an ideal that can only be approximately realized. Because the Schrödinger equation is linear, a wave function | ψ > can be a linear combination (a superposition) of another set of wave functions | φn >,
| ψ > =  cn | φn >,
where the cn coefficients squared are the probabilities of finding the system in the possible state | φn > as the result of an interaction with another quantum system.
cn2 = < ψ | φn >2.
Quantum mechanics introduces real possibilities, each with a calculable probability of becoming an actuality, as a consequence of one quantum system interacting (for example colliding) with another quantum system.
It is quantum interactions that lead to new information in the universe - both new information structures and information processing systems. But that new information cannot subsist unless a compensating amount of entropy is transferred away from the new information.
And it is only in cases where information persists long enough for a human being to observe it that we can properly describe the observation as a "measurement" and the human being as an "observer." Following von Neumann's "process" terminology, we might complete his admittedly unsuccessful attempt at a theory of the measuring process with the anthropomorphic
Process 3 - a conscious observer recording new information (knowledge) in a human mind.

In less than two decades of the mid-twentieth century, the word information was transformed from a synonym for knowledge into a mathematical, physical, and biological quantity that can be measured and studied scientifically.
In 1929, Leo Szilard connected an increase in thermodynamic (Boltzmann) entropy with any increase in information that results from a measurement, solving the problem of "Maxwell's Demon," a thought experiment suggested by James Clerk Maxwell, in which a local reduction in entropy is possible when an intelligent being interacts with a thermodynamic system.
In the early 1940s, digital computers were invented, by Alan Turing, Claude ShannonJohn von Neumann, and others, that could run a stored program to manipulate stored data.
Then in the late 1940s, the problem of communicating digital data signals in the presence ofnoise was first explored by Shannon, who developed the modern mathematical theory of the communication of information. Norbert Wiener wrote in his 1948 book Cybernetics that "information is the negative of the quantity usually defined as entropy," and in 1949 Leon Brillouin coined the term "negentropy."
Finally, in the early 1950s, inheritable characteristics were shown by Francis Crick, James Watson, and George Gamow to be transmitted from generation to generation in a digital code.

Information is Immaterial
Information is neither matter nor energy, but it needs matter for its embodiment and energy for its communication.
A living being is a form through which passes a flow of matter and energy (with low entropy). Genetic information is used to build the information-rich matter into an information-processing structure that contains a very large number of hierarchically organized information structures.
All biological systems are cognitive, using their internal information structure to guide their actions. Even some of the simplest organisms can learn from experience. The most primitive minds are experience recorders and reproducers.
In humans, the information-processing structures create new actionable information (knowledge) by consciously and unconsciously reworking the experiences stored in the mind.
Emergent higher levels exert downward causation on the contents of the lower levels, ultimately supporting mental causation and free will.
When a ribosome assembles 330 amino acids in four symmetric polypeptide chains (globins), each globin traps an iron atom in a heme group at the center to form the hemoglobin protein. This is downward causal control of the amino acids, the heme groups, and the iron atoms by the ribosome. The ribosome is an example of Erwin Schrödinger's emergent "order out of order," life "feeding on the negative entropy" of digested food.
Notice the absurdity of the idea that the random motions of the transfer RNA molecules (green in the video at right), each holding a single amino acid (red), are carrying pre-determined information of where they belong in the protein being built.
Determinism is an emergent property and an ideal philosophical concept, unrealizable except approximately in the kind of adequate determinism that we experience in the macroscopic world, where the determining information is part of the higher-level control system.
The total information in multi-cellular living beings can develop to be many orders of magnitude more than the information present in the original cell. The creation of this new information would be impossible for a deterministic universe, in which information is constant.
Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.
Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.
And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.
Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.
Since the 1950's, the science of human behavior has changed dramatically from a "black box" model of a mind that started out as a "blank slate" conditioned by environmental stimuli. Today's mind model contains many "functions" implemented with stored programs, all of them information structures in the brain. The new "computational model" of cognitive science likens the brain to a computer, with some programs and data inherited and others developed as appropriate reactions to experience.

The Experience Recorder and Reproducer
The brain should be regarded less as an algorithmic computer with one or more central processing units than as a multi-channel and multi-track experience recorder and reproducerwith an extremely high data rate. Information about an experience - the sights, sounds, smells, touch, and taste - is recorded along with the emotions - feelings of pleasure, pain, hopes, and fears - that accompany the experience. When confronted with similar experiences later, the brain can reproduce information about the original experience (an instant replay) that helps to guide current actions.
Information is constant in a deterministic universe. There is "nothing new under the sun." Thecreation of new information is not possible without the random chance and uncertainty of quantum mechanics, plus the extraordinary temporal stability of quantum mechanical structures.
It is of the deepest philosophical significance that information is based on the mathematics ofprobability. If all outcomes were certain, there would be no "surprises" in the universe. Information would be conserved and a universal constant, as some mathematicians mistakenly believe. Information philosophy requires the ontological uncertainty and probabilistic outcomes of modern quantum physics to produce new information.
But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to bedigital rather than analog.
Moreover, the "correspondence principle" of quantum mechanics and the "law of large numbers" of statistics ensures that macroscopic objects can normally average out microscopic uncertainties and probabilities to provide the "adequate determinism" that shows up in all our "Laws of Nature."
Information philosophy explores some classical problems in philosophy with deeper and more fundamental insights than is possible with the logic and language approach of modern analytic philosophy.
By exploring the origins of structure in the universe, information philosophy transcendshumanity and even life itself, though it is not a mystical metaphysical transcendence.
Information philosophy uncovers the providential creative process working in the universe
to which we owe our existence, and therefore perhaps our reverence.
It locates the fundamental source of all values not in humanity ("man the measure"), not in bioethics ("life the ultimate good"), but in the origin and evolution of the cosmos.
Information philosophy is an idealistic philosophy, a process philosophy, and a systematic philosophy, the first in many decades. It provides important new insights into the Kantian transcendental problems of epistemologyethicsfreedom of the willgod, and immortality, as well as the mind-body problemconsciousness, and the problem of evil.
In physics, information philosophy provides new insights into the problem of measurement, the paradox of Schrödinger's Cat, the two paradoxes of microscopic reversibility and macroscopic recurrence that Josef Loschmidt and Ernst Zermelo used to criticize Ludwig Boltzmann's explanation of the entropy increase required by the second law of thermodynamics, and finally information provides a better understanding of the entanglement and nonlocality phenomena that are the basis for modern quantum cryptography and quantum computing.

Information Philosophers, as do all who would make an advance in knowledge, stand on the shoulders of giant philosophers and scientists of the past and present as we try to make modest advances in the great philosophical problems of knowledgevalue, and freedom.In the left-hand column of all pages are links to nearly three hundred philosophers and scientists who have made contributions to these great problems. Their web pages include the original contributions of each thinker, with examples of their thought, usually in their own words, and where possible in their original languages as well.

Traditional philosophy is a story about discovery of timeless truths, laws of nature, a block universe in which the future is a logical extension of the past, a primal moment of creation that starts a causal chain in which everything can be foreknown by an omniscient being. Traditional philosophy seeks knowledge in logical reasoning with clear and unchanging concepts.Its guiding lights are thinkers like Parmenides, Plato, and Kant, who sought unity and identity, being and universals.
In traditional philosophy, the total amount of information in the conceptually closed universe is static, a physical constant of nature. The laws of nature allow no exceptions, they are perfectly causal. Everything that happens is said to have a physical cause. This is called "causal closure".   Chance and change - in a deep philosophical sense - are said to be illusions.
Information philosophy, by contrast, is a story about invention, about novelty, about biologicalemergence and new beginnings unseen and unseeable beforehand, a past that is fixed but an ambiguous future that can be shaped by teleonomic changes in the present.
Its model thinkers are Heraclitus, Protagoras, Aristotle, and Hegel, for whom time, place, and particular situations mattered.
Information philosophy is built on probabilistic laws of nature. The fundamental challenge for information philosophy is to explain the emergence of stable information structures from primordial and ever-present chaos, to account for the phenomenal success of deterministic laws when the material substrate of the universe is irreducibly chaotic, noisy, and random, and to understand the concepts of truth, necessity, and certainty in a universe of chance, contingency, and indeterminacy.
Determinism and the exceptionless causal and deterministic laws of classical physics are the real illusions. Determinism is information-preserving. In an ideal deterministic Laplacianuniverse, the present state of the universe is implicitly contained in its earliest moments.
This ideal determinism does not exist. The "adequate determinism" behind the laws of natureemerged from the early years of the universe when there was only indeterministic chaos.
In a random noisy environment, how can anything be regular and appear determined? It is because the macroscopic consequences of the law of large numbers average out microscopic quantum fluctuations to provide us with a very adequate determinism.
Information Philosophy is an account of continuous information creation, a story about the origin and evolution of the universe, of life, and of intelligence from an original quantal chaos that is still present in the microcosmos. More than anything else, it is the creation and maintenance of stable information structures that distinguishes biology from physics and chemistry.
Living things maintain information in a memory of the past that they can use to shape the future. Some get it via heredity. Some learn it from experience. Others invent it!
Information Philosophy is a story about knowledge and ignorance, about good and evil, aboutfreedom and determinism.

There is a great battle going on - between originary chaos and emergent cosmos. The struggle is between destructive chaotic processes that drive a microscopic underworld of random eventsversus constructive cosmic processes that create information structures with extraordinary emergent properties that include adequately determined scientific laws -
despite, and in many cases making use of, the microscopic chaos.
Created information structures range from galaxies, stars, and planets, to molecules, atoms, and subatomic particles. They are the structures of terrestrial life from viruses and bacteria to sentient and intelligent beings. And they are the constructed ideal world of thought, of intellect, of spirit, including the laws of nature, in which we humans play a role as co-creator.
Based on insights into these cosmic creation processes, the Information Philosopher proposes three primary ideas that are new approaches to perennial problems in philosophy. They are likely to change some well-established philosophical positions. Even more important, they may reconcile idealism and materialism and provide a new view of how humanity fits into the universe.

The three ideas are
  • An explanation or epistemological model of knowledge formation and communication. Knowledge and information are neither matter nor energy, but they require matter for expression and energy for communication. They seem to be metaphysical.
    Briefly, we identify knowledge with actionable information in the brain-mind. We justify knowledge by behavioral studies that demonstrate the existence of information structures implementing functions in the brain. And we verifyknowledge scientifically.
  • A basis for objective value beyond humanism and bioethics, grounded in the fundamental information creation processes behind the structure and evolution of the universe and the emergence of life.
    Briefly, we find positive value (or good) in information structures. We see negative value (or evil) in disorder and entropy tearing down such structures. We call energy with low entropy "Ergo" and call anti-entropic processes "ergodic."
    Our first categorical imperative is then "act in such a way as to create, maintain, and preserve information as much as possible against destructive entropic processes."Our second ethical imperative is "share knowledge/information to the maximum extent." Like love, our own information is not diminished when we share it with others
    Our third moral imperative is "educate (share the knowledge of what is right) rather than punish." Knowledge is virtue. Punishment wastes human capital and provokes revenge.
  • Watch a 10-minute animated tutorial on theTwo-Stage Solution to
    the Free Will Problem
    scientific model for free will and creativity informed by the complementary roles of microscopic randomness and adequate macroscopic determinism in a temporal sequence that generatesinformation.
    Briefly, we separate "free" and "will" in a two-stage process - first the free generation of alternative possibilities for action, then anadequately determined decision by the will. We call this two-stage view ourCogito model and trace the idea of a two-stage model in the work of a dozen thinkers back to William James in 1884.This model is a synthesis of adequate determinism and limited indeterminism, a coherent and complete compatibilism that reconciles
    free will with both determinism and indeterminism.
    David Hume reconciled freedom with determinism. We reconcile free will with indeterminism.
    Because it makes free will compatible with both a form of determinism (reallydetermination) and with an indeterminism that is limited and controlled by the mind, the leading libertarian philosopher Bob Kane suggested we call this model "Comprehensive Compatibilism."
    The problem of free will cannot be solved by logic, language, or even by physics. Man is not a machine and the mind is not a computer.
    Free will is a biophysical information problem.
All three ideas depend on understanding modern cosmology, physics, biology, and neuroscience, but especially the intimate connection between quantum mechanics and the second law of thermodynamics that allows for the creation of new information structures.
All three are based on the theory of information, which alone can establish the existential status of ideas, not just the ideas of knowledge, value, and freedom, but other-worldly speculations in natural religion like God and immortality.
All three have been anticipated by earlier thinkers, but can now be defended on strong empirical grounds. Our goal is less to innovate than to reach the best possible consensus among philosophers living and dead, an intersubjective agreement between philosophers that is the surest sign of a knowledge advance in natural science.
This Information Philosopher website aims to be an open resource for the best thinking ofphilosophers and scientists on these three key ideas and a number of lesser ideas that remain challenging problems in philosophy - on which information philosophy can shed some light.
Among these are the mind-body problem (the mind can be seen as the realm of information in its free thoughts, the body an adequately determined biological system creating and maintaining information); the common sense intuition of a cosmic creative process often anthropomorphized as a God or divine Providence; the problem of evil (chaotic entropic forces are the devil incarnate); and the "hard problem" of consciousness (agents responding to their environment, and originating new causal chains, based on information processing).
Philosophy is the love of knowledge or wisdom. Information philosophy (I-Phi or ΙΦ) quantifies knowledge as actionable information.
What is information that merits its use as the foundation of a new method of inquiry?
Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is the modern spirit, the ghost in the machine. It is the stuff of thought, the immaterial substance of philosophy.
Over 100 years ago, Bertrand Russell, with the help of G. E. MooreAlfred North Whitehead, and Ludwig Wittgenstein, proposed logic and language as the proper foundational basis, not only of philosophy, but also of mathematics and science. Their logical positivism and the variation called logical empiricism developed by Rudolf Carnap and the Vienna Circle have proved to be failures in grounding philosophy, mathematics, or science.
Information is a powerful diagnostic tool. It is a better abstract basis for philosophy, and for science as well, especially physics, biology, and neuroscience. It is capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), and idealism itself.
Information philosophy is not a solution to specific problems in philosophy. I-Phi is a new philosophical method, capable of solving multiple problems in both philosophy and physics.
It needs young practitioners, presently tackling some problem, who might investigate that problem using this new methodology. Note that, just as the philosophy of language is not linguistic philosophy, I-Phi is not the philosophy of information, which is mostly about computers and cognitive science.
The language philosophers of the twentieth century thought that they could solve (or at leastdissolve) the classical problems of philosophy. They did not succeed. Information philosophy, by comparison, now has cast a great deal of light on some of those problems. It needs more information philosophers to make more progress.


To recap, when information is stored in any structure, two fundamental physical processes occur. First is a "collapse" of a quantum mechanical wave function, reducing multiple possibilities to a single actuality. Second is a local decrease in the entropy corresponding to the increase in information. Entropy greater than that must be transferred away from the new information structure to satisfy the second law of thermodynamics.
These quantum level processes are susceptible to noise. Information stored may have errors. When information is retrieved, it is again susceptible to noise. This may garble the information content. In information science, noise is generally the enemy of information. But some noise is the friend of freedom, since it is the source of novelty, of creativity and invention, and of variation in the biological gene pool.
Biological systems have maintained and increased their invariant information content over billions of generations, coming as close to immortality as living things can. Philosophers and scientists have increased our knowledge of the external world, despite logical, mathematical, and physical uncertainty. They have created and externalized information (knowledge) that can in principle become immortal. Both life and mind create information in the face of noise. Both do it with sophisticated error detection and correction schemes. The scheme we use to correct human knowledge is science, a two-stage combination of freely invented theories andadequately determined experiments. Information philosophy follows that example.

If you have read this far, you probably already know that the Information Philosopher website is an exercise in information sharing. It has seven parts, each with multiple chapters. Navigation at the bottom of each page will take you to the next or previous part or chapter.Teacher and Scholar links display additional material on some pages, and reveal hidden footnotes on some pages. The footnotes themselves are in the Scholar section.
Our goal is for the website to contain all the great philosophical discussions of our three main ideas, plus preliminary solutions for several classic problems in philosophy and physics, with primary source materials (in the original languages) where possible.
Philosophers who would like to develop their expertise in information philosophy should inquire into support possibilities by writing Bob Doyle, the founder of information philosophy.
Support options include online training sessions by Skype or Google Hangouts, perhaps published to YouTube.
Preferences will be given to current graduate students in philosophy or science - physics, biology, psychology, especially - and current post-docs.
All original content on Information Philosopher is available for your use, without requesting
permission, under a Creative Commons Attribution License.     cc by
Copyrights for all excerpted and quoted works remain with their authors and publishers.

윤리적 폭력 비판 - 자기 자신을 설명하기 주디스 버틀러

윤리적 폭력 비판 - 자기 자신을 설명하기
주디스 버틀러 지음, 양효실 옮김 / 인간사랑 / 2013년 8월
평점 : 
장바구니담기 




 질문은 알튀세르에서 시작된다.
 그러니까 그를 절망케 했던 것. 바로 체자레 보르지아의 운명. 마키아벨리의 군주론의 모델이기도 한 그는 오래도록 분열되어 있었던 이탈리아의 통일이라는 위업을 바로 목전에 두고 병마로 쓰러진다. 그리고 그 우연히 걸린 병 때문에 이탈리아 통일이라는 거대한 역사적 흐름은 그대로 좌절되고 만다. "어떻게 이럴수가! 이토록이나 위대한 역사가 한낱 병마 따위에!" 마키아벨리는 진심으로 아파했다. 알튀세르에게도 이건 충격이었다. 그가 믿고 있던 마르크스의 역사 법칙이 한 순간에 무너지는 것이었으니까. 마르크스는 말한다. 역사란 필연적으로 법칙을 따른다고. 그렇게 인류의 역사는 지금의 자본주의에 이르기까지 원시공산제로부터 필연적으로 발전해왔으며 그와 똑같이 공산주의 사회로 필연적으로 다다르게 될 것이라고. 하지만 체자레의 사건은 전혀 다른 걸 보여주었다. 병마와 같은 작은 우연이 거대한 역사적 필연마저도 거꾸러뜨릴 수 있음을. 그래서 그는 마르크스의 유물론적 역사관을 수정하기에 이른다. 저 고대 그리스의 루크레티우스를 따라 우발성의 유물론을 정초하기에 이른 것이다.

 이제 질문이다. 역사란 과연 우연일까? 필연일까?
 어떤 이들은 이게 동전의 양면이라고 말한다. 채무자를 기다리고 있는 채권자를 생각해보자. 그 채권자는 채무자가 몇 시에 어디를 통해 집으로 오는지 훤하게 알고 있었고 그래서 골목 한 모퉁이에서 가만히 그 채무자를 기다리고 있었다. 하지만 채무자는 채권자가 그정도까지 알고 있다고는 생각하지 못했고 그 날도 여느날과 다름없이 홀가분한 마음으로 집으로 돌아갔다. 그런데 느닷없이 꺾어진 골목에서 채권자가 기다렸다는 듯이 나타나는게 아닌가!  채무자에게 이 채권자의 출현은 느닷없이 당한, 우연의 횡액이겠지만 채권자에겐 아니다. 채무자의 출현은 필연인 것이다. 뭐, 우연과 필연은 보기 나름이라고 편하게 정리내릴 수도 있겠지만 어쨌든 이렇게만은 이야기할 수 없다. 왜냐하면 이것이 우연인지 필연인지를 정확히 가늠할 수 있는 것은 신(神)적 인식으로만 가능하기 때문이다. 그만한 인식을 가지지 못한 보통의 우리로서는 무엇이 우연이고 또 무엇이 필연인지를 분명히 구분할 수 있다. 알튀세르도 바로 그 수준에서 우발성의 유물론을 이야기한다. 이 알튀세르의 이론을 진지하게 받아들여 하나의 사회 이론으로 정립한 이들이 있으니 그들이 바로 에르네스토 라클라우와 상탈 무페다. 그 둘은 우연적으로 결정되는 것과 그람시의 헤게모니론을 받아들여 한 권의 책을 썼는데 그것이 바로 '헤게모니와 사회주의 전략' 이었다. 이는 보편적이며 필연적인 것에 바탕을 둔 마르크스주의에 대한 중대한 수정이었다. 하부에 의한 결정이 아니라 상부에 의한 구성이 더욱 강력하며 그렇게 헤게모니 또한 얼마든지 우연히 변화될 수 있다고 보았다. 말하자면 그들에 의해 알튀세르로 부터 제기되었던 '우연성'이 보다 더 부각되게 된 것이다. 이제 그들은 중요한 질문을 던지고 있다. 이렇게 우연성이 넘쳐난다면 보편성은 어떻게 해야 하는가? 그건 과연 정립 가능한 것인가?

 아시다시피, 포스트모던은 보편성을 부셨다. 거대 서사의 종말. 그렇게 그것은 지역적인 것, 특수적인 것에 특권을 부여했다. 포스트 모던이 나왔을 때 부터 그것이 소비지상주의를 떠받치고 보수를 강화하는 쪽으로 흐르게 될 것이라고 경고하는 목소리들은 있었다. 그 예언은 맞아 떨어졌고 포스트 모던은 2008년의 서브프라임이 일으킨 금융 공황과 더불어 침몰하고 있는 중이다. 프랜시스 후쿠야마는 샴페인을 성급하게 터뜨렸다. 아직 헤겔이 말한 역사의 종말은 도래하지 않았으며 이제는 다시금 '보편성'을 생각해야 할 시대가 온 것이다. 무엇보다 거대 서사의 종말이 가져온 지금의 모습을 보라. 사무엘 헌팅턴의 예언대로 갈수록 인종주의, 부자와 빈자간의 대립은 격해지고 있다. 이제까지 그들을 제어해주던 최소한의 이념적 틀마저 사라졌기 때문이다. 미국은 미국대로 패권을 유지하려 하고 중국은 중국대로 자신이 패권국가가 되려하며 일본은 일본대로 또다시 제2의 대동아공영을 부르짖을 준비를 하고 있다. 서로가 합의하에 공생의 길을 도모하는 '보편성'이 시급히 요청되는 시기인 것이다.

 최근의 철학적 흐름 가운데 하나는 바로 이러한 보편성을 정초시키는데 있으며 주디스 버틀러는 그 흐름의 대표적인 학자 가운데 한 사람이다. 이미 국내에 소개된 그녀의 주저 '젠더 트러블'에서 보여주듯이 그녀는 성적 정체성을 결정되는 것이 아니라 구성되는 것이라 생각한다. 그렇게 그녀는 우연적인 것을 포용한다. '과정 중의 형성' 그것이 핵심이다. 바로 거기에 맞처 그녀는 '보편성'을 정립하는 것도 탐색하고 있는데 이번에 나온 '윤리적 폭력 비판'은 바로 이러한 구성주의적인 것을 '나'라는 주체성 확립에 연결지어 탐색한 것이다. '윤리적 폭력 비판'의 원래 제목은 'GIVING AN ACCOUNT OF ONESELF'다직역하자면 '자기 자신을 설명하기'라고 할 수 있다. 개인적으로 제목 자체가 이 책에서 주디스 버틀러가 하려는 것을 집약적으로 보여주고 있다고 생각된다.

 그런데 왜 자기 자신을 설명하는 것에 천착하는 것일까? 그건 '보편성의 정립'을 염두에 두면 쉽게 답이 나온다. 보편성의 정립이란 쉽게 비유하자면 일종의 대화와 같다. 그렇다면 대화가 이루어지기 위한 가장 기본적인 출발점은? 물론 그건 자기 소개다. 그렇게 자신을 설명하는 것이다. 의례적 만남이든, 사교적 만남이든 모든 만남에는 필연적으로 타인에게 나 자신을 설명하는 과정이 뒤따르게 된다. 그래야 서로 이해의 차원으로 다가갈 수 있기 때문이다. 이 서로 이해의 차원을 '보편성'이라 볼 수 있을 것 같다. 그런 의미에서 나 자신을 설명한다는 것은 상대방과의 상호 보편성을 정초하는 데 있어 필요한 기초 작업인 셈이다. 보다 원활한 상호 이해가 가능한 가급적 투명한 보편성을 가져오기 위해서는 무엇보다 그 시작이 되는 '나 자신에 대해 설명하기'가 잘 이루어져야 한다. 때문에 그녀는 천착하는 것이다.

 여기서 뒤따르기 쉬운 하나의 오해.
 '나 자신을 설명한다'고 했을때 우리가 과연 설명하는 '나'란 고정적인 것일까, 우연적인 것일까?
아마도 고정적인 나일 것이라 생각하시는 분이 많으실 것이다. 하지만 아니다. 주디스 버틀러는 모두 3부에 걸쳐서 나 자신에 대해 말한다는 것이 정말 어떤 의미인가 말하고 있는데 그 중 1부의 이야기가 바로 내가 말하는 '나'라는 게 내가 익히 경험했고 알고 있는 나는 아닌 것임을 보여주는데 있다. 그러니까 우리가 우리 자신을 설명할 때 우리의 진실된 모습을 설명하지 않는다는 것이다. 보다 정확히는 그 설명하는 순간 우리는 만들어지고 바로 그 과정중에 형성된 우리 자신을 설명할 뿐이다. 다시 말해, 우리가 늘 가지고 있었던 자아를 그 설명의 순간 '쨘!'하고 드러내는게 아니라 오히려 그 설명을 통해 정의를 얻지 못했던 우리의 자아가 그제서야 비로소 명확해지는 것이다. 즉 우리의 정체성이란 바로 그 설명의 순간 형성되어진다. 이것이 바로 1부가 이야기하고자 하는 핵심이다. 그렇게 우리의 정체성이란 고정 불변의 자아가 아니라 나 자신에 대해 발화하는 순간 다양하게 가변화되는 우연의 존재인 것이다. 하지만 우리 스스로는 우리를 꽤나 고정 불변적인 것으로 여긴다. 누구나 다 '나는 이러이러한 사람이야.'라고 생각하고 있듯이.

 혹시 궁금하게 여기진 않았는지?
 왜 우리는 우리가 가진 존재의 가능성을 이토록 협소하게 이해하고 있는지?

 주디스 버틀러는 그 까닭을 밝혀준다.
 바로 거기서 이 책의 제목이기도 한 '윤리적 폭력 비판'이 들어온다. 칸트의 '비판' 시리즈를 오마쥬하고 있는 듯 보이는 이것은 2부 '윤리적 폭력에 대항해서'에서 상세히 설명되고 있다. 그런데 가만히 생각해보면 이 말 좀 이상하다. 윤리와 폭력이 어떻게 결합될 수 있을까? 윤리란 원래 서로 간의 폭력을 없애기 위해 사람들 사이의 약속으로 만들어진 것이 아니던가? 그런데 주디스 버틀러는 그러한 윤리가 우리에게 폭력적으로 작용한다고 말한다. 때문에 2장의 제목은 '윤리적 폭력에 대항해서'인 것이다. 그렇다면 여기에서 말하는 윤리란 도대체 무엇일까? 그리고 그것은 또 어떻게 해서 우리에게 폭력적이 되는 것일까?

 이걸 알려면 다시 나 자신을 설명하는 순간으로 돌아가야 한다. 남에게 자신을 설명하는 순간을 다시 한 번 떠올려 보자. 그 때 우리의 의식은 어떻게 전개되는가? 과연 우리는 그 순간 우리 자신을 있는 그대로 설명할 수 있는가? 정말 투명하게 드러내는가? 아마도 아닐 것이다. 앞서도 말했듯, 우리는 우리를 있는 그대로 드러내지 못한다. 일례로 사람들 사이에는 언제나 일정한 자기 소개의 규칙이 존재한다. 우리 자신을 설명하는 행위는 언제나 그 규칙을 따르게 되는 것이다. 내 자의대로 나 자신을 설명하지 못한다. 설령 자의대로 한다고 해도 상대방이 나에게 규칙에 따를 것을 요구한다. 할 수 없이 외부에 이미 존재하는 규칙이나 혹은 방식에 따라서 나 자신을 설명할 수 밖에 없다. 어떤 의미로 그것은 예의라고도 불리고 혹은 배려라고도 불린다. 이렇게 나 자신을 설명한다는 것은 외부의 규칙을 가져올 수 밖에 없다. 규칙 뿐만이 아니다. 나 자신을 설명하는 것 자체도 언제나 외부에 기댈 수 밖에 없다. 왜냐하면 나 자신을 언어로 설명하는 그 순간, 나의 말들은 바로 탈취되어 내 삶에서 우러나온 담론이 아닌 타인에게 받아들여진 언어의 담론으로 즉각 변형되어 버리기 때문이다. 그건 내가 속했던 시간이 아니고 타인이 속했던 시간 틀 위에서 새롭게 번역된다. 내 삶의 직접 경험이라는 터전 위에서가 아니라 그 타인이 살아온 삶의 경험적 틀 위에서 말이다. 즉 다시 말해서 우리가 아무리 서로에게 투명하게 나 자신을 설명한다고 해도 그것이 언어라는 또 문법이라는 혹은 예절이라는 외부적 형식을 빌려오는 한 마치 서로 다른 언어를 하는 사람들끼리 번역해서 듣는 것처럼 할 수 밖에 없다는 그런 의미다. 나는 일부러 언어 이외에 '문법'이라는 말을 썼는데 왜냐하면 이를 통해 강조하고 싶은 게 있기 때문이다. 그건 바로 문법에서 연상되어지는 '규범'의 존재를 나타내기 위해서다. 예절이라는 것도 그 규범의 존재를 강력하게 시사한다. 즉 우리에겐 서로가 소개하고 또 이해하기 위해 반드시 필요한 매개항으로써 '규범'이란 게 존재한다는 걸 말하고 싶은 것이다.

  자신을 설명하는 순간, 규범이 도래한다.
  그리고 그것은 하나의 틀이 되어 나 자신을 거기에 맞추게 한다. 나는 다양하지만 정해진 규범의 틀은 마치 그리스 신화에 나오는 프로크루스테스의 침대처럼 틀에 맞지 않는 걸 잘라낸다. 폭력적이다. 다양한 나는 규범이 허용하는 틀 내에서 협소해지고 앞서도 말했듯이 나 자신의 정체성은 그 발화의 순간 형성되는 것이기에 어느 순간 그것은 나의 모습으로 받아들여진다. 그래서 나 자신 역시도 나라는 존재를 협소하고 고정된 것으로 받아들이게 된다. 바로 이런 의미에서 외부의 규볌이라는 윤리는 우리에게 폭력으로 작용하는 것이다.

  주디스 버틀러는 이걸 니체를 들어 설명한다.
  니체는 우리가 나 자신에 대해 설명하는 것이 무엇보다 사법적 체계로 부터 비롯된 것이라 말한다. 즉 타인에게 상해를 입혔을 때, 자신에겐 전혀 그런 고의가 없었음을 스스로 입증하던 것에서 나 자신을 설명하는 것이 나왔다는 것이다. 니체의 이말은 우리가 알고 있는 자아가 온전히 있는 그대로의 자아가 아니라 외부에 의해 강제적으로 형성된 것임을, 그리고 그걸 그대로 무의식적으로 받아들인 것임을 강력히 시사한다. 왜냐하면 법정 앞에서 자신의 변호란 아무래도 있는 그대로의 나를 드러내기 보다는 남이 보다 잘 이해할 수 있는 나로 만드는데 더욱 초점을 맞추게 마련이기 때문이다. 그렇게 애시당초 나를 설명한다는 것이 변호로 부터 출발했고 이제는 본질이 되었기에 우리는 앞에서 가해지는 윤리적 폭력 앞에서 일종의 자기 방어로써 나 자신을 협소한 것으로 그리고 고정 불변의 존재로 스스로 규정해왔던 것이다.

 결국 우리는 왜곡된 우리의 자화상을 진실인 것처럼 알고 살아가는 것이며 이런 왜곡된 상으로는 주디스 버틀러가 바라는 투명한 보편성을 정초할 수가 없다. 깨어진 거울로는 서로의 진정한 모습을 비출 수가 없는 것이다. 해서 우리는 윤리적 폭력에 대항해서 보다 온전히 나 자신을 드러내고 받아들일 수 있는 '설명하기'로 나아가야 한다. 그 탈주, 미끄러짐을 이야기 하는 것이 바로 제3부의 '책임'이다. 3부에 책임이 나오는 것에 대하여 의아해 할 분들이 계실지 몰라서 부언하자면 이는 어디까지나 나 자신에 대한 설명이 사법적 체계 아래에서 나왔다는 니체의 말과 연장선상에 있는 것이다. 왜냐하면 책임이 등장하게 된 것이 바로 법 때문이었기 때문이다. 근대에 들어와 인간 개인의 중요성이 커지면서 형벌 또한 중세처럼 연좌제가 아니라 오로지 그 개인에게 돌릴 수 있는 것만 내리게 되었다. 그렇게 개인에게 그 처벌을 감당해야 할 이유로써 '책임'이라는 게 대두되었던 것이다. 한 마디로 책임이란 '나의 나-됨'의 완성이다. 주디스 버틀러는 책임을 이렇게 설명한다.

 자신을 책임진다는 것은 자기-이해의 모든 한계를 시인하고, 이 한계를 주체의 조건으로서뿐 아니라 인간 공동체의 곤궁으로서도 확립한다는 것이다(p.146)

 '책임'은 나를 설명하고 그 와중에 나를 만들어가는 것의 종착역이다. 거기서 나라는 상은 만들어지는데 왜 주디스 버틀러는 이것을 마지막 장으로 불러온 것일까? 그건 바로 나라는 상(像)의 확립과 관계가 있다. 보다 정확히 말하자면 그 과정의 진실된 정체를 밝히는 것. 책임이란 우리가 바라보는 우리 자아의 확정된 모습인데 과연 그것은 있는 그대로의 우리 모습인걸까? 주디스 버틀러는 정신분석학자 라플랑슈와 푸코의 고백 이론을 들어 이 확정된 자아의 상을 남김없이 때려 부순다. 실로 그런 건 아예 존재하지도 않는다고 말이다.

 놀라운 건 라플랑슈의 이론이다. 그는 우리가 우리 자아의 모습을 형성하는 유아기 때부터 아예 우리는 우리 스스로가 아니라 타자에 호응해서 자아를 만들어간다고 단언한다. 쉽게 말하면 나라는 자아는 나 자신의 뜻대로 만드는 것이 아니라 타인의 반응에 따라 형성된 것이며 그 상호 조정의 결과라는 것이다. 그렇게 타자가 우리 내부에 원초적으로 깊숙이 들어와서 우리 자아의 형성까지 주관했다는 것이 라플랑슈의 주장이다. 그러므로 법이 네가 누구냐 물었을 때 비로서 나 자신이 태어났다는 니체의 말과도 같이  애초부터 우리에게 진정한 나의 모습이란 없는 것이다. 있다면 그동안 수많은 타자들과의 상호 작용 속에서 그때 그때따라 맞춰가며 형성된 '나'가 있을 뿐. 푸코의 고백에 대한 이론은 이를 더욱 증명한다. 푸코는 고백이 내면의 진실을 드러내는 것이 아니라 오히려 바로 그 고백을 통해서 내면의 진실이 형성되는 것이라 말한다. 즉 고백은 자신의 자아를 형성해가는 하나의 육체적 실천인 것이다. 이렇게 함으로써 푸코는 고백을 통한 우리의 자아 표현이 '자신의 내면성을 용해시키고 자아의 외면성 속에서 자신을 재구성'하는 것이라 말한다. 이는 그대로 애초부터 타자에 의해 우리의 자아란 게 형성되어왔다는 라플랑슈의 이론과 그대로 이어진다.

 이렇게 라플랑슈도, 푸코도 우리의 자아란 고정된 것이 아니라 타인에 의해 그 때 그 때에 따라 변형되고 수정되어 왔음을 보여준다. 말하자면 지금 우리가 진실이라 알고 있는 모습 또한 내 삶의 어느 순간 타인과의 어떤 계기로 굳어진 것에 불과하다는 의미다. 그러므로 우리에게 '책임'이 '꽝!'하고 도장을 찍는 것과 같은 확정된 자아란 게 있을 수 있는가? 그건 불가능하다는 것이다. 라플랑슈와 푸코 그리고 레비나스의 이론을 인용하면서 주디스 버틀러가 '책임'의 장에서 주장하는 건, 나 자신이란 건 부단히 형성되는 존재라는 것이다. 즉 나는 어떤 시점에 일의적으로 규정될 수 없고 마치 파인만의 경로처럼 무한히 가변될 수 있는 존재라고 말이다. '나 자신을 확실히 설명한다는 건 영원히 불가능할 노력'이라고 푸코가 말했듯이.

 주디스 버틀러는 나라는 존재가 이렇게 늘 수정가능하다고 생각하는 게 중요하다고 생각한다. 물론 존재의 모든 부분이 수정 가능한 것은 아니지만 그럼에도 불구하고 나를 이렇게 모든 변화와 수정에 열려있는 존재로 여기는 것이 보다 투명한 보편성을 위한 소중한 첫걸음이라고 여긴다. 왜냐하면 나를 이렇게 받아들여야만 되도록 윤리적 폭력으로 부터 비껴나서 보다 허심탄회하게 상대방에게 귀기울일 수 있기 때문이다. 나의 나-됨을 고집하지 않는 것이야말로 무엇보다 상호 타협을 위한 첫 계단이니까 말이다. 결국 여기에서 드러나는 건 나 자신을 설명한다는 게 진짜 어떤 의미냐는 것이다. 그건 우리가 얼른 이 말에서 생각했듯이 나의 '나-됨'을 남에게 보이는 것이 아니라 진실은 그만큼 나를 더 허물고 타인에게 여는 행위라는 것이다. 쉽게 말하면 설명한다는 것은 내 안에 여백을 만드는 일이다. 혹은 더 많은 귀를 가지는 일이다. 나를 비우고 다시금 타인을 포용하면서 새로이 나를 만들어 가는 것. 그것이 바로 나를 설명한다는 것의 진정한 모습이다.

 들으려는 귀는 없고, 말하려는 입만 많은 요즘. 주디스 버틀러의 이와 같은 주장이 소중히 여겨지는 것이 과연 나뿐일까?