The Evolution of Complexity - Abstracts.

Causality as Constraint

By Alicia Juarrero
  • Prince George's Community College
  • Largo, Maryland 20772-2199
  • Tel: (202) 342-612
  • E-Mail:
  • Abstract:

    I am confident that a careful analysis of the concept of "constraint" can shed light on many of the key concepts which the conference will focus on (self-organization, selection, adaptation, variety, hierarchy, control,and metasystems transition). From a philosopher's point of view, a mechanistic understanding of causality is to blame for many of those puzzles. Although I am not a Wittgensteinian, I am confident that clarifying the source of the misunderstandings might help dissolve many of them.

    While evolutionary theorists puzzle over such issues as morphogenesis, development versus selection, and measures of complexity, philosophers agonize over those of causality, self- referential paradoxes,and meaning. And behind but spanning both science and philosophy lurks the seemingly unbridgeable differences in understanding the concept of "information."

    It is the claim of this paper that the concept of constraint may serve as the hinge that articulates the relationships among many of these concepts. In particular, when dealing with hierarchical systems that are self-referential and display inter-level effects, the notion of causality itself must be reconceptualized in terms other than that of the billiard-ball, collision conception which is the legacy of mechanism. Understanding causality *as* the workings of constraint reveals new questions leading to interesting new avenues of research, and renders other questions moot.

    Depew & Weber focus on the question of whether the world exhibits real propensities which may not only provide the objective basis for probability theory but also be the source of self-organization and complexity. Without disagreeing with their relating of propensities and probabilities, I wish to suggest that "propensities" in the sense used by Popper, Ulanowicz and Depew & Weber, are just one version (the bottom-up, enabling version) of a more general notion of constraint. Enabling constraints are responsible for the emergent properties of metasystem transitions. In turn, physical stability, chemical efficiency, and fitness are the other side of the constraint coin: measures of top-down selectionist constraints which wholes exert on their components. The "supervenience" which complex systems display with respect to their microstates can also better be understood as a version of "weak constraint satisfaction," in the sense used by Neural networks. In addition, one must keep in mind that evolving systems exhibit diachronic as well as synchronic constraints.

    The word "information" is ambiguous, even within the discipline of information theory. Constraints take systems away from the equilibrium of noise and are thus required to create form and information; in this sense "information" is a measure of the reduction of uncertainty (a decrease in entropy). As Lila Gatlin notes, "information density is a measure of all the constraints placed upon a sequence of symbols." On the other hand, Weaver himself notes that "this word `information' in communication theory relates not so much to what you do say as to what you can say." On this view, since randomness equates with unpredictability and constant novelty, potential message variety will be at its maximum at equilibrium. Lila Gatlin redescribes Shannon's two _________ as kinds of constraints: the first amounting to deviation from equiprobability (an efficient but expensive form of redundancy which limits message variety), the other to deviation from independence (a form of conditional probability which does not suffer from the limitations of the first). Gatlin argues that the explosion of phenotypes that occurred with the appearance of the vertebrates took place precisely because vertebrates managed to maintain one kind of constraint constant while allowing another to expand. If this is so, constraints must do double duty: they must, decrease entropy in the sense of noise while simultaneously increasing entropy in the sense of expanding potential message variety. How is this possible?

    In this paper I explore in detail how the second kind of constraint, deviation from independence accomplishes this apparent contradiction, limiting and at the same time increasing degrees of freedom. I also examine why not only the pragmatics but the logic of explanation must be contextual, both temporally and environmentally.