Principia Cybernetica Web

What is complexity?

Complexity has turned out to be very difficult to define. The dozens of definitions that have been offered all fall short in one respect or another, classifying something as complex which we intuitively would see as simple, or denying an obviously complex phenomenon the label of complexity. Moreover, these definitions are either only applicable to a very restricted domain, such as computer algorithms or genomes, or so vague as to be almost meaningless. Edmonds (1996) gives a good review of the different definitions and their shortcomings, concluding that complexity necessarily depends on the language that is used to model the system. Still, I believe there is a common, "objective" core in the different concepts of complexity. Let us go back to the original Latin word complexus, which signifies "entwined", "twisted together". This may be interpreted in the following way: in order to have a complex you need two or more components, which are joined in such a way that it is difficult to separate them. Similarly, the Oxford Dictionary defines something as "complex" if it is "made of (usually several) closely connected parts". Here we find the basic duality between parts which are at the same time distinct and connected. Intuitively then, a system would be more complex if more parts could be distinguished, and if more connections between them existed. More parts to be represented means more extensive models, which require more time to be searched or computed. Since the components of a complex cannot be separated without destroying it, the method of analysis or decomposition into independent modules cannot be used to develop or simplify such models. This implies that complex entities will be difficult to model, that eventual models will be difficult to use for prediction or control, and that problems will be difficult to solve. This accounts for the connotation of difficult, which the word "complex" has received in later periods.

The aspects of distinction and connection determine two dimensions characterizing complexity. Distinction corresponds to variety, to heterogeneity, to the fact that different parts of the complex behave differently. Connection corresponds to constraint, to redundancy, to the fact that different parts are not independent, but that the knowledge of one part allows the determination of features of the other parts. Distinction leads in the limit to disorder, chaos or entropy, like in a gas, where the position of any gas molecule is completely independent of the position of the other molecules. Connection leads to order or negentropy, like in a perfect crystal, where the position of a molecule is completely determined by the positions of the neighbouring molecules to which it is bound. Complexity can only exist if both aspects are present: neither perfect disorder (which can be described statistically through the law of large numbers), nor perfect order (which can be described by traditional deterministic methods) are complex. It thus can be said to be situated in between order and disorder, or, using a recently fashionable expression, "on the edge of chaos".

The simplest way to model order is through the concept of symmetry, i.e. invariance of a pattern under a group of transformations. In symmetric patterns one part of the pattern is sufficient to reconstruct the whole. For example, in order to reconstruct a mirror-symmetric pattern, like the human face, you need to know one half and then simply add its mirror image. The larger the group of symmetry transformations, the smaller the part needed to reconstruct the whole, and the more redundant or "ordered" the pattern. For example, a crystal structure is typically invariant under a discrete group of translations and rotations. A small assembly of connected molecules will be a sufficient "seed", out of which the positions of all other molecules can be generated by applying the different transformations. Empty space is maximally symmetric or ordered: it is invariant under any possible transformation, and any part, however small, can be used to generate any other part.

It is interesting to note that maximal disorder too is characterized by symmetry, not of the actual positions of the components, but of the probabilities that a component will be found at a particular position. For example, a gas is statistically homogeneous: any position is as likely to contain a gas molecule as any other position. In actuality, the individual molecules will not be evenly spread. But if we look at averages, e.g. the centers of gravity of large assemblies of molecules, because of the law of large numbers the actual spread will again be symmetric or homogeneous. Similarly, a random process, like Brownian motion, can be defined by the fact that all possible transitions or movements are equally probable.

Complexity can then be characterized by lack of symmetry or "symmetry breaking", by the fact that no part or aspect of a complex entitity can provide sufficient information to actually or statistically predict the properties of the others parts. This again connects to the difficulty of modelling associated with complex systems.

(1996) notes that the definition of complexity as midpoint between order and disorder depends on the level of representation: what seems complex in one representation, may seem ordered or disordered in a representation at a different scale. For example, a pattern of cracks in dried mud may seem very complex. When we zoom out, and look at the mud plain as a whole, though, we may see just a flat, homogeneous surface. When we zoom in and look at the different clay particles forming the mud, we see a completely disordered array. The paradox can be elucidated by noting that scale is just another dimension characterizing space or time (Havel, 1995), and that invariance under geometrical transformations, like rotations or translations, can be similarly extended to scale transformations (homotheties).

Havel (1995) calls a system "scale-thin" if its distinguishable structure extends only over one or a few scales. For example, a perfect geometrical form, like a triangle or circle, is scale-thin: if we zoom out, the circle becomes a dot and disappears from view in the surrounding empty space; if we zoom in, the circle similarly disappears from view and only homogeneous space remains. A typical building seen from the outside has distinguishable structure on 2 or 3 scales: the building as a whole, the windows and doors, and perhaps the individual bricks. A fractal or self-similar shape, on the other hand, has infinite scale extension: however deeply we zoom in, we will always find the same recurrent structure. A fractal is invariant under a discrete group of scale transformations, and is as such orderly or symmetric on the scale dimension. The fractal is somewhat more complex than the triangle, in the same sense that a crystal is more complex than a single molecule: both consist of a multiplicity of parts or levels, but these parts are completely similar.

To find real complexity on the scale dimension, we may look at the human body: if we zoom in we encounter complex structures at least at the levels of complete organism, organs, tissues, cells, organelles, polymers, monomers, atoms, nucleons, and elementary particles. Though there may be superficial similarities between the levels, e.g. between organs and organelles, the relations and dependencies between the different levels are quite heterogeneous, characterized by both distinction and connection, and by symmetry breaking.

We may conclude that complexity increases when the variety (distinction), and dependency (connection) of parts or aspects increase, and this in several dimensions. These include at least the ordinary 3 dimensions of spatial, geometrical structure, the dimension of spatial scale, the dimension of time or dynamics, and the dimension of temporal or dynamical scale. In order to show that complexity has increased overall, it suffices to show, that - all other things being equal - variety and/or connection have increased in at least one dimension.

The process of increase of variety may be called differentiation, the process of increase in the number or strength of connections may be called integration. We will now show that evolution automatically produces differentiation and integration, and this at least along the dimensions of space, spatial scale, time and temporal scale. The complexity produced by differentiation and integration in the spatial dimension may be called "structural", in the temporal dimension "functional", in the spatial scale dimension "structural hierarchical", and in the temporal scale dimension "functional hierarchical".

It may still be objected that distinction and connection are in general not given, objective properties. Variety and constraint will depend upon what is distinguished by the observer, and in realistically complex systems determining what to distinguish is a far from trivial matter. What the observer does is picking up those distinctions which are somehow the most important, creating high-level classes of similar phenomena, and neglecting the differences which exist between the members of those classes (Heylighen, 1990). Depending on which distinctions the observer makes, he or she may see their variety and dependency (and thus the complexity of the model) to be larger or smaller, and this will also determine whether the complexity is seen to increase or decrease.

For example, when I noted that a building has distinguishable structure down to the level of bricks, I implicitly ignored the molecular, atomic and particle structure of those bricks, since it seems irrelevant to how the building is constructed or used. This is possible because the structure of the bricks is independent of the particular molecules out of which they are built: it does not really matter whether they are made out of concrete, clay, plaster or even plastic. On the other hand, in the example of the human body, the functioning of the cells critically depends on which molecular structures are present, and that is why it is much more difficult to ignore the molecular level when building a useful model of the body. In the first case, we might say that the brick is a "closed" structure: its inside components do not really influence its outside appearance or behavior (Heylighen, 1990). In the case of cells, though, there is no pronounced closure, and that makes it difficult to abstract away the inside parts.

Although there will always be a subjective element involved in the observer's choice of which aspects of a system are worth modelling, the reliability of models will critically depend on the degree of independence between the features included in the model and the ones that were not included. That degree of independence will be determined by the "objective" complexity of the system. Though we are in principle unable to build a complete model of a system, the introduction of the different dimensions discussed above helps us at least to get a better grasp of its intrinsic complexity, by reminding us to include at least distinctions on different scales and in different temporal and spatial domains.

References:

See also:


Copyright© 1996 Principia Cybernetica - Referencing this page

Author
F. Heylighen,

Date
Dec 9, 1996

Home

Metasystem Transition Theory

Systems Concepts

Up
Prev. Next
Down



Discussion

Add comment...