The Competitiveness of Nations in a Global Knowledge-Based Economy
Stuart A. Kauffman
Investigations
Chapter 9: The
Persistently Innovative Econosphere
Oxford
University Press, 2000, 211-241
Content
General Competitive Equilibrium and Its Limitations
Rational Expectations and Its Limitations
Natural Rationality Is Bounded |
IT IS NO ACCIDENT that the words for economics and
ecology have the same I Greek root, “house?’ Ecology and economics are, at root, the same. The economy of Homo habilis and Homo
erectus, the stunning flaked flint tools of the Magdalinian culture of the
magnificent Cro-Magnon in southern France 14,000 years ago when the
large beasts had retreated southward from the glaciation, the invention and
spread of writing in Mesopotamia, the Greek agora, and today’s global economy
are all in the deepest sense merely the carrying on of the more diversified
forms of trade that had their origins with the first autonomous agents and
their communities over four billion years ago.
Economics has its roots in agency and the emergence of
advantages of trade among autonomous agents. The advantages of trade predate the human
economy by essentially the entire history of life on this planet. Advantages of trade are found in the metabolic
exchange of legume root nodule and fungi, sugar for fixed nitrogen carried in
amino acids. Advantages of trade were
found among the mixed microbial and algal communities along the littoral of the
earth’s oceans four billion years ago. The
trading of the econosphere is an outgrowth of the trading of the biosphere.
Economics has considered itself the science of
allocation of scarce resources. In doing so, it shortchanges its proper domain. Indeed, if we stand back and squint, it
211
is easy to see the most awesome feature of an economy and its roots in
autonomous agents: The most awesome feature of the econosphere, as of the biosphere
- both built by communities of autonomous agents in their urgent plunging,
lunging, sliding, gliding, hiding, trading, and providing - has been a
blossoming diversity of molecular and organismic species and of novel ways of
making a living that has persistently burgeoned into the adjacent possible. From tens of organic molecular species to tens
of trillions; from one or a few species of autonomous agents to a standing
diversity of some hundred million species and a total diversity some hundred to
thousandfold larger of those creatures come and gone.
Homo erectus
had fire and early tools. Homo habilis traded stone axe parts 1.6
million years ago. The diversity of
Cro-Magnon goods and services in the south of France some 14,000 years ago may
have numbered in the several hundreds to a few thousands. Today, surf the web and count the diversity of
goods and services, the ways of making a living; it is in the millions.
Neither the biosphere nor the econosphere are merely
about the distribution of limited resources, both are expressions of the
immense creativity of the universe, and in particular; of autonomous agents as
we exapt molecularly, morphologically, and technologically in untold,
unforetellable ways persistently into the adjacent possible. Jobs and job holders jointly coevolve into
existence in the econosphere in an ever-expanding web of diverse complexity.
One of the most striking facts about current economic
theory is that it has no account of this persistent secular explosion of
diversity of goods, services, and ways of making a living. Strange, is it not, that we have no theory of
these overwhelming facts of the biosphere and econosphere? Strange, is it not, that we pay no attention
to one of the most profound features of the world right smack in front of our
collective nose? And consistent with
this strangeness, the most comprehensive theory, the rock foundation of modern
economics, the beautiful “competitive general equilibrium” theory of Arrow and
Debreu, does not and cannot discuss this explosion, perhaps the most important
feature of the economy.
General
Competitive Equilibrium and Its Limitations
So we begin with an outline of competitive general
equilibrium as the cornerstone conceptual framework of modern economics. Ken Arrow is a friend. As one of the inventors of the framework, Ken
is more at liberty to be a critic than the two generations of economists who
have followed in his footsteps. My best
reading of Ken’s view, which I share, is that competitive general equilibrium
is, at present, the only overarching framework we have to think about the
economy as a whole. Yet Ken suspects
that that framework is incomplete; I agree.
Competitive general equilibrium grows out of a conceptual framework in
which the core question is how prices form such that markets clear. Recall the fa-
212
mous supply-and-demand curves for a single
good (Figure 9.1) [HHC: figure not reproduced). As a function of increasing price, plotted on
the x-axis, supply, plotted on the y-axis, increases from low to high. More companies are willing to create widgets
as the price per widget increases. On
the other hand, demand, where demand is also plotted on the y-axis, decreases
from high to low as prices increase. Fewer customers are willing to buy widgets as
the price per widget increases.
As the figure shows, the supply-and-demand curves
cross at some price. At that price, the
markets “clear,” that is, all the widgets supplied are purchased. The price at which markets clear is the
“equilibrium price.”
For a single good, the problem is simple. But consider bread and butter. Since many of us like butter on our bread, the
demand for butter depends not only on the price and hence the supply of butter,
but also on the price of bread, hence on the supply of bread. And vice versa, the demand for bread depends
upon the price of butter and hence on the supply of butter. For thousands of goods, where the demand for
any one good depends upon the price of many goods and the supply of any one
good depends upon the price of many goods, it is not so obvious that there is a
price for each good such that all markets clear.
But worse, the supply or demand for bread today may be
different than the supply or demand for bread tomorrow. And still worse, the supply or demand for
bread tomorrow may depend on all sorts of odd contingent facts. For example, if severe cold kills the winter
wheat next month, the supply of bread will drop; if a
HHC: Figure 9.1 not reproduced
FIGURE 9.1 Textbook supply
and demand curves for a single good, the supply curve increasing as a function
of price, the demand curve decreasing as a function of price. Crossing point is the equilibrium price at
which supply equals demand and markets clear. For an economy with many coupled goods, the
requirements for market clearing are more complex (see text).
213
bumper crop of winter wheat comes available
globally because of weather or suddenly improved irrigation and farming
practices worldwide, the supply will go up.
Arrow and Debreu made brilliant steps. First, they consider a space of all possible
dated and contingent goods. One of their
examples of a “dated contingent good” is “1 ton of wheat delivered in Chicago
on May 2, 2008, under the condition that the average rainfall in Nebraska for
the six preceding months has been 10 percent less than normal for the past
fifty years and that the Boston Red Sox won the World Series the previous
year.”
In the Arrow-Debreu theory, we are to imagine an
auctioneer, who at a defined beginning of time, say, this morning, holds an
auction covering all possible dated contingent goods. All suppliers and customers gather at this imaginary
auction, bidding ensues for all possible dated contingent goods, with values
calculated under different hypotheses about the probabilities that the
different dated contingencies will come about. At the end of an imaginary hour of frantic
bargaining, the auction closes. All
participants now have contracts to buy or sell all possible dated contingent
goods, each at a fixed price. Everybody
hustles home to watch Good Morning America. And, wondrously, however the future
unfolds, whether there’s rain, sun, or snow in Nebraska, the dated contingent
contracts that are appropriate come due, the contracts are fulfilled at the
preestablished price for each contract, and all markets clear.
It is mind-boggling that Arrow and Debreu proved these
results. The core means of the solution
depends upon what mathematicians call “fixed-point theorems.” A beginning case is your hair, particularly
for males, where short hair makes the fixed point easy to see. When you comb your hair in a normal fashion,
there is a point roughly on top of your head, slightly to the back, where a
roughly circular swirl of hair occurs (ignoring baldness) around a fixed point
where typically a bit of scalp shows through.
A general theorem considers hair on a spherical
surface and combing the hair in any way you want. You cannot avoid a fixed point. More generally, replace hair by arrows, with
tails and heads, where each arrow is a line with an arrow head at one end,
drawn on the surface of the sphere. The
arrows may bend if you like . Each arrow can be
thought of as mapping the point on the sphere at the tail of that arrow to the
point of the sphere at the tip of that arrow. So the arrows are a mapping of the surface of
the sphere onto itself. For a continuous
mapping, such that there is a mapping from each point on the sphere, a general
fixed-point theorem proves that there must be at least one point on the surface
of the sphere that maps onto itself - that point is a fixed point under the
arrow mapping.
The wonderful Arrow-Debreu general competitive
equilibrium theorems depend on such a fixed point. In a space where all possible dated contingent
goods can be prestated and all possible markets for trade of all such goods
exist, a fixed point also exists that corresponds to a price at which all such
markets clear, how-
214
ever the future unfolds. Arrow and Debreu won the Nobel Prize for their
work, and won it deservedly. It is a
beautiful theory. Yet there are
important critiques of general competitive equilibrium. For example, the theorem depends on “complete
markets,” that is, markets to trade all possible dated contingent goods, and
fine economists have raised issues about how well the theory works if markets
are incomplete, as indeed they are.
I pose, however, a wider set of issues. The
overarching feature of the economy over the past million years or so is the
secular increase in the diversity of goods and services, from a dozen to a
dozen million or more today. Nowhere
does general competitive equilibrium speak about this. Nor can the theory speak about the growth in
diversity of goods and services, for it assumes at the outset that one can
finitely prestate all possible dated contingent goods. Then the theory uses complete markets and a
fixed-point theorem to prove that a price exists such that markets clear.
But we have seen grounds to be deeply suspicious of
the claim that we can finitely prestate all possible exaptations - whether they
be new organic functionalities or new goods - that arise in a biosphere or an
econosphere, such as Gertrude learning to fly in her terrified leap from the
pine tree 60 million years ago last Friday or the engineers working on the
tractor suddenly realizing that the engine block itself could serve as the
chassis.
I do not believe for a moment that we can finitely
prestate all possible goods and services. Indeed, economists intuitively know this. They distinguish between normal uncertainty
and “Knightian uncertainty.” Normal
uncertainty is the kind we are familiar with in probability theory concerning
flipping coins. I am unsure whether in 100
flips there will be 47 heads and 53 tails. Thanks to lots of work, I can now calculate
the probability of any outcome in the finitely prestated space of possible
outcomes.
Knightian uncertainty concerns those cases where we do
not yet know the possible outcomes. Knightian uncertainty has rested in an
epistemologically uncomfortable place in economics and elsewhere. Why? Because we have not realized that we cannot
finitely prestate the configuration space of a biosphere or an econosphere; by
contrast, Newton, Laplace, Boltzmann, Einstein, and perhaps Bohr have all more
or less presupposed that we can finitely prestate the configuration space of
any domain open to scientific enquiry. After
all, as I have noted, we can and do prestate the 6N configuration space for a
liter of N gas particles.
From this point of view, the wonderful Arrow-Debreu
theory is fundamentally flawed.
Moreover, general competitive equilibrium, seen as a
culmination of one central strand of economic theory, is too limited. Insofar as economics is concerned with
understanding the establishment of prices at which markets clear, general
competitive equilibrium was a masterpiece. But insofar as economics is or should
215
be concerned with how and why economies
increase the diversity of goods and services, the reigning theory is a
nonstarter. And since the growth in
wealth per capita over the past million years is deeply related to the growth
in the diversity of technology and goods and services, contemporary economics
is clearly inadequate.
We need a theory of the persistent coming into
existence of new goods and services and extinction of old goods and services, rather
like the persistent emergence of new species in an ecosystem and extinction of
old species. In the previous chapter, we
discussed ecosystems as self-organized critical. We discussed the biosphere and econosphere as
advancing into the adjacent possible in self-organized critical small and large
bursts of avalanches of speciation and extinction events. We discussed the power law distribution of
extinction events in the biological record. And we discussed the power law distribution of
lifetimes of species and genera.
But the econosphere has similar extinction and
speciation events. Consider my favorite
example: The introduction of the automobile drove the horse, as a mode of
transport, extinct. With the horse went
the barn, the buggy, the stable, the smithy, the saddlery, the Pony Express. With the car came paved roads, an oil and gas industry, motels, fast-food restaurants, and
suburbia. The Austrian economist, Joseph
Schumpeter, called these gales of creative destruction, where old goods die and
new ones are born. One bets that Schumpeterian
gales of creative destruction come in a power law distribution, with many small
avalanches and few large ones. More, if
species and genera have a power law distribution of lifetimes, what of firms? Firms do show a similar power law distribution
of lifetimes. Most firms die young, some
last a long time. We may bet that
technologies show similar avalanches of speciation and extinction events and
lifetime distributions.
The parallels are at least tantalizing, and probably
more than that. While the mechanisms of
heritable variation differ and the selection criteria differ, organisms in the
biosphere and firms and individuals in the econosphere are busy trying to make
a living and explore new ways of making a living. In both cases, the puzzling conditions for the
evolutionary cocreation and coassembly of increasing diversity are present. The biosphere and econosphere are persistently
transforming, persistently inventing, persistently dying, persistently getting
on with it, and, on average, persistently diversifying. And into a framework of such a diversifying
set of goods and services we must graft the central insights of general
competitive equilibrium as the approximate short-timescale mechanism that
achieves a rough-and-ready approximate clearing of markets at each stage of the
evolution of the economy.
A rough hypothetical biological example may help
understand market clearing in a more relaxed formal framework than general
competitive equilibrium. Consider two
bacterial species, red and blue. Suppose
the red species secretes a red metabolite, at metabolic cost to itself, that aids the replication rate of the blue species. Conversely, suppose the blue species secretes
a different blue metabolite,
216
at metabolic cost to itself, that increases
the replication rate of the red species. Then the conditions for a mutualism are
possible. Roughly stated, if blue helps
red more than it costs itself, and vice versa, a mixed community of blue and
red bacteria may grow. How will it
happen? And is there an optimal
“exchange rate” of blue-secreted metabolite to red-secreted metabolite, where
that exchange rate is the analogue of price?
Well, it can and does happen. Here is the gedankenexperiment: Imagine an ordered
set of blue mutant bacteria that secrete different amounts of the blue
metabolite that helps the red bacteria. Say
the range of secretion is from 1 to 100 molecules per minute per blue
bacterium, with metabolic cost to the blue bacteria proportional to the number
of molecules secreted. Conversely,
imagine mutant red bacteria that secrete from 1 to 100 of the red molecules
valuable to the blue bacteria, at a similar cost proportional to the number of
molecules secreted.
Now create a large, square petri plate with 100 rows
and columns drawn on the plastic below to guide your experimental hands. Arrange the rows, numbered 1 to 100 to
correspond to blue bacteria secreting 1 to 100 molecules a second. Arrange the columns, numbered 1 to 100 to
correspond to the red bacteria secreting 1 to 100 molecules a second. Into each of the 100 x 100 cells on your
square petri plate, place exactly one red and one blue bacterium with the
corresponding secretion rates. Thus, in
the upper left, bacteria that are low blue and low red secretors are coplated
onto each square. On the lower left,
high blue and low red secretors are coplated. On the upper right, low blue and high red
secretors are coplated. And in the lower
right corner, high red and high blue bacteria are coplated.
Go for lunch, and dinner, and come back the next day. In general, among the 10,000 coplated pairs of
bacteria, while all 10,000 colonies will have grown, a single pair will have
grown to the largest mixed red-blue bacterial colony. Say the largest mixed red-blue colony
corresponds to red secreting 34 molecules per second, blue secreting 57
molecules per second.
This gedankenexperiment is important, for the exchange
ratio of red and blue molecules is the analogue of price, the ratio of trading
of oranges for apples. And there exists
a ratio, 34 red molecules to 57 blue molecules per second,
that maximizes the growth of the mixed red-blue bacterial colony. Since the fastest growing mixed red-blue
colony will exponentially outgrow all others and dominate our gedankenexperiment
petri plate, this red-blue pair establishes “price” in the system at 34 red to
57 blue molecules. Further, in the
fastest growing red-blue colony, where red secretes 34 molecules and blue
secretes 57 molecules per second, both the red and blue bacteria in that mixed
colony are replicating at the identical optimum rate. As discussed in chapter 3, using a rough
mapping of biology to economics, that rate of replication of a bacterium
corresponds to economic utility and the increased the rate of replication
corresponds to increased economic utility. The red and blue bacteria not only establish
price, but they also share equally the
270
advantages of trade present along the
Pareto-efficient contract curve in the Edgeworth box discussed in chapter 3.
Mutualists in the biosphere have been hacking out
rough price equilibria for millions of years and have done so without foresight
and without the Arrow-Debreu fixed-point theorems. Indeed, critters have been hacking out rough
price equilibria even as exaptations and new ways of living have come into
existence and old ways have perished. Presumably,
these rough biological price equilibria are reached because in the short and
intermediate term they optimize the fitness of both of the mutualists. And the markets clear, in the sense that all
the 34 red molecules are exchanged for 57 blue molecules per second. But it’s a self-organized critical world out
there, with small and large avalanches of speciation and extinction events in
the biosphere and econosphere, and equilibrium price or no, most species and
technologies, job holders and jobs, are no longer among us to mumble about
advantages of trade.
I confess I am happier with this image of prices
established in local, rough-and-ready ways at many points in an ecosystem or
economy than with the beautiful fixed-point theorems of general competitive
equilibrium. Bacteria and horseshoe
crabs keep establishing rough price equilibria in their mutualisms without a
prespecified space of ways of making a living. If they can do it, so can we
mere humans. Getting on with it in the
absence of predefined configuration spaces has been the persistent provenance
of autonomous agents since we stumbled into existence.
Rational
Expectations and Its Limitations
Actually, there has been a major extension of general
competitive equilibrium called “rational expectations.” Like general competitive equilibrium, this
theory too is beautiful but, I think, deeply flawed.
Rational expectations grew, in part, out of an attempt
to understand actual trading on stock exchanges. Under general competitive equilibrium, little
trading should occur and stock prices should hover in the vicinity of their
fundamental value, typically understood as the discounted present value of the
future revenue stream from the stock. But, in fact, abundant trading does occur, and
speculative bubbles and crashes occur. Rational expectations theory is built up
around another fixed-point theorem. Rational expectations theory assumes a set of
economic agents with beliefs about how the economy is working. The agents base their economic actions on
those beliefs. A fixed point can exist
under which the actions of the agents, given their beliefs about the economy, exactly
create the expected economic behavior. So,
under rational expectations one can understand bubbles. It is rational to believe that prices are
going above fundamental value and thus to invest, and the investments sustain
the bubble for a period of time.
Meanwhile, Homo economicus has been thought to
be infinitely rational. In the
218
Arrow-Debreu setting, such infinitely rational agents bargain and
achieve the best equilibrium price for each dated contingent good. In rational expectations, the agents figure
out how the economy is working and behave in such a way that the expected
economic system is the one that arises. The
theories and actions of the agents self-consistently create an economy fitting
the theories under which the agents operate.
But beautiful as these fixed-point theorems are, there
are two troubles in the rational expectations framework. First, the beautiful fixed points may not be
stable to minor fluctuations in agent behaviors. Under fluctuations, the economic system may
progressively veer away from the fixed point into a feared conceptual
no-man’s-land. Second, achieving the
fixed points seems to demand excessive rationality to fit real human agents. So it appears necessary to extend rational
expectations.
One direction was broached thirty years ago, when
economist Herb Simon introduced the terms “satisficing” and “bounded
rationality.” Both seem sensible but
have been problematic. Satisficing
suggests that agents do not optimize but do well enough; yet it has been hard
to make this concept pay off. It has
also been hard to make advances with the concept of bounded rationality for the
simple reason that there is one way, typically, of being infinitely smart and
indefinitely many ways of being rather stupid. What determines the patterns of bounded
stupidity? How should economic theory
proceed?
Natural
Rationality Is Bounded
I suspect that there may be a natural extension to
rational expectations applicable to human and any strategic agents, and I
report a body of work suggested by me but largely carried out by Vince Darley
at Harvard for his doctoral thesis. Two
virtues of our efforts are to find a natural bound to infinite rationality and
a natural sense of satisficing.
The core ideas stated for human agents are these:
Suppose you have a sequence of events, say, the price of corn by month, and
want to predict next month’s price of corn. Suppose you have data for twenty months. Now, Fourier invented his famous
decomposition, which states that any wiggly line on a blackboard can be approximated
with arbitrary accuracy by a weighted sum of sine and cosine waves of different
wavelengths and phase offsets, chosen out of the infinite number of possible
sine and cosine functions with all possible wavelengths.
Now, you could try to “fit” the data on the corn
prices with the first Fourier “mode’ namely the average price. But presumably if the twenty prices vary a
fair amount, that average will not predict the twenty-first month’s price very
well. You have “underfit the data.” Or you could use twenty or more Fourier modes,
all different wavelengths, with different phase offsets, and you would,
roughly, wind
219
up drawing a straight line between the
adjacent pairs of points in the twenty-period series. This procedure will not help too much in
predicting the twenty-first period. You
have “overfit” the data by using too many Fourier modes.
Typically, optimal prediction of the twenty-first
period price will be achieved by using two to five Fourier modes, each of
different wavelength and different phase offset. As is well known in the art, you have neither
underfit nor overfit your data.
This fact suggests that the optimal prediction of a
short sequence of data is obtained by a model of intermediate complexity - a few
Fourier modes, neither a single one nor very many. The sense of bounded rationality Vince and I
want to advocate is that optimal prediction of a limited time series is
achieved with models using only a few Fourier modes, or their analogs in other
basis sets - models of modest, or bounded, complexity.
The rest of the theory Vince and I have developed goes
to show that agents who have theories of one another and act selfishly based on
those theories will typically create a persistently changing pattern of
actions. Therefore, they persistently
create a nonstationary world in which only the relatively recent past has valid
data. Thus, there is always only a
limited amount of valid data on which to base theories, and the agents, in turn,
must always build models of intermediate, bounded complexity to avoid over- or
underfitting the meager valid data.
Natural rationality is, in this sense, bounded. It is bounded because we mutually create
nonstationary worlds. What happens is
that the agents act under their theories. But in due course some agent acts in a way
that falsifies the theories of one or more other agents. These agents either are stubborn or change
their theories. If they change their
theories of the first agent, then typically they also change their actions. In turn, those changes disconfirm the theory
of the first agent, and perhaps still other agents. So the agents wind up in a space of coevolving
theories and actions with no fixed-point, stable steady states, which means
that past actions are a poor guide to future actions by an agent since his
theories, and hence his action plans, have changed. But this means that the agents mutually create
a “nonstationary” time series of actions (nonstationary just means that the statistical
characteristics of the time series keep changing because the agents keep
changing their theories and actions). In
turn, the agents typically have only a modest amount of relatively recent data
that is still valid and reliable on which to base their next theories of one
another. Given only a modest amount of
valid and reliable data, the agents must avoid overfitting or underfitting that
smallish amount of data, so they must use theories of intermediate complexity -
for example, four Fourier modes to fit the data, not one or twenty.
Vince and I want to say that natural rationality is
bounded to models of intermediate complexity because we collectively and
persistently create nonstationary worlds together. In the agent-based computer models Vince has
created for his thesis, just this behavior is seen. Indeed, we allow agents to evolve how much of
the
220
past history of the interactions they will pay
attention to and how complex their models of one another will be - one, four,
or fifty Fourier modes. Agents evolve in
a history and complexity space to find currently optimal amounts of history and
complexity to use to optimally predict their neighbors. In our little world, the agents evolve to use
a modest history, ignoring the distant past, and only modestly complex theories
of one another.
We have found evidence of a further, perhaps generic,
property that appears to drive such systems to settle down, then
change in a sudden burst. As the system
of agents and actions settles down to some repeatable behavior, an increasingly
wide range of alternative theories, simple and very complex, fit the same data.
But the complex theories, with many
Fourier modes, attempt to predict fine details of the repeatable behavior. As those theories become more complex, they
are more fragile because they can be disconfirmed by ever more minor
fluctuations in the repeatable behavior. Sooner or later such a fluctuation happens,
and the agents with the complex disconfirmed theories change theories and
actions radically, setting up a vast avalanche of changes of theories and
actions that sweeps the system, driving the collective behavior far from any
repeatable pattern. In these new circumstances,
only a small subset of theories fits the current facts, so the diversity, and
complexity, of theories in the population of agents
plummets, and the system finds its way back to some repeatable pattern of
behavior.
In short, there appears to be not only a bounded
complexity in our rationality, but a fragility-stability cyclic oscillation in
our joint theories and actions as well. In these terms, the system of agents and
theories never settles down to a fixed-point equilibrium in which markets
clear. Instead, the system repeatedly
fluctuates away from the contract curve then returns to new points in the
vicinity of the contract curve. Hence,
in precisely the sense of repeatedly fluctuating away from a contract curve
then returning to its vicinity, the system does not achieve an optimizing price
equilibrium, but satisfices.
The bounded complexity issues would seem to apply to
any coevolving autonomous agents that are able to make theories of one another
and base actions on those theories. The
tiger chasing the gazelle and the starfish predating the trilobite are, we
suppose, Popperian creatures able to formulate hypotheses about their worlds
that may sometimes die in their stead. Presumably
all such autonomous agents, under persistent mutation and selection, would opt
for changeable models of one another of bounded complexity.
While these steps are only a beginning to go beyond
rational expectations in economics, they seem promising. Whatever natural, or unnatural, games autonomous
agents are playing as they and we coevolve in a biosphere or econosphere,
nonstationarity arises on many levels. Here we see it at the level of the agents’
theories of one another and the actions based on those theories. Perhaps this is just part of how the world
works. Given the semantic import of yuck
and
221
yum, and the reality of natural games for fox
and hare, for E. coli and paramecium, these changing theories and
actions are part of the fabric of history of the market, the savannah, and the
small pond.
Natural rationality is bounded by the very
nonstationarity of the worlds we cocreate as we coexapt.
Technology
Graphs and Economic Webs
Life takes its unexpected turns. I have been an academic scientist, a
biologist, for thirty years at the University of Chicago, the National
Institutes of Health, the University of Pennsylvania, then twelve stunningly
exciting years at the Santa Fe Institute. After thirty years, I’ve written the canonical
hundred or more scientific articles, was fortunate enough to receive a
MacArthur Fellowship, during whose five years my IQ went up and then slumped
back to normal as the funding ended, invented and patented this and that, and
published two previous books of which I am proud, Origins of Order and At
Home in the Universe, both by Oxford University Press.
I thought Origins and At Home were
largely about the challenge of extending Darwinism to an account of evolution
that embraced both self-organization and natural selection in some new, still
poorly understood marriage. One hundred
and forty years after Darwin, after all, we still have only inklings about the
kinds of systems that are capable of adaptation. What principles, if any, govern the coevolutionary
assembly of complex systems such as ecosystems or British common law, where a
new finding by a judge alters precedent in ways that ricochet in small and
large avalanches through the law? If new
determinations by judges did not have any wider impact, the law could not
evolve. If every new determination
altered interpretation of precedents throughout the entire corpus of common law,
the law also could not evolve.
My rough bet is that systems capable of coevolutionary
construction, such as British common law, can evolve and accumulate complexity
because they are somehow self-organized critical, and a power law distribution
of avalanches of implications of new precedent ricochet in the law and in other
complex coevolving systems to allow complexity to accumulate. Indeed, based on self-organized criticality,
and more particularly on the analysis of the NK fitness landscape model discussed
in Origins and At Home and the “patches” version of the NK model
discussed in At Home, I am rather persuaded that adapting systems can
best exploit the trade-off between exploitation and exploration at a rough
phase transition between order and chaos. Here power law distributions of small and
large avalanches of change can and do propagate through the system as it
adapts.
So saying, and having published Origins and
then At Home, I was rather surprised to find business people approaching
me. The consulting companies of
222
McKinsey, Coopers and Lybrand, Anderson, and Ernst and Young began
visiting the Santa Fe Institute to learn about the “new sciences of
complexity.” In due course, Chris Meyer
at the Center for Business Innovation at Ernst and Young asked me if I might be
interested in forming a fifty-fifty partnership with E and Y to discover if
complexity science, that nuanced term, could be applied in the practical world.
I found myself deeply intrigued. Was the work of my colleagues and myself mere
theory or did it have application in real biospheres and econospheres? Why not plunge in and try my best to find out,
to do it right, even knowing how early was the stage of the
science we had been inventing.
Bios Group Inc., the partnership with Ernst and Young,
is now just three-and-a-half years old. We
have grown to over seventy people, heading, we hope, for a hundred. Our annual revenues are running at $6 million.
We have a hopeful eye on $7 to $8
million this year with clients ranging from Texas Instruments, for whom we
invented a novel adaptive chip, to the U.S. Marine Corps with its concern for
adaptive combat, to Unilever, the NASDAQ stock market, Honda, Boeing, Johnson
and Johnson, Procter & Gamble, Kellogg, Southwest Airlines, the Joint
Chiefs of Staff, and others. We have
spun out a biotechnology company, CIStem Molecular, that aims to clone the
small cis acting DNA regions that control turning genes on and off in
development and disease; a European daughter company, Euro-Bios; as well as
EXA, a company spun out with NASDAQ to make tools for financial markets. I’m deeply glad to be chairman of the board
and chief scientist of Bios, to be working with a very creative group of colleagues,
and to be finding routes in the practical world where our ideas do, in fact,
apply.
I mention Bios and my involvement because some of the
science we have done bears on diverse aspects of practical economics and even
begins to suggest pathways beyond the limitations of the Arrow-Debreu theory. I begin with Boeing, which came to Bios
wondering how to design and build airplanes in a year rather than seven years. Out of what modular parts and processes,
wonder Boeing folks, might it be possible to assemble a family of related
aircraft for a diversity of markets?
The obvious approach was to invent “Lego World?’ As founding general partner and chief
scientist, I duly authorized the expenditure of $23.94 to buy a largish box of
Lego parts. (In truth, I fib. I actually
won the Lego box at our first Bios Christmas party.)
Most of the readers of this book will be familiar with
Lego. It is a construction game
consisting of snap-together, plastic parts, based on square blocks that graduate
in size, for example, 1 x 1, 1 x 2, 2 x 3, and 3 x 4 blocks. The blocks can be assembled into wonderfully
complex structures, as many delighted children and adults have discovered.
But what might Lego World be? What, indeed. Well, consider a large pile of Lego blocks on
a bare wooden table. Consider these
blocks the “primitive parts.” Now
consider all the primitive construction or deconstruction operations, pressing
two
223
parts together or adding a primitive part to a
growing assemblage, or taking a part off another part or off an assemblage.
Consider the pile of unassembled bare Lego parts as
the founder set, and place in “rank 1” all the unique Lego objects that can be
constructed from the founder set in a single construction step. Thus, a 1 x 3 can be attached in a specific
overlapping way to a 2 x 4. Now place in
rank 2 all the unique Lego objects that can be constructed from the founder set
in two construction (or deconstruction) steps.
Similarly, consider ranks 3, 4, 5,.., 20, 100,
10,000, 11,343,998…
A set of primitive parts and the transformations of
those parts into other objects is a “technology graph.” In fact, a technology graph is deeply similar
to a chemical-reaction bipartite graph from a founder set of organic molecules,
where the molecules are the objects and the reaction hyperedges linking
substrate and products are the transformations among the objects. The graph is “bipartite” because there are two
types of entities, nodes, and hyperedges, representing objects and transformations.
The first thing to notice about the Lego World
technology graph is that it might extend off to infinity, given an infinite
number of primitive Lego parts.
The second thing to notice is that within Lego World
an adjacent possible relative to any actual set of primitive and more complex
Lego structures is perfectly definable. The
adjacent possible is just that set of unique novel objects, not yet constructed,
that can be constructed from the current set of Lego objects in a single
construction step. Of course, within the
limited world of Lego we can think of the technologically adjacent possible
from any actual. A Lego economy might
flow persistently from simple primitive objects into the adjacent possible,
building up evermore complex objects.
A third feature is that we might consider specific
Lego machines, made of Lego parts, each able to carry out one or more of the primitive
gluing or ungluing operations. Lego
World could build up the machine tools to build other objects including other
tools.
Indeed, in Origins of Order and At Home in
the Universe, I borrowed theoretical chemist Walter Fontana’s algorithmic
chemistry and defined a mathematical analogue of Lego World, namely a “grammar
model” of an economy. In that model, binary
symbol strings represented goods and services, as do Lego objects in Lego
World. In a grammar model, “grammar”
specifies how symbol strings act on symbol strings, rather like machines on
inputs, to produce new symbol strings. In
Lego World, the grammar is specified by the ways primitive blocks can be
attached or unattached and by any designation of which Lego objects can carry
out which primitive construction operations. The grammar in question may be simple and
“context-insensitive” or a far richer “context-sensitive” grammar in which what
objects can be added in what ways to different objects depends upon the small
or
224
large context surrounding those blocks. In short, in a context-sensitive grammar, the
objects and transformations rules are sensitive to the context of the objects
and previous transformations themselves.
Before proceeding with current uses of Lego World and
its intellectual children, notice that Lego World, like the grammar models in Origins
and At Home, can become the locus of an economy, in which the sets
of goods and services can expand over time and in which speciation and
extinction events occur. In Origins and
At Home, I built upon a suggestion of economist Paul Romer, and
specified that each symbol string - or here, each Lego object - has some
utility to a single consumer. The
utility of each object to the consumer is subjected to exponential discounting
over time. A Lego house today is worth
more than a Lego house tomorrow and still more than a Lego house two days from
now. And for simplicity’s sake, the
total utility of a bundle of goods is the sum of their discounted utilities to
the consumer.
Next, I invoked a “social planner?’ The task of the social planner is to plan a
pattern of production activities over time that optimizes the discounted
happiness of the consumer. A standard
approach is to adopt a finite planning horizon. The social planner thinks ahead, say, ten
periods, finds that pattern of construction activities over time that creates
the sequence of symbol-string goods, or Lego objects, that
maximizes the time-discounted happiness of the consumer. Then, the social planner initiates the
first-period plan, making the first set of objects. Next, the planner considers a ten-period
planning horizon from period 1 to period 11, deduces the optimal
second-period plan, taking account of the newly considered eleventh period, and
carries out the second-period plan.
Because the total utility to the consumer is a simple
sum of the discounted utilities of all the possible goods in the economy, finding
the optimal plan at any period is just a linear programming problem, and the
results are a fixed ratio of construction activities of all the objects
produced at that period. The fixed ratio
of the activities is the mirror of price, relative to one good, taken as the
arbitrary “currency’ or “numeraire.”
Over time, the model economy ticks forward. At each period, in general, only some of the
possible goods and services are constructed. The others do not make the consumer happy
enough. Over time, new goods and
services come into existence, and old ones go out of existence in small and
large avalanches of speciation and extinction events.
Thus, a grammar model, or a physical instantiation of
a grammar model such as Lego World, is a toy world with a technology graph of
objects and transformations. With the
addition of utilities to the different objects for one consumer or a set of
consumers and a social planner - or more generally, with a set of utilities for
the objects that may differ among consumers and with different costs and
scaling of costs with sizes of production runs of different Lego objects - a
market economy
225
can be constructed. With defined start-up costs, costs of borrowing
money, and bankruptcy rules, a model economy with an evolving set of goods and
services can be created and studied.
In general, such economies will advance persistently
into the adjacent possible. And because
the number of unique Lego objects in each rank is larger than the number in the
preceding rank, the diversity of opportunities and objects tends to increase as
ever more complex objects are constructed.
More generally, we need to consider “complements” and
“substitutes.” Screw and screwdriver are
complements; screw and nail are substitutes. Complements must be used together to create
value; substitutes replace one another. Rather obviously, the complements and
substitutes of any good or service constitute the economic niche in which that
good or service lives. New goods enter
the economy, typically, as complements and substitutes for existing goods. There is just no point in inventing the
channel changer before the television set is invented and television
programming is developed.
An economic web is just the set of goods and services
in an economy, linked by red lines between substitutes and green lines between
complements.
As we have seen, over the past million years, and even
the past hundred years, the diversity of the economic web has increased. Why? Because,
as in Lego World, the more objects there are in the economy, the more
complement and substitute relations exist among those objects, as well as
potential new objects in the adjacent possible. If there are N objects, the number of
potential complement or substitute relations scales at least as N squared
since each object might be a complement or substitute of any object. Thus, as the diversity of the objects in the
web increases, the diversity of prospective niches for new goods and services
increases even more rapidly! The very
diversity of the economic web is autocatalytic.
If this view is correct, then diversity of goods and
services is a major driver of economic growth. Indeed, I believe that the role of diversity
of goods and services is the major unrecognized factor driving economic growth.
Jane Jacobs had made the same point in
her thoughtful books about the relation between economic growth and economic
diversity of cities and their hinterlands. Economist Jose Scheinkman, now chairman of
economics at the University of Chicago, and his colleagues studied a number of
cities, normalized for total capitalization, and found that economic growth
correlated with economic diversity in the city. In a similar spirit, microfinancing of a
linked diversity of cottage businesses in the third world and the first world
seems to be achieving local economic growth where more massive efforts at
education and infrastructure, Aswan dams and power grids, seem to fail.
Indeed, in the same way in an ecosystem, organisms
create niches for other organisms. I
suspect, therefore, that over the past 4.8 billion years, the growth of diversity
of species is autocatalytic, for the number of possible niches increases more
226
rapidly than the number of species filling
niches. And in the linking of spontaneous
and nonspontaneous processes, the universe as a whole advances autocatalytically
into its adjacent possible, driven by the very increase of diversity by which
novel displacements from equilibrium come into existence, are detected, are coupled
to, and come to drive the endergonic creation of novel kinds of molecules and
other entities. Economic growth is part
and parcel of the creativity of the universe as a whole.
Think of the Wright brothers’ airplane. It was a recombination between an airfoil, a
light gasoline engine, bicycle wheels, and a propeller. The more objects an economy has, the more
novel objects can be constructed. When
we were working with rough stone in the Lower Paleolithic, we pretty much
mastered everything that could be done until pressure flaking came along. Most forms of simple stone tools that could be
made were made. Today, the adjacent
possible of goods and services is so vast that the economy, stumbling and
lunging into the future adjacent possible, will only construct an ever smaller
subset of the technologically possible.
The economy is ever more historically contingent... As the biosphere is ever more historically
contingent... As, I suspect, the
universe is ever more historically contingent.
We are on a trajectory, given a classical
6N-dimensional phase space, where the dimensionality of the adjacent possible
does seem to increase secularly and the universe is not about to repeat itself
in its nonergodic flow.
A fourth law?
I now discuss an algorithmic model of the real
economic web, the one outside in the bustling world of the shopping mall, of
mergers and acquisitions. While
powerful, however, no algorithmic model is complete, for neither the biosphere
nor the econosphere is finitely prestatable. Indeed, the effort to design and construct an
algorithmic model of the real economic web will simultaneously help us see the
weakness of any finite description.
It all hangs on object-oriented programming.
A case in point is the recent advent of Java, an
object-based language, which, as of February 1998 had a library of some
eighty thousand Java objects. Goodness
knows how fast this library of objects is growing. Among the Java objects are “carburetor”
objects, “engine block” objects, and “piston” objects, and objects come with
“functional” descriptors, such as “is a” “has a,” “does a” “needs a” “uses a?’
Both implicitly and, with modest work, explicitly, the
“piston” object can discover that it fits into the cylinder hole in the “engine
block” object to create a completed piston in a cylinder. The “carburetor” object can discover that it
is to be located on top of the “engine block” object, connected to certain “gas
line” objects in certain ways.
The physical engine block and piston, in reality, are
complements, used together to create value. Thus, the representations of the engine block
and piston as
227
algorithmic Java objects, together with
algorithmic “search engines” to match the corresponding “is a,” “has a” “does
a” functions of complements and even substitutes, can as a matter of principle
- and practicality - create an image of the real complements and substitutes in
the real economic web.
In a fundamental sense, an appropriate set of Java
objects, together with search engines for complements and substitutes matching
“is a,” “has a” “does a” constitutes a grammar of objects and linkings or
transformations among objects. The
grammar may be context independent or context sensitive or richer.
In short, properly carried out, Java objects and the
proper search engines can create a technology graph of all the objects and
transformed objects constructible from any founder set of objects. The Java objects are like Lego World. And Lego World, stated in terms of building
simple combinatorial objects, is logically similar to a set of objectives that
must be achieved by a military force to carry out its total objective. Entities and operations are deeply similar, as
we will explore further below. Technology
graphs concern objects and actions, things and objectives, products and
processes in a single framework.
Therefore, in principle, we have begun to specify a
means to characterize large patches of the global economic web. Let each of very many firms, at different
levels of disaggregation, create Java objects proper
to their activities, building materials, partially processed materials,
partially achieved objectives, and objectives including products. Let these Java objects be characterized by
appropriate “is a” “has a” “does a,” lists, with adequate search engines looking
for complements and substitutes. The
result is a distributed web of Java objects linked functionally in at least
many or most of the ways that are actually in use as complements and
substitutes creating the millions of different goods and services in the
current economy.
Much is of interest about such data on the real
economic web. Among other features, any
such graph has graph-typical characteristics. Some goods are central to the web, the car,
computer, and so forth. Others are
peripheral, such as the hula hoop and pet rock. Presumably, location of its products in the
web structure has a great deal to do with the strategic position of a firm.
But there is more, for the economic web states its own
adjacent possible. Given the Queen Mary
and an umbrella, the umbrella placed in the smoke stack of the Queen Mary is in
the adjacent possible. Not much use. But what about a small umbrella on the back
of a Cessna 172 that opens upon landing: Ah, an air brake is a possible
new good in the adjacent possible a few steps from here.
And still more. What are the statistics of the transformation
of an economic web over time as new goods and services arise in the niches
afforded by existing goods and services, and drive old goods and services
extinct? No one makes Roman siege
engines these days. Cruise missiles do
the job better.
I believe such object-based economic web tools will
come into existence in the
228
near future. Indeed, Bios Group is involved in inventing
and making them. And I believe that such
tools will be very powerful means of coordinating activities within supply
chains and within the larger economy when linked by automated markets.
But I do not believe any such algorithmic tool can be
complete. Consider the case of the
engineers discovering that the engine block is so rigid that the block itself
can serve as the chassis for the tractor they are trying to invent. That exaptation seems a genuine discovery. Now imagine that we had had Java object models
of all the parts that were to go into the tractor: engine block objects,
carburetor objects, piston objects. If,
among the properties of the engine block - the proud “is a” “has a” “does a”
features - we had not listed ahead of time the very rigidity of the engine
block or if that rigidity was not deducible from the other listed properties,
then my vaunted economic web model with its algorithmically accessible adjacent
possible could not have ever come up with the suggestion: Use the engine block,
due to its rigidity, as the chassis.
You see again that unless there is a finite
predescription of all the potentially relevant properties of a real physical
object, our algorithmic approach can be powerful, but incomplete. Yet I cannot see how to construct such a
finite predescription of all the potentially relevant properties of a real
physical object in the real universe.
The world is richer than all our dreams, Horatio.
I must say to Arrow and Debreu, “Gentlemen, the set of
goods and services is not finitely prestatable, so fixed-point theorems are of
limited use”
And to my economist colleagues: Consider the economy
as forever becoming, burgeoning with new ways of making a living, new ways of
creating value and advantages of trade, while old ways go extinct. This too is the proper subject for your study,
not just allocation of scarce resources and achievement of market-clearing
prices. The economy, like the biosphere,
is about persistent creativity in ways of making a living.
I find it intriguing to note certain parallels from
our prior discussion of autonomous agents and propagating organization. At the level of molecular autonomous agents, I
made the point repeatedly that work is the constrained release of energy and
that autonomous agents do carry out work to construct the constraints on the
release of energy such that the energy is released along specific channels and
such that specific couplings of nonequilibrium energy sources to propagating
organization arise. Think then of the
role of laws and contracts, whose constraints enable the linked flow of
economic activities down particular corridors of activities. The web of economic activities flows down
channels whose constraints are largely legal in nature. The coming into existence of the enabling constraints
of law is as central to economic development and growth as any other aspect of
the bubbling activity.
My first purpose in investing in an entire box of Legos
was to explore and define concepts of “robust constructibiity?’ We have succeeded, but run into fascinating
problems of a general phase transition in problem solvability. In turn, this very phase transition suggests
that in a coconstructing biosphere or econosphere rather specific restrictions
arise and are respected by critters and firms, creatures and cognoscenti.
Recall the Lego founder set, and the rings, rank 1,
rank 2,.., rank 11,983,... each containing the unique Lego objects first
constructible from the founder set in a number of steps equal to the rank of
that ring. Suppose a given Lego house is
first constructible in twenty steps, hence, lies in rank 20. Now, it might be the case that there is a
single construction pathway from the founder set to the Lego house in twenty
steps. It might also be the case that
there are thousands of construction pathways to the Lego house in twenty steps.
In the latter case, intuitively,
construction of the Lego house is robust. If one way is blocked, say because 1 x 3 blocks
are temporarily used up, then a neighboring pathway will allow the Lego house
to be constructed without delay, that is, in twenty steps, using other block
sizes.
A related sense of robustly constructibiity concerns
how the number of ways to construct the Lego house increases if we take more
than the minimum twenty steps, say, twenty-one, twenty-two, twenty-three,...
steps. The number of ways may not
increase at all or very slowly or hyperexponentially. If the number of ways increases very rapidly,
it might be worth using twenty-two steps to make the Lego house, for it would
be virtually impossible to block construction even if several types of building
blocks and machines were temporarily broken.
But recall Boeing’s question. They wanted to build a family of related
objects. Hence, let us define still
another related sense of robustly constructible.
Consider a family of Lego objects, a house, and a
house with a chimney. Now consider each
of the many ways from the founder set to build the Lego house. For each such way to build the Lego house,
consider how to change construction minimally in order to build the Lego house
with the chimney. Perhaps the chimney
can just be added to the completed Lego house. More likely, it would be necessary to
partially deconstruct that completed Lego house, then
go on to construct the house with the chimney. So there is a last branch point during
construction on the way both to the Lego house and the Lego house with the
chimney.
The branch point object and/or operation that is simultaneously on the way to the house and the house with
the chimney is an interesting intermediate-complexity object or operation
because it is polyfunctional. It can be
used in at least two further ways.
When we build a house, we all know that boards and
nails are primitive objects and the completed house is the finished object. But some intermediate objects, say, framed
windows and framed walls, are commonly used. Why? Because they are in-
230
termediate objects that are polyfunctional. The technology graph and its branch points are
identifying the intermediate-complexity polyfunctional objects for us.
But things are more subtle. It might be the case that from the last branch
point on a way to make both the house and the house with the chimney there is
only a single pathway forward to the house and there is only a single pathway
forward to the house with the chimney. Not
robust. Stupid
stopping spot. Either pathway can
readily be blocked. Suppose instead we
consider an intermediate object three steps prior to the last branch point on
the way outward from the founder set. Ah, perhaps there are thousands of ways to
complete the house and to complete the house with the chimney. Any single blockage or small set of blockages
is readily overcome. The house, or the
house with the chimney, can be built without delay if all 1 x 2 blocks
are temporarily out of stock. Now this
is a smart, robust, intermediate-complexity polyfunctional object-objective. And it may cost no more to stockpile such
smart intermediate objects!
So, here is a new view of process design and inventory
control.
Bios colleague Jim Herriot has made a delightful
little Java computer model to show technology graphs in action. The program shows a “chair” object, a “seat”
object, a “back” object, and a “leg” object. In addition, there are “foam” and “padding”
objects, two “attachment” objects, a set of “screw” objects, “nail” objects,
“wood” objects, a “saw” object, a “hammer” object, and a “screwdriver” object. Each object comes with its own characteristic
set of “is a” “has a” “does a” features.
The program assembles coherent technology graphs and
chair-assembly pathways as follows: An object tries a connection, shown by a
black line, to another object. In
effect, the chair object extends a line to the screw object as it says, “I need
a lean-on! I need a lean-on!” The screw object responds, “I do twist holds,
I do twist holds!” There is no match. After many random tries, the “chair” object
extends a black line to the “back” object. “I need a lean-on, I need a lean-on’ says the
“chair” object. “I do lean-ons, I do
lean-ons,” cries the “back” object with cybernetic joy. The black line becomes a yellow line as a
contract is signed between the “chair” and “back” objects. In a similar way, the “back” object needs
“padding” and “wood” objects, and either the complementary pair “nail and
hammer” objects or their substitutes, “screw and screwdriver” objects, to carry
out an “attachment” operation.
In due course, the conditions are met to begin
construction of partial objects on the way to the entire chair. Legs, then backs, begin to be assembled as
screws and nails are used up. Eventually, a seat is constructed too, and the
first chair triumphantly follows.
All goes well until all the screws are used up. The seat, having relied on screws and
screwdrivers to attach padding, goes nuts and looks about frantically, asking
the screwdriver to work on nails. No
luck. Eventually, the seat tries nails
and hammers jointly, and that complementary pair works. More chairs are constructed, then nails run
out and failures propagate throughout the system.
231
Not a metaphor, the technology graph. Rather, a new tool to understand the general
principles of robust constructibility, the structure of economic webs, a
knowledge-management tool for a firm, a new hunk of basic science. Indeed, one of the interesting features of
technology graphs is that they constitute the proper conceptual framework to
consider process and product design simultaneously. As far as I can tell, we have not had such a
conceptual framework before.
Nor is the technology graph limited to manufacturing. The same general principles apply, for
example, in military or other logistic operations. Technology graphs, in these other contexts,
become the sequential set of requirements needed to meet subobjectives that
robustly culminate in the achievement of an overall objective. Taking hill 19 after diverting gracefully from
orders to take hill 20 is logically related to making the house with the
chimney after starting to make the house without the chimney.
A Phase
Transition in Problem Solvability
Part of the basic science of technology graphs stems
from generic phase transitions in problem solvability in many combinatorial
optimization or satisficing problems in biology and economics. I turn now to discuss these generic phase
transitions.
I begin with a metaphor. You are in the Alps. A yellow bromine fog is present. Anyone in the fog for more than a microsecond
will die. There are three regimes: the “dead” the “living dead’ and the
“survivable.”
The “dead”: The bromine fog is higher than Mont Blanc.
Unfortunately, everyone dies.
The “living dead”: The bromine fog has drifted lower
and Mont Blanc, the Eiger, and the Matterhorn jut into the sunlight. Hikers near these three peaks are alive.
But consider that even mountains are not fixed. Plate tectonics can deform the mountainous
landscape. Or, in the terms of the last
chapter, the mountainous fitness landscape of a species or a firm or an armed
force can change and persistently deform due to coevolution when other species
or firms or adversaries change strategies. If the mountainous landscape deforms, Mont
Blanc, the Eiger, and the Matterhorn will eventually dip into the bromine fog. As this happens, perhaps new peaks jut into
the sunshine. But those peaks will
typically be far away from Mont Blanc, the Eiger, and the Matterhorn.
Alas, the hikers near those three initial peaks will
die as they are dipped into the lethal fog and are too far from the newly
emerged sun drenched peaks to reach them. This “isolated peaks regime” is the living
dead regime.
But there is a third regime a phase transition away.
The survivable regime: Let the bromine fog drift
lower. More and more peaks jut into the
sunshine. At some point, some magical
point, as more peaks emerge
232
into the sunshine, quite suddenly, a hiker can
walk all the way across the Alps in the sunshine.
This is a phase transition from the isolated peaks
regime. A connected web -
mathematically, a percolating web - of connected “solutions” has emerged suddenly
as the fog lowers. Now consider hikers
striding across the Alps, knapsacks and hearts full. If plate tectonics rather slowly deforms the
landscape, then whenever hikers are about to be dipped into the lethal bromine
fog, they can take a sideways step in some direction and remain in the
sunshine.
The percolating web of solutions regime is
persistently survivable. In fitness
landscape terms, if you are evolving on a fitness landscape that is deforming
and are in your survivable regime, you can continue to exist by stepping
somewhere from wherever you happen to find yourself.
This phase transition is generic to hard combinatorial
optimization problems. A case in point
is the well-known job shop problem. The
job shop problem posits
M machines and O objects. The idea is to build the O objects on
the machines. Each object requires being
on each machine in some set order for some fixed period of time. Perhaps object 1 must be on machine 13 for 20
minutes, then machine 3 for 11 minutes, then machine 4 for 31 minutes. In turn, object 2 must be on machine 1 for 11
minutes, then machine 22 for 10 minutes, and so on.
A schedule is an assignment of objects to machines
such that all objects are constructed. The total length of time it takes to construct
the set of objects is called the “makespan.” We may consider, for each schedule, a
definition of neighboring schedules, such as swapping the order in which two
objects are assigned to a given machine. Given the set of possible schedules, the neighborhood
relation between schedules, and the makespan of each
schedule, there is a makespan fitness landscape over the space of schedules of
the job shop problem.
We want to minimize makespan. But to keep our mountainous landscape metaphor
where high peaks are good, let us consider optimizing efficiency by minimizing
makespan. So schedules with low makespan
correspond to points of high efficiency on the job-shop fitness landscape. Clearly, short makespan makes the problem
hard, long makespan makes the problem easy. So long makespan is like the bromine fog being
low, while short makespan is like the bromine fog being high.
Does the phase transition occur in typical job shop
problems as makespan is tuned from long to short? Figure 9.2 [HHC:
not displayed] shows the results Vince Darley obtained. The figure plots makespan, short to long, on
the x-axis and the number of schedules at a given makespan on the y-axis.
As you can see, for long enough makespan, as makespan
decreases there are roughly a constant number of schedules at each makespan. But at a critically short makespan, the number
of solutions starts to fall abruptly. The corner where the curve starts to turn is
the phase transition between the survivable regime for longer makespans and the
isolated peaks/living dead regime for shorter makespans. (I
233
cheat slightly, the sharpness of the corner
increases as the size of the job shop problem increases in numbers of machines,
M, and objects, 0.)
There are a number of direct tests for this phase
transition. In the
survivable regime, at longer makespans and lower efficiency than the phase
transition makespan, start with a given schedule at a given makespan. N ow
examine all “nearby” schedules and test if any is of
an equal or better makespan. Continue to
“walk” across the space of schedules via neighbors to test if there is a
connected web of schedules with makespans at least as good as our initial
schedule’s makespan. Either the web
percolates across the space of solutions or it does not. If the web percolates, then the initial
schedule was in the survivable regime. If
only isolated regions of neighboring schedules of the same approximate makespan
are found, then you are in the isolated peaks regime.
A second test looks at the “Hausdorf dimensionality” of the acceptable solutions at a given makespan or better. The Hausdorf dimension is computed by considering an initial schedule at a given makespan, then by considering from among all the 1-mutant neighbor schedules the number of them that are of the same or better makespan. as the initial schedule’s makespan, then doing the same among all the 2-mutant neighbor schedules. The Hausdorf dimension of the acceptable set of schedules at that makespan and point in the job shop space is the ratio of the loga-
Figure 9.2 [HHC: not displayed]
The transition from the isolated peaks regime to the survivable regime
in a job shop problem as makespan increases. Robust survivable operations occur on the
horizontal region of the curve near the phase transition region where the curve
bends sharply downward as makespan decreases. Phase transition becomes sharper as size of
job shop problem increases.
234
rithm of the 2-mutant acceptable schedules to
the logarithm of the 1-mutant acceptable schedules. In effect, the Hausdorf dimension shows how
rapidly - in how many dimensions of the job-shop
schedule space - acceptable schedules of a given makespan or better are
growing. In the survivable regime, the
Hausdorf dimension, on average, is greater than 1.0. In the isolated peaks regime, averaged over
the job shop space, the Hausdorf dimension is less than 1.0. At the phase transition, the dimensionality is
1.0.
The phase transition I have just noted is generic for
many or most hard combinatorial optimization problems. It is not true for all fitness landscapes. For example, it would not hold on a conical
Fujiyama landscape. But the Fuji
landscape corresponds to a simple, typically linear, optimization problem. Hard combinatorial optimization problems are
multipeaked due to conflicting constraints.
A further interesting connection relates the
statistics of search on the job shop landscape to learning curves in economics.
Learning curves, well known in arenas
from airplane manufacture to diamond cutting to cigar manufacture, show that
every time the total output of a plant is doubled, the cost per unit falls by a
rough constant percentage, typically 5 to 10 percent. If the logarithm of the cost per unit is
plotted on the y-axis and the logarithm of the cumulative number of units produced
is plotted on the x-axis, one gets a typical straight-line power law that decreases
downward to the right.
The fascinating thing is that this feature probably
reflects the statistics of search for fitter variants on rugged, correlated
fitness landscapes such as the job shop problem. Typically, in such problems, every time a
fitter 1-mutant variant is found, the fraction of 1-mutant variants that are
still fitter falls by a constant fraction, while the improvement achieved at
each step is typically a constant fraction of the improvement achieved at the
last step. These properties yield the
learning curve. My colleagues Jose Lobo,
Phil Auerswald, Karl Shell, Bill Macready, and I have published a number of
papers on the application of the statistics of rugged landscapes and learning
curves.
But there is another even more important point. We can control the statistical structure of
the problem spaces we face such that the problem space is more readily
solvable. We can, and do, tune the
structure of the problems we solve. The
capacity to tune landscape structure shows up in the job shop problem. In most cases, improving the structure of the
problem space requires relaxing conflicting constraints to move the problem
into the survivable regime at the makespan, or efficiency, you require. For a specific case, in my statement of the
job shop problem, I asserted that the O objects must each have access to the M
machines in some fixed order.
Now simple observation and experience tells you that
you can put on your shirt and pants in either order, but you had better put on
your socks before your shoes. In other
words, some steps in life are permutable, others are not. Suppose in our
235
job shop problem, a fixed fraction, P, of
the steps in the construction of each of the O objects were jointly
permutable. As P increases and
more steps are permutable, the conflicting constraints in the total problem are
reduced. But this in turn means that the
entire space of solutions improves, that is, the entire space shifts toward
lower makespan, or higher efficiency.
And now the relation to the bromine fog metaphor can
be stated clearly. As the number of
permutable steps increases, the conflicting constraints are reduced. The entire makespan-efficiency landscape is
lifted higher, the peaks are higher, and the landscape, with fewer conflicting
constraints, is smoother. All in all,
the result is that the percolating web of solutions where the hikers can walk
all across the Alps occurs at a higher efficiency and shorter makespan.
In terms of Figure 9.2, [HHC: not
displayed] the capacity to permute steps in the job shop problem shifts the phase
transition point leftward, toward shorter makespan. Equivalently, if one wants to shift the curve
in Figure 9.2 to the left, purchase a computable number of machines that
are polyfunctional so that the same machine can be used for more than one job. That too reduces conflicting constraints.
There is another view of this generic phase transition
between solvable and nonsolvable that again highlights the role of having
alternative ways of doing things, robustly constructible strategies that are
not easily blocked. In addition, the
same simple model, the Ksat model discussed in the previous chapter, begins to
account for at least the following anecdotal observation, which is apparently
typical: Colleagues at Unilever noted to us that if they have a plant that
manufactures different types of a product, say, toothpaste, then the plant does
well when the diversity of products grows from three to four to ten to twenty
to twenty-five, but at twenty-seven different toothpastes, the plant suddenly
fails. So rather abruptly as product
diversity in a given plant increases, the system fails.
Why? Presumably
it is the same phase transition in problem solvability.
Consider again the Ksat problem, taken as a model for
community assembly in the last chapter based on the work of Bruce Sawhill and
Tim Keitt. Figure 8.13 repeats
the Ksat problem and shows again the phase transition.
Recall that a Ksat problem consists in a logical
statement with V variables, in C clauses, in normal disjunctive
form: (A1 vA2) and (A3 vA4) and (not A2 vA4). As we discussed in the previous chapter, a
normal disjunctive expression with C clauses, V variables in
total, and K variables per clause is satisfiable if there is an assignment
of true or false to each of the V variables, such that the expression as
a whole is true.
The normal disjunctive form makes it clear that as there is an increase
in the number of alternative ways, K, of carrying out a task, or making
a clause true, it is easier to satisfy the combined expression. More generally, as noted in the last chapter
and shown in Figure 8.13, there is a phase transition in the probability that a
random Ksat expression with V variables, K per clause, and C clauses
can be satis-
236
fied by some assignment of true or false to
the V variables. The phase transition occurs
in the horizontal axis, labeled C/V, which is the mean number of clauses
in which any variable occurs. Obviously,
as C/V increases, conflicting constraints increase. The phase transition from
easily solvable to virtually impossible to solve occurs at a point on the C/V
axis equal to log 2 x 2 raised to the K power, or 0.6 x 2K.
Hence, as K increases, the
phase transition shifts outward to greater C/V values.
But Figure 8.13 gives us an intuitive understanding of
Unilever’s problem, in fact, the problem is far more
general than Unilever’s assembly plants. Think of each clause, (A1
v A2), et cetera, as a way to make one of the toothpaste products, and
think of the conjunction, (A1 v A2) and (A3 v A4) and...,
as the way to make all twenty-seven toothpastes. Then as the number of clauses, hence
toothpaste products, increases for a given plant with V variables to use in
making these different products, all of a sudden the conjoint problem will have
so many conflicting constraints that it will cross the phase transition from
solvable to insolvable.
Moreover, for any given number of clauses and V
variables, if a given assignment of true or false to the V variables
satisfies the Ksat problem, we can ask if any of the 1-mutant neighbor
assignments of true and false to the V variables that change the truth
value assigned to one of the V variables also
satisfy the Ksat problem. Thus, we can
study the phase transition from a survivable percolating web of solutions
regime when V/C is lower to an isolated peaks regime as V/C increases
to virtual impossibility for high V/C.
Thus, just as in the job shop problem, as product
diversity increases for a fixed plant, a phase transition from survivable to
unsurvivable will occur because the conflicting constraints will increase with C/V.
The resulting landscape becomes more
rugged, the peaks lower, the yellow bromine fog rises from the survivable to
the living dead to the dead regime, covering Mont Blanc and all hikers in the
Alps (Figure 9.3).
But this takes us back to the technology graph and
robust constructibility. Recall from
above that the Lego house and Lego house with a chimney might be robustly
constructible from wisely chosen intermediate-complexity objects on the pathway
to both houses, with thousands of ways to get there. At no extra cost, we might choose to stockpile
that intermediate-complexity polyfunctional object rather than another choice.
But intermediate-complexity polyfunctional objects are
just what allows multiple pathways, multiple
permutations of construction steps to our two final objects. Hence, these same smart intermediate objects
reduce the conflicting constraints in the fitness landscape over the
construction space to make our desired set of objects, or our set of
toothpastes. Lowering the conflicting
constraints makes the efficiency peaks of the fitness landscapes higher, hence,
allows survivable operations at a higher level of product diversity.
Thus, by use of the technology graph to design both products and processes, we
237
FIGURE 9.3 [HHC: not displayed]
Transition
from isolated peaks regime to survivable regime as the diversity of goods
produced in a given facility decreases from high to low. Figure suggests a bound on diversity that can
be manufactured in a given facility because conflicting constraints increase as
diversity increases. Robust operations
occur on the horizontal part of the curve just before the curve drops sharply
as diversity of goods increases. The
same concepts should apply to the complexity of a military campaign that can be
waged in terms of the diversity of weapon systems – variables - and
subobjectives - clauses. Robust
operations should occur in the survivable regime.
can choose a family of products and
construction pathways with highly redundant intermediate objects. That choice makes the problem space easy to
solve rather than hard to solve. We have
thereby tuned the statistical structure of our problem space into a survivable
regime. Furthermore, we can test whether
our choice of construction pathways to the house and/or house with a chimney is
robustly survivable or in the living dead - isolated peaks regime. We need merely use the technology graph to
test for percolating sets of 1-mutant neighboring pathways of construction of
the same objects and the average Hausdorf dimension of such pathways.
No need to operate in the isolated peaks regime. Indeed, if you face loss of parts and
machines, you had best locate back from the phase transition, deep enough into
the survivable regime to survive. And if
you are a military force fighting against an enemy whose strategy changes persistently
deform your payoff landscape and whose efforts are to destroy your capacity to
fight, you had best operate even further back from the phase transition in the
survivable regime. Indeed, the normal
disjunctive form in Figure 9.3 is a rough image of the complexity of a campaign
you can fight - the number of clauses that must be jointly satisfied to meet
238
your objectives, where each clause is a
subobjective and there are K alternative ways to meet that objective
using V weapon systems.
Just as warfare and the economy as a whole have much
in common, warfare and the biosphere have much in common. If you are a species coevolving with other
species, you had best operate back from the phase transition well into the
survivable regime.
There is a message: If you must make a living, for
God’s sake, make your problem space survivable!
This brings us back to a point made in early chapters.
Recall the no-free-lunch theorem proved
by Bill Macready and David Wolpert. Given
a family of all possible fitness landscapes, on average, no search algorithm
outperforms any other search algorithm. Hill
climbing is, on average, no better than random search in finding high peaks,
when averaged over all possible fitness landscapes.
The no-free-lunch theorem led me to wonder about the
following: We organisms use mutation, recombination, and selection in
evolution, and we pay twofold fitness for sex and recombination to boot. But recombination is only a useful search
procedure on smooth enough fitness landscapes where
the high peaks snuggle rather near one another.
In turn, this led me to wonder where such nice fitness
landscapes arise in evolution, for not all fitness landscapes are so blessedly
smooth. Some are random. Some are anticorrelated.
In turn, this led me to think about and discuss
natural games, or ways of making a living. Since ways of making a living evolve with the
organisms making those livings, we got to the winning games are the games the
winners play. Which
led me to suggest that those ways of making a living that are well searched out
and exploited by the search mechanisms organisms happen to use - mutation,
recombination, and selection - will be ways of making a living that are well
populated by organisms and similar species. Ways of making a living that cannot be well searched
out by organisms and their mutation recombination search procedures will not be
well populated.
So we came to the reasonable conclusion that a
biosphere of autonomous agents is a self-consistently self-constructing whole, in
which agents, ways of making a living, and ways of searching for how to make a
living all work together to co-construct the biosphere. Happily, we are picking the problems we can
manage to solve. Of course, if we could
not solve our chosen ways to make livings, we would be dead.
And there is, I think, a molecular clue that the
biosphere is persistently coconstructing itself in the survivable regime for a
propagating set of lineages. We have
just characterized the survivable percolating web regime where the fitness landscape
of each creature deforms due to the adaptive moves of other creatures, but
there are always neighboring ways of surviving. Genetically, those neighboring
239
ways are one or a few mutations or
recombinations away from where the species population is right now. If the biosphere has coconstructed itself such
that most species are in a survivable regime, then as coevolution occurs, most
species will persist but may well transform, for example, to daughter species. One would guess that this mildly turbulent process
is rather continuous, perhaps with some self-organized critical bursts on a
power law scale.
The “molecular clock hypothesis” seems to fit these
facts. If one compares hemoglobins from
humans, chimps, horses, whales, and so on, in general, the longer ago we
diverged from one another in the evolutionary record, the more amino acid
mutations distinguish the hemoglobins of the two
species involved. Our hemoglobin is very
similar to chimp hemoglobin and quite different from the whale. So good is this correlation that, within given
protein families, it is argued that mutations accumulate with timelike
clockwork, hence the molecular clock hypothesis, in which a number of amino
acid differences can be taken as a surrogate for time from the most common ancestor.
Different protein family clocks seem to
run at different rates.
There is evidence that the clock does not run quite
smoothly. John Gillespie, a population
biologist now at the University of California at Davis, showed some years ago
that amino acid substitutions seemed to come in short bursts that accumulate
over long periods of time to a rough molecular clock that “stutters.” Gillespie argued that fitness landscapes were
episodically shifting and the bursts of amino acid substitutions were adaptive
runs toward nearby newly formed peaks. I
agree and suggest that the near accuracy of the molecular clock data over
hundreds of millions of years and virtually all species strongly suggests that
the biosphere has coconstructed itself such that species, even as they speciate
and go extinct, are, as lineages, in the persistently survivable regime.
We as organisms have, in fact, constructed our ways of
making a living such that those problem spaces are typically, but not always,
solvable as coevolution proceeds. And,
on average, the same thing holds for the econosphere. As old ways of making a living go extinct, new
ones persistently enter. We too, it
appears, have coconstructed our econosphere such that our ways of making a
living, and discovering new ways of making a living, are manageable, probably
in a self-organized critical manner, with small and large speciation and
extinction events.
And there is a corollary: If you are lucky enough to
be in the survivable regime, you can survive by being adaptable. What is required to be adaptable as an organism
or organization? We discussed this in
the last chapter. A good guess is that
an organism or organization needs to be poised in an ordered regime, near the
edge of chaos, where a power law distribution of small and large avalanches of
change propagates through the system such that it optimizes the persistent
balance between exploration and exploitation on ever-shifting, coevolving
fitness landscapes.
Laws for any biosphere extend, presumably, to laws for
any economy. Nor
240
should that be surprising. The economy is based on advantages of trade. But those advantages accrue no more to humans
exchanging apples and oranges than to root nodules and fungi exchanging sugar
and fixed nitrogen such that both make enhanced livings. Thus, economics must partake of the vast
creativity of the universe. Molecules,
species, and economic systems are advancing into an adjacent possible. In all cases, one senses a secular trend for
diversity to increase, hence for the dimensionality of the adjacent possible to
increase in our nonergodic journey.
Perhaps again we glimpse a fourth law.
241
The Competitiveness of Nations
in a Global Knowledge-Based Economy
May 2005