Until now we have worked with the
Standard Model of Market Economics with its iconic 'X' marking
the spot where willingness to pay by consumers equals the
willingness to provide by producers. If all costs and
benefits are internalized in market price all is well. If
not then externalities exist the benefits and costs of which are
not captured by market price. Such a market failure then
justifies governmental action to assure a socially optimum
outcome maximizing net social benefits through law - including
property rights, rules and regulations and/or fiscal policy -
tax and spend. Choice is made at the 'margin',
e.g., where the additional or marginal cost of reducing
pollution by one unit is exactly equal to the additional or
marginal social benefit of doing so. This model resulted
from the Marginalist Revolution of the 1870s that successfully
married Newtonian calculus of motion and Bentham's calculus of
human happiness. It displaced the Classical School
of economics primarily concerned with distribution of national
wealth among classes of peoples - landlords, capitalists,
workers, etc. - with a focus on the allocative efficiency
of the atomized consumer and producer. Its origins lay in
mechanics and the laws of motion as understood in the middle of
the Industrial Revolution.
Economics & Biology
thru Time
As we have seen the first modern
school of economics was in fact the French Physiocrats of the
late 18th century. It was they who gave us the terms:
economist, laissez faire and laissez passer. Their
economic metaphor was not mechanics but biology, specifically
farming. With the beheading of the Physiocrats by Madame Gullotine during the
French Revolution the discipline shifted to the
beat of mechanical efficiencies in manufacturing cum Adam Smith
including the division and specialization of labour.
The emergence of a mathematical and geometric hedonic
(pleasure/pain) economics with the Marginalist Revolution
spurred a countermovement in the United States where the
biological metaphor took root. In the first
instance, Agricultural Economics broke off from the mainstream
in the 1880s. In the second instance the work of
Richard Theodore Ely (1854–1943), first president of the
American Economic Association, led to a distinct school of
American Institutionalism which dominated American economics
until the 1950s.
The leaders of the school
which did not share a common model included John R. Commons
whose work highlights the evolutionary nature of Law especially
property law and how economic behaviour shifts and changes in
response to legal changes (see his 1921
Legal Foundations of
Capitalism). Thorstein Veblen, on the other
hand, focused on the relationship between culture and economics
(see his 1899 Theory of the Leisure Class) as well as the
evolutionary nature of economics which recently led to the
formation of a distinct sub-discipline called Evolutionary
Economics. Wesley C. Mitchell was the third pillar of the
school focusing on collection and analysis of real world
statistics founding in 1921 the National Bureau of Economic
Research. His efforts fed creation of the System of
National Income Accounting in 1945. The biological
metaphor of American Institutionalism was, however, swept away
during the Red Scare of the 1950s. The School's support
for collective action and especially unions was viewed as
suspicious, if not Communist then at least 'fellow travelers'.
This also marked the emerging
dominance of mathematical
economics which was relatively safe from ideological
contamination by Marxists.
In this regard, the philosophy
of science itself was re-directed in this period through the
ideological efforts of James Conant, President of Harvard
University. His effort came to fruition with his protégé’s
most influential work, Thomas Kuhn’s 1962 The Structures of
Scientific Revolutions. Kuhn argues, in effect, that
good paradigms like good fences make good neighbours so stick to your
field of research and leave politics and other such things
alone (Fuller
1992). According to Kuhn specialization in the
natural sciences leads to the progressive incommensurability of
knowledge between scientific fields in which an Invisible
College exists of perhaps 40 or 50 people in the entire world
who can understand your work. This conclusion has
significant implications for interdisciplinary studies such as
ecology.
With Ecological Economics another
attempt is afoot to root economics once again in biology rather
than mechanics. Why should the outcome be different this
time? Arguably because the philosopher Immanuel Kant has
finally been proved wrong. Kant whose work in ecology was
raised in
0. I ntroduction to this course argued that: There would never
be a Newton of a blade of grass! By this he meant that the
sheer complexity of living things and ecologies would defy
mathematical treatment. Biology as a science, was in
fact, until the Genomics Revolution, essentially descriptive and
imprecise about the actual processes of life. Genomics began
some fifty years ago with the discovery by Watson and Cricks of
the DNA double helix. Their discovery that it could split into
complementary strands established the physical basis for the
encoding and transmission of genetic information within an
individual organism and between generations. In this regard,
the New York Times on June 13, 1953 ran an article
entitled “Clue to Chemistry of Heredity is Found” calling DNA “a
substance as important to biologists as uranium is to nuclear
physicists.” (Overbye 2003). Today molecular biology finds
expression in what is now called 'bioinformatics', a new school
of both biology but also programming based on 4 rather than 2
bits of information. One of the leaders in the field
is Stuart Kauffman currently at the University of Calgary who
has also attempted to root economics in biology.
The relationship actually works
both ways . Thus Kauffman, in his eulogy of the growing
diversity and complexity of life, draws on a root planted by
Adam Smith (1723-1790) with his observation that the division
and specialization of labour is limited by the extent of the
market. With respect to natural selection, Darwin himself
recognized a debt to economist Thomas Malthus (1766-1834) and his observation that the
food supply grows arithmetically while human population grows
exponentially. Furthermore, as we will see, Kauffman’s explanation of
mutualism or co-evolution in molecular biology is based on the
advantages of trade which conceptually links to yet another of
Smith’s immediate successors, David Ricardo (1772-1823. And Kauffman draws a parallel between survival of
the fittest in biology and business failure in economics where
the ‘survivor principle’ was coined by 1982 Nobel Prize winning
economist George Stigler. The economic principle, however, lacks
a determinant mechanism of selection. When asked which firms are
successful, Stigler answers those that survive, no matter why.
Ecological economics views the
human economy as part of the biosphere itself - not separate and
distinct. Humanity following its Nature
is simply part of the terrestrial biosphere and like other
species is enframing and enabling Nature to serve its purpose.
The difference is that humanity is the only species that
grows its own food. By doing so it generates a surplus
that fosters increasing division and specialization of labour
among its members at a rate unmatched in Nature and has achieved
global dominance. It is literally the highest link in the
food chain. Before examining some of Kauffman's parallels
between biology and economics it is important to note some
weaknesses of ecological economics as an emerging
sub-discipline.
First, it is new and it is
awfully complex mathematically at its bleeding edge.
There is no simple X marks
the spot graphic that captures its complexity at a glance.
The theoretical implications of the Genomics Revolution for
economics are only now being
adduced. Put another way, it is only recently that new
biological metaphors capable of supporting an ecological
economics have
emerged. Kauffman’s
intellectual affinity to economics as well as his debt and
contribution to it is apparent throughout his work (1995, 2000). In this
regard, he recommends a series of very sophisticated
mathematical techniques for application in economics. Their
sophistication is such that I am not qualified to judge their
internal workings or technical merits. I have, as always, strong
epistemic reservations about low grade social scientific data
fueling ever more sophisticated mathematical models, i.e.,
garbage in garbage out. Such low quality evidence should not be
confused with that generated, without human mediation, in the
natural & engineering sciences including biology.
The mathematical complexity of
ecological 'hard science' is daunting enough. One
example, phenomenon exhibit widely varying growth rates like so
many rounds of compound interest moving in different directions.
Complex computer simulations with key assumption holding at
least some factors constant are run against
available data from widely different sources of varying quality
- geographic, disciplinarian. This is at least as complex as
the financial securitization done by physicists and
mathematicians hired by the financial community leading directly
to the Great Recession of 2008. Black swans are swimming
out there but the false concreteness of numbers seems sound and
certain. In fact we never have enough
knowledge or understanding let alone wisdom to make decisions in
business and life but rather rely ultimately on 'informed'
intuition.
Stuart Kauffman’s Persistently Innovative Econosphere
Nonetheless, a number of Kauffman’s metaphors have, I
believe significance for mainstream economics and any future
ecological economics. I will examine four of them: the autonomous agent, co-evolution, the adjacent possible and comparative advantage. I
will briefly consider two other ideologically commensurate
concepts shared by biology and economics – division &
specialization of labour and natural selection.
i - Autonomous Agents
Kauffman’s central concept is the autonomous agent
(2000, 49-79). This is a Kantian-like entity with
natural purpose acting on its own behalf in an environment and
able to reproduce itself through “thermodynamic work cycles”
(2000, 49). For Kauffman, such work cycles involve, in Heideggerian
fashion, the enframed linkage
of endergonic (energy requiring) and exergonic (energy
releasing) chemical reactions whereby:
the coherent organization of … constraints on the release of
energy … constitutes the work by which agents build further
constraints on the release of energy that in due course
literally build a second copy of the agent itself…” (2000, 72)
Kauffman thus descends Kant from the cellular to the
molecular level where he finds autocatalytic sets of
“self-reproducing molecular systems” (Kauffman 2000, 130). In
effect, he finds the origin of life in chemistry. He argues that
life is the inevitable outcome of some threshold concentration
of organic chemicals widely dispersed throughout astronomical
space. While this may be so, like Kant asserting there would
never be a Newton for a blade of grass, Kaufman concludes that
while linking exergonic and endergonic reactions is essential to
definition of an autonomous agent, life itself is a “mysterious
concatenation of matter, energy, information, and something more
…” (2000, 47).
In the biosphere there is also a hierarchy of autonomous
agents. Kauffman points to the evolutionary transition from
single-cell organisms without nuclei, prokaryotes, to
eukaryotes, i.e., single-cell organisms with a nucleus plus
mitochondria in animals or plastids in plants using chlorophyll.
He concludes that:
eukaryotic cells are symbionts of two or more earlier separate
autonomous agents that contributed the mitochondria, the
plastids, and perhaps the nuclear structure of eukaryotes into a
single novel reproducing entity, the eukaryotic cell. (Kauffman
2000, 120)
Life, of course, has burgeoned far beyond single-celled
creatures. Kauffman notes there are some 265 different cell
types in the human body (2000, 182). Each is an
autonomous agent. Each, however, collectively combines to form a
higher order agent – an organ - that, in turn, forms a
functioning part of a yet higher order agent – the individual
human being. Kauffman takes this hierarchy up from the geosphere
of chemistry to the biosphere to the noösphere and beyond to the
universe itself. The process I characterize as the increasing
diversity and complexity of autocatalytic systems pursuing
Kantian natural purpose. This process is also active in what
Kauffman calls the econosphere where there are similarly higher
and lower order autonomous agents like the individual and the
firm. He argues that humanity exhibits the same basic pattern of behaviour as all life - making a living:
The parallels are at least tantalizing, and probably more than
that. While the mechanisms of heritable variation differ and the
selection criteria differ, organisms in the biosphere and firms
and individuals in the econosphere are busy trying to make a
living and explore new ways of making a living. (2000,
216)
ii - Co-evolution
The mechanism driving increasing diversity and complexity
is co-evolution defined as the mutual evolutionary influence of
two species (molecular, organic or economic) that become
dependent on each other. Each exerts selective pressures on the
other, thereby affecting each others’ evolution. This often
involves morphological co-construction, e.g., the shape of an
orchid flower matching the bill of the hummingbird. Co-evolution
and co-construction apply in both symbiotic and predator/prey
relationships between autonomous agents.
Kauffman argues that the primary mechanism of molecular
evolution is not the template model of sequentially constructing
DNA step-by-step up the ladder. Rather it is through coconstruction of its segments by sets of mutually dependent
autocatalytic molecules that then integrate the parts into a new
coherent living whole. This catches the Kantian sense that “each
part is reciprocally means and end to every other. This involves
a mutual dependence and simultaneity that is difficult to
reconcile with ordinary causality” (Grene & Depew 2004, 94).
Given an ever changing fitness landscape, autonomous agents
constantly adapt, adjust and evolve or go extinct, e.g., out of
business, sometimes in avalanches of change. They do so by
experimenting with mutations called preadaptations or
exaptations which:
… in an appropriate environment [are] a causal consequence of a
part of an organism that had not been of selective significance
[but] might come to be of selective significance and hence be
selected. Thereupon, that newly important causal consequence
would be a new function available to the organism.” (Kauffman
2000, 130)
Arguably, in a knowledge-based economy, research & development
(R&D) plays a commensurable role. It should be noted, however,
that the concept of the self-organizing universe based on
coevolution was first (to my knowledge) put forward by Eric
Jantsch in Design for Evolution (1975) and then The
Self-Organizing Universe (1980).
There are at least two other important characteristic of
life on a fitness landscape. First, having reached a peak of
fitness if the rate of mutation, change or experimentation
becomes too rapid, i.e., crosses some threshold, then “the
population ‘melts’ off the fitness peak and wanders away across
the fitness landscape” (2000, 155). This is arguably
the case with the ‘de-industrialization’ of traditional First
World economies. Second, among the many border or transition
states identified by Kauffman as characteristic of life one of
the most intriguing is that life exists on the quantum/classic
frontier.
… it is probably of more than passing interest that real living
entities, cells, do straddle the classical and quantum boundary.
One photon hitting a visual pigment molecule can beget a neural
response. In short, real living systems straddle the quantum
classical boundary. If there is a tendency of coevolving
autonomous agents to increase the diversity of alternative
events that can occur, then living entities must eventually hit
the Heisenberg uncertainty limit and abide at least partially in
the quantum realm. (2000, 149)
iii - Adjacent Possible
But from where do preadaptations and exaptations come?
According to Kauffman, using chemical reaction charts as his
model, they come from the ‘adjacent possible’ consisting “of all
those molecular species that are not members of the actual, but
are one reaction step away from the actual” (2000,
142). Extended to the noösphere, it is those thoughts and ideas
which are candidates for application at the next level of
ideological evolution. Economic and biological systems expand or
explore the adjacent possible as quickly as possible subject to
timely selection of the fit and unfit, e.g., going out of
business. If selection takes too long, then fitness may decline
or simply melt away. Arguably, this explains
‘de-industrialization’ of some First World Nation-States. They
maintained existing plant and equipment, e.g., in steel
production, until fully depreciated through voluntary (and
sometimes involuntary) quotas on imports from developing Asian
producers who were investing in the best new technologies
emerging from the adjacent possible. The fitness of the West
fell, at least in terms of the traditional manufacturing-based
economy.
A characteristic of the chemical adjacent possible is that
its size (the number of possibilities) increases exponentially
faster than the increase in the diversity, complexity and number
of autonomous agents. For example, a doubling in diversity may
result in a fourfold or greater increase in the size of the
adjacent possible, i.e., the number of new possible forms just
one step away from becoming actual. This, Kauffman argues, is
one reason for the proliferation and diversification of life.
The same may be said for knowledge itself. From this conclusion
he argues there may be a fourth law of thermodynamics involving:
a tendency for self-constructing biospheres [and econospheres]
to enlarge their workspace, the dimensionality of their adjacent
possible, perhaps as fast, on average, as is possible ...
(Kauffman 2000, 244)
This means an exponential increase in the ways and means by
which autonomous agents make a living is the inevitable outcome
of increased diversity and complexity. The transition from an
agricultural- to a manufacturing-based economy demonstrates such
an exponential increase in job opportunities, not just in number
but in the kinds of jobs. New niches appear.
Kauffman is, however, critical of contemporary economics
for its treatment of compliments and substitutes in what he
calls the technological adjacent possible. Quite simply, the
Standard Model offers no explanation for the emergence of
compliments or substitutes or for the increasing diversity and
complexity of new goods and services, e.g., the book versus the
DVD. Kauffman uses the classic example of the automobile
replacing not just the horse but also the network of goods and
services associated with it. He points out the new web of
compliments that followed innovation or emergence of the
automobile. These included paved roads, garages, gasoline
stations, parking lots, car insurance, the drive-in, then the
drive-thru, etc. Such ‘Kauffman webs’ are, at least in part,
commensurate with Paul David’s “network externalities effects”
in economics (David 1990, 356). Kauffman would have us, however,
look much deeper into the adjacent possible for compliments and
substitutes to enhance economic fitness.
iv - Comparative Advantage
If the production function is the most elegant contribution
to thought by economics, i.e., Y = f (K, L, N), then the theory
of comparative advantage is one of its most obscure. When
challenged by mathematician Stanislaw Ulam to “name me one
proposition in all of the social sciences which is both true and
non-trivial,” the Nobel Prize winning economist Paul Samuelson
responded with the theory of comparative advantage because:
That it is logically true need not be argued before a
mathematician; that it is not trivial is attested by the
thousands of important and intelligent men who have never been
able to grasp the doctrine for themselves or to believe it after
it was explained to them. (Samuelson 1969)
This obscurity partially results because the theory engages
a complex web of economic ideas including absolute advantage,
division and specialization of labour, exchange, factor
endowments, opportunity cost, production possibility frontiers,
relative prices and trade. Furthermore, it would more accurately
be called the theory of comparative cost rather than of
advantage. And, of course, some of its results appear
counter-intuitive.
Semantic obscurity has lead to the theory finding general
expression as a numeric example such as that first used by David
Ricardo to demonstrate the theory in his 1817 book The
Principles of Political Economy and Taxation. In his case, the
example concerned wheat and wine production in England and
Portugal. In summary, comparative advantage means that mutually
beneficial exchange is possible whenever relative production
costs differ prior to trade. One of its counter-intuitive
deductions, however, is that if a country enjoys an absolute
advantage in the production of all goods and services, i.e., can
produce all of them cheaper than anyone else, it is still better
off trading with other countries. The theory was used by Ricardo
to counter arguments favouring protective tariffs and trade
barriers which, intuitively, promise national prosperity. It
continues to serve this free-trade purpose.
The theory of comparative advantage, in effect, separates
consumption from production. Without trade, a nation can only
consume what it produces. With trade, it is able to consume more
than it produces. Put another way, by specializing in what it
does best, a nation can afford to buy more of what it does
worst. For Kauffman, and biology in general, the advantages of
trade are old news:
Economics has its roots in agency and the emergence of
advantages of trade among autonomous agents. The advantages of
trade predate the human economy by essentially the entire
history of life on this planet. Advantages of trade are found in
the metabolic exchange of legume root nodule and fungi, sugar
for fixed nitrogen carried in amino acids. Advantages of trade
were found among the mixed microbial and algal communities along
the littoral of the earth’s oceans four billion years ago. The
trading of the econosphere is an outgrowth of the trading of the
biosphere. (2000, 211)
To demonstrate the advantages of trade,
he uses a
biological example that, to my mind at least, is intuitive:
Consider two bacterial species, red and blue. Suppose the red
species secretes a red metabolite, at metabolic cost to itself,
that aids the replication rate of the blue species. Conversely,
suppose the blue species secretes a different blue metabolite,
at metabolic cost to itself, that increases the replication rate
of the red species. Then the conditions for a mutualism are
possible. Roughly stated, if blue helps red more than it costs
itself, and vice versa, a mixed community of blue and red
bacteria may grow. How will it happen? And is there an optimal
“exchange rate” of blue-secreted metabolite to red-secreted
metabolite, where that exchange rate is the analogue of price?
(2000, 216-17)
How it will happen and at what rate is determined by co-evolution. The benefits of trade lead each to adjust to the
other until optimal growth is achieved by both. Without each
others help, individually each would be less fit. In such a
symbiotic relationship there is also the potentiality for the
emergence of a higher order autonomous agent, e.g., prokaryotes
coevolving into eukaryotes, or European Nation-States coevolving
as the European Union.
v -
Division & Specialization of Labour
Kauffman in his eulogy of the
growing diversity and complexity of life draws on a root
planted by Adam Smith (1723-1790) with his observation that the
division and specialization of labour is limited by the extent
of the market.
vi- Natural Selection
With respect to natural selection,
Darwin himself recognized a debt to economist Thomas Malthus
(1766-1834), one of Smith’s immediate successors, and his
observation that the food supply grows arithmetically while
human population grows exponentially. And Kauffman draws a
parallel between survival of the fittest in biology and business
failure in economics where the ‘survivor principle’ was coined
by 1982 Nobel Prize winning economist George Stigler. The
economic principle, however, lacks a determinant mechanism of
selection. When asked which firms are successful, Stigler
answers those that survive, no matter why. It should also be
noted that Kauffman’s explanation of mutualism or co-evolution in
molecular biology is based on the advantages of trade which
conceptually links to yet another of Smith’s immediate
successor, David Ricardo (1772-1823).
4.1 Population, Urbanization
& Bio-diversity
It was Thomas Malthus's 1796
An Essay on the Principle of Population that led
economics to be become known as 'the dismal science'.
The human food supply grows arithmetically while the human
population grows geometrically. His insight inspired
Darwin and has haunted our civilization ever since. What
Malthus held constant, however, was technological change - both
instrumental (GMO) and organizational (agribusiness). With
the human planetary population rising to 9 billion by 2050
from
7 today
the question of how to feed all these new
mouths is no longer a technical one (see
3.2 Agriculture) but rather a political economic one.
It is a question to be framed inclusive of quality of life and
our growing environmental footprint as a species.
We
are the only species on the planet that raises its own food.
The resulting surplus allows us to grow our population
essentially as we choose. Over the last four
centuries of the Scientific Reveloution population growth has
fostered increasing division and specialization of labour among
us beginning our species baby steps towards the stars, planets
and asteroids. It has extended our grasp to the genetic
elements of Life itself which we are beginning to enframe and
enable to serve human purpose.
It has also allowed us to wage devastating wars against each other
and against Nature. War remains
the greatest cause of food scarcity during an era global surplus. We are the last highest link in the food chain.
No other species eats us and survives long. Our only real
enemy is ourselves. So what is the human condition,
environmentally, economically in the second decade of the 21st
century?
Urbanization
Globalization of the economy is
either cause or effect of massive urbanization of the last
century. This forms another distinct sub-discipline: Urban
Economics, From hunter gathers wandering the land in search
of food 10,000 years ago we began to settle in village, towns
and then cities around which agriculture produced a surplus.
Farmers could grow and raise much more food than they could eat.
Division and specialization of labour took off and civilization
- living in cities - began. This pattern persisted
until about 200 years ago when the factory system emerged with
the Industrial Revolution fuelled for the first time not by
fluctuating winds and water or the muscles of animals and men
but steam. Labour flowed from the country-side into
the cities to man the factories. An increasingly small
population remained on the land to feed the cities.
Roughly speaking in the Canada of 1900 some 60% of the
population was rural, today 3% or so.
Traditionally the City is seen as a
machine but increasingly it is viewed as an ecology. This is
sometimes called the human-built environment or an arcology
marrying ‘architecture’ including urban design and planning to
‘ecology’. The City is thus a living thing. It is planted,
grown, cultivated and maintained by its citizens – a word
literally meaning ‘those who live in cities’.
The City is thus a web of mutually
supportive relationships between physical structures such as
buildings, roads, water and power lines and its inhabitants. As
the City grows its citizens evolve ever more complex webs of
transactions called business, policing, health, education,
welfare, recreation and culture. This division & specialization
of labour generates the gains from trade that were, and remain,
the primary reason for civilization – living in cities. Its
different parts, ideally, co-evolve building on each other’s
strengths and compensating for weaknesses increasing overall
civic competitiveness, fitness and sustainability. Thus the
hummingbird’s bill co-evolves with the orchid to perfectly fit
its flower.
As noted by Adam Smith long ago,
division and specialization of labour is limited by the extent
of the market. Thus as the City grows, niches unfold opening up
new and different ways of earning a living. Increasing diversity
of opportunity combined with tolerance of difference and the
willingness of citizens – old and new – to grasp emerging
opportunities define the dynamism of a City. The adjacent
possible – the realm where possibilities one step away from
being realized reside – expands, particularly during spurts of
economic growth. Creativity, inventiveness and imagination are
required to see them and courage and confidence to grasp them.
For the first time in human history
the majority (3.3 billion) of the human population is now civilized. The
pattern above, however, arguably describes cities in the already
urbanized world. For the rest, however, the situation
varies dramatically.
Robert Kaplan, in his 1994 essay and later in a book called
The
Coming Anarchy, describes the state of slums in Turkey, West
Africa, Brazil and elsewhere in the world reporting on the
social costs of rapid urbanization in the developing world.
In fact by 2030 some 5 billion will live in cities, the vast
majority in the developing world. Most, however, have
moved before municipal infrastructure is in place.
This has led to a self-provision of traditional municipal
services by major building complexes and clusters, e.g., San
Paulo in Brazil and Bangledor in India.
As noted in
0.
Introduction
a global society in which there is
contiguous urban development separated only by natural barriers
– mountains, oceans and deserts - has been called the Ecumenopolis – the World City - by urban planner
Constantinius Doxiadis;
its global reality is visible in a composite photograph of “The
World at Night”
published by NASA in the year 2000. We see
a World City whose shimmering lights soar out into the infinite
blackness of space. However, there is no global economics, no generally
accepted model for managing the planet.
cluster theory
'urban heat island effect'
micro-climates
Gender, Geography,
Religion & Boys will be Boys
That there are 7 billion human
beings on the planet now and will be 9 billion by 2050 cannot be
argued. As the saying goes: Demography is destiny!
This increase is the single greatest environmental threat facing
the planet and its biosphere - nothing fails like success!
No one can morally ask who should die and who should never be
born as a solution. And it is not just numbers and
morality that are daunting but also the gender, geographic,
religious and age distribution of projected population growth.
The single most effective means of
human population control is the empowerment of women. As
your authors suggest educated working women can control their
reproductive cycle and represent arguably the best economic
investment that can be made in so-called developing countries.
Unfortunately this natural asset is actually shrinking in
relative terms just where such an investment has potentially the
highest rates of return - environmentally and economically.
Thus in Asia there are conservatively some
160 million missing girls, missing because they were
selected out by, among other things, 'industrialized' ultrasound
scanning. When we consider boys below,
the implications of this demographic reality for global
peace and security will be exposed.
So population growth will
concentrate in the developing or former Third World. It
will increasingly be housed in rapidly growing urban centres;
there will, relatively speaking, be fewer and fewer girls and
women. Why? Excepting Leninist China the
primary force is religion. In all mainstream
religions - Judaism, Christianity, Hinduism, Islam, Shinto -
there is either recent history or continuing practice of legal
misogyny and sexual apartheid. They also espouse: Go forth
and multiply in God's name. In Canada it was not until
1928 that a woman was recognized as a Person with the legal
right to vote. In law she was until then the 'chattel' or
moveable property of a male - husband, father, brother,
whatever. This was also the tradition in ancient Greece
and Rome as well.
The legal, social and economic
equality of women attained in the last quarter of the 20th
century is arguably one of the greatest achievements of Western
civilization. It is not shared, however, around the world
in spite of U.N. declarations, treaties and conventions.
Arguably the 9/11 attack was motivated by fear and/or hatred of
the provocative female freedom enjoyed in the secular West. The
Taliban too fear female liberation. In west Africa, and
elsewhere especially in the Islamic world, fear of female
sexuality, as opposed to fertility, leads to so-called 'female
circumcision'.
If, however, there are at least 160 million
missing girls in Asia then by deduction there 160 million extra
boys. In China where the one child policy resulted in a
surplus of boys the problem of the little emperors has arisen.
When before the typical family had three boys and three girls
the future of the family especially the elderly lay with the
success of at least one child, generally but not always a boy.
Afterwards the family's future rested on one little head on whom
all blessing are bestowed. Needless to say swelled heads
can result. The more important question is when these
boys reach manhood where will they find little women? Hormones blazing such young men traditionally
went to war or became engaged in criminal gang activity. Arguably it will be where gender bias against women is greatest that
future
wars with their devastating effects on the
environment will be fought and where street gang cultures will
flourish best. Where the gender balance is more equitable -
demographically, legally, politically, socially - there will
tend to be peace, order and good government.
It should be noted, however, that
in the West the highest birth rates are among religious
fundamentalists - Jewish, Christian, Hindu and Islamic - with
traditional views about the status of women in human society.
Secularists are not demographically replacing themselves Is demography destiny?
Biodiversity
On the one hand increased urbanization frees
up habitat - unless converted to agriculture or other uses. It also
reduces the demand for 'bush meat' and thereby inhibits
reduction of biodiversity. On the other hand, urban concentration
generally means higher wealth and a growing appetite for exotic
animals as pets and animal parts, e.g., rhino
horn, as medicinal agents or aphrodisiacs.
4.2 Climate Change:
Contrasting Sciences
It is of course climate
change that has become the focal concern about humanity's
ecological footprint on this planet. That climate changes is
in the nature of things. Whether caused by asteroids, comets,
earthquakes, volcanoes or the changing intensity of the Sun, the
climate of planet Earth always has and always will change
through time. Unlike the
discount rate and present value of future events, however, assessment of
climate change has traditionally been based on one's personal
experience and/or the remembered or recorded experience of one's
ancestors. However, it is only some 20 generations since
the Scientific Revolution began instrumental measurement of
climate and its change. It is arguably only 5 generations since we developed the technology to use proxies to
estimate past climate conditions, e.g.,
tree rings, ice cores, etc.
In that time we have come to
learn of the major forces creating Earth's climate and
contributing to its change. The ionosphere blocking deadly
UV radiation from the sun, the balance of gases in the
atmosphere, the great ocean currents exchanging hot southern for
cold northern waters, volcanoes causing so far only short term change like
Mount Tambora's 1815
eruption making 1816 the year without summer or Krakatoa's 1883
eruption that caused a 5 year volcanic winter. Roughly
during the same period the human population soared almost 18
fold from 400 million in 1500 to 7 billion today. This is
a species whose emergence 200,000 years ago - then numbering in the thousands
- contributed to the extinction of
almost all megafauna on the planet. There is little doubt
that it has an effect on the
environment and climate when numbering in the billions.
The question is how much and in what direction? Is
it life affirming with respect to other life forms on this
planet or life denying in order to feed its own increasing
numbers? Concerning problems of population control see
above
4.1 Population, Urbanization & Bio-diversity.
During the last 50 years - about
two and a half generations - new instrumentation has been added
- near and far Earth satellites. Sensing and
measuring everything from gravity variations on the planet to recent solar observatories reporting on our single
most important energy source - the Sun. It was thus only
in 1958, a year after Sputnik and the beginning of the Space
Race, that the Van Allen Belts of charged particles flowing
along the Earth's magnetic torus were discovered by Explorer 1.
Some theories but then instrumental proof.
As an
example of new intelligence being gathered consider this October
9, 2011 report on the BBC
Ultraviolet light shone on cold winter conundrum
http://www.bbc.co.uk/news/science-environment-15199065.
Variations in solar UV apparently triggered recent cold winters
in northern Europe. Who would have guessed? Who
could have guessed without the instruments in place around
the Sun. As for rogue asteroids crashing into our little
blue marble repeating the sad tale of the dinosaur there
is a growing international enterprise to survey the skies.
This includes Hubble and other orbiting observatories.
We are beginning a new
age of discovery about our home world and its environment.
It is also increasingly international in
nature including the International Space Station - the third
brightest object in the sky after the Sun and Moon. Human
access to the station is, at the moment, in the hands of the
Russian Federation. The European Union has been active in
launching numerous scientific observatories and interplanetary
missions. And then there is the
burgeoning Chinese,
Indian and Japanese space programs which have sent instruments
to the Moon, comets and asteroids and back. The privatization of
the American space industry promises some surprises and forecast developments including space ecology tours - see the Earth from
Space in zero g's - and, of course, hotels.
Nonetheless we are just at
the beginning of a new age of enlightenment - the Earth as seen
from Space.
i - Ozone Depletion
The contrasting success of
the 1987 Montreal Protocol to the 1985
Vienna Convention for the
Protection of the Ozone Layer
and the 1997 Kyoto Protocol
to the 1992 United Nations
Framework Convention on Climate Change (Rio or Earth Summit)
demonstrates the varying authority of contrasting sciences.
Thus by 1985 there was
hard data subject to testing
that the ozone in the atmosphere was depleting potentially
exposing Earth to devastating ultraviolet radiation from the
Sun. Opening over the Antarctic continent a hole appeared
in the ionosphere and was growing. It represented a clear
and present danger to all of humanity in a very short timeframe.
The worst case scenario was such holes opening and closing over
population centres such as NYC. In effect torching the
human population.
The Vienna Conference of 1985 came
into force in 1988 acting as a framework for international
efforts to protect the ozone layer. It did not include any
legally binding targets. However, the 1987 Montreal
Protocol on Substances That Deplete the Ozone Layer
established such goals including mechanisms to encourage the
phasing out of production substances responsible for ozone
depletion. Some 196 countries have ratified the Protocol
and the ozone layer is expected to recover by 2050.
ii - Global Warming
By contrast the 1997 Kyoto
Protocol to the 1992
United Nations Framework Convention on Climate (UNFCCC) has
been less successful. The UNFCCC is an international
environmental treaty produced at the United Nations
Conference on Environment and Development (UNCED) - the
Earth Summit - held in Rio de Janeiro from June 3 to 14, 1992.
Its objective was to stabilize greenhouse gases at a level that
would prevent human change of Earth's climate, specifically its
current 'normal' temperature distribution around the world.
It too like the Vienna Convention on ozone did not include any
legally binding targets. The 1997 Kyoto Protocol on
Climate Change like the 1987 Montreal Protocol did.
The Protocol laid out greenhouse
gas emission reductions for developed countries and established
mechanisms including emissions trading, the clean development
mechanism and joint implementation agreements. Most
industrialized countries and some central European economies in
transition agreed to reduce emissions by an average of 6 to 8%
below 1990 levels by 2008–2012. The developing world
including India and China, however, were and remain exempt.
The United States was required to
reduce emissions 7% below 1990 levels but Congress did not
ratify the treaty after President Clinton signed it and the Bush
administration explicitly rejected the protocol in 2001.
There were fundamental economic reasons for doing so including
enormous costs of complying while developing countries did not.
The post-1997 eruption of China as the world's factory for
almost everything and India the global answering service makes
the decision almost prescient in hindsight. The
decision was not, however, just about economics but also about
the arguably differing authority of contrasting sciences.
When it comes to global
warming we are dealing with a different form of science than the
Montreal Protocol. Arguably Montreal is based on old style
reductive science in which experimental testing can confirm
results. Modeling of complex system such as climate
change arguably represents the outer limits of contemporary science
for a number of reasons. First, traditional reductive
science of controlled experimental conditions gives way to 'real
world' integrative inter-disciplinary sciences such as ecology
and climatology. They are in a sense 'synthetic sciences'.
As in the 'soft' or human sciences experimental testing is
severely limited. Second, modeling requires
evidence of varying quality drawn from widely different
sub-disciplines within physics, chemistry and biology. The
interdisciplinary correlation of potentially incommensurate
findings across disciplinary borders remains problematic.
Third, modeling as in
economics requires holding certain factors constant while
examining effects caused by changes in others. Such
assumed 'constants' may or may not in fact remain constant
in real world situations. The case of
black carbon and sulphates currently ignored in most climate
models is a case in point. Fourth,
models are generally tested through computer simulations
comparing forecast outcomes against available data sets.
The model is then re-calibrated to generate outcomes that more
closely approximate evidence presented in such data sets.
This puts a premium on the quality and length of data sets which
are themselves often the result of manipulating or massaging
observational data accumulated in many different places by many
different observers over differing time frames. This was
one issue raised during the 2009 Climategate
Controversy.
Fifth and finally,
that such modeling can be off the mark was demonstrated in a
field arguably less complex than climate change - investment
finance. Many of the same statistical and mathematical
techniques developed originally in physics were used to generate
so-called Collateral Debt Obligations (CDOs) probabilistically
engineered to secure a positive rate of return in the sub-prime
mortgage market. When a key assumption (that
regional housing markets would not all go down at the same time)
was broken the outcome was anything but secure for the entire
financial industry. That physicists have moved
on from Wall Street is reflected in a recent
independent review of Climategate data series by physicists.
This critique is not intended
to reject such modeling but rather to recognize its inherent
limitations and treat its findings with reasonable doubt
compared to traditional experimental science. It is,
however, no coincidence that the Rio Declaration introduces for
the first time, to my knowledge, the 'precautionary principle'
as a norm of international environmental law - jus cogens
(See
2.3 Property Right -
The Commons (Natural & Artificial) & International Law).
To quote the
Rio Declaration:
Principle 15
In order to protect the
environment, the precautionary approach shall be widely applied
by States according to their capabilities. Where there are
threats of serious or irreversible damage, lack of full
scientific certainty shall not be used as a reason for
postponing cost-effective measures to prevent environmental
degradation.
Findings from
such modeling indicated that human activity particularly burning
fossil fuels is affecting the Earth's climate. Based on
these findings a global effort called the Kyoto Protocol was developed to slow human
induced global warming. In many ways this effort is
based on the precautionary principle given "lack of full
scientific certainty'. Unlike ozone
depletion with its nearly immediate and devastating impact
around the world,
global warming is an externality distant in time (fifty to one
hundred years or 2 1/2 to 5 generations) with benefits
and costs unevenly distributed through geopolitical space,
i.e., some will win and some will lose from global warming.
One response to the winner/loser outcome has been introduction
of an additional argument in favour of united action - 'global
weirding'. In this view global warming involves not
just a rise in average temperature but also increasingly severe
weather patterns around the world.
In fact the Protocol exempted
'developing' countries from formal abatement obligations placing
the burden on developed world economies. Given the rapid
economic rise of the developing world especially China, India
and Brazil since 1997 this exemption arguably threatens the
entire project. Thus China has now surpassed the United
States as the largest single source of global warming gases.
Given China's growing dependency on coal fired generation of
electricity its contribution to global warming is likely to grow
not diminish in the near future. Attempts to extend
abatement obligations among developing countries has been firmly
rejected. Is it equitable that developing
countries should sacrifice growth and development by abating
green house gases when the rise of the developed world was
fuelled by generating such gases through burning fossil fuels in
the past?
The economic cost to the
developed world of this global effort is huge! The
economic costs of inaction according to some experts is much
higher. The 2006
Stern Review on the Economics
of Climate Change published by the U.K.
Office of Climate Change lays out the potential damage to the
economy of inaction. The review, like virtually all of
climate change evidence, is the result of synthetic science
involving both individual physical and engineering sciences as
well as the soft sciences, i.e., the social sciences and
humanities. It assumes over the relevant timeframe no
other climate changing events, e.g., solar cooling,
volcanoes, etc. These are 'black swan' events,
i.e., very low probability but high impact. Only the
global warming effects of human behaviour is considered.
Tipping points is another device presented in much of the
literature. A marginal effect such as human induced global
warming according to this narrative tips the balance of, for
example, melting of the ice caps, cessation of the Gulf Stream,
eruption of undersea and tundra-based methane deposits, etc.
These too are black swan events but ones generally presented to
sustain the global effort.
Three grand strategies have
been proposed to tackle global warming. These are geoengineering,
adaptation and mitigation. Geoengineering envisages
massive projects ranging from seeding the oceans with iron
particles to stimulate plankton growth and thereby absorb green
house gases to space 'umbrellas' to cut solar heating of the
planet. Adaptation involves preparing for the likely
consequences of global warming such as rising sea levels.
Population could, for example, be gradually moved to higher
ground. Mitigation involves steps to minimize the effects
of global warming by, for example, abating generation of green
house gases. To date geoengineering and adaptation have
been effectively ruled out and mitigation has become the chosen
strategy.
During negotiations of the
Kyoto Protocol a search began for cost effective mitigating solutions.
Two major alternatives were considered: emission charges and
national cap and trade schemes. It was determined
that emission charges would lead to economic leakage from
jurisdictions imposing such charges to those that did not
placing abating countries at a competitive disadvantage.
It could also lead to a 'race to the bottom'. Cap and trade, on the other hand, would serve to nationalize revenue flows.
Each nation ratifying the Protocol would be assigned a cap on
how much green house gas they could produce. The cap could
then be used to permit countries exceeding their target to cover
their excess by buying capacity from nations that
succeeded in keeping output below their assigned cap. The
most extensive cap and trade system has been adopted by the
European Union in which national caps have in a number of member
states been auctioned off to corporate citizens who in turn are
allowed to trade between themselves.
4.3 Sustainability,
Competitiveness & Fitness
'Sustainability' is a term with
many meanings to many people. It can mean sustaining our
current life style and/or standard of living into the future.
This is analogous to the long-run outcome of perfect competition
in economics - the steady state. It is an equilibrium end
state. It can mean curtailing our current standard of
living to ensure future generations have resources available to
sustain their needs into their future. It can mean
curtailing our current standards in order to sustain the
remaining 'natural' environment and other species. Such
curtailment, to be effective, would, however, require what
futurists of the 1970 called 'the spontaneous dawning of
awareness' by virtually the entire human population. It
can also, as suggested in the
Bruntland Report, simply mean meeting the needs of the present without
compromising the future. This last definition sounds like Bain's
definition of economic conservation - wise use (See
3.4 Mining).
Sustainability, however, must
compete with the other great buzz phrase of our age:
competitiveness. The term arose as a sports metaphor.
In sports, it is the opposing team that is the challenge. The
playing field, the environment itself, is generally fixed,
invariant and subsidiary to the consciousness of players at
play. The competitiveness of sports brings the sense of
win/lose against an opponent and winner take all.
Nation-States have adopted competitiveness as a norm which finds
expression in comparative advantage and the benefits of trade.
Specializing in what one does best allows one to trade for what
one does worse.
Shifting to biology, however,
offers yet another term that arguably captures the sense of
sustainability and extends it: fitness. Natural selection involves not just an opponent but also new
invariants and affordances thrown up by an ever changing
environment. In this sense Darwinian fitness is not simply
bodily strength, intelligence, vigor or bravery vis-à-vis
rivals. Rather, fitness is a compounded result of the mutual
relationship between an organism and its environment including
symbiotic as well as predator/prey relationships. In fact, symbionts can significantly enhance fitness,
i.e., the
probability one will survive and leave descendants.
The fitness
landscape is constantly changing, altered and distorted by
perpetual adaptation by competitors and symbionts as well as
environmental variation and change such as increased heat or
cold, wet or dry and the rise and fall of mountains, etc.
Shifting to a biological metaphor expands focal attention to
include the environment and symbionts, dimensions the sports
analogy cannot capture. The fitness of biology brings a sense of
survival/reproduction in an environment increasing enframed and
enabled by human technology and populated by many more symbionts
than predators. The sports metaphor is hostile and aggressive; the
biological,
cooperative and coevolutionary. In effect,
most Nation-States, especially smaller ones, have opted for
coevolution with other Nation-States in the guise of trading
blocs such as NAFTA and the European Union. Division and
specialization of knowledge remains limited by the extent of the
market and most Nation-States are not large enough in population
and/or natural resources to specialize in everything. They can
no longer independently reproduce.
Fitness also implies flexibility in
the face of environmental change. Overspecialization can
be deadly if the environment changes suddenly, e.g.,
a bird flu epidemic halting international trade for months.
Fitness also suggests conservation or sustainability of
resources (and knowledge generally lost by outsourcing work to
other countries) to meet future unexpected, possibly Black Swan
events.
4.4 Links
Black, R., "
Global
warming 'confirmed' by independent study" , BBC Online, October
20, 2011.
Bruntland, G.H.,
Report of the World Commission on Environment and Development:
Our Common Future, United Nations, 1987.