The Competitiveness of Nations in a Global Knowledge-Based Economy
6.1.1 Personal & Tacit Knowledge
6.2 As Input
6.2.1
Codified & Tooled Capital
6.2.2
Personal & Tacit Labour
6.2.3
Toolable Natural Resources
6.3 As Output
6.3.1
The Person
6.3.2
The Code 6.3.3 The Tool
6.4 Reconciliation |
Epithet
There
are four classes of idols which beset men’s minds. To these for distinction’s
sake I have assigned names—calling the first class Idols of the Tribe; the
second, Idols of the Cave; the third, Idols of the Market-Place; the fourth,
Idols of the Theatre.
Francis
Bacon (1560–1626) Novum Organum, Aphorism 39 (1620).
* Index & Epithet not in published dissertation
|
6.0 KNOWLEDGE AS FORM
1. Whether generated by Science or Design (efficient cause), knowledge must assume form to exist (formal cause). In summary, I will argue that knowledge takes three forms: personal & tacit, codified and tooled knowledge. These become inputs to the economic process as codified & tooled capital, personal & tacit labour and toolable natural resources. Inputs, in turn, are transformed in production into final knowledge outputs as the Person, the Code and the Tool. Such outputs satisfy the elemental human need to know (material cause). In doing so, I am, however, defying Francis Bacon’s warning about erecting new “Idols of the Market-Place” (Bacon 1620). I do so, however, by throwing down lesser ones, ones currently the subject of debate concerning the knowledge-based economy.
1. Current public policy debate about the knowledge-based economy focuses primarily on two forms of knowledge: tacit and codified with passing reference to ‘local’ knowledge which I will subsume under tacit. In this debate, codified knowledge generally refers to knowledge that can be written down, recorded and easily transmitted to others while tacit knowledge cannot be recorded nor easily transmitted to others. ‘Local’ knowledge refers to that which exists in a specific location such as on the shop floor and which cannot be recorded nor easily transferred to other locations, e.g., team efficiencies and economies (Cambrosio & Keating 1988, 244; Alder 1998).
2. Both codified and tacit knowledge are recognized as factors affecting the production function of the firm and nation-state (OECD 1996; Malhotra 2000; ANSI/GKEC 2001). Both, however, are subject to widely varying and very thin interpretations with significantly different policy implications (Cowan, David & Foray 2000, 212-213).
Indeed, references to ‘tacitness’ have become a platform used by some economists to launch fresh attacks upon national policies of public subsidization for R&D activities, and equally by other economists to construct novel rationales for governmental funding of science and engineering research and training programs.” (Cowan, David & Foray 2000, 212-213)
3. I will now examine the nature and origin of both forms of knowledge – tacit and codified - and add to the mix tooled knowledge by which I mean knowledge fixed in matter as
65
function. That knowledge can be embedded in matter as function is evidenced by the widespread industrial practice of reverse engineering.
6.1.1 Personal & Tacit Knowledge
1. The term ‘tacit knowledge’ derives from the work of chemist turned philosopher, Michael Polanyi, especially his 1958 book, Personal Knowledge: Towards a Post-Critical Philosophy (M. Polanyi 1962a). It should be noted that the 1962 edition is referenced herein. This second edition was published in the same year as Thomas Kuhn’s first edition of The Structure of Scientific Revolutions which went through three editions - 1962, 1970 & 1996. While Kuhn makes only one reference to Polanyi, Fuller argues that many of Polanyi’s insights were subsequently and inappropriately attributed to Kuhn. He concludes that “it is not hard to see that Kuhn owed more to Polanyi than the appreciative footnote to his magnum opus, Personal Knowledge, would suggest” (Fuller 2000, 140).
2. In fact, between 1950 and 1970, Polanyi published, in addition to his book, some 13 articles defining his philosophy of science and his theory of knowledge (1950, 1952, March 1957, August 1957, 1960-61, 1961, 1962, October 1962, 1965, 1967, 1968, 1969, 1970). Like Kuhn, Polanyi was concerned with defending Science from political and commercial interference. Ideologically, however, he went much further than Kuhn with “The Republic of Science” (1962b) making a direct public policy comparison between the scientific process and the perfectly competitive market of the Standard Model, i.e., let Science do it without interference from government.
3. Polanyi also strongly objected to the received philosophy of science, logical positivism/empiricism (LPE), on two grounds: reality and understanding. For LPE, reality was a metaphysical concept without epistemological content. Polanyi put it this way:
The modern ideal of science is to establish a precise mathematical relationship between the data without acknowledging that if such relationships are of interest to science, it is because they tell us that we have hit upon a feature of reality. My purpose is to bring back the idea of reality and place it at the centre of a theory of scientific enquiry. (M. Polanyi 1967, 177)
4. For Polanyi, “reality in nature is a thing that may yet manifest itself inexhaustibly, far beyond our present ken” (M. Polanyi 1967, 192). It is indeterminacy that defines reality, not certainty. For Polanyi this has significant scientific and philosophical implications:
Modern antimetaphysical philosophies, like pragmatism, operationalism, positivism, and logical positivism, have tried to spell out the implications of asserting a proposition to be true. But if the truth of a proposition lies in its
66
bearing on reality, which makes its implications indeterminate, then such efforts are foredoomed. They have in fact failed, and must fail, for the indeterminate cannot be spelt out without making it determinate. It can be known in its indeterminate condition only tacitly, by those tacit powers by which we know more than we can tell. (M. Polanyi Oct. 1962, 612)
5. Accordingly, under LPE there can also be no understanding. For Polanyi, however,
understanding is taken to include the kind of practical comprehension which is achieved in the successful performance of a skill. This being allowed for, understanding may be recognized as the faculty, cast aside by a positivistic theory of knowledge, which the theory of tacit knowing acknowledges as the central act of knowing. In this sense the practice of skills, the diagnosing of physiognomies, the performance of tests, the use of tools and probes, and the meaningful uttering of denotative words, are so many acts of understanding complex entities. (M. Polanyi Oct. 1962, 605)
6. Polanyi’s epistemology is explicitly rooted in gestalt psychology (M. Polanyi Oct. 1962, 605). It is also comprehensive in that “the theory of tacit knowing establishes a continuous transition from the natural sciences to the study of the humanities” (M. Polanyi Oct. 1962, 606). He even proposes a “tacit coefficient” to measure this transition (M. Polanyi Oct. 1962, 605). This led, however, to criticism of tacit knowledge “as psychological, not logical, in character” (M. Polanyi Oct. 1962, 612). This highlights ongoing tension between philosophy and psychology as the preferred path to human knowledge and understanding. As previously noted, Kuhn began with cognitive psychology but this was considered ‘metaphysical’ by his critics and he retreated to sociology; Polanyi did not retreat.
7. Three central concepts define and delineate Polanyi’s ideology: subsidiary/focal knowledge, indwelling and displacement. First, according to Polanyi, we know in an integrated stereoscopic manner invoking a combination of subsidiary and focal knowledge. Thus we know “subsidiarily the particulars of a comprehensive whole when attending focally to the whole which they constitute” (M. Polanyi Oct. 1962, 601). It is subsidiary knowing that is called “tacit, so far as we cannot tell what the particulars are, on the awareness of which we rely for attending to the entity comprising them” (M. Polanyi Oct. 1962, 601). In fact, to the degree that we focus on the whole, its parts cannot be known at the same time in themselves. In very gestalt fashion, Polanyi concludes:
We may call the bearing which a particular has on the comprehensive entity to which it contributes its meaning, and can then say that when we focus our attention wholly on a particular, we destroy its meaning. (M. Polanyi Oct.1962, 601)
Polanyi’s focal/subsidiary knowledge can be relationally expressed in aesthetics as figure/ground or melody/note, in Grene’s biology as invariant/affordance and, for my purposes, as Science by
67
Design. Arguably, Polanyi would include all these examples, including my own, as “variants of the same organismic process” (M. Polanyi Oct. 1962, 610).
8. Critically, Polanyi concludes it is
appropriate to extend the meaning of “tacit knowing” to include the integration of subsidiary to focal knowing. The structure of tacit knowing is then the structure of this integrative process, and … we shall say that, ultimately, all knowledge has the structure of tacit knowledge. (M. Polanyi Oct. 1962, 602)
9. The integrative or constructionist power of tacit knowing, as defined by Polanyi, is also apparent, as previously noted, with respect to the subsidiary or background role played by ideology and technology in our daily lives. If technology cum Heidegger (1955) tacitly enframes and enables us as physical beings within a human built environment then ideology (inclusive of religion) tacitly enframes and enables us as mental beings within a network of local, regional, national and global communities of ideas. It is this enframing and enabling of minds within systems of ideas that forms, in part at least, the noösphere.
10. Second, according to Polanyi, the ultimate in tacit knowledge is the human body. Everything we do in, and know of, the world is through our bodies – seeing, hearing, touching, tasting, smelling. The body, however, is normally transparent to the mind in its doings and knowings. This transparency Polanyi calls “indwelling”.
Tacit knowing … appears as an act of indwelling by which we gain access to a new meaning. When exercising a skill we literally dwell in the innumerable muscular acts which contribute to its purpose, a purpose which constitutes their joint meaning. Therefore, since all understanding is tacit knowing, all understanding is achieved by indwelling. (M. Polanyi Oct. 1962, 606)
11. Indwelling characterizes not just physical performance but also aesthetic distancing and ‘objective’ scientific observation. Polanyi concludes that “it bridges the gap between the I-It and the I-Thou, by rooting them both in the subject’s I-Me awareness of his own body, which represents the highest degree of indwelling” (M. Polanyi Oct. 1962, 606).
12. Third, indwelling has a powerful corollary that Polanyi uses to treat experimental instrumental science: displacement. And it is here that Polanyi meets Heidegger. A characteristic of human being is displacement of sensation from point of contact to distant source. Thus, in the use of a hand tool such as a hammer: “the impact that their handle makes on our hands and fingers is not felt in itself at the place where it happens, but as an impact of our instrument where it hits its object” (M. Polanyi Oct. 1962, 607). This displacement allows humans to indwell in their tools and technology in what I call, existential phenomenology. I will have more to say about this below.
68
13. Conspicuous by its absence in all of Polanyi’s epistemology, however, is any reference to codified knowledge. He treats language but only as an example of tacit knowing. Fixation of semiotic code into material form does not arise anywhere in his work. The opposition, if any in this very dyadic relationship, is between focal and subsidiary knowledge, not tacit and codified.
14. Equally conspicuous by its absence is the term ‘personal’ in discussion of ‘tacit knowledge’ in the current debate. It is clear from Polanyi’s usage that tacit knowledge is ‘personal knowledge’. Put another way, personal knowledge is living knowledge, knowledge fixed in an individual natural person. From whence it comes – demonstration, experience, experimentation, intuition or reading – does not change its personal nature. As will be demonstrated other forms of knowledge – codified and tooled – take on meaning or function only when mediated by a natural person. I therefore insist upon the phrase ‘personal & tacit knowledge’ to highlight its root in the natural person. If, from time to time, I slip and use ‘tacit’ alone, I ask the reader to implicitly add ‘personal’. This slippage is, as I will argue below, reflective of an ideological bias of the Standard Model of economics that I call ‘capitalization of labour’.
15. The question remains, however, what physical form does personal & tacit knowledge take? In fact, it comes in two distinct forms. The first is the matrix of neurons that fix memories (knowledge) as part of one’s voluntary wetware, i.e., that part of the nervous system subject to conscious control, specifically, to recall. Memories can usually be described and codified, i.e., spoken and transcribed into language or drawn as a picture.
16. The second are reflexes (part of one’s involuntary wetware) composed of “the connected set of nerves concerned in the production of a reflex action” (OED, reflex, n, 6 b). Reflexes refer to the memory of our limbs and digits of how to do something, e.g., ride a bicycle. Etymologically it is relevant that the word ‘reflex’ derives from ‘reflect’ in the sense of ‘to remember’. Knowledge is fixed in one’s body parts and nervous system. This may be the fine practiced motor skills of a brain surgeon or those of a professional bricklayer. What they share is that such knowledge is tacit, i.e., not subject to articulation and codification - spoken, transcribed or drawn. It can, however, sometimes be transferred through demonstration, repetition and practice.
17. Ultimately, all knowledge is personal & tacit. A Code or a Tool always leads back to a Person acting as agent to decode or activate it. Personal & tacit knowledge is also one-dimensional, a monad: it is known by only one mind. It is the sum of what an individual knows. If one is what one knows then personal & tacit knowledge is the definition of the individual
69
human being. And, only the individual can ‘know’. Books and computers do not know that they know, nor does any other species on this planet. Companies, corporations and governments or, in Common Law, ‘legal persons’ cannot know (Graf 1957). Only the solitary flesh and blood ‘natural person’ can know.
1. Codified knowledge, as a term, does not have a seminal authorial source. Rather it refers to the encoding of knowledge in written language, symbols (including mathematical symbols), sounds or pictures. In effect, the knowledge of one person is fixed in a communications medium then subsequently – distant in Time and/or Space – is decoded and assimilated by another human mind into personal & tacit knowledge. It is thus extra-somatic knowledge (Sagan 1972) acting as “a completely new ‘genetic’ system dependent on cultural transmission” (Waddington 1960, 149).
2. Codified knowledge is semiotic in nature, i.e., conveyed in signs and symbols. As noted by Husserl, writing, and codified knowledge in general, “makes communications possible without immediate or mediate personal address; it is, so to speak, communication become virtual. Through this, the communalization of man is lifted to a new level” (quoted in Idhe 1991, 46). In this sense, codified knowledge is two-dimensional engaging two minds – the author and a distant reader/receiver. Such knowledge, however, begins and ends as personal & tacit knowledge within the human mind.
3. There are four qualifications to this definition. First, speech or oral language is codified knowledge. Oral or pre-literate cultures, as will be seen, create and maintain knowledge through the mnemonics of chant, incantation, poetry and fable. In general, however, I ignore spoken language unless fixed in material form, i.e., recorded in a material matrix - written or otherwise.
4. Second, codified knowledge is restricted to ‘human-readable’ (analogue) code. I therefore exclude machine-readable (digital) code including machine/molecule-readable versions of the genomic ‘code of life’ or the “autobiography of a species” (Ridley 1999). The distinction is between semiotic or symbolic knowledge communicated from one human mind to another versus the operating instructions of a machine or a molecule. As will be seen, machine/molecule-readable code is a form of ‘soft-tooled’ knowledge.
5. Third, my focus is primarily on the matrix or communications medium into which knowledge is fixed rather than its content. In this sense, to paraphrase McLuhan, ‘the matrix is the message.’ Beginning with the telegraph then the photograph, telephone, sound and video
70
recording, the number and form of communication media has exploded since the 19th century and continues to do so. In effect, there has been an avalanche of speciation of communication technologies.
6. Fourth, codified knowledge is both an intermediate good, i.e., an input to the production process, and a final consumption good, e.g., as books, magazines, motion picture, reports and sound recordings. Machine-readable code is always an intermediate and never a final output of the knowledge-based economy. Human-readable code can be either.
7. In effect, codified knowledge consists of four overlapping levels. The first is the personal knowledge of the author. The second is the semiotic code itself – alphabet, icons, pictographs, pictures, sounds, etc. into which knowledge is translated. The third is the material matrix or communications medium into which knowledge is fixed and then transmitted for decoding by another human mind – the reader/receiver. Each has epistemological and legal implications. In codified knowledge ideology meets technology. Code structures ideas and their expression while the matrix structures how we send and receive knowledge from others. I will now consider some of the epistemic implications of Code and the changing nature of its matrix as revealed by three sets of authors.
6.1.2.1 Innis, McLuhan & Réalism fantastique
1. Through his study of communications, Harold Innis identified a fundamental relationship between a culture and its dominant communications matrix (Innis 1950, 1951). According to Innis, a civilization is limited in space, but extensive in time, i.e. it has duration, to the extent its matrix is durable, e.g., stone, clay or parchment. Alternatively, it is extensive in space, but limited in time, to the extent its communications matrix is non-durable but easily transported, e.g., papyrus and paper. Using this hypothesis Innis explained the rise and fall of empires. Five examples will demonstrate Innis’ style of inductive analysis.
2. First, acid-based paper is cheap and light weight and has been used for about 150 years. This corresponds roughly to the 19th century European colonial expansion. Books, newspapers, periodicals and other records fixed in this medium are, however, now disintegrating in libraries and archives around the world (The Economist, February 27, 1987: B-1). Meanwhile, parchment and vellum from the thirteenth century have not ‘self-destructed’. This implies that the European colonial empires would be short-lived in Time because the dominant communications medium was light and easily transportable. In fact, the British Empire ‘on which the Sun never
71
set’ was, in historical terms, the most extensive in space, but also one of the shortest empires in duration.
3. Second, the dominant communications medium today, leaving aside for the moment the Internet, is television that spans the world in an instant, i.e. it is extensive in Space. Television takes the average citizen around the world to spaces and places of which his ancestors never knew. A question, however, has arisen about television's impact on attention span. Some argue that children do not read as well as before because attention span has been reduced by TV, i.e. the medium, while extensive in Space, has reduced the psychological duration of Time.
4. Third, new communications technologies have arguably made the entertainment industry the largest sector of final demand in the knowledge-based economy. But this industry is peculiar in a number of ways:
(a) its hardware including direct broadcast satellites, fiber optics, magnetic recording technologies, and the compact disc player are based upon aluminum, silicon and iron oxide, i.e., stone, that, theoretically, should endure more than a century. On the other hand, content, such as television programs, circle the globe in an instant;
(b) production of the medium is separated from production of the message. Thus “home entertainment” hardware is dominated by Asian producers while programming is dominated by the American entertainment industry, i.e., Hollywood. This international division of labour - medium from message -suggests a new culture unlike any in history, i.e., a global culture; and,
(c) like previous communications revolutions, e.g., the printing press, the new communications media is being accompanied by a breakdown of old ways of communicating, e.g., declining literacy, and a heightened sense of societal “dis-ease”.
5. Fourth, behind the scenes lurks a new nervous system, a new communications matrix, encircling planet Earth – the World Wide Web, the WWW or ‘the Web’. In less than a decade, the Web has affected business, economics, education, entertainment, health care, information, news and the nature of work itself. Among the many characteristics of the Web, consider three:
(a) the Web is economically bifurcated into intermediate and final knowledge goods and services. For example, the ‘consumer’ Internet is paralleled by the ‘B2B’ or the business-to-business Internet that globally links producers and suppliers with significant cost reductions for firms. In effect, it has reduced transactions costs of doing business (Coase 1992) and shifted the borderline between the firm and the marketplace, e.g., outsourcing;
72
(b) mechanical and electronic devices are increasingly being plugged into the Web. From automobiles, ships, trucks and trains to home air conditioning, computers, heating, lighting and security systems to microwave ovens, refrigerators, toasters, toilets and TV sets all are now being attached to the Web permitting two-way communication not just between people but also between machines. The Web therefore carries both human-readable and machine-readable code; and,
(c) creation, distribution and duplication costs on the WWW approach zero raising questions about copyright and facilitating new forms of authorship as noted by copyright lawyer David Nimmers (1998, 521-522).
6. Fifth, and finally, contemporary recording technologies provide artists, celebrities and ‘historic’ events with something that only literary and visual artists enjoyed in the past - life after death. This is a life not as a ghost on another plane, but as a shadow on the silver screen. There may never again be a Richard Burton, but his image, his voice, his body language and his performance will now endure like the plays of Shakespeare, part of our social genetic, the extra-somatic knowledge that is the stuff of culture. This link with the past, or re-ligio, distinguishes the Arts from other knowledge domains. In the natural & engineering sciences, for example, Kuhn observed that normal science has, in effect, no history at all. Everything so to speak is ‘new’ (Kuhn 1996). In the Arts, however, the images and words of cultures and civilizations long buried by the sands of time enrich and inspire contemporary creators (Boulding July 1986).
7. Innis’ colleague, Marshall McLuhan, extended the linkage between medium of communication and duration of civilization with his aphorism “The Medium is the Message”. McLuhan recognized that the material matrix affects both reception of the message and the fabric of society itself. From the hot, focused matrix of the printing press with its linear phonetic alphabet (the first engine of mass production) to the cool, passive medium of television with its cascade of images and sounds, McLuhan believed a major transformation in consciousness, of knowing, is underway: “the transition to the electronic phase of simultaneous or acoustic man” (McLuhan 1978).
8. A similar conclusion was reached in France by Pauwels and Bergier with their 1960 publication of Les matin de magician (The Morning of the Magicians). This text began a new strain in French philosophy, or rather metaphysics, called réalisme fantastique (fantastic realism). Its sense is: “Can you believe what some people really believe!” Beginning with what the Nazi did, Pauwels and Bergier asked what did they think they were doing? The answer tore at the roots of European rationalism. The earth is not round, nor is it flat, but rather it is a
73
crucible with the Chinese and Europeans held to their respective sides of the planet by solar radiation. The earth has had many moons and with each a great race arose but in the dark between moons inferior races were spawned, the people of Zog. How could the leadership of arguably the most advanced nation in Europe believe such things in the twentieth century? The alarming answer, of course, is they did! The need to know can be satisfied in many ways, not all are rational and in the competitiveness of nations in a global knowledge-based economy one must never forget this fact of life.
9. With respect to the world of the 1960s, Pauwel and Bergier concluded that exposure of children to vastly expanded audio-visual examples of life role models and opportunities coded in motion pictures, radio and television programs would engender a psychic mutation reminiscent of McLuhan’s “simultaneous or acoustic man”. Their final chapter is in fact entitled Reverie sur les mutants, or dreams about the mutants (Pauwel & Bergier 1960, 607).
10. A ‘simultaneous or acoustic’ mind is not the focused linear consciousness of the previous literate or textual mind. Where the literate mind acts like the eye focusing on detail, the acoustic mind is an ambient consciousness awash in images and sounds and aware of context, gestalt and pattern. The addition of moving images and recorded sounds to the stock of codified knowledge has brought, according to McLuhan, pattern recognition to the foreground of contemporary consciousness. This was recently highlighted in Congressional testimony by the U.S. Defense Secretary, Donald Rumsfeld, when he contrasted the psychological effect of seeing pictures of prisoner abuse at Iraq’s Abu Ghraib prison in May 2004 compared to reading the file in January (Rumsfeld 2004).
11. Three examples demonstrate the ‘iconic’ effects of the new communications matrix and its cybercode. First is invention of the computer icon, the window and the mouse by scientists at Xerox Park in the early 1980s. The shift from text to graphics in Western culture arguably began with the Xerox’s computer ‘icon’. They also introduced ‘WYSIWYG’ as the standard, i.e., what you see on the computer screen is what you get out of the printer. A user interacts more effectively by gestalt-like icons than by the temporal linearity of text. While Xerox failed to exploit its inventions, they were picked up first by Apple Computers and then by Microsoft. Second is the transition in 1990 from the Internet (text-based) to the World-Wide Web (graphic-based) with the first graphic ‘browser’.
In a mere decade, strands of ‘The Web’ have been spun out from a handful of obscure physics labs into seven million Web sites and tens of millions of workplaces and homes around the world. It has catapulted the high-technology industry to unimagined heights, given meteoric rise to electronic commerce, revolutionized research, and made phrases such as ‘download’
74
and ‘home page’ part of everyday conversation. (Ottawa Citizen, December 24, 2000)
12. Third was the launch of Windows ’95 with its shift from a text-based DOS to a graphics interface. Home and office computing took off. In short order, Microsoft became one of the largest business enterprises in the world and Bill Gates, the world’s richest man. It must be noted, however, that there has been a ‘Kuhnian loss’ in transition (Fuller 1992, 272). Much of the new media does not require literacy and with the cultural shift to acoustic space there has been a decline in attention span and literacy, as noted above.
6.1.2.2 Thomas Shales & the Re-Decade
1. Perhaps the most succinct statement of the impact of new forms of codified knowledge is by cultural critic Thomas Shales in his 1986 Esquire article “The ReDecade”. Through the new recording technologies, especially video, consumers now have nearly universal visual access to the styles and tastes of all historic periods, as presented on television and in motion pictures. Does one want to watch gangster movies or musicals of 1930s or witness the French Revolution or Moses on the mountain? Does one want to replay it, time after time, or erase it to capture images and sounds of another Time and Space?
2. This access to the fashions and styles of all historic periods produced what Shales called the ReDecade, a decade without a distinctive style of its own, a decade characterized by the pervasive stylistic presence of all previous periods of history. The impact of this phenomenon on consumer behavior is, at least in the short term, confusion and disorientation. Time has become a significant dimension of consumer behavior, and, more importantly, of self-consciousness. As noted by Shales:
It does seem obvious that here in the ReDecade ... the possibilities for becoming disoriented in time are greater than they have ever been before. And there's another thing that's greater than it has ever been before: accessibility of our former selves, of moving pictures of us and the world as we and it were five, ten, fifteen years ago. No citizens of any other century have ever been provided so many views of themselves as individuals or as a society. (Shales, 1986: 72)
3. As a prequel, art critic Robert Hughes, in his book and television program The Shock of the New (1981) pointed out that since the turn of the twentieth century modern abstract art has been increasingly concerned with the fourth dimension, Time, in contrast with the traditional dimensions of Space and perspective. Thus abstract painting may be viewed as a precursor to the increasing disorientation in Time characteristic of the ReDecade.
75
4. It is not yet clear what will be the long term impact of the ReDecade on consumer behavior. It is likely, however, that there will be a growing market for historic fashions, period piece furniture and reproductions as well as other cultural durables from all historical human cultures. In effect, Shales’ ReDecade is an “overlapping temporal gestalten” (Emery & Trist 1972, 24), i.e., the Present is an amalgam of anachronisms of the Past, and, given the contemporary prominence of science fiction, of the Future. Durable goods, however, constitute a different form of knowledge – tooled knowledge to be discussed below.
6.1.2.3 William Gibson & Cybercode
1. Two years before Thomas Shales published “ReDecade”, William Gibson (1984) published his first novel, Neuromancer. This Hugo Prize winning science fiction novel changed the way the computer/communications industry saw itself and the way the public saw the industry. This was eleven years before Windows ’95. Using a manual typewriter Gibson coined the term ‘cyberspace’ and created a prescient vision of what would become the Web. With his text Gibson created a new literary genre: ‘cybergothic’. For my purposes, however, he defined ‘cybercode’ - including text, graphic icons, sounds and moving holographic images that have been, or shortly will be, codified on the Web. Gibson extended this vision with Count Zero (1986), Mona Lisa Overdrive (1988) and Virtual Light (1993). His most recent novel, however, is Pattern Recognition (2003). Set in the present, it is not science fiction but rather a novel about contemporary ‘cool hunting’, mysterious ‘footage’ on the Web and the global design and marketing industry. Again, Gibson addresses Code, but this time fashion code.
2. My subsequent text will be punctuated with references to Gibson’s vision of cybercode and his futuristic projections about intellectual property. In the global knowledge-based economy Gibson’s vision is film noir, not documentary. Nonetheless, it portends a possible future. From a Canadian perspective, I see Gibson as the new McLuhan of codified knowledge. For now, however, I turn to tooled knowledge, i.e., knowledge fixed as function in a material extra-somatic matrix.
1. The term ‘tooled knowledge’ is not currently part of the debate about the knowledge-based economy. The term itself appears in the classic The History of Economic Analysis, wherein Joseph Schumpeter refers to economics as “a recognized field of tooled knowledge” (Schumpeter 1954, 143). It is in this sense that a former professor, Giles Paquet, called economists tool-bearing animals with their heads as toolboxes. My usage, however, will be
76
quite different. I will be dealing not with the manipulation of ideas (ideology) but rather with knowledge tooled into matter, knowledge embodied as physical functioning things (technology).
2. My usage will also be different from that in the philosophy of technology including Baird’s ‘thing knowledge’ (Baird 2004) and Idhe’s ‘instrumental realism’ (Idhe 1991). My focus is economics, i.e., satisfying the unlimited human want, need and desire to know with limited means. Its objective lens is the final consumer, not the scientist, technologist or instrument-maker. In common with the philosophy of technology, however, tooled knowledge is three-dimensional connecting one mind to another through the hands, e.g., through reverse engineering. This is in keeping with Aldrich’s observation that: “technological intelligence does not come to rest in the eye or the ear. Its consummation is in the hand” (Aldrich 1969, 382).
3. I restrict myself to works of technological rather than aesthetic intelligence because aesthetic works are semiotic or symbolic in nature intended to be decoded by another human mind. Tooled knowledge, on the other hand, is functional taking two-related forms: ‘hard-tooled’ and ‘soft-tooled’. Hard-tooled knowledge breaks out into three types: sensors, tools and toys. Soft-tooled breaks out into four: computer and genomic code, mathematics, standards and techniques.
4. Before describing each I need expand on the relation between knowledge and technology. I draw my introduction from Michael Polanyi and my main argument from Heidegger. For Polanyi, tools, and technology in general, are extensions of our bodies “forming part of ourselves, the operating persons. We pour ourselves into them and assimilate them as parts of our own existence” (M. Polanyi 1962a, 59).
5. For Heidegger, technology (or tooled knowledge) is how the human ecology is enframed and its members enabled. As a species, we order things in our environment into standby mode as a ‘standing-reserve’ (Heidegger 1955, 17) awaiting activation – the furnace, TV, computer, car, train, airport, etc. Whatever is part of this standing-reserve “no longer stands over against us as object” (Heidegger 1955, 17). It is no longer ‘other’. It becomes an existential phenomenological extension of human being. Like ideology, technology becomes subsidiary or tacit, present but in the background, out of focal consciousness, yet ready at hand. And, like Polanyi in 1958, Heidegger used the hammer as the quintessential example some thirty years earlier in his Sein und Zeit (Being & Time) published in 1927.
6. Heidegger’s interpretation of Aristotelian causality is also radically different from the conventional deriving from Latin translation rather than Greek original. Thus ‘cause’ derives from the Latin verb cadere, “to fall,” meaning “that which brings it about that something falls
77
out as a result in such and such a way” (Heidegger 1955, 7). The original Greek, aition, however, means “that to which something else is indebted” (Heidegger 1955, 7). For Heidegger the four causes – material, formal, efficient and final - are “all belonging at once to each other, of being responsible for something else” (Heidegger 1955, 7). This is similar to Kant’s view of living things in which “each part is reciprocally means and end to every other” (Grene & Depew 2004, 94). In biology, the “something else” is a living entity with natural purpose; in the case of technology, it is an artifact imbued with human purpose.
7. Similarly, Heidegger interprets final cause, or telos, in an unconventional manner. Usually translated as ‘aim’ or ‘purpose, he argues telos originally meant in the Greek that “which gives bounds, that which completes” (Heidegger 1955, 7). For Heidegger, technology thus represents “modes of occasioning” in which all four causes are at play “bringing-forth” something - natural or human-made – to completion (Heidegger 1955, 10). And it is “through bringing-forth, [that] the growing things of nature as well as whatever is completed through the crafts and the arts come at any given time to their appearance” (Heidegger 1955, 10). This is reminiscent of Kauffman’s ‘adjacent possible’ (2000). I now turn to the hard and soft appearance of tooled knowledge.
1. By ‘hard’ I mean tooled knowledge as a physical artifact designed to:
· monitor the world of matter and energy (a sensor) or;
· manipulate, shape or animate matter and energy (a tool or toy).
2. In summary, the purpose of sensors is measurement; the purpose of tools is manipulation; and, the purpose of toys is pleasure. Sensors and tools are located on the production-side of the economic equation; toys, on the consumption-side. Sensors and tools are utilitarian, i.e., they serve a higher purpose; toys are non-utilitarian, i.e., they have no purpose other than themselves. Collectively, sensors, tools and toys constitute ‘instruments’.
3. Another distinction must be made between ‘wetware’ and ‘dryware’. Living things can, using genomics or traditional cross-breeding, be designed to serve a utilitarian purpose, e.g., gene therapy (BBC News April 2002), or, a non-utilitarian one, e.g., genetically engineered fish that glow in the dark (Shaikh 2002). These constitute wetware, i.e., ‘living’ tooled knowledge. Traditional instruments are constructed out of inanimate matter, usually minerals, and constitute dryware. Both, however, are hard-tooled knowledge. Using this distinction, plastics are a cross-over, i.e., they are organically-based but generally derived from non-living sources, e.g.,
78
petroleum. The borderline between wetware and dryware is becoming increasingly obscure as the sciences of genomics, proteomics and nanotechnology mature. Thus, in theory, the genetic code used by marine organisms to produce biosilicate shells may eventually be used to make silicon chips for computers.
4. The three – sensors, tools and toys – can, from time to time, be one and the same. For example, a sensor may be active or passive. An active sensor monitors changes in nature by initiating such changes, e.g., a synchrotron or subatomic particle accelerator. Thereby the sensor becomes a tool. Furthermore, to the degree that normal science involves puzzle solving (Kuhn 1996, 35-42) then scientific instruments can, with no disrespect, be considered playthings or toys of scientists. Play-like behaviour is a generally recognized characteristic of creativity in all knowledge domains. In this regard, the search for knowledge-for-knowledge-sake is non-utilitarian, i.e., it has no objective other than itself. To this extent, all scientific instruments can be considered toys. In effect, scientific instruments are designed to produce new knowledge which, to the scientist, is like the pleasure of a toy. As will be seen, this relates to the subordination of Sensation to Reason as in Timothy Findley’s “intellectual priapism” (Findley 1999, 258).
5. Similarly, new scientific instruments – the foundation of experimental research – may subsequently become industrial tools, e.g., the scanning electron microscope, ion implantation and the synchrotron (Brooks 1994, 480). They may also become toys intended for amusement or entertainment, e.g., the cathode display tube developed to monitor laboratory experiments became a standardized tool of science and industry and then the television set in the family room.
1. As a sensor or ‘probe’ (M. Polanyi 1962a, 55), tooled knowledge extends human touch, taste, sight, sound and smell. It monitors the world of matter and energy above (macroscopic), at (mesoscopic), or below (microscopic) the threshold of our natural senses. The resulting ‘readings’, when organized, structured and systematized, become codified knowledge that can be shared as a statement of objective, empirical fact.
2. To the degree they measure above and below the threshold of our natural senses, scientific instruments realize a Platonic ideal: “belief in a realm of entities, access to which requires mental powers that transcend sense perception” (Fuller 2000, 69). Furthermore, the
79
‘language’ of sensors realizes the Pythagorean ideal by reporting Nature by the numbers. My term ‘sensor’ corresponds to Baird’s ‘measuring instruments’ (Baird 2004).
3. The effects of sensors can be profound, for example: “the idea of a world governed by precise mathematical laws was transmitted… through Galileo’s and Huygen’s conversion of the mechanical clock into an instrument of precision” (Layton 1974, 36). Or, consider the impact on our “image” of the world (Boulding 1956) of Galileo’s innovative use of the telescope resulting in “artificial revelation” (Price 1984, 9).
4. To the degree that the natural sciences are about acquiring knowledge of the physical world then, to that degree, all scientific instruments are sensors, i.e. their primary purpose is to monitor, not manipulate. That scientific instruments embody knowledge is noted by Shapin who reports:
much empirical work has addressed the embodied nature of scientific know-how and the embodied vectors by which it travels, whether that embodiment is reposed in skilled people, in scientific instruments, or in the transactions between people and knowledge-making devices. (Shapin 1995, 306)
With respect to the later, he notes the emergence of new non-human actors including cyborgs – part human and part machine (Shapin 1995, 313).
5, The history, philosophy and sociology of science are replete with allusions to scientific instruments. Experimental science was, is now, and probably always will be, rooted in such tooled knowledge (Price 1984). For example, CERN’s Large Hadron Collider will begin operation in 2006 while the recently upgraded Fermi National Accelerator Lab’s “Tevatron” is already sensing Nature at levels beyond the sensitivity of previous instruments. The ‘Canadian Light Source’ synchrotron at the University of Saskatchewan is an example of increasingly common sensor/tool crossovers serving both research science and industry. These are ‘Big Science’. The size and complexity of such instruments, the range and diversity of knowledge embodied and costs associated with their design, construction and operation may limit future revolutions in physics (Fuller 1992, 252). Without doubt, they impose a strong path dependency on the road to future knowledge (Rosenberg 1994, 1-6).
6. It has also been argued that new sub-disciplines, i.e., new categories of knowledge, within the natural sciences and related technological disciplines emerge in response to new instruments (Price 1984). This conclusion is reinforced by Rosenberg’s findings about their interdisciplinary impact in bringing together scientists from different disciplines and thereby mitigating incommensurability (Rosenberg 1994, 156).
80
7. Beyond the knowledge embodied in scientific sensors and the new knowledge they produce, their epistemological importance lays in consistent objective evidence about the state of the physical world. Such evidence is objective in the sense that collection is not mediated by a human subject. Instruments extend the human senses beyond the subjectivity of the individual observer. Once calibrated and set in motion a clock – atomic or otherwise –ticks at a constant rate per unit time until its energy source is exhausted. Again, such measurement is ideally achieved without mediation by a human subject.
8. In this regard it is important to note that sensors pattern the modern way of life. The simple household thermometer is an example. It tells us when we have a fever and when to seek medical intervention. In turn, a medical thermometer is used to monitor the progress of such intervention (Shapin 1995, 306-307). Put another way:
By encapsulating knowledge in our measuring instruments, these methods minimize the role of human reflection in judgment. They offer a kind of “push-button objectivity” where we trust a device and not human judgment. How many people check their arithmetic calculations with an electronic calculator?... Putting our faith in “the objectivity” of machines instead of human analysis and judgment has ramifications far and wide. It is a qualitatively different experience to give birth with an array of electronic monitors. It is a qualitatively different experience to teach when student evaluations – “customer satisfaction survey instruments” - are used to evaluate one’s teaching. It is a qualitatively different experience to make steel “by the numbers,” the numbers being provided by analytical instrumentation. (Baird 2004, 19)
1. If sensors extend the human senses then tools extend the human grasp. Tools are designed with human purpose. They have an in-built aim, i.e., they are teleological (Layton 1988, 90-91). We thus recognize a tool by its purpose (M. Polanyi 1962a, 56). Put another way, a tool is created when “a function couples purpose with the crafting of a phenomenon. A function is a purposeful phenomenon” (Baird 2004, 123).
2. The teleological nature of tooled knowledge is, in a sense, atavistic, an epistemological throwback to medieval animism, i.e., when objects and natural phenomena were believed possessed of soul. This was generally displaced by mechanistic causality, the episteme of the first Scientific Revolution of the mid-17th century (Foucault 1973). This provided a “description of reality in terms of a world of precision, free of all considerations based upon value-concepts, such as perfection, harmony, meaning, and aim” (Layton 1988, 90). While this displacement is appropriate in the geosphere, it is inappropriate in the world of human-made things where “the sciences of the artificial” rule (Herbert Simon quoted in Layton 1988, 91).
81
3. Purpose is inherent in a tool. It is designed to do a job; it is not valued in-and-of-itself like a work of art, but rather for what and how well it can do that job. The knowledge to make a tool becomes embedded in it. It becomes tooled knowledge. If intended to do a job in the weightlessness of outer space then its shape, size and tolerances will be different than if designed to do the same job under terrestrial gravity or the enormous pressures of the ocean’s depths. Put another way: “Material agency is revealed in our mechanical contrivances… Much as we control concepts through the exercise of our literary skills, we control material agency through the exercise of our making skills” (Baird 2004, 47).
4. Tools are located on the production-side of the economic equation. They are intermediate goods used to produce final goods and services purchased by consumers (excepting the handyman). They are utilitarian - valued for what they can do, not for what they are in-and-of-themselves.
5. A final distinction must be drawn between specific purpose and general purpose tools, or what David calls ‘general purpose engines’ (David 1990). A specific purpose tool has but one purpose, e.g., a hammer or a drill press. A general purpose tool has multiple applications which “give rise to network externality effects of various kinds, and so make issues of compatibility standardization important for business strategy and public policy” (David 1990, 356). Modern general purpose tools also generate “techno-economic regimes” involving a web of related installations and services. Such is the case, for example, with the internal combustion engine. When embodied in an automobile it requires manufacturing plants, refineries, service stations, parking lots, car dealerships, roads, insurance, et al. In temporal succession, general purpose tools include the printing press, steam engine, electric dynamo, internal combustion engine, radio-television, the computer and genomics.
6. Such techno-economic regimes display path dependency. Specifically, once introduced all subsequent additions, changes and/or improvements to a general purpose tool must conform to existing standards. The example of 110 versus 220 voltage used in North America and Europe, respectively, is a case in point. Any electric appliance – new or old – must be tooled to operate using the appropriate current; otherwise it will not function.
1. If sensors are for measuring and tools are for manipulating then toys are for pleasure. Sensors and tools are located on the production-side of the economic equation. They serve as inputs in the production of final goods and services. In the case of sensors, monitoring
82
information may be used either as an input to the production of knowledge or the production of other goods and services. Toys are final goods and services. They are appreciated for their own sake, not for any contribution to the production of other things. In this sense, toys are non-utilitarian, pleasure-giving devices. This includes the pleasure of learning, i.e., knowledge as a final consumption good. It also includes the aesthetic experience of works of art. They are appreciated for their own sake; they are physical artifacts that embody the knowledge of the artist in making an artwork ‘work’. I am, however, compelled to use the word ‘toy’ because is no other word in English denoting a work valued in-and-of-itself with no other purpose or utilitarian value. One plays with a toy; one works with a tool.
2. If, cum Bentham, pleasure is the only objective of life then tooled knowledge, like personal & tacit and codified knowledge, reflects the full spectrum of human wants, needs and desires subject to cultural, legal and financial constraints. Aesthetic distancing, morality and scientific objectivity are not epistemological constraints in economics. As toys, tooled knowledge has extended the human playpen to the globe and beyond; it has extended our sense of time and place beyond the dreams of previous generations. In this sense, it is ‘natural’ that one of the first adapters of new computer/communications technologies such as the video recorder and the WWW was, is and will be the sex or ‘XXX’ industry.
1. An instrument, as a physical artifact, must be activated by a human operator if it is to fulfill its function. Operation of an instrument – sensor, tool or toy – is generally associated with tacit and/or codified knowledge in the form of computer and genomic programs, mathematics, standards and techniques. In summary, computer programs are machine-readable code used to operate instruments – sensors, tools and toys. Genomic programs are molecular/machine code read by machines to analyze and/or synthesize biological compounds and living organisms (Hood 2002). Standards are codified knowledge physically designed into an instrument defining its operational properties, e.g., a 110 or 220 volt electric razor. Mathematics is the language in which such standards are usually set and in which most instruments are calibrated. Techniques are personal & tacit knowledge required if use and application of an instrument is to attain the intended purpose.
2. Soft-tooled knowledge is tied to hardware. In effect, one has no purpose (e.g., software) and one has no function (e.g., hardware) without the other. Soft-tooled knowledge exists on both sides of the economic equation – consumption and production.
83
1. The purpose of tooled knowledge is manipulation of the natural world. A computer program, while codified and fixed in a communications medium, is intended to be decoded by a machine not by a human mind. It is intended to manipulate the flow of electrons in a circuit. In turn, such circuits may activate other machines and/or machine parts, e.g., industrial robots in steel mills, auto plants and fabricating industries. The distinction between ‘machine readable’ and ‘human readable’ fuelled the 1970s debate about software copyright (Keyes & Brunet 1977). Recognition of software copyright in 1988 was a break with a long legal tradition restricting copyright to ‘artistic works’ (Chartrand 1997a). For my purposes, this distinguishes computer programs as soft-tooled, i.e., machine-readable rather than human-readable knowledge.
2 Similarly, a genomics program, while codified and fixed in a communications medium, is intended to be decoded by machines and molecules, not by a human mind. It is intended to manipulate the chemical bonds of atoms and molecules to analyze or synthesize biological compounds and living organisms with intended or designed characteristics. Such code is being used in a rapidly increasing range of scientific instruments (Hood 2002). Compared to the cost of ‘Big Science’ in physics, however, instrumentation costs in genomics remain relatively low while instrument capabilities are accelerating rapidly.
3 As with software copyright, legal questions are arising about genomic copyright. There are two levels of concern. First, copyright logically adheres to genomic databases as documentation - hard-copy, electronic or fixed in any future matrix. Second, copyright may, or may not, be determined by the courts to adhere to gene segments themselves. The question in law is originality. Naturally occurring sequences, according to some, are facts of nature and hence copyright cannot adhere. In the case of original sequences, however, i.e., artificial, there appears no reason for copyright not to adhere as they do with computer programs. Whether this is appropriate is another question.
4. Genomic programs, however, involve not just sensors and tools but also toys. In the fine arts, one author - David Lindsay (Lindsay 1997) - has tried to copyright his own DNA with the U.S. Copyright Office (without success) and mounted a web page: “The Genome Copyright Project’. Since his initial effort in 1997 a private firm - the DNA Copyright Institute – has appeared on the world-wide web (DNA Copyright Institute 2001). It claims to: “provide a scientific and legal forum for discussion and research, as well as access to valid DNA Profiles, among other Services, as a potential legal tool for deterrence and resolution of situations where there is suspected DNA theft and misappropriation.”
84
5. Steve Tomasula speculatively writes about the rabbit Alba, the first mammal genetically engineered as a work of art in “Genetic Arts and the Aesthetics of Biology” (Tomasula 2002). He compares incipient gene artists with Marcel Deschamp (1887-1968). While the above remain speculative, the fact is that Mike Manwaring, a graduate student at the University of Utah created the first piece of genetic art in 2002: a version of the Olympic Rings entitled “the living rings” made from nerve cells (BBC News On-Line, January 15, 2002). And at least one geneticist, Willem Stemmer, vice president for research and development at Maxygen, has considered transposing genomic code into music to create ‘DNA ditties’ and thereby gain copyright protection (Fountain 2002).
1. The Pythagorean concept of a cognate relationship between mathematics and the physical world is, perhaps, the single most important inheritance from the ancient world reflected in the material well being of contemporary society. It finds fullest expression in ‘the calculus’, i.e., the mathematics of motion and change through Time. The following is a short history of its development. As will be seen, the ability of a knowledge domain and/or its component disciplines to achieve mathematical articulation tends to raise its epistemological status from a Mechanical to a Liberal Art.
2. If the computer represents a ‘general purpose engine’ (David 1990) then mathematics is a general purpose concept, i.e., a mental general purpose tool. It serves as the most effective interface yet discovered (or invented) between mind and matter, between user and instrument, between human readable and machine-readable forms of expression. In this regard, it is important to remember that music was the only ‘fine art’ admitted to the classical and medieval Liberal Arts curriculum. Balance, harmony, proportion and resonance are critical mathematical elements that Pythagoras expressed with the music of a string – halves, quarters, thirds, fourths, fifths, etc. All are audible properties of a string. The conceptual metaphor is one employed in a number of disciplines. For example, in cosmology, Jeff Weeks recently explained fluctuations in readings about the physical dimensions of the universe by comparing them with the sound waves of musical harmonics (Roberts 2004).
3. For the ancient Greeks (and the humanist Renaissance), balance, harmony, proportion and resonance were everything. They capture the ancient Greek meaning of kosmos – the right placing of the multiple parts of the world (Hillman 1981, 28). They are inherent in the music of the spheres, i.e., astronomy, and in the design of cities and the human ecology in general:
85
The polis is the place of art... The magus, the poet who, like Orpheus and Arion is also a supreme sage, can make stones of music. One version of the myth has it that the walls of Thebes were built by songs, the poet's voice and harmonious learning summoning brute matter into stately civic forum. The implicit metaphors are far reaching: the “numbers” of music and of poetry are cognate with the proportionate use and division of matter and space; the poem and the built city are exemplars both of the outward, living shapes of reason. (Steiner, 1976)
4. In temples and public buildings, the ancient Greeks used the proportions of the human form for their columns. According to Marcus Vitruvius in the 1st century B.C.E., the Doric column represents the proportions of a man; the Ionian column, those of a mature woman; and, the Corinthian column, those of a young maiden (Vitruvius 1960, 103-104). Thus in ancient Greece and in the Renaissance ‘man was the measure of all things’. The human form provided the standard of measurement, e.g., how many ‘hands’ high is a horse?
5. But beyond the human lay the universal forms of the circle, square, triangle and variations, e.g., the parabola. Captured in Euclid’s Elements, two-dimensional space was reduced to the mathematics of such universal forms – their balance, harmony, proportion and resonance. Archimedes moved the cognitive relationship between number and nature into the three-dimensional world of volume. Measuring different forms of space was resolved by the Greeks through ‘exhaustion’ whereby one considers the area measured as expanding to account for successively more and more of the required space. In astronomy this method was extended to the celestial motion of the stars and planets. In effect, motion to the ancient Greeks was geometric exhaustion applied, step by step, through time. Ancient Greek mathematics was thus essentially concerned with spatial relationships finding its fullest expression in Euclidian and Archimedean geometry and the astronomy of Ptolemy.
6. After the fall of Rome, the works of the ancient Greek mathematicians were, for the most part, lost to the West. Only gradually were they recovered from Byzantine and Arab sources. In the interim, medieval guilds held a monopoly of tooled knowledge, or the ‘mysteries’ (Houghton 1941, 35) and operated without mathematical theory applying ‘rules of thumb’ and ‘magic numbers’. Even after recovery of Greek and Roman classics, guild masters and apprentices worked in the vernacular and did not have access to the ‘theoretical’ works, in Greek and Latin, of Archimedes, Euclid, Ptolemy or Vitruvius. The breakdown of the guilds and introduction of craft experimentation near the end of the medieval period, however, led to new forms and types of mathematics and instruments – scientific and musical - all calibrated to provide a mathematical reading of physical reality (Zilsel 1945).
86
7. In the early 15th century, the mathematical laws of perspective were discovered (or rediscovered) by the architect Filippo Brunelleschi (1377-1446). In accounting, innovation of the double entry ledger by Luca Pacioli (1445-1515) facilitated the commercial revolution first in the Mediterranean and then around the world. The need for improved navigation led to an intensive search for new methods and instruments to calculate longitude. The Royal Observatory was established in Greenwich in 1675 specifically for this purpose. It was not, however, until 1761 that John Harrison, “a working-class joiner” (BBC News Online, August 3, 2003), created his H4 ‘watch’ which proved sufficiently accurate and sturdy, under the stresses of 18th century sea travel, to permit reliable calculation of longitude. The spirit of playful fascination with new instruments and devices in the 17th and 18th centuries, especially those intended to measure longitude, is captured in Umberto Eco’s novel: The Island of the Day Before (Eco 1994).
8. Beyond the astronomical mathematics of Kepler and Galileo, it was canon fire that provided the impetus for development for a true mathematics of motion. In fact, the mathematics of cannon fire (and its patronage) provided the opportunity for many of the experiments of Galileo (Hill 1988) which are generally recognized as the beginning of the first Scientific Revolution. Mechanics began to drive mathematics.
9. In the 1670s, what was known as ‘the geometry of infinitesimals’, i.e., geometric exhaustion, achieved a breakthrough with the simultaneous invention of ‘the calculus’, independently by Newton (1643-1727) and Leibniz (1646-1716). Calculus provided a true mathematics of motion – changing spatial position through Time expressed in algebraic rather than geometric terms. This breakthrough was then extended by Newton in his three laws of motion which arguably served as the foundation stone of modern natural science. By the middle of the 18th century, in France, ‘scientific’ engineering emerged with a requirement for formal training in calculus (Kranakis 1989, 18).
1. A quarter of a century before Adam Smith published The Wealth of Nations (Smith 1776), the French military changed its weapons purchasing policy imposing strict standards for the production of parts and final weapons systems, e.g., artillery (Alder 1998). Standards were codified into mechanical drawings and mathematically defined tolerances subject to various physical forms of testing. Previously production was a craft activity with each part and weapon a unique artifact. This change meant that parts became interchangeable, e.g., bayonets. This had
88
a significant impact on the performance of the French revolutionary armies of Napoleon (Alder 1998, 536).
2. Standardized parts production was the first step towards ‘mass production’. It was followed early in the next century by the introduction, in England, of the first machine tools to guide and later to replace a worker’s hand to assure standards in production. The use of such machines led Charles Babbage to extend Smith’s theory of the division and specialization of labour to include payment only for the skill level actually required at each stage of production thereby encouraging a reduction of skill requirements, i.e., craftspersons could be replaced by semi-skilled labourers (Rosenberg 1994, 32). This is the same ‘de-skilling’ that continues in the natural sciences with the introduction of new instruments, e.g., the directly readable spectrometer (Baird 2004), and in industry generally.
3. It was not in Europe, however, that the system came to fruition. Arguably due to a shortage of skilled craftsmen and a predominantly low-end ‘mass’ market (rather than an upscale highly differentiated or ‘eccentric’ aristocratic one), it was in the United States that the system developed into ‘the American System’ (Hounshell 1983). Specifications and standards were designed into machines (machine tools) that were, in many cases, simply unknown elsewhere, e.g., in England. Development of the British Enfield rifle in the late 1850s is a case in point. To the British who had carried on the old craft method of production, the idea of interchangeable parts for rifles was initially considered impossible until American machine tools and workers demonstrated how it could be done (Ames & Rosenberg 1968).
4. The American System, however, was not restricted to the military. It was extended to other industries in the United States including tableware such as knives and forks (Ames & Rosenberg 1968, 36). When standardized parts production was married to the moving assembly line introduced by Henry Ford in 1913, the modern system of mass production began. This combination became known as ‘Fordism’ or the “Fordist regime” (David 1990, 356).
5. If standardized parts and the assembly line began mass production, it was innovation of “techno-economic regimes formed around general purpose engines” (David 1990, 355) that completed the transformation of traditional into modern life-styles. The steam engine, railway, internal combustion engine, electric generator and computer require standardization not only of internal components but also external connectors (Alder 1998, 537). As previously noted, general purpose tools, once innovated, establish a ‘path dependency’, i.e., standards and specifications established at the onset become ‘locked in’ and all subsequent improvements, innovations or adjustments must comply. In a manner of speaking, the path dependency of
88
general purpose tools corresponds to ‘tradition’ for the medieval craftsman who inherited and was limited by ‘best practices’ established in a distant past.
6. The importance of standards is recognized in an emerging sub-discipline called metrology (O’Connell 1993). To anticipate discussion of technique, such networks produce what O’Connell calls ‘societies’ or what I call ‘technical subcultures’ including:
a society of health care facilities that share the same measure of body composition, a society of laboratories that share the same electrical units, and a society of weapons that share the same electrical and dimensional standards. (O’Connell 1993, 131)
7. In this regard, at the international level, engineering standardization began with the International Electrotechnical Commission (IEC) in 1906. The broader based International Federation of the National Standardizing Associations (ISA) was set up in 1926 and, after the Second World War, the International Standards Organization (ISO) was established in 1947. Today the ISO has forty distinct fields of standardization ranging from Environment to Image Processing to Domestic Equipment. In most fields mathematically defined standards are codified and then designed into hard-tooled knowledge to ensure compatibility anywhere in the world (Alder 1998, 537).
1. The French word ‘technique’ was introduced into English in 1817. Among its several meanings is: “a body of technical methods (as in a craft or in scientific research)” (MWO, technique, n, 2a). Quite simply such methods involve the effective use and application of hard-tooled knowledge - as sensor, tool or toy. Such use requires personal & tacit knowledge about a new instrument, its codification into operating manuals, and, then transfer of the instrument to a final user who, in turn, must decode the manual and then develop the necessary knowledge to become skillful in its use.
2. Hard-tooled knowledge acts like a nucleating agent around which technique develops like a routinized pattern of behaviour, or an institution in the tradition of the ‘old’ Institutional Economics (e.g., Commons 1924, 1934, 1950). In this regard Price has called the instrument/technique relationship an ‘instrumentality’, i.e., the nucleus plus orbiting behaviour (Price 1984, 15). For my purposes, the instrument is hard-tooled while the methods associated with its use constitute soft-tooled knowledge. They are, in economic terms, ‘tied goods’ like the punch cards required to run an old-style mainframe computer.
89
3. In genomics, Cambrosio & Keating have documented this nucleating role of instruments in “Art, Science, and Magic in the Day-to-Day Use of Hybridoma Technology”. They define scientific technique as an “embedded system of practices”. They highlight how much technique can only be learned by doing and/or through instruction, i.e., it cannot be fully codified (Cambrosio & Keating 1988, 258).
4. Similarly, Rosenberg writes of “instrument-embodied technique” (Rosenberg 1994, 156). He observes that shared use of specialized instruments serves “to bring members of different disciplines together” countering the tendency towards incommensurability between scientific disciplines and sub-disciplines (Rosenberg 1994, 156).
5. Technique, of course, brings us full circle back to personal & tacit knowledge. Thus an instrument, any instrument such as the hammer becomes, for Heidegger, one with us in action, or, as “part of ourselves” (M. Polanyi 1962a, 59). Like a Zen master practicing archery for forty years, the arrow and the bow become transparent, only the target is seen (Suzuki 1959).
1. Tooled knowledge exhibits four characteristics: design, density, fixation and vintage. As introduction, design refers to the synthesis of knowledge drawn from different domains, disciplines, sub-disciplines and specialties, e.g., biology, chemistry and physics, to create an instrument, i.e., a sensor, tool or toy. Density refers to the operational opacity (or transparency) of the resulting instrument. Fixation refers to embedding knowledge into a functioning material matrix. Vintage refers to when that knowledge is embedded, fixed or frozen into a matrix. I will examine each in turn.
1. As a verb, ‘design’ means “to create, fashion, execute, or construct according to plan; to have as a purpose” (MWO, design, v, 1). As a noun, it means deliberate purposive planning; the arrangement of elements or details in a product or work of art; the creative art of executing aesthetic or functional designs (MWO, design, n, 1a). Critically, engineers use the word design “in framing membership criteria for the professional grades of engineering societies” (Layton 1974, 37). More generally, however, in Design
we have come to recognize the processes which bring about creative advances in science, the new paradigms as processes of human design, comparable to artistic creation rather than logical induction or deduction which work so well within a valid paradigm... the norms of artistic design (are) “inherent in the specific psychic process, by which a work of art is
90
represented” and thus in the creative act, not in the created object - in the process not the structure . (Jantsch, 1975, 81)
2. From the dictionary definition I extract the terms ‘arrangement’ and ‘purpose’ in order to distinguish tooled from codified knowledge. Both codified and tooled knowledge are extra-somatic, i.e., fixed outside a natural person. The purpose of codified knowledge, however, is transmission of knowledge between natural Persons while the purpose of tooled knowledge is measurement and manipulation of the natural world.
3. With respect to arrangement, codified knowledge involves manipulating an alphabet, grammar, syntax and vocabulary, i.e., a language including mathematics, to communicate with other human beings. Arrangement of tooled knowledge, however, involves the coordination of different forms and types of matter and energy to subsequently and artificially manipulate or animate the natural world. This may include synthesizing bits of biological, chemical, cultural, economic, electric, electronic, ergonomic, mechanical knowledge and/or organizational knowledge into a single working device or instrument. Put another way:
The term “design” covers the mutual employment of the material and the propositional, as well as hybrid forms such as drawings, computer simulations, and material models. However, design must be understood to embrace material knowledge as well as ideational knowledge. The “design paradigm” is the most promising recent development in the epistemology of technology, but it must not lose track of this central insight about design. Thought and design are not restricted to processes conducted in language. (Baird 2004, 149)
4. As an example, consider the common electric hand drill. Functionally it makes a hole. Without a drill one can use a simpler tool like a spike. This requires knowledge of materials technology, e.g., balsam won’t work well. One either pounds away or rotates the spike with little control or effect unless one spends a very long time developing the tacit knowledge of how to do so. If instead one mounts the bit and turns a crank handle to drive a hardened specially shaped shaft (embodied knowledge of mechanics as well as materials technology) then the operator can achieve much more control and effect. One has invented the hand drill. If one powers the crank by electricity (knowledge of electric motors), then at the push of a button one hand can achieve more control and effect. If one then computerizes the button, one frees the hands, body and mind of the operator. One has invented a computerized machine tool that embodies knowledge streams of materials technology, mechanics, electricity and computers - all in one.
5. Layton, quoting Herbert Simon, defines the “sciences of the artificial” as involving synthesis or what I call ‘design’ rather than analysis or ‘reduction’. Furthermore: “the engineer
91
is concerned with how things ought to be - ought to be, that is, in order to attain goals, and to function” (Layton 1988, 90-91).
6. Michael Polanyi also recognized the artificial nature of tooled knowledge. He observed a machine can be smashed but the laws of physics continue to operate in the parts. He concluded that: “physics and chemistry cannot reveal the practical principles of design or co-ordination which are the structure of the machine” (M. Polanyi 1970).
7. Put another way, in another context, by another author: “technology is about controlling nature through the production of artifacts, and science is about understanding nature through the production of knowledge” (Faulkner 1994, 431). In Aristotle’s Nicomachean Ethics “art is identical with a state of capacity to make, involving the true course of reasoning” (McKeon 1947, 427). The connection between the Arts and tooled knowledge is captured in the aesthetic term elegant, i.e., “ingenious simplicity and effectiveness” (OED, elegant, a, 5a). Put another way: “design involves a structure or pattern, a particular combination of details or component parts, and it is precisely the gestalt or pattern that is of the essence for the designer” (Layton 1974, 37).
8. This gestalt is generally expressed in visual rather than verbal terms. In fact, the earliest expression of engineering knowledge in the West takes the form of design portfolios and the “natural units of study of engineering design resemble the iconographic themes of the art historian” (Layton 1976, 698). In the experimental sciences, this is also increasingly true. Quoting Ackerman, Idhe observes:
Visual thinking and visual metaphors have undoubtedly influenced scientific theorizing and even the notation of scientific fact, a point likely to be lost on philosophers who regard the products of science as a body of statements, even of things. Could the modern scientific world be at its current peak of development without visual presentations and reproductions of photographs, x-rays, chromatographs, and so forth? ... The answer seems clearly in the negative.” (Idhe 1991, 93)
9. There is, however, a Western cultural bias towards ‘the Word’ and away from ‘the image’ – graven or otherwise (Chartrand 1992a). This has contributed to the epistemological suppression of tooled knowledge relative to ‘scientific’ knowledge which is usually presented in a documentary format (the article or book) while tooled knowledge appears first as an artifact which must then be transliterated into written formats that “savour of the antiquarian” (Price 1965, 565-566).
10. Another connexion between tooled knowledge and the Arts is found in the expression “from art to science” (Cambrosio & Keating 1988, 256). This transition has been documented in
92
biotechnology (Hood 2002) and engineering (Schön, 1983) with respect to experimental techniques or protocols. Such protocols generally begin as the unique tacit knowledge of a single researcher. This is called ‘magic’ by Cambrosio & Keating. Over time, this tacit knowledge becomes embodied in an experimental piece of hardware, i.e., tooled knowledge. This stage they call ‘art’ because operation of the prototype requires a high level of tacit knowledge or skill. In turn, the prototype may be commercially transformed into a standardized instrument requiring less skill of its operator who, in effect, transforms from a scientist into technician (Rosenberg 1994, 257-258). This, according to Cambrosio & Keating, is the ‘science’ stage when the standardized instrument can be routinely used in the ongoing search for new knowledge. The original protocol, however, becomes effectively embodied in the now standardized, calibrated scientific instrument. Put another way:
In the language of technology studies, these instruments “de-skill” the job of making these measurements. They do this by encapsulating in the instrument the skills previously employed by the analyst or their functional equivalents.” (Baird 2004, 69)
11. In summary, design refers to the synthesis of different forms of knowledge – cultural, economic, organizational as well as scientific. Tooled knowledge is thus synthetic and integrative rather than analytic and reductive. Through design it enfolds or integrates many different forms of knowledge, including economic knowledge, into an efficient instrument (technically and economically) that works and performs its function. In this sense, tooled knowledge achieves the ancient Greeks kosmos: “the right placing of the multiple parts of the world” (Hillman 1981, 28). The world is in harmony; the world works. In more prosaic terms: “Development of the design is coordinated and iterative, and the end product succeeds in integrating all of the necessary knowledge” (Faulkner 1994, 432).
1. Among its several meanings, the word density means “the degree of opacity of a translucent medium” (MWO, density, n, 3a). With respect to tooled knowledge, density refers to the operational opacity (or transparency) of an instrument. The more tooled knowledge is embodied in an artifact, relative to its function, the denser, the more opaque, the instrument becomes, i.e., it requires less and less personal & tacit or codified knowledge to operate. In other words, the denser an instrument becomes, the more ‘user friendly’.
2. At one extreme are ‘one-offs’, customized instruments common in the natural sciences. A particle accelerator or synchrotron is unique. No two are alike; the personal & tacit and codified knowledge required to maintain and operate it is large. It requires a great deal of what
93
is called ‘local knowledge’ (Alder 1998, 537; Faulkner 1994, 445). In this sense the covers are kept off the machine. It is transparent requiring constant looking inside and tinkering to make it function correctly. Its operation involves the “craft of experimental science” (Price 1984). This sense is captured by an aphorism told by Professor Tom Steele of the Department of Physics & Engineering Physics about the Canadian Light Source synchrotron at the University of Saskatchewan. A problem with vacuum containment baffled staff until a visiting vacuum specialist offered to check it out. He walked around the circuit twice and then pointed out where the problems lay. At first staff laughed but then instrumentation confirmed the expert’s findings. He required no instruments, no measurements, just experience, i.e., personal & tacit knowledge.
3. At the other extreme is the consumer ‘black box’ – push the button and it operates itself. The leading edge of black box tooled knowledge, today, is voice activated computer control. Just a verbal command and the tooled knowledge works. The black box hides its ‘thing-ness’ (Baird 2004, 146).
4. Between the extremes are many shades of grey. Standardized research instruments like scanning electron microscopes or MRI scanners require highly trained technicians to operate. They can do so, however, without the detailed personal & tacit and codified knowledge available to an experimental scientists. This again involves a ‘de-skilling’ of the operator and transfer of knowledge into the instrument (Baird 2004, 69).
5. The process of standardizing experimental scientific instruments by replacing manual with automatic control is well documented (Cambrosio & Keating 1988; Hood 2002; Price 1984; Rosenberg 1994). It involves conversion of a transparent scientific sensor into an opaque industrial tool that, in turn, may become a black box toy in final consumption, e.g., the cathode display tube as TV.
6. The impact of soft-tooled knowledge in this process, especially standardization, cannot be underestimated:
For all the diversity of our consumer cornucopia, the banal artifacts of the world economy can be said to be more and more impersonal, in the sense that they are increasingly defined with reference to publicly agreed-upon standards and explicit knowledge which resides at the highest level of organizations, rather than upon local and tacit knowledge that is the personal property of skilled individuals. (Alder 1998, 537)
1. Tooled knowledge is fixed in a functioning material matrix as a sensor, tool or toy. Fixation is a condition for intellectual property rights such as patents and copyrights. I will
94
discuss the nature of such intellectual property rights in more detail below. For now it is sufficient to ask if tooled knowledge can be extracted from such a matrix? The answer is yes through reverse engineering. In effect, “engineers learn the state of the art not just by reading printed publications, going to technical conferences, and working on projects for their firms, but also by reverse engineering others’ products” (Samuelson & Scotchmer 2002, 70-71).
1. Vintage refers to the temporal coefficient (historical date or time) when existing knowledge is embedded, embodied or tooled into a matrix. Unlike design, density and fixation, vintage has been the subject of formal economic investigation. Robert Solow (1960) considered the question of the distribution of capital equipment including new and old technologies and asked why different vintages coexist. Subsequently, Solow introduced the concept of ‘embodied technological change’ (1962).
2. Like codified knowledge when the hand has written it moves on, tooled knowledge exists at a given moment of time – a given state of the art. Once embedded, it is ‘frozen’ (Boulding 1966, 6) and subject to update with more effort and cost than revising a written document. Vintage thus refers to the state of the art current when knowledge is tooled into matter. Furthermore, and excepting the military and natural sciences, it is also subject to economic constraints (M. Polanyi 1960-61, 404).
3. One further vintage distinction can be drawn: technical versus functional obsolescence. On the one hand, a given product or process embodying tooled knowledge may be displaced by one that is faster and/or more cost-effective. The old is now technically obsolete. It can continue, however, to perform the same or similar function. On the other hand, a given product or process may be displaced because the function it performs is no longer required. The old is now functionally obsolete. An example is hydrogen re-fuelling stations for zeppelins.
1. Knowledge takes three forms – personal & tacit, codified and tooled. Knowledge is fixed in a person as neuronal bundles of memories and as the trained reflexes of nerves and muscles. As code it is fixed in a medium of communication or matrix that allows knowledge to cross Time and Space until another person reads or decodes it and thereby adds it to his or her personal & tacit knowledge. Knowledge is tooled into a functioning physical matrix as an instrument such as a sensor, tool or toy, or more generally, as a work of technological
95
intelligence. The knowledge tooled into an extra-somatic matrix remains a functionless artifact, however, until someone makes it work by pushing the right buttons and using it in the right way. This requires, of course, personal & tacit knowledge that comes with practice, talent and technique. Thus, once again, we can conclude that all knowledge is ultimately personal and tacit.
96
The Competitiveness of Nations in a Global Knowledge-Based Economy