The Competitiveness of Nations
in a Global Knowledge-Based Economy
6.0 Form & Fixation
Harold Innis & Marshall McLuhan
The
Metaphysics of Technology An Aside (not in submission) |
Page 90 90 91 92 92 96 96 97 102 108 113 113 113 |
Epithet Human knowledge and human power meet in one; for where the cause is not known the effect cannot be produced. Nature to be commanded must be obeyed; and that which in contemplation is as the cause is in operation as the rule. Francis Bacon (1560–1626) The Great Instauration, Aphorism 3. Columbia World of Quotations, #5138 HHC © last revised December 2004 Draft in Progress |
6.01 Qubits identified so far – the etymological WIT, psychological PSI, epistemological IMP and pedagogic PED - are abstractions. Such abstractions, to find expression, must assume a form, i.e., must be reified. Form, according to Francis Bacon, is “the real or objective conditions on which a sensible quality or body depends for its existence” (OED, form, n, 4 c).
6.02 In summary introduction, knowledge, it will be argued, assumes three forms: personal & tacit, codified and tooled knowledge. These three, however, begin and end with personal knowledge. Some personal knowledge can be coded (in words, numbers and/or ideographs) and fixed in a medium of communication to be decoded by another person. Some personal knowledge, however, always remains tacit, finding expression, if at all, only in a work of tooled knowledge and/or in the process of making it. A code or a work always leads back, however, to a person who must either: (a) decode the communication; (b) activate a work of technological intelligence, i.e., know what buttons to push; or, (c) attain an aesthetic experience from a work of art.
6.03 Put another way, personal & tacit knowledge is somatic, i.e., fixed in a person. Code and works of tooled knowledge are extra-somatic and fixed in forms external to a person. Code and works exist, however, only in the presence of a person who can decode or use them. Code in a lost language like Minoan Script A is not knowledge; it is ignorance unless someone can read it. Similarly, a work of tooled knowledge is a useless artifact until someone knows how it works. Knowledge is thus reduced to “permanent bodily form” fixed in a person, a code or a work of tooled knowledge but it is upon the person that knowledge ultimately depends. I will examine each in turn and then reconcile them.
6.04 To date discussion of ‘knowledge’ in a knowledge-based economy has primarily focused on two forms: tacit knowledge and codified knowledge with some reference to
90
‘local’ knowledge (Cambrosio & Keating 1988, 244) that can be subsumed under tacit. Both tacit and codified knowledge are now recognized as factors affecting the production function of firms and nation-states (OECD 1996; Malhotra 2000; ANSI/GKEC 2001). Both, however, are subject to widely varying interpretation in the hands of different analysts with significantly different and important policy implications (Cowan, David & Foray 2000, 212-213). [A]
Indeed, references to ‘tacitness’ have become a platform used by some economists to launch fresh attacks upon national policies of public subsidization for R&D activities, and equally by other economists to construct novel rationales for governmental funding of science and engineering research and training programs.” (Cowan, David & Foray 2000, 212-213)
6.05 For my purposes, I define tacit knowledge in keeping with the work from which the term derives, Michael Polanyi’s 1958: Personal Knowledge: Towards a Post-Critical Philosophy. It should be noted that the second 1962 edition is referenced in my text. This edition was published the same year as Thomas Kuhn’s first edition of The Structure of Scientific Revolutions. While Kuhn makes only one reference to Polanyi they share a very important concept: incommensurability. For Kuhn, it is the incommensurability of specialized scientific knowledge resulting in distinct paradigms; for Polanyi, it is the incommensurability of what subsequently became known as tacit versus codified knowledge (M. Polanyi 1962a, 174). It has been argued by Fuller that many of Polanyi’s insights were subsequently attributed to Kuhn. Nevertheless, “it is not hard to see that Kuhn owed more to Polanyi than the appreciative footnote to his magnum opus, Personal Knowledge, would suggest” (Fuller 2000, 140).
6.06 It is clear from Polanyi’s usage that tacit knowledge is ‘personal knowledge’. Put another way, personal is living knowledge, knowledge that is fixed in an individual. From whence it comes – demonstration, experience, experimentation, intuition or reading – does not change its personal nature, nor does the fact that some of it can be codified while other parts can be tooled into matter and energy.
6.07 In effect, personal knowledge comes in two forms. The first is the mental matrix of neurons that fix memories (knowledge) as part of one’s voluntary wetware, i.e., that part of the nervous system subject to conscious control, specifically, to recall. Memories can generally be described and codified, i.e., spoken and transcribed into language or drawn as a picture.
6.08 The second form is what Polanyi calls ‘tacit’ knowledge in performance that is fixed in reflexes (part of one’s involuntary wetware) composed of “the connected set of nerves concerned in the production of a reflex action” (OED, reflex, n, 6 b). Reflexes refer to the memory of our limbs and digits about how to do something, e.g., ride a bicycle. Etymologically it is relevant that the word ‘reflex’ derives from ‘reflect’ in the sense of ‘to remember’. Such memories
91
or knowledge is fixed in one’s body parts and nervous system. This can involve the fine practiced motor skills of a surgeon or a professional bricklayer. What they share is that such knowledge remains tacit, i.e., it is not subject to articulation and hence to codification - spoken, transcribed or drawn. It can be gained, however, through repetition and practice following its demonstration and leading to enhanced performance at a task. It is, I argue, such ‘knowing by doing’ that Polanyi means by tacit knowledge (M.Polanyi 1962a, 175). The classic example in the philosophies of science and technology is the hammer (Heidegger 1927; Polanyi 1962a, 174-75).
6.09 Ultimately, however, all knowledge is personal. A code or a work always leads back to a person who can either decode or activate it. To further distinguish personal, codified and tooled knowledge, I consider personal and tacit knowledge to be one-dimensional, a monad: it is known (and fully knowable) only by one Mind. It is the sum of what an individual knows and, to coin a phrase, ‘if one is what one knows’ then personal and tacit knowledge defines individuality.
6.10 Codified knowledge, as a term, does not have a seminal source. In general, it means the use of a written language, symbols (including mathematical symbols), sounds or pictures to encode the knowledge of one or more persons into a material matrix that subsequently – distant in time and/or space – may be decoded and assimilated as personal knowledge by another human being. The important function of writing is that “it makes communications possible without immediate or mediate personal address; it is, so to speak, communication become virtual. Through this, the communalization of man is lifted to a new level” (Husserl quoted in Idhe 1991, 46). In this sense, codified knowledge is two-dimensional engaging at least two Minds – the author and a distant reader/receiver. Such knowledge begins and ends as somatic personal & tacit knowledge.
6.11 Four qualifications constrain my definition. First, technically, speech qualifies as codified knowledge but is ignored except when fixed, i.e., recorded in a material matrix - written or otherwise. Second, codified knowledge is restricted to ‘human-readable’ (analogue) as opposed to both machine-readable (digital or binary) and molecule-readable forms such as the genomic ‘code of life’ or “autobiography of a species” (Ridley 1999). I include in this code those of its emerging sub-disciplines such as proteomics, organic nanotechnology and molecular biology in general. Manipulation of gene lines, as works of technological intelligence, began with domestication of plants and animals for a human food supply includes the primordial incest taboo. The distinction is between semiotic knowledge coded by one person then decoded by another as opposed to the operating instructions for a machine or a molecule. As will be argued below, machine- and molecule-readable code is more effectively treated as ‘soft-tooled’ knowledge. Third, my focus is on the matrix or communications medium rather than the content, i.e., language, symbol, sound or image. In this sense, to paraphrase McLuhan: ‘the matrix is the message.’ Fourth,
92
codified knowledge is both an intermediate producer good, i.e., an input to the production process, and a final consumption good, e.g., as books, magazines, motion picture and sound recordings, no matter matrix of fixation.
Harold Innis & Marshall McLuhan
6.12 Through his study of communications, Innis identified a fundamental relationship between culture and communications media or the matrix (Innis 1950, 1951). A culture is limited in space, but extensive in time, i.e. it has duration, to the extent its matrix is durable, e.g., stone, clay or parchment. Alternatively, a culture is extensive in space, but limited in time, to the extent its communications matrix is non-durable but easily transported, e.g., papyrus and paper. Using this hypothesis Innis explains the rise and fall of empires. Five examples demonstrate Innisian inductive analysis of a knowledge-based economy.
6.13 First, acidic paper – cheap and light weight - has been used for more than 150 years. Books, newspapers, periodicals and other written records fixed in this matrix, however, are now disintegrating in libraries and archives around the world (The Economist, February 27, 1987: B-1). Meanwhile, parchment and vellum from the thirteenth century have not ‘self-destructed’. From an Innisian perspective, this implies that European expansion and colonization of the last century and a half should have been short-lived because the dominant communications medium was cheap and easily transportable. In fact, the British Empire “on which the Sun never sets”, was, in historical terms, the most extensive in space, but shortest in duration of any empire in history.
6.14 Second, the dominant communications medium today, in spite of the Internet, remains television which spans the world in an instant, i.e. it is extensive in space. Television takes the average citizen around the world to spaces and places of which his ancestors never knew. A question, however, has arisen about television's impact on attention span. Some argue that children do not read as well as before TV because their attention span has been reduced, i.e. the medium, while extensive in space, has effected a reduction in the psychological duration of time.
6.15 Third, the new communications technologies (essentially new matrices to fix and transmit codified knowledge) have arguably made the entertainment industry the largest sector of final demand in the knowledge-based economy. But this industry is peculiar in a number of ways. First, the hardware including direct broadcast satellites, fibre optics, magnetic recording technologies, and the compact disc player are based upon silicon and iron oxide, i.e., stone, that, theoretically, should endure for more than a century. On the other hand, their contents such as television programs circle the globe in an instant. Second, production of medium or matrix is separated from production of the message. Thus “home entertainment” hardware is dominated by Asian producers while programming is dominated by the American entertainment industry, i.e., Hollywood. This international division of medium and message suggests a new culture unlike any in human history, i.e., a global culture. Third, like previous communications revolutions, e.g., the printing press, the
93
new communications media is being accompanied by a breakdown of old ways of communicating, e.g., declining literacy and a heightened sense of societal “dis-ease”.
6.16 Fourth, behind the scenes lurks a new nervous system, a new matrix, encircling planet Earth – the World Wide Web, the WWW or ‘the Web’, for short. In less than a decade, the Web has affected economics, education, entertainment, health care, information, news and the nature of work. Among the many significant knowledge characteristics of the Web, I will consider three. First, the Web is economically bifurcated into intermediate and final knowledge goods and services. Thus the ‘consumer’ Internet is partnered by the ‘B2B’ or the business-to-business Internet that globally links producers to suppliers with dramatic cost and other implications for firms. Second, mechanical and electronic devices are increasingly being ‘plugged’ into the Web. From automobiles, ships, trucks and trains to home air conditioning, computers, heating, lighting and security systems to microwave ovens, refrigerators, toasters, toilets and TV sets are all being attached to the Web permitting two-way communication not just between people but also machines. The Web therefore carries both human-readable and machine-readable code.
6.17 Third, distribution costs of knowledge on the WWW approach zero as do duplication, reproduction or copying costs. In considering this new matrix and the nature of authorship, the noted copyright lawyer, David Nimmers, observes that:
These questions simply adumbrate in miniature the completely unanticipated vistas that a world of interactive authorship might show us. Most, if not all, doctrines of copyright law are destined to become inapplicable, anachronistic, or at least severely distended, in such a brave new world. For the High Priesthood of copyright to even contemplate such potentialities might require the utmost in retooling. (Nimmers 1992, 521-522)
6.18 Fifth, contemporary recording technologies provide artists, celebrities and ‘historic’ events with something that only literary and visual artists enjoyed in the past - life after death. This is a life not as a ghost on another plane, but as a shadow on the silver screen. There may never again be a Richard Burton, but his image, his voice, his body language and his performance will now endure like the plays of Shakespeare, part of our social genetic, the extra-somatic knowledge that is the stuff of culture. It is this characteristic of the Arts, maintenance of a collective linkage with the past, which distinguishes knowledge in the Arts from other sectors of the economy. It is also one reason why Art has displaced religion in the secular West, i.e., it provides an alternative secular re-ligio, or linking back. In other sectors, new knowledge often displaces the old. In the Arts, the images and words of cultures and civilizations, long buried by the sands of time, enrich and inspire contemporary creators (Boulding July 1986).
6.19 Innis’ colleague, Marshall McLuhan, extended the observed linkage between medium of communication and duration of civilization to his famous aphorism “The Medium is the Message”. McLuhan recognized that the material matrix affects both reception of the message and shapes the fabric of society itself. From the hot,
94
focused matrix of the printing press with its linear phonetic alphabet (the first engine of mass production) to the cool, passive medium of television with its cascade of images and sounds, McLuhan believed a major transformation in consciousness, of knowing, is underway: “the transition to the electronic phase of simultaneous or acoustic man” (McLuhan 1978).
6.20 A ‘simultaneous or acoustic’ Mind is not the focused linear consciousness of the previous literate or textual Mind. Where the literate Mind acted like the eye focusing on detail, the acoustic Mind is an ambient consciousness awash in images and sounds and aware of the context, of the gestalt, of the pattern. In a way, McLuhan’s ‘acoustic’ Mind is similar to Jaynes’ ‘bicameral mind’ with its active right-lobe (Jaynes 1975). The addition of images to the stock of codified knowledge has brought, according to McLuhan, pattern recognition to the foreground of contemporary consciousness. This was recently highlighted in Congressional testimony by the U.S. Defense Secretary, Donald Rumsfeld, when he contrasted the psychological effect of seeing pictures of prisoner abuse at Iraq’s Abu Ghraib prison in May 2004 compared to his ‘reading’ of the file in January of that year:
It is the photographs that gives one the vivid realization of what actually took place. Words don't do it. The words that there were abuses, that it was cruel, that it was inhumane -- all of which is true - that it was blatant, you read that and it's one thing. You see the photographs and you get a sense of it and you cannot help but be outraged. (Rumsfeld 2004)
6.21 Three examples demonstrate the effect of the new matrix. First is the invention of the computer icon, the window and the mouse by scientists at Xerox Park in the early 1980s. The shift in western culture from text to graphics arguably began with the Xerox’s innovation of the computer ‘icon’. A computer user interacts more effectively by image than text. While Xerox failed to successfully exploit its innovations, they were picked up first by Apple and then by Microsoft.
6.22 Second is the transition in 1990 of the Internet (text-based) to the World-Wide Web (graphic-based) with the first graphic ‘browser’.
In a mere decade, strands of ‘The Web’ have been spun out from a handful of obscure physics labs into seven million Web sites and tens of millions of workplaces and homes around the world. It has catapulted the high-technology industry to unimagined heights, given meteoric rise to electronic commerce, revolutionized research, and made phrases such as ‘download’ and ‘home page’ part of everyday conversation. (Ottawa Citizen, “Web revolution began 10 years ago tomorrow”, December 24, 2000)
6.23 Third is the launch of Windows ’95. With a shift from a text-based DOS interface to graphics, home and office computing took off. In short order, Microsoft became one of the largest business enterprises in the world and Bill Gates, the world’s richest man.
6.24 It must be noted, however, that there has been a ‘Kuhnian loss’ in transition (Fuller 1992, 272). Much of the new media do not
95
require literacy. Accordingly, with the cultural shift to an acoustic space there has been an apparent decline in attention span and literacy, as noted above.
6.25 Perhaps the most succinct statement of the impact of new types of codified knowledge was made by cultural critic Thomas Shales in his 1986 article ‘The ReDecade’. Through the new recording technologies, especially video, consumers now have nearly universal visual access to the styles and tastes of all historic periods, at least as presented on television and in motion pictures. Does one want to watch the gangster movies or musicals of 1930s or witness the French Revolution or Moses on the mountain? Does one want to
94
replay it, time after time, or erase it to capture the images and sounds of another time and place?
6.26 This access to the fashions and styles of all historic periods produced what Shales called the ReDecade, a decade, for him, without a distinctive style of its own, a decade characterized by the pervasive stylistic presence of all previous periods of history. The impact of this phenomenon on consumer behavior is, at least in the short term, confusion and disorientation. Time has now become a significant dimension of consumer behavior, and, more importantly, of one’s self-image. As noted by Shales:
It does seem obvious that here in the ReDecade ... the possibilities for becoming disoriented in time are greater than they have ever been before. And there's another thing that's greater than it has ever been before: accessibility of our former selves, of moving pictures of us and the world as we and it were five, ten, fifteen years ago. No citizens of any other century have ever been provided so many views of themselves as individuals or as a society. (Shales, 1986: 72)
6.27 Similarly, the art critic Robert Hughes, in his book and television program entitled The Shock of the New (1981), has pointed out that since the turn of the twentieth century modern abstract painting has been increasingly concerned with the fourth dimension, time, in contrast with the traditional dimensions of space. Thus abstract painting can be viewed as a precursor to the increasing disorientation in time characteristic of the ReDecade.
6.28 It is not yet clear what will be the long term impact of the ReDecade on consumer behavior. It is likely, however, that there will be a growing market for historic fashions, period piece furniture and reproductions as well as other consumer cultural durables drawn from historical human cultures. Such ‘durables’, however, constitute a different form of knowledge – tooled knowledge.
6.29 The term ‘tooled knowledge’ is not currently part of the debate about the knowledge-based economy. The term itself appears in the classic The History of Economic Analysis, wherein Joseph Schumpeter refers to economics as “a recognized field of tooled knowledge” (Schumpeter 1954: 143). My usage, however, will be quite different. I will be dealing not with manipulation of ideas but
96
rather with knowledge tooled into matter, or knowledge embodied as physical functioning things. My usage will also be different from ‘thing knowledge’ (Baird 2004) and ‘instrumental realism’ (Idhe 1991) proposed in the philosophy of technology. My focus is on knowledge satisfying human wants, needs and desires. Its objective or lens is the final consumer, not the scientist, technologist or instrument-maker. In common with the philosophy of technology, however, is a sense of tooled as three-dimensional knowledge connecting one Mind to another Mind through the hands, e.g., through reverse engineering. This is in keeping with Aldrich’s observation that: “technological intelligence does not come to rest in the eye or the ear. Its consummation is in the hand” (Aldrich 1969, 382).
6.30 Restricting myself to works of technological rather than aesthetic intelligence, tooled knowledge takes two-related forms: ‘hard-tooled’ and ‘soft-tooled’. Hard-tooled knowledge breaks out into three types: sensors, tools and toys. Soft-tooled breaks out into four: computer and genetic code, mathematics, standards and techniques. I will examine each by form and type.
6.31 By ‘hard’ I mean tooled knowledge as a physical artifact, specifically an artifact designed to:
· monitor activity in the world of matter and energy (a sensor) or;
· manipulate, shape or animate matter and energy (a tool or toy).
6.32 In summary, the purpose of sensors is measurement; the purpose of tools is manipulation; and, the purpose of toys is pleasure. Sensors and tools are located on the production-side of the economic equation; toys, on the consumption-side. Sensors and tools are utilitarian; toys, non-utilitarian, i.e., they have no other purpose than themselves. Collectively, sensors, tools and toys constitute ‘instruments’. Accordingly, the term ‘instrument’ should be read in context.
6.33 Another distinction must be made between ‘wetware’ and ‘dryware’. Living things can, using genomics or traditional cross-breeding, be designed to serve a utilitarian purpose, e.g., gene therapy (BBC News April 2002), or, a non-utilitarian one, e.g., genetically engineered fish that glow in the dark (Shaikh 2002). These constitute wetware, i.e., ‘living’ tooled knowledge. Traditional instruments are constructed out of inanimate matter, usually minerals, and constitute dryware. Both are hard-tooled knowledge. Using this distinction, plastics are a cross-over, i.e., they are organically-based but generally derived from non-living sources, e.g., petroleum. The threshold between wetware and dryware will probably become increasingly obscure as the sciences of genomics, proteomics and nanotechnology mature. Thus, in theory, the genetic code used by marine organisms to produce biosilicates may eventually be used to make silicon chips for computers.
6.34 The three – sensors, tools and toys – can, from time to time, be one and the same. For example, a sensor may be active or
97
passive. An active sensor monitors changes in nature by initiating such changes, e.g., a synchrotron or subatomic particle accelerator. Thereby the sensor becomes a tool. Furthermore, to the degree that normal science involves puzzle solving (Kuhn 1996, 35-42) then scientific instruments can, with no disrespect, be considered playthings or toys of scientists. Play-like behaviour is a generally recognized characteristic of creativity in all knowledge domains. In this regard, the search for knowledge-for-knowledge-sake is non-utilitarian in purpose, i.e., it has no objective other than itself. To this extent, all scientific instruments can be considered toys. In effect, scientific instruments are designed to produce new knowledge which, to the scientist, is like the pleasure derived from a toy. This relates to the subordination of Sensation to Reason as in “intellectual priapism” (Findley 1999, 258).
6.35 Similarly, new scientific instruments – the foundation of experimental research – may subsequently become industrial tools used in economic production, e.g., the scanning electron microscope, ion implantation and the synchrotron (Brooks 1994, 480). They may also become toys intended for amusement or entertainment, e.g., the cathode display tube developed to monitor laboratory experiments became a standardized tool of science and industry and then evolved into the television set in the family living room.
i) Sensors
6.36 As a sensor or ‘probe’ (M. Polanyi 1962a, 55), tooled knowledge extends the human senses of touch, taste, sight, sound and smell. It monitors the world of matter and energy existing above (macroscopic), at (mesoscopic), or below (microscopic) the threshold of our natural senses. The information or ‘readings’ generated, when organized, structured and systematized, become codified knowledge that can be shared as a statement of objective, empirical fact.
6.37 To the degree they measure phenomenon above and below the threshold of our natural senses, scientific instruments realize a Platonic ideal: “belief in a realm of entities, access to which requires mental powers that transcend sense perception” (Fuller 2000, 69). Furthermore, the ‘language’ of sensors realizes another ancient Greek ideal, that of Pythagoras, by reporting about nature in numbers. My term ‘sensor’ corresponds to Baird’s concept of ‘measuring instruments’ (Baird 2004).
6.38 The effects of sensors can be profound, for example: “the idea of a world governed by precise mathematical laws was transmitted… through Galileo’s and Huygen’s conversion of the mechanical clock into an instrument of precision” (Layton 1974, 36). Or, consider the impact on our “image” of the world (Boulding 1956) of Galileo’s innovative use of the telescope resulting in “artificial revelation” (Price 1984, 9). [B]
6.39 To the degree that the natural sciences are about acquiring knowledge of the physical world then, to that degree, all scientific instruments are sensors, i.e. their primary purpose is to monitor, not manipulate. That scientific instruments embody knowledge is alluded to by Shapin when he reports: “much empirical work has addressed the embodied nature of scientific know-how and the embodied vectors by which it travels, whether that embodiment is
98
reposed in skilled people, in scientific instruments, or in the transactions between people and knowledge-making devices” (Shapin 1995, 306). With respect to the later category, he notes the emergence of new non-human actors including cyborgs – part human and part machine (Shapin 1995, 313).
6.40 The history, philosophy and sociology of science are replete with allusions to the role of scientific instruments. Experimental science was, is now, and probably always will be, rooted in tooled knowledge (Price 1984). For example, CERN’s Large Hadron Collider will begin operation in 2006 while the recently upgraded Fermi National Accelerator Lab’s “Tevatron” is already sensing nature at levels beyond the sensitivity of previous instruments. The ‘Canadian Light Source’ synchrotron at the University of Saskatchewan is an example of increasingly common sensor/tool crossovers serving both research science and industry. These are ‘Big Science’. The size and complexity of such instruments, the range and diversity of knowledge embodied and costs associated with their design, construction and operation may, as suggested by Fuller, limit a future ‘scientific revolution’ in physics (Fuller 1992, 252) but without doubt, they impose a strong path dependency on the road to future knowledge (Rosenberg 1994, 1-6).
6.41 It has also been argued that new sub-disciplines, i.e., new categories of knowledge, within the natural sciences and related technological disciplines emerge in response to new instruments (Price 1984). [C] This conclusion is reinforced by Rosenberg’s findings about the interdisciplinary impact of scientific instruments in bringing together scientists from different disciplines and mitigating the incommensurability problem (Rosenberg 1994, 156). [D]
6.42 Beyond the knowledge embodied in scientific sensors and the new knowledge they generate, their social and metaphysical importance lays in the fact that they generate consistent objective evidence about the state of the physical world. Such evidence or measurement is objective in the sense that its collection is not mediated by a human subject. Instruments extend the human senses beyond the subjectivity of the individual observer. Once calibrated and set in motion a clock – atomic or otherwise – will tick at a constant rate per unit time until its energy source is exhausted. Again, such measurement is ideally achieved without mediation by a human subject.
6.43 In this regard it is important to note that sensors also pattern the modern way of life. The simple household thermometer is an example. It tells us when we have a fever and when to seek medical intervention. In turn, a medical thermometer is used to monitor the progress of such intervention (Shapin 1995, 306-307). [E] Put another way:
By encapsulating knowledge in our measuring instruments, these methods minimize the role of human reflection in judgment. They offer a kind of “push-button objectivity” where we trust a device and not human judgment. How many people check their arithmetic calculations with an electronic calculator?
... Putting our faith in “the objectivity” of machines instead of human analysis and judgment has ramifications
99
far and wide. It is a qualitatively different experience to give birth with an array of electronic monitors. It is a qualitatively different experience to teach when student evaluations – “customer satisfaction survey instruments” - are used to evaluate one’s teaching. It is a qualitatively different experience to make steel “by the numbers,” the numbers being provided by analytical instrumentation. (Baird 2004, 19)
ii) Tools
6.44 If sensors extend the human senses then tools extend the human grasp. They can be considered extensions of our own bodies “forming part of ourselves, the operating persons. We pour ourselves into them and assimilate them as parts of our own existence” (M. Polanyi 1962a, 59). Tools are the means by which humanity animates nature. They move and change nature to suit human purposes and ends. Empirically, before art, culture or language, there was tool making. Tools provide primae facia evidence of the arrival of our species: artifacts left by ancestors some two and a half million years ago (Schuster 1997).
6.45 Using its opposable thumb, humanity reached out to shape the material world to compensate for its elemental frailty – no great size, no claws or talons and tiny canine teeth. To eat and survive predation, the human brain reached out with finger-thumb coordination to grasp and shape parts of the world into tools with which to then manipulate other parts, e.g., to kill game or plant seeds. It appears, from the fossil record, that the opposable thumb preceded, and in a path-dependent manner contributed to, the subsequent and extraordinarily rapid evolutionary growth and development of the human brain itself.
6.46 In this regard, the word ‘concept’ derives from the Latin concipere ‘to conceive’ that in turn derives from ‘to take’ and, as I understand it, colloquially, meant ‘to grasp firmly with the hand’ or, in Sicilian, ‘to steal’. Thus a concept is a grasping and manipulation of the world – inner or outer – using mental tools, the evolutionary descendents of finger and thumb exercises of prehistoric humanity.
6.47 As noted, matter is tooled to extend the human grasp of the physical world in order to shape and mold it to serve human purposes. In this sense tools have an in-built aim or purpose, i.e., they are teleological (Layton 1988, 90-91). We recognize a tool by its purpose (M. Polanyi 1962a, 56). [F] Or, put another way, a tool is created when “a function couples purpose with the crafting of a phenomenon. A function is a purposeful phenomenon” (Baird 2004, 123).
6.48 The teleological nature of tooled knowledge is atavistic, an epistemological throwback to a time before the Scientific Revolution when medieval animism ruled, i.e., when objects and natural phenomena were believed possessed of purpose. This was effectively displaced by the mechanistic causality of the initial Scientific Revolution of the mid-17th century which provided a “description of reality in terms of a world of precision, free of all considerations based upon value-concepts, such as perfection, harmony, meaning, and aim” (Layton 1988, 90). While this
100
displacement is appropriate for understanding the natural world, it is inappropriate in the world of human-made things, that is, in “the sciences of the artificial” (Layton 1988, 91).
6.49 Purpose and value are inherent in a tool. It is designed to do a job; it is not valued in-and-of-itself, like a work of art, but rather for what and how well it can do that job. The knowledge required to make a tool becomes embedded in it, i.e., it becomes tooled knowledge. For example, if it is intended to do a job in the weightlessness of outer space then its shape, size and tolerances will be very different than if designed to do the same job under conditions of terrestrial gravity or the enormous pressures of the ocean’s depths.
Material agency is revealed in our mechanical contrivances… Much as we control concepts through the exercise of our literary skills, we control material agency through the exercise of our making skills. (Baird 2004, 47)
6.50 Tools are located on the production-side of the economic equation. They are intermediate goods used to produce final goods and services that are purchased by consumers (excepting the handyman). In this sense, they are utilitarian in that they are valued for what they can do, not for what they are in-and-of-themselves.
6.51 A final distinction can be drawn between specific purpose and general purpose tools, or what David calls ‘general purpose engines’ (David 1990). A specific purpose tool has but one primary purpose, e.g., a hammer or a drill press. A general purpose engine is one that has multiple applications and which “give rise to network externality effects of various kinds, and so make issues of compatibility standardization important for business strategy and public policy” (David 1990, 356). Modern general purpose engines also generate “techno-economic regimes” involving a web of related installations and services. Such is the case with the internal combustion engine. For example, if embodied in an automobile it requires manufacturing plants, refineries, service stations, parking lots, car dealerships, roads, insurance, et al. In temporal succession, general purpose engines include the printing press, steam engine, electric dynamo, internal combustion engine, radio-television, the computer and, arguably, genomics.
6.52 Such techno-economic regimes display path dependency. Specifically, once introduced all subsequent additions, changes and/or improvements to a general purpose engine must conform to existing standards. The example of 110 versus 220 voltage current used in North America and Europe, respectively, serves as a case in point. Any electric appliance – new or old – must be tooled to operate using the appropriate current. Otherwise it will not function.
iii) Toys
6.53 If sensors are for measuring and tools are for manipulating then toys are for pleasure. Sensors and tools are located on the production-side of the economic equation. They serve as inputs in the production of final goods and services. In the case of sensors, monitoring information may be used either as an input to the production of knowledge or the production of other goods and services. Toys are final goods and services. They are appreciated for
101
their own sake, not for any contribution to the production of other things. In this sense, toys are non-utilitarian, pleasure-giving devices. This includes the pleasure of learning, i.e., knowledge as a final consumption good. It also includes the aesthetic experience of works of art. They are appreciated for their own sake; they are physical artifacts that embody the knowledge of the artist in making an artwork ‘work’. I am, however, compelled to use the word ‘toy’ because there appears to be no word in the English language denoting a work valued in-and-of-itself with no other purpose or utilitarian value. One plays with a toy; one works with a tool.
6.54 If, as according to Bentham, pleasure is the only objective of life then tooled knowledge, like tacit and codified knowledge, must reflect the full spectrum of human wants, needs and desires subject to cultural, legal and financial constraints. Aesthetic distancing, morality and scientific objectivity are not epistemological constraints in economics. As toys, tooled knowledge has extended the human playpen to the globe and beyond; it has extended our sense of time and place beyond the dreams of previous generations.
6.55 An instrument, as a physical artifact, must be activated or otherwise used by a human operator if it is to fulfill its function. Operation of an instrument – sensor, tool or toy – is generally associated with tacit and/or codified knowledge in the form of computer and genomic programs, mathematics, standards and techniques. In summary introduction, computer programs are machine-readable code used to operate most modern instruments – sensors, tools and toys. Genomic programs are molecule-reading code intended to analyze or synthesize biological compounds and living organisms. Standards are codified knowledge physically designed into an instrument defining its operational properties, e.g., a 110 or 220 volt electric razor. Mathematics is the language in which standards are set and in which most instruments are usually calibrated. Techniques are tacit and/or codified knowledge defining the manner of use and application of an instrument to attain its intended purpose.
6.56 Soft-tooled knowledge is tied to hardware. In effect, one has no purpose (software) and one has no function (hardware) without the other. Soft-tooled knowledge exists on both sides of the economic equation – consumption and production.
i) Computer & Genomic Programs
6.57 The purpose of tooled knowledge is manipulation of the natural world. A computer program, while codified and fixed in a communications medium, is intended to be decoded by a machine not by a Mind. It is intended to manipulate the flow of electrons in a circuit. In turn, such circuits may activate other machines and/or machine parts, e.g., industrial robots in steel mills, auto plants and other fabricating industries. The distinction between ‘machine readable’ and ‘human readable’ forms of expression fuelled the 1970s debate about software copyright (Keyes & Brunet 1977). Recognition of software copyright in 1988 represented a break with a long legal tradition restricting copyright to ‘artistic works’ (Chartrand
102
1997a). For my purposes, this distinguishes computer programs as soft-tooled, i.e., they are machine-readable rather than codified knowledge which is human-readable.
6.58 Similarly, a genetic program, while codified and fixed in a communications medium, is intended to be decoded by molecules, not by a Mind. It is intended to manipulate the chemical bonding of atoms and molecules to analyze or synthesize biological compounds and living organisms with intended or designed characteristics. Such code is read by a rapidly increasing range of scientific instruments or molecular technologies (Hood 2002).
6.59 As with computer program copyright, legal questions are arising about genetic program copyright. There are two levels of concern. First, copyright logically adheres to genomic databases and other documentation - hard-copy, electronic or fixed in any future matrix. Second, copyright may or may not adhere to gene segments themselves. The question in law appears to be originality. Naturally occurring sequences, according to some, are ‘facts of nature’ and hence copyright cannot adhere. In the case of ‘original’ sequences, however, i.e., those created through human ingenuity - a.k.a. artificial, there appears no reason for copyright not to adhere as with software computer programs.
6.60 Genomic programs, however, involve not just sensors and tools but also toys. In the fine arts, one author - David Lindsay (Lindsay 1997) - has tried to copyright his own DNA with the U.S. Copyright Office (without success) and mounted a web page: “The Genome Copyright Project’. Since his initial effort in 1997 a private firm - the DNA Copyright Institute – has appeared on the world-wide web (DNA Copyright Institute 2001). It claims to: “provide a scientific and legal forum for discussion and research, as well as access to valid DNA Profiles, among other Services, as a potential legal tool for deterrence and resolution of situations where there is suspected DNA theft and misappropriation.”
6.61 Steve Tomasula speculatively writes about the rabbit Alba, the first mammal genetically engineered as a work of art in “Genetic Arts and the Aesthetics of Biology” (Tomasula 2002). He compares incipient gene artists with Marcel Deschamp (1887-1968). While the above remain speculative, the fact is that Mike Manwaring, a graduate student at the University of Utah created the first piece of genetic art in 2002: a version of the Olympic Rings entitled “the living rings” made from nerve cells (BBC News On-Line, January 15, 2002). And at least one geneticist, Willem Stemmer, vice president for research and development at Maxygen, is considering transposing genomic code into music to create ‘DNA ditties’ and thereby gain copyright protection (Fountain 2002).
ii) Mathematics
6.62 The Pythagorean concept of a cognate relationship between mathematics and the physical world is, perhaps, our single most important inheritance from the ancient world as it affects the material well being of contemporary society. It finds its fullest expression in ‘the calculus’, i.e., the mathematics of motion and change through time. The following is a short history of its development. As will be seen, the ability of a knowledge domain and/or its component
103
disciplines to achieve mathematical articulation has tended to raise it from the epistemological status of a Mechanical to a Liberal Art.
6.63 If the computer represents a ‘general purpose engine’ (David 1990) then mathematics is a general purpose concept, i.e., a mental general purpose tool. It serves as the most effective interface yet discovered (or invented) between mind and matter, between user and instrument, between human readable and machine-readable forms of expression. In this regard, it is important to remember that music was the only ‘fine art’ admitted to the classical and medieval Liberal Arts curriculum. Balance, harmony, proportion and resonance are critical mathematical elements that Pythagoras expressed with the music of a string – halves, quarters, thirds, fourths, fifths, etc. All are audible properties of a string. The conceptual metaphor is one employed in a number of disciplines. For example, in cosmology, Jeff Weeks and his team recently explained fluctuations in readings about the physical dimensions of the universe by comparing them with the sound waves of musical harmonics (Roberts 2004).
6.64 For the ancient Greeks (and the humanist Renaissance), balance, harmony, proportion and resonance were everything. They capture the ancient Greek meaning of kosmos – the right placing of the multiple parts of the world (Hillman 1981, 28). They are inherent in the music of the spheres, i.e., astronomy, and in the design of cities (Steiner, 1976). [G]
6.65 Similarly in temples and public buildings, the ancient Greeks used the proportions of the human form for their columns. According to Marcus Vitruvius, writing in the 1st century before the Common Era, the Doric column represented the proportions of a man; the Ionian column, those of a mature woman; and, the Corinthian column, those of a young maiden (Vitruvius 1960, 103-104). Thus in ancient Greece (and during the Renaissance): ‘man was the measure of all things’. The human body and form provided the standard of measurement, e.g., how many ‘hands’ high is a horse?
6.66 But beyond the human form lay the universal forms of the circle, square, triangle and variations on their themes, e.g., the parabola. Captured in Euclid’s Elements, two-dimensional space was reduced to the mathematics of these universal forms – their balance, harmony, proportion and resonance. Archimedes moved the cognitive relationship between numbers and nature into the three-dimensional world of volume. Measuring different forms of space was resolved by the Greeks through ‘exhaustion’ whereby one considers the area measured as expanding to account for successively more and more of the required space. In astronomy this method was extended to the celestial motion of the stars and planets. In effect, motion to the ancient Greeks was geometric exhaustion applied, step by step, through time. Ancient Greek mathematics was thus essentially concerned with spatial relationships finding its finest expression in Euclidian and Archimedean geometry and the astronomy of Ptolemy.
6.67 After the fall of Rome, the works of the ancient Greek mathematicians were, for the most part, lost to the West. Only gradually were they recovered from Byzantine and Arab sources. In the interim, medieval guilds held a monopoly of tooled knowledge,
104
or the ‘mysteries’ (Houghton 1941, 35) and operated without mathematical theory applying ‘rules of thumb’ and ‘magic numbers’. Even after recovery of Greek and Roman classics, guild masters and apprentices worked in the vernacular and did not have access to the ‘theoretical’ works, in Greek and Latin, of Archimedes, Euclid, Ptolemy or Vitruvius. The breakdown of the guilds and introduction of craft experimentation at the end of the medieval period, however, led to new forms and types of mathematics and instruments – scientific and musical - all calibrated to provide a mathematical reading of physical reality (Zilsel 1945).
6.68 In the early 15th century, the mathematical laws of perspective were discovered (or rediscovered) by the architect Filippo Brunelleschi (1377-1446). In accounting, innovation of the double entry ledger by Luca Pacioli (1445-1515) facilitated the commercial revolution first in the Mediterranean and then around the world. The need for improved navigation led to an intensive search for new methods and instruments to calculate longitude. The Royal Observatory was established in Greenwich in 1675 specifically for this purpose. It was not, however, until 1761 that John Harrison, “a working-class joiner” (BBC News Online, August 3, 2003), created his H4 ‘watch’ which proved sufficiently accurate and sturdy, under the stresses of 18th century sea travel, to permit reliable calculation of longitude. The spirit of playful fascination with new instruments and devices in the 17th and 18th centuries, especially those intended to measure longitude, is captured in Umberto Eco’s novel: The Island of the Day Before (Eco 1994).
6.69 Beyond the astronomical mathematics of Kepler and Galileo (the later taking the telescope, invented by Hans Lipperhey in 1608, and changing the way we see the universe), it was canon fire that provided the terrestrial impetus for development of a true mathematics of motion. In fact, the mathematics of cannon fire (and its patronage) provided the opportunity for many of the experiments of Galileo (Hill 1988) which are generally recognized as the beginning of the Scientific Revolution. Mechanics began to drive mathematics.
6.70 In the 1670s, what was previously known as ‘the geometry of infinitesimals’ achieved a breakthrough with the invention of ‘the calculus’, independently by Newton (1643-1727) and Leibniz (1646-1716). Calculus provided a true mathematics of motion – changing spatial position through time expressed in algebraic rather than geometric terms. Arguably, the invention of the calculus was possible because of the prior invention of the mechanical clock (Layton 1974, 36). This breakthrough was then extended by Newton in his three laws of motion which arguably served as the foundation stone of modern natural science. By the middle of the 18th century, in France, ‘scientific’ engineering emerged with its requirement for formal training in calculus (Kranakis 1989, 18).
iii) Standards
6.71 A quarter of a century before Adam Smith published his analysis of the division and specialization of labour in The Wealth of Nations (Smith 1776), the French military changed its weapons purchasing policy imposing strict standards for the production of
105
parts and final weapons systems, e.g., artillery (Alder 1998). Standards were codified into mechanical drawings and mathematically defined tolerances subject to various physical forms of testing. Previously production was a craft activity with each part and weapon a unique artifact. This change meant that parts became interchangeable, e.g., bayonets. This had a significant impact on the performance of the French revolutionary armies of Napoleon (Alder 1998, 536). [H]
6.72 Standardized parts production was the first step towards ‘mass production’. It was followed early in the next century by introduction, in England, of the first machine tools to guide and later to replace a worker’s hand to assure standards in production. The use of such machines led Charles Babbage to extend Smith’s theory of the division and specialization of labour to include payment only for the skill level actually required at each stage of production thereby encouraging a reduction of skill requirements, i.e., craftspersons could be replaced by semi-skilled labourers (Rosenberg 1994, 32). [I] This is, of course, similar to the ‘de-skilling’ that continues to occur in the natural sciences with the introduction of new instruments, e.g., the directly readable spectrometer (Baird 2004).
6.73 It was not in Europe, however, that the system came to fruition. Arguably due to a shortage of skilled craftsmen and a predominantly low-end ‘mass’ market (rather than an upscale highly differentiated aristocratic one), it was in the United States that the system developed into what became known as ‘the American System’ (Hounshell 1983). In this system, specifications and standards became designed into machines (machine tools) that were, in many cases, simply unknown elsewhere, e.g., in England. Development, in the late 1850s, of the British Enfield rifle is a case in point where initially the idea of interchangeable parts for rifles was considered next to impossible until American machine tools and workers demonstrated how it could be done (Ames & Rosenberg 1968). [J]
6.74 The American System, however, was not restricted to the military. It was extended to most manufacturing industries in the United States including, for example, tableware such as knives and forks (Ames & Rosenberg 1968, 36). [K] When standardized parts production was married to the moving assembly line, innovated by Henry Ford in 1913, the modern system of mass production effectively began. This combination became known as ‘Fordism’ or the “Fordist regime” (David 1990, 356).
6.75 If standardized parts and the assembly line began mass production, it was innovation of “techno-economic regimes formed around general purpose engines” (David 1990, 355) that completed the transformation of traditional into modern life-styles. The steam engine, railway, internal combustion engine, electric generator and computer require standardization not only of internal components but also external connectors (Alder 1998, 537). [L] As previously noted, general purpose engines, once innovated, establish a ‘path dependency’, i.e., standards and specifications established at the onset become ‘locked in’ and all subsequent improvements, innovations or adjustments must comply. In a manner of speaking, the path dependency of general purpose engines corresponds to
106
‘tradition’ for the medieval craftsman who inherited and was limited by ‘best practices’ established in a distant past.
6.76 The importance of ‘standards’ in the production of stand-alone artifacts and technical networks is recognized in an emerging sub-discipline called metrology (O’Connell 1993). To anticipate discussion of technique, such networks produce what O’Connell calls ‘societies’ or what I call ‘technical subcultures’ including “a society of health care facilities that share the same measure of body composition, a society of laboratories that share the same electrical units, and a society of weapons that share the same electrical and dimensional standards” (O’Connell 1993, 131).
6.77 In this regard, at the international level, engineering standardization began with the International Electrotechnical Commission (IEC) in 1906. The broader based International Federation of the National Standardizing Associations (ISA) was set up in 1926 and, after the Second World War, the International Standards Association (ISO) was established in 1947. Today the ISO has forty distinct fields of standardization ranging from Environment to Image Processing to Domestic Equipment. [M] In most fields mathematically defined standards are codified and then designed into hard-tooled knowledge to ensure compatibility anywhere in the world (Alder 1998, 537).
iv) Techniques
6.78 The French word ‘technique’ was introduced into English in 1817. Among its several meanings is: “a body of technical methods (as in a craft or in scientific research)” (MWO, technique, n, 2a). Quite simply such methods involve the effective use and application of hard-tooled knowledge - as sensor, tool or toy. Such use requires acquisition of tacit knowledge about a new instrument, its codification into operating manuals, and, then transfer of the instrument to a final user who, in turn, must decode the manual and then develop the necessary tacit knowledge to become skillful in its use.
6.79 In a way, hard-tooled knowledge is a nucleating agent around which routinized patterns of human behaviour develop. In the tradition of the ‘old’ Institutional Economics (e.g., Commons 1924, 1934, 1950), a routinzed pattern of collective human behaviour is an ‘institution’. In this regard Price has called the instrument/technique relationship an ‘instrumentality’, i.e., the nucleus plus the orbiting behaviours (Price 1984, 15). For my purposes, the instrument is hard-tooled while the methods associated with its use constitute soft-tooled knowledge. They are, in economic terms, ‘tied goods’ like the punch cards required to run an old-style mainframe computer. [N]
6.80 In genomics, Cambrosio & Keating have documented the nucleating role of instruments in their study: “Art, Science, and Magic in the Day-to-Day Use of Hybridoma Technology”. They define scientific technique as an “embedded system of practices”. They highlight how much of technique can only be learned by doing and/or through instruction, i.e., it cannot be fully codified and much remains tacit (Cambrosio & Keating 1988, 258). [O]
107
6.81 Similarly, Rosenberg writes about “instrument-embodied technique” (Rosenberg 1994, 156). He also notes that shared use of specialized instruments serves “to bring members of different disciplines together” countering the tendency towards incommensurability between scientific disciplines and sub-disciplines (Rosenberg 1994, 156). [P]
6.82 Discussion of technique brings us full circle back to tacit knowledge, back to personal knowledge. In Heidegger’s existential phenomenology the hammer is one with us in action. It becomes transparent as ‘other’, or as Polanyi put it, tools form “part of ourselves, the operating persons. We pour ourselves into them and assimilate them as parts of our own existence” (M. Polanyi 1962a, 59). [Q] It appears coincidental that Heidegger’s hammer - the basis of contemporary philosophy of technology - is paralleled in the philosophy of science by Polanyi’s hammer and ‘tacit knowledge’ which has become part of the lexicon of the knowledge-based economy (American National Standards Institute and the Global Knowledge Economics Council 2001). Both were German-speaking but I can find no reference that they knew each other or each other’s work. Heidegger published his article ‘The Question Concerning Technology’ in 1949; Polanyi published his first edition of Personal Knowledge in 1958. Heidegger had, however, first mentioned the hammer metaphor in his 1927 book Being and Time (Idhe 1991).
6.83 Tooled knowledge exhibits four characteristics: design, density, fixation and vintage. As introduction, design refers to the synthesis of different sub-domains of knowledge, e.g., biology, chemistry and physics, to create an instrument, i.e., a sensor, tool or toy. Density refers to the operational opacity (or transparency) of the resulting instrument. Fixation refers to embedding knowledge into a functioning material matrix. Vintage refers to the temporal coefficient (historical date) at which current knowledge is embedded, fixed or frozen. I will examine each in turn.
i) Design
6.84 The word ‘design’, as a verb, entered the language in the 14th century, meaning: to create, fashion, execute, or construct according to plan; to have as a purpose” (MWO, design, v, 1). As a noun, it entered the English language in 1588 meaning: deliberate purposive planning; the arrangement of elements or details in a product or work of art; the creative art of executing aesthetic or functional designs (MWO, design, n, 1a). Critically, for our immediate purposes, engineers use the word design “in framing membership criteria for the professional grades of engineering societies” (Layton 1974, 37). [R] More generally, however, in Design
we have come to recognize the processes which bring about creative advances in science, the new paradigms as processes of human design, comparable to artistic creation rather than logical induction or deduction which work so well within a valid paradigm... the norms of artistic design (are) “inherent in the specific psychic process, by which a work of art is represented” and thus in the creative act, not
108
in the created object - in the process not the structure .(Jantsch, 1975, 81)
6.85 From the dictionary definitions I extract the terms ‘arrangement’ and ‘purpose’ in order to distinguish tooled from codified knowledge. With respect to purpose, both codified and tooled knowledge are extra-somatic, i.e., fixed outside a natural person. The purpose of codified knowledge, however, is transmission of knowledge between natural persons while the purpose of tooled knowledge is measurement and manipulation of the natural world.
6.86 With respect to arrangement, codified knowledge involves manipulating an alphabet, grammar, syntax and vocabulary, i.e., a language including mathematics intended to communicate with other natural persons. Arrangement of tooled knowledge, however, involves the coordination of different forms and types of matter and energy to subsequently and artificially manipulate or animate the natural world. This may include synthesizing specific bits of biological, chemical, cultural, economic, electric, electronic, ergonomic, mechanical knowledge and/or organizational knowledge into a single working device or instrument. Put another way:
The term “design” covers the mutual employment of the material and the propositional, as well as hybrid forms such as drawings, computer simulations, and material models. However, design must be understood to embrace material knowledge as well as ideational knowledge. The “design paradigm” is the most promising recent development in the epistemology of technology, but it must not lose track of this central insight about design. Thought and design are not restricted to processes conducted in language. (Baird 2004, 149)
6.87 As an example, consider the common electric hand drill. Functionally it makes a hole. Without a drill one can use a simpler tool like a spike. This requires knowledge of materials technology, e.g., balsam won’t work well. One either pounds away or rotates the spike with little control or effect unless one spends a very long time developing the tacit knowledge of how to do so. If instead one mounts the bit and turns a crank handle to drive a hardened specially shaped shaft (embodied knowledge of gears as well as bits) then the operator can achieve much more control and effect. One has invented the hand drill. If one powers the crank by electricity (knowledge of electric motors), then at the push of a button one hand can achieve more control and effect. If one then computerizes the button, one frees the hands, body and mind of the operator. One has invented a computerized machine tool that embodies knowledge streams of materials technology, mechanics, electricity and computers - all in one tool.
6.88 Layton, quoting Herbert Simon, defines the “sciences of the artificial” as involving synthesis or what I call ‘design’ rather than analysis or what I call ‘reduction’ as in the natural sciences. Furthermore: “the engineer is concerned with how things ought to be - ought to be, that is, in order to attain goals, and to function” (Layton 1988, 90-91). [S]
109
6.89 Polanyi also recognized the artificial nature of tooled knowledge. He observed a machine can be smashed but the laws of physics continue to operate in the parts. He concluded that: “physics and chemistry cannot reveal the practical principles of design or co-ordination which are the structure of the machine” (M. Polanyi 1970). [T]
6.90 Put another way, in another context, by another author: “technology is about controlling nature through the production of artifacts, and science is about understanding nature through the production of knowledge” (Faulkner 1994, 431). In Aristotle’s Nicomachean Ethics “art is identical with a state of capacity to make, involving the true course of reasoning” (McKeon 1947, 427). The connection between the Arts and tooled knowledge is captured in the aesthetic term elegant, i.e., “ingenious simplicity and effectiveness” (OED, elegant, a, 5a). This term, of course, is also applied in mathematics. Put another way: “design involves a structure or pattern, a particular combination of details or component parts, and it is precisely the gestalt or pattern that is of the essence for the designer” (Layton 1974, 37). [U]
6.91 This gestalt is generally expressed in visual rather than verbal terms. In fact, the earliest expression of engineering knowledge in the West takes the form of design portfolios and the “natural units of study of engineering design resemble the iconographic themes of the art historian” (Layton 1976, 698). [V] Even in the natural sciences, this is true. Quoting Ackerman, Idhe observes:
Visual thinking and visual metaphors have undoubtedly influenced scientific theorizing and even the notation of scientific fact, a point likely to be lost on philosophers who regard the products of science as a body of statements, even of things. Could modern scientific world be at its current peak of development without visual presentations and reproductions of photographs, x-rays, chromatographs, and so forth? ... The answer seems clearly in the negative.” (Idhe 1991, 93)
6.92 There is, however, a Western cultural bias towards ‘the Word’ and away from ‘the image’ – graven or otherwise (Chartrand 1992a). This has contributed to the epistemological suppression of tooled knowledge relative to ‘scientific’ knowledge which is usually presented in a documentary format (the article or book) while tooled knowledge appears first as an artifact which must then be transliterated into a written format that “savour of the antiquarian” (Price 1965, 565-566). [W]
6.93 Another connexion between tooled knowledge and the Arts is found in the expression “from art to science” (Cambrosio & Keating 1988, 256). This transition has been documented in biotechnology (Hood 2002) and engineering (Schön, 1983) with respect to experimental techniques or protocols. Such protocols generally begin as the unique tacit knowledge of a single researcher. This is called ‘magic’ by Cambrosio & Keating. Over time, this tacit knowledge becomes embodied in an experimental piece of hardware, i.e., tooled knowledge. This stage they call ‘art’ because operation of the prototype requires a high level of tacit knowledge or skill. In
110
turn, the prototype may be commercially transformed into a standardized instrument requiring less skill of its operator who, in effect, transforms from a scientist into technician (Rosenberg 1994, 257-258). [X] This, according to Cambrosio & Keating, is the ‘science’ stage when the now standardized instrument can be routinely used in the ongoing search for new knowledge. The protocol, however, has effectively become embodied in a standardized, calibrated scientific instrument. Put another way:
In the language of technology studies, these instruments “de-skill” the job of making these measurements. They do this by encapsulating in the instrument the skills previously employed by the analyst or their functional equivalents.” (Baird 2004, 69)
6.94 In summary, design refers to the synthesis of different forms of knowledge – cultural, economic, organizational as well as scientific. Tooled knowledge is thus synthetic and integrative rather than analytic and reductive. Through design it enfolds or integrates many different forms of knowledge, including economic knowledge, into an efficient instrument (technically and economically) that works and performs its function. In this sense, tooled knowledge achieves what the ancient Greeks called kosmos: “the right placing of the multiple parts of the world” (Hillman 1981, 28). When this is achieved the world is in harmony; the world works. In more prosaic terms:
Development of the design is coordinated and iterative, and the end product succeeds in integrating all of the necessary knowledge. (Faulkner 1994, 432)
ii) Density
6.95 Among its several meanings, the word density refers to “the degree of opacity of a translucent medium” (MWO, density, n, 3a). With respect to tooled knowledge, density refers to the operational opacity (or transparency) of an instrument. The more tooled knowledge embodied in an artifact, relative to its function, the denser, the more opaque, the instrument becomes, i.e., it requires less and less tacit or codified knowledge to operate. In other words, the denser an instrument, the more ‘user friendly’ it becomes.
6.96 At one extreme are ‘one-offs’, customized instruments common in the natural sciences. A particle accelerator or synchrotron is unique. No two are alike; the tacit and codified knowledge required to maintain and operate it is large. It requires a great deal of what is called ‘local knowledge’ (Alder 1998, 537; Faulkner 1994, 445). In this sense it is very transparent requiring constant looking inside and tinkering to make sure it functions correctly. Its operation involves the “craft of experimental science” (Price 1984).
6.97 At the other extreme is the consumer ‘black box’ – push the button and it operates itself. The leading edge of black box tooled knowledge, today, is voice activated computer control. Just a verbal command and the tooled knowledge works. The black box hides its ‘thing-ness’ (Baird 2004, 146). [Y]
6.98 Between the extremes are many shades of grey. Standardized research instruments like scanning electron
111
microscopes or MRI scanners require highly trained technicians to operate. They can do so, however, without the detailed tacit and codified knowledge available to an experimental scientists. This again involves a ‘de-skilling’ of the operator and transfer of knowledge into the instrument (Baird 2004, 69).
6.99 The process of standardizing experimental scientific instruments by replacing manual with automatic control is well documented (Cambrosio & Keating 1988; Hood 2002; Price 1984; Rosenberg 1994). It involves conversion of a transparent scientific sensor into an opaque industrial tool that, in turn, may become a black box toy in final consumption, e.g., the cathode display tube as TV.
6.100 The impact of soft-tooled knowledge in this process, especially standardization, cannot be underestimated:
For all the diversity of our consumer cornucopia, the banal artifacts of the world economy can be said to be more and more impersonal, in the sense that they are increasingly defined with reference to publicly agreed-upon standards and explicit knowledge which resides at the highest level of organizations, rather than upon local and tacit knowledge that is the personal property of skilled individuals. (Alder 1998, 537)
iii) Fixation
6.101 Tooled knowledge is fixed in a functioning material matrix as a sensor, tool or toy. Fixation is a condition for intellectual property rights such as patents. I will discuss the nature of such intellectual property rights below in 8.0 Rights to Know. For now it is sufficient to ask if tooled knowledge can be extracted from such a matrix? The answer is yes through ‘reverse engineering”. In effect, “engineers learn the state of the art not just by reading printed publications, going to technical conferences, and working on projects for their firms, but also by reverse engineering others’ products” (Samuelson & Scotchmer 2002, 70-71). [Z]
iv) Vintage
6.102 Vintage refers to the temporal coefficient (historical date or time) at which existing knowledge is embedded, embodied or tooled into a matrix. Unlike design, density and fixation, vintage has been the subject of formal economic investigation ever since Robert Solow (1960) considered the question of the distribution of capital equipment including new and old technologies and why different vintages coexist. Subsequently, Solow introduced the concept of ‘embodied technological change’ (1962).
6.103 Like codified knowledge where the hand having written moves on, tooled knowledge exists at a given moment of time – a given state of the art. Once embedded, it is ‘frozen’ (Boulding 1966, 6) and subject to update with significantly more effort and cost than revising a written document. Vintage thus refers to the state of the art current when knowledge is tooled into matter. Furthermore, and excepting defense and the natural sciences, it is subject to economic constraints (M. Polanyi 1960-61, 404). [AA]
112
6.104 One further vintage distinction can be drawn: technical versus functional obsolescence. On the one hand, a given product or process embodying tooled knowledge may be displaced by one that is faster and/or more cost-effective. The old is now technically obsolete. It can continue, however, to perform the same or similar function. On the other hand, a given product or process may be displaced because the function it performs is no longer required (for whatever reasons). The old is now functionally obsolete. An example is hydrogen re-fuelling stations for zeppelins.
6.105 Knowledge takes three forms – personal & tacit, codified and tooled. Knowledge is fixed in a person as neuronal bundles of memories and as the trained reflexes of nerves and muscles. It is fixed as code in a medium of communication or matrix that allows knowledge to cross time and space until another person reads or decodes it and thereby adds it to his or her personal & tacit knowledge. Knowledge is tooled into a functioning physical matrix as an instrument such as a sensor, tool or toy, or more generally, as ‘a work’ of technological intelligence. The knowledge tooled into an extra-somatic matrix remains a meaningless artifact until someone makes it work by pushing the right buttons and uses in the right way. This requires, of course, personal & tacit knowledge that comes with practice and experience. Thus all knowledge is ultimately personal and tacit.
6.106 These three forms of knowledge constitute the primary knowledge triad as form. Abstract knowledge qubits – the etymological WIT, psychological PSI, epistemological IMP and pedagogic PED – are initially reified as one or more of these three forms – personal & tacit, codified and/or tooled knowledge. In the next chapter, I will demonstrate that this primary triad spawns secondary and tertiary triads of knowledge as the inputs and outputs of a knowledge-based economy.
6.0 Form & Fixation
[A] “But, more than having become merely another overly vague bit of fashionable economic jargon, ‘tacit knowledge’ now is an increasingly ‘loaded’ buzzword, freighted with both methodological implications for micro-economic theory in general, and policy significance for the economics of science and technology, innovation, and economic growth. Indeed, references to ‘tacitness’ have become a platform used by some economists to launch fresh attacks upon national policies of public subsidization for R&D activities, and equally by other economists to construct novel rationales for governmental funding of science and engineering research and training programs.” (Cowan, David & Foray 2000, 212-213)
[B] “The dramatic new evidence that altered completely the nature of cosmology did so, not by any intellectual prowess on Galileo’s part, but by revealing new evidence, the existence of which had never been suspected. The telescope was not devised to seek such evidence, and it was not used primarily to gain more. Its purpose was to inject each new telescope owner into this world of what can only be called “artificial revelation”. The term is
113
not used lightly. Galileo was not so conceited as to think that he was brighter than all previous authorities; he knew that he had been presented with decisive new evidence of the structure of nature.” (Price 1984, 9)
[C] “Such is the power of instrumentalities, old and new, that they are probably also the chief agent for the sociological and substantive disaggregation of the chief scientific and technological disciplines into their constituent subdisciplines and invisible colleges. Scientists and engineers seem to be bound together in their invisible colleges, not so much by any communality of their paradigms, ways of thought, and cognitive training, as by a guild-like communality of the tools and instrumentalities that they use in their work.” (Price 1984, 15)
[D] “It appears also that instrumentation requirements sometimes serve as a powerful device for bringing together research scientists from separate disciplines. X-ray crystallography played such a role in the development of molecular biology, precisely because it is, in effect, an instrument-embodied technique.” (Rosenberg 1994, 156)
[E] “Think, for example, of the physical knowledge embodied in a thermometer. To contest that knowledge would be to fight on many fronts against many institutionalized activities that depend upon treating the thermometer as a “black box.” Intercalating science or technology into larger and larger networks of action is what makes them durable. When all the elements in a network act together to protect an item of knowledge, then that knowledge is strong and we come to call it scientific. The central modern scientific phenomenon to which attention is directed is thus metrology - the development of standards and their circulation around the world.” (Shapin 1995, 306-307)
[F] “Take for example the identification of a thing as a tool. It implies that a useful purpose can be achieved by handling the thing as an instrument for that purpose. I cannot identify the thing as a tool if I do not know what it is for - or if knowing its supposed purpose, I believe it to be useless for that purpose.” (Polanyi 1962a, 56)
[G] “The polis is the place of art... The magus, the. poet who, like Orpheus and Arion is also a supreme sage, can make stones of music. One version of the myth has it that the walls of Thebes were built by songs, the poet's voice and harmonious learning summoning brute matter into stately civic forum. The implicit metaphors are far reaching: the “numbers” of music and of poetry are cognate with the proportionate use and division of matter and space; the poem and the built city are exemplars both of the outward, living shapes of reason. And only in the city can the poet, the dramatist, the architect find an audience sufficiently compact, sufficiently informed to yield him adequate echo. Etymology preserves this link between “public”, in the sense of the literary or theatrical public and the “republic” meaning the assembly in the space and governance of the city.” (Steiner, 1976)
[H] “… the fact that the soldier could [now] choose any bayonet and still fit it on to the muzzle of his gun - even though the two pieces of metal had been manufactured several hundred kilometres apart - testifies to the fact that technical knowledge had been taken out of the domain of private and local knowledge, and moved up to a more general level of organization… It is no accident that these mass interchangeable bayonets proved eminently suitable for the mass army fielded by the French régime during the Revolutionary wars.” (Alder 1998, 536)
[I] “A central point for Babbage is that an extensive division of labor is itself an essential prerequisite to technical change. This is so for two related reasons. First of all, technical improvements are not generally dependent upon a few rarely gifted individuals, although the more “beautiful combinations” are indeed the work of the occasional genius (p. 260). Rather, and secondly, inventive activity needs to be seen as a consequence as well as a cause of the division of labor. This is so because “The arts of
114
contriving, of drawing, and of executing, do not usually reside in their greatest perfection in one individual; and in this, as in other arts, the division of labor must be applied” (p. 266; emphasis Babbage’s). (Rosenberg 1994, 32)
[J] “Before 1854, British gunmaking was concentrated in a large but complicated structure of handicraft firms, mainly located in Birmingham, and producing firearms to individual order or in very small batches. American gunmakers were at this time engaged in mass production of both civilian and military weapons. These weapons had interchangeable parts, a fact which the British found to be almost unbelievable. This production required a number of machines which were virtually unknown in Britain before the hearings. The Enfield Arsenal was equipped almost entirely with machinery of American design and manufacture; and American workers were brought to England to introduce the machines and to train English workers in their use.” (Ames & Rosenberg 1968, 827-828)
[K] “Nineteenth-century English observers frequently noted that American products were designed to accommodate the needs of the machine rather than the user. Lloyd, for example, noted of the American cutlery trade that “where mechanical devices cannot be adjusted to the production of the traditional product, the product must be modified to the demands of the machine. Hence, the standard American table-knife is a rigid, metal shape, handle and blade forged in one piece, the whole being finished by electroplating - an implement eminently suited to factory production.” (Ames & Rosenberg 1968, 36)
[L] “Today, artifacts travel with increasing ease over much of the globe. Transformers adapt personal computers to local currents; bicycle parts are sized in metric dimensions (even in the USA!); quantitative standards for copper, wheat and air pollution are monitored by international agencies; and digital high-definition television is coming. In factories from Thailand to Tennessee to the Czech Republic, digitally controlled machine tools can be programmed (and reprogrammed) to produce functionally identical artifacts in short production runs. For all the diversity of our consumer cornucopia, the banal artifacts of the world economy can be said to be more and more impersonal, in the sense that they are increasingly defined with reference to publicly agreed-upon standards and explicit knowledge which resides at the highest level of organizations, rather than upon local and tacit knowledge that is the personal property of skilled individuals. This is true even though the heyday of Fordist mass production is said to be over. Flexible production depends on standards of production as much as, perhaps even more than, Fordism: in part because shared values and common standards enable congeries of independent producers to pool their efforts and simultaneously compete against one another.” (Alder 1998, 537)
[M] Current ISO Categories
01 Generalities. Terminology. Standardization. Documentation
03 Sociology. Services. Company organization and management. Administration. Transport
07 Mathematics. Natural Sciences
11 Health care technology
13 Environment. Health protection. Safety
17 Metrology and measurement. Physical phenomena
19 Testing Analytical chemistry, see 71.040
21 Mechanical systems and components for general use
23 Fluid systems and components for general use
Measurement of fluid flow, see 17.120
25 Manufacturing engineering
27 Energy and heat transfer engineering
29 Electrical engineering
31 Electronics
33 Telecommunications. Audio and video engineering
115
35 Information technology. Office machines
37 Image technology
39 Precision mechanics. Jewellery
43 Road vehicles engineering
45 Railway engineering
47 Shipbuilding and marine structures
49 Aircraft and space vehicle engineering
53 Materials handling equipment
55 Packaging and distribution of goods
59 Textile and leather technology
61 Clothing industry
65 Agriculture
67 Food technology
71 Chemical technology
73 Mining and minerals
75 Petroleum and related technologies
77 Metallurgy
79 Wood technology
81 Glass and ceramics industries
83 Rubber and plastic industries
85 Paper technology
87 Paint and colour industries
91 Construction materials and building
93 Civil engineering
95 Military engineering
97 Domestic and commercial equipment. Entertainment. Sports
International Standards Organization
List of ISO Fields, Geneva, 2003
[N] HHC: The economic term ‘'tied-good’ requires explanation. An example is the old 'punch card' computer. The computer could not operate without such cards which, technically, were an output of the pulp, paper and publishing industries, sequentially. The computer and cards were tied-goods in production of computational results. Similarly, there can only be a market for audio-visual software, e.g. records and tapes if there is a market for home entertainment hardware, e.g. cameras, record players, TV sets, etc. They are tied-goods in consumption fitting hand in glove. In this regard, it is likely, but not proved, that the home entertainment center (HEC) is the third most expensive consumer durable purchased by the average consumer after house and car. Similarly, private collections of audio-visual software including phonographs, photographs and video tapes constitute an enormous stock of American cultural wealth. (Chartrand 2000, 24)
[O] “At the beginning of this study, Cambrosio undertook a comparison of several different experimental protocols for the production of hybridomas. He had not yet been able to attend a fusion experiment but relied, to a great extent, on his previous biological training. While one might expect that it would be relatively easy to determine variations between the protocols, this was true only in a “mechanical” or literal sense; to the untrained eye, the protocols appeared to be arbitrary lists of instructions lacking any overall sense. The situation changed fundamentally when he was able to attend a training session in the technique. Once these instructions were embodied in a series of gestures, they became confounded with other factors such as the manual skills of a given person or that person’s degree of familiarity with a piece of equipment. The comparison between protocols now became possible, each line of instruction evoking shapes, colors, time spans, and gestures that could be compared.” (Cambrosio & Keating 1988, 249)
[P] “It appears also that instrumentation requirements sometimes serve as a powerful device for bringing together research scientists from separate disciplines. X-ray crystallography played such a role in the development of
116
molecular biology, precisely because it is, in effect, an instrument-embodied technique. In a very different way the increasing reliance on supercomputers is serving to bring members of different disciplines together. The impetus in this case is, to a considerable degree, the high cost of the technology and, consequently, the small number of locations where users need to convene.” (Rosenberg 1994, 156)
[Q] In Zen-like terms of a monk transcending technique (Suzuki 1959), Polanyi notes:
“Our subsidiary awareness of tools and probes can be regarded now as the act of making them form a part of our own body. The way we use a hammer or a blind man uses his stick, shows in fact that in both cases we shift outwards the points at which we make contact with the things that we observe as objects outside ourselves. While we rely on a tool or a probe, these are not handled as external objects. We may test the tool for its effectiveness or the probe for its suitability, e.g. in discovering the hidden details of a cavity, but the tool and the probe can never lie in the field of these operations; they remain necessarily on our side of it, forming part of ourselves, the operating persons. We pour ourselves out into them and assimilate them as parts of our own existence. We accept them existentially by dwelling in them.” (Polanyi 1962a, 59)
[R] “… not only in after-dinner speeches, which are not necessarily to be taken seriously, but also in framing membership criteria for the professional grades of engineering societies, a matter which engineers take with deadly seriousness. The professional engineer is usually considered the creative practitioner, the “real” engineer. In the definition of such a person, the “ability to design” has been almost universally acknowledged as the crucial test, though in practice only the most professionally oriented societies have actually adopted it. It is interesting to note that “ability to design” and “reasoned state of capacity to make” are very similar, both in form and in substance.” (Layton 1974, 37)
[S] Referencing Herbert Simon, Layton writes: “… there are a body of sciences associated with practice, which he terms the “sciences of the artificial… He argues for engineering that: “We speak of engineering as concerned with ‘synthesis,’ while science is concerned with ‘analysis.’ Synthetic... and more specifically, prospective artificial objects having desired properties - are the central objective of engineering activity and skill. The engineer is concerned with how things ought to be - ought to be, that is, in order to attain goals, and to function.” Simon concludes that sciences of the artificial, such as “engineering science,” have certain characteristics that distinguish them from natural sciences.” (Layton 1988, 90-91)
[T] “… a machine can be smashed and the laws of physics and chemistry will go on operating unfailingly in the parts remaining after the machine ceases to exist. Engineering principles create the structure of the machine which harnesses the laws of physics and chemistry for the purposes the machine is designed to serve. Physics and chemistry cannot reveal the practical principles of design or co-ordination which are the structure of the machine…Consequently, and the consequences reach far beyond the example at hand, the meaning of the higher level cannot be accounted for by reductive analysis of the elements forming the lower levels. No one can derive a machine from the laws of physics and chemistry… At each consecutive level there is a state which can be said to be less tangible than the one below it.” (Polanyi 1970)
[U] “Design is clearly distinct from philosophy, including natural philosophy. It is, as both Aristotle and modern engineers have held, an attribute of a human being which may be expressed in an object but which is not identical with the object itself. At the outset, design is an adaptation of
117
means to some preconceived end. This I take to be the central purpose of technology… Design involves a structure or pattern, a particular combination of details or component parts, and it is precisely the gestalt or pattern that is of the essence for the designer.” (Layton 1974, 37)
[V] “Indeed, it is the oldest part of engineering knowledge to be recorded; the early engineering and machine books are in the nature of portfolios of design, and there is a deep kinship between engineering design and art, running back to the artist-engineers of the Renaissance and earlier. The natural units of study of engineering design resemble the iconographic themes of the art historian.” (Layton 1976, 698)
[W] “… only … science is already injected in documentary form in a way that mirrors the content of the science. The similar mirroring process in technology gives rise to the artifacts and processes, and it is necessary to transform this evidence into written form through the medium of descriptions which savor of the antiquarian.” (Price 1965, 565-566)
[X] “One essential aspect of this expansion in use has been modification of design so that instruments can be employed by people with lower levels of training. Often, in fact, it has proven worthwhile to redesign to lower performance ceilings in order to permit the substitution of automatic control for control by a highly trained operator.” (Rosenberg 1994, 257-258)
[Y] “Black-boxed instruments also hide their thing-y-ness. This is one of the ironies that confronts thing knowledge. Instruments, when they are working, connect seamlessly with theory; they provide information, data that can be enfolded into the propositional life of theory. This is why epistemology has been able to carry on under the illusion of knowledge solely as a play of ideas. The materiality of instruments only surfaces in their making and breaking. One needs to appreciate this essentially Heideggerian point to recognize that and to see how material knowledge complements knowledge borne by ideas.” (Baird 2004, 146)
[Z] “Reverse engineering is fundamentally directed to discovery and learning. Engineers learn the state of the art not just by reading printed publications, going to technical conferences, and working on projects for their firms, but also by reverse engineering others’ products. Learning what has been done before often leads to new products and advances in know-how. Reverse engineering may be a slower and more expensive way for information to percolate through a technical community than patenting or publication, but it is nonetheless an effective source of information. Of necessity, reverse engineering is a form of dependent creation, but this does not taint it, for in truth, all innovators, as the saying goes, “stand on the shoulders of giants” as well as on the shoulder of other incremental innovators. Progress in science and the useful arts is advanced by dissemination of know-how, whether by publication, patenting or reverse engineering.” (Samuelson & Scotchmer 2002, 70-71).
[AA] “A technology claiming acceptance irrespective of economic considerations is meaningless. Indeed, any invention can be rendered worthless and altogether farcical by a radical change in the values of the means used up and the ends produced by it. If the price of all fuels went up a hundredfold, all steam engines, gas turbines, motor cars, and aeroplanes would have to be thrown on the junk heap. Strictly speaking, a technical process is valid, therefore, only within the valuations prevailing at one particular moment and at one particular time… By contrast, no part of science can lose its validity by a change in the current relative value of things. (Polanyi 1960-61, 404)
118
The Competitiveness of Nations
in a Global Knowledge-Based Economy