Part 29 (2/2)
21. 21. Hans Moravec, Hans Moravec, Mind Children: The Future of Robot and Human Intelligence Mind Children: The Future of Robot and Human Intelligence (Cambridge, Ma.s.s.: Harvard University Press, 1988). (Cambridge, Ma.s.s.: Harvard University Press, 1988).
22. 22. Vernor Vinge, ”The Coming Technological Singularity: How to Survive in the Post-Human Era,” VISION-21 Symposium, sponsored by the NASA Lewis Research Center and the Ohio Aeros.p.a.ce Inst.i.tute, March 1993. The text is available atandand pared to contemporary electronics. An emulation of the human brain running on an electronic system would run much faster than our biological brains. Although human brains benefit from ma.s.sive parallelism (on the order of one hundred trillion interneuronal connections, all potentially operating simultaneously), the reset time of the connections is extremely slow compared to contemporary electronics.
28. 28. See notes 20 and 21 in chapter 2. See notes 20 and 21 in chapter 2.
29. 29. See the appendix, ”The Law of Accelerating Returns Revisited,” for a mathematical a.n.a.lysis of the exponential growth of information technology as it applies to the price-performance of computation. See the appendix, ”The Law of Accelerating Returns Revisited,” for a mathematical a.n.a.lysis of the exponential growth of information technology as it applies to the price-performance of computation.
30. 30. In a 1950 paper published in In a 1950 paper published in Mind: A Quarterly Review of Psychology and Philosophy Mind: A Quarterly Review of Psychology and Philosophy, the computer theoretician Alan Turing posed the famous questions ”Can a machine think? If a computer could think, how could we tell?” The answer to the second question is the Turing test. As the test is currently defined, an expert committee interrogates a remote correspondent on a wide range of topics such as love, current events, mathematics, philosophy, and the correspondent's personal history to determine whether the correspondent is a computer or a human. The Turing test is intended as a measure of human intelligence; failure to pa.s.s the test does not imply a lack of intelligence. Turing's original article can be found .at parison of cerebral-cortex gene-expression profiles for humans, chimpanzees, and rhesus macaques showed a difference of expression in only ninety-one genes a.s.sociated with brain organization and cognition. The study authors were surprised to find that 90 percent of these differences involved upregulation (higher activity). See M. Cacares et al., ”Elevated Gene Expression Levels Distinguish Human from Non-human Primate Brains,” Proceedings of the National Academy of Sciences Proceedings of the National Academy of Sciences 100.22 (October 28, 2003): 1303035. 100.22 (October 28, 2003): 1303035.However, University of California-Irvine College of Medicine researchers have found that gray matter in specific regions in the brain is more related to IQ than is overall brain size and that only about 6 percent of all the gray matter in the brain appears related to IQ. The study also discovered that because these regions related to intelligence are located throughout the brain, a single ”intelligence center,” such as the frontal lobe, is unlikely. See ”Human Intelligence Determined by Volume and Location of Gray Matter Tissue in Brain,” University of CaliforniaIrvine news release (July 19, 2004), today.uci.edu/news/release_detail.asp?key=1187.A 2004 study found that human nervous system genes displayed accelerated evolution compared with nonhuman primates and that all primates had accelerated evolution compared with other mammals. Steve Dorus et al., ”Accelerated Evolution of Nervous System Genes in the Origin of h.o.m.o sapiens h.o.m.o sapiens,” Cell Cell 119 (December 29, 2004): 102740. In describing this finding, the lead researcher, Bruce Lahn, states, ”Humans evolved their cognitive abilities not due to a few accidental mutations, but rather from an enormous number of mutations acquired through exceptionally intense selection favoring more complex cognitive abilities.” Catherine Gianaro, 119 (December 29, 2004): 102740. In describing this finding, the lead researcher, Bruce Lahn, states, ”Humans evolved their cognitive abilities not due to a few accidental mutations, but rather from an enormous number of mutations acquired through exceptionally intense selection favoring more complex cognitive abilities.” Catherine Gianaro, University of Chicago Chronicle University of Chicago Chronicle 24.7 (January 6, 2005). 24.7 (January 6, 2005).A single mutation to the muscle fiber gene MYH16 has been proposed as one change allowing humans to have much larger brains. The mutation made ancestral humans' jaws weaker, so that humans did not require the brain-size limiting muscle anchors found in other great apes. Stedman et al., ”Myosin Gene Mutation Correlates with Anatomical Changes in the Human Lineage,” Nature Nature 428 (March 25, 2004): 41518. 428 (March 25, 2004): 41518.
33. 33. Robert A. Freitas Jr., ”Exploratory Design in Medical Nanotechnology: A Mechanical Artificial Red Cell,” Robert A. Freitas Jr., ”Exploratory Design in Medical Nanotechnology: A Mechanical Artificial Red Cell,” Artificial Cells, Blood Subst.i.tutes, and Immobil. Biotech. Artificial Cells, Blood Subst.i.tutes, and Immobil. Biotech. 26 (1998): 41130; /alb.u.m/beyondhuman/respirocyte01.htm) of the respirocytes. 26 (1998): 41130; /alb.u.m/beyondhuman/respirocyte01.htm) of the respirocytes.
34. 34. Foglets are the conception of the nanotechnology pioneer and Rutgers professor J. Storrs Hall. Here is a snippet of his description: ”Nanotechnology is based on the concept of tiny, self-replicating robots. The Utility Fog is a very simple extension of the idea: Suppose, instead of building the object you want atom by atom, the tiny robots [foglets] linked their arms together to form a solid ma.s.s in the shape of the object you wanted? Then, when you got tired of that avant-garde coffee table, the robots could simply s.h.i.+ft around a little and you'd have an elegant Queen Anne piece instead.” J. Storrs Hall, ”What I Want to Be When I Grow Up, Is a Cloud,” Foglets are the conception of the nanotechnology pioneer and Rutgers professor J. Storrs Hall. Here is a snippet of his description: ”Nanotechnology is based on the concept of tiny, self-replicating robots. The Utility Fog is a very simple extension of the idea: Suppose, instead of building the object you want atom by atom, the tiny robots [foglets] linked their arms together to form a solid ma.s.s in the shape of the object you wanted? Then, when you got tired of that avant-garde coffee table, the robots could simply s.h.i.+ft around a little and you'd have an elegant Queen Anne piece instead.” J. Storrs Hall, ”What I Want to Be When I Grow Up, Is a Cloud,” Extropy Extropy, Quarters 3 and 4, 1994. Published on KurzweilAI.net July 6, 2001: ing. Sherry Turkle, ed., ”Evocative Objects: Things We Think With,” forthcoming.
36. 36. See the ”Exponential Growth of Computing” figure in chapter 2 (p, 70). Projecting the double exponential growth of the price-performance of computation to the end of the twenty-first century, one thousand dollars' worth of computation will provide 10 See the ”Exponential Growth of Computing” figure in chapter 2 (p, 70). Projecting the double exponential growth of the price-performance of computation to the end of the twenty-first century, one thousand dollars' worth of computation will provide 1060 calculations per second (cps). As we will discuss in chapter 2, three different a.n.a.lyses of the amount of computing required to functionally emulate the human brain result in an estimate of 10 calculations per second (cps). As we will discuss in chapter 2, three different a.n.a.lyses of the amount of computing required to functionally emulate the human brain result in an estimate of 1015 cps. A more conservative estimate, which a.s.sumes that it will be necessary to simulate all of the nonlinearities in every synapse and dendrite, results in an estimate of 10 cps. A more conservative estimate, which a.s.sumes that it will be necessary to simulate all of the nonlinearities in every synapse and dendrite, results in an estimate of 1019 cps for neuromorphic emulation of the human brain. Even taking the more conservative figure, we get a figure of 10 cps for neuromorphic emulation of the human brain. Even taking the more conservative figure, we get a figure of 1029 cps for the approximately 10 cps for the approximately 1010 humans. Thus, the 10 humans. Thus, the 1060 cps that can be purchased for one thousand dollars circa 2099 will represent 10 cps that can be purchased for one thousand dollars circa 2099 will represent 1031 (ten million trillion trillion) human civilizations. (ten million trillion trillion) human civilizations.
37. 37. The invention of the power loom and the other textile automation machines of the early eighteenth century destroyed the livelihoods of the cottage industry of English weavers, who had pa.s.sed down stable family businesses for hundreds of years. Economic power pa.s.sed from the weaving families to the owners of the machines. As legend has it, a young and feebleminded boy named Ned Ludd broke two textile factory machines out of sheer clumsiness. From that point on, whenever factory equipment was found to have mysteriously been damaged, anyone suspected of foul play would say, ”But Ned Ludd did it.” In 1812 the desperate weavers formed a secret society, an urban guerrilla army. They made threats and demands of factory owners, many of whom complied. When asked who their leader was, they replied, ”Why, General Ned Ludd, of course.” Although the Luddites, as they became known, initially directed most of their violence against the machines, a series of b.l.o.o.d.y engagements erupted later that year. The tolerance of the Tory government for the Luddites ended, and the movement dissolved with the imprisonment and hanging of prominent members. Although they failed to create a sustained and viable movement, the Luddites have remained a powerful symbol of opposition to automation and technology. The invention of the power loom and the other textile automation machines of the early eighteenth century destroyed the livelihoods of the cottage industry of English weavers, who had pa.s.sed down stable family businesses for hundreds of years. Economic power pa.s.sed from the weaving families to the owners of the machines. As legend has it, a young and feebleminded boy named Ned Ludd broke two textile factory machines out of sheer clumsiness. From that point on, whenever factory equipment was found to have mysteriously been damaged, anyone suspected of foul play would say, ”But Ned Ludd did it.” In 1812 the desperate weavers formed a secret society, an urban guerrilla army. They made threats and demands of factory owners, many of whom complied. When asked who their leader was, they replied, ”Why, General Ned Ludd, of course.” Although the Luddites, as they became known, initially directed most of their violence against the machines, a series of b.l.o.o.d.y engagements erupted later that year. The tolerance of the Tory government for the Luddites ended, and the movement dissolved with the imprisonment and hanging of prominent members. Although they failed to create a sustained and viable movement, the Luddites have remained a powerful symbol of opposition to automation and technology.
38. 38. See note 34 above. See note 34 above.
Chapter Two: A Theory of Technology Evolution:.
The Law of Accelerating Returns 1. 1. John Smart, Abstract to ”Understanding Evolutionary Development: A Challenge for Futurists,” presentation to World Futurist Society annual meeting, Was.h.i.+ngton, D.C., August 3, 2004. John Smart, Abstract to ”Understanding Evolutionary Development: A Challenge for Futurists,” presentation to World Futurist Society annual meeting, Was.h.i.+ngton, D.C., August 3, 2004.
2. 2. That epochal events in evolution represent increases in complexity is Theodore Modis's view. See Theodore Modis, ”Forecasting the Growth of Complexity and Change,” That epochal events in evolution represent increases in complexity is Theodore Modis's view. See Theodore Modis, ”Forecasting the Growth of Complexity and Change,” Technological Forecasting and Social Change Technological Forecasting and Social Change 69.4 (2002), ourworld.compuserve.com/homepages/tmodis/TedWEB.htm. 69.4 (2002), ourworld.compuserve.com/homepages/tmodis/TedWEB.htm.
3. 3. Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less s.p.a.ce it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper ”A Mathematical Theory of Communication,” Compressing files is a key aspect of both data transmission (such as a music or text file over the Internet) and data storage. The smaller the file is, the less time it will take to transmit and the less s.p.a.ce it will require. The mathematician Claude Shannon, often called the father of information theory, defined the basic theory of data compression in his paper ”A Mathematical Theory of Communication,” The Bell System Technical Journal The Bell System Technical Journal 27 (JulyOctober 1948): 379423, 62356. Data compression is possible because of factors such as redundancy (repet.i.tion) and probability of appearance of character combinations in data. For example, silence in an audio file could be replaced by a value that indicates the duration of the silence, and letter combinations in a text file could be replaced with coded identifiers in the compressed file. 27 (JulyOctober 1948): 379423, 62356. Data compression is possible because of factors such as redundancy (repet.i.tion) and probability of appearance of character combinations in data. For example, silence in an audio file could be replaced by a value that indicates the duration of the silence, and letter combinations in a text file could be replaced with coded identifiers in the compressed file.
Redundancy can be removed by lossless compression, as Shannon explained, which means there is no loss of information. There is a limit to lossless compression, defined by what Shannon called the entropy rate (compression increases the ”entropy” of the data, which is the amount of actual information in it as opposed to predetermined and thus predictable data structures). Data compression removes redundancy from data; lossless compression does it without losing data (meaning that the exact original data can be restored). Alternatively, lossy compression, which is used for graphics files or streaming video and audio files, does result in information loss, though that loss is often imperceptible to our senses.Most data-compression techniques use a code, which is a mapping of the basic units (or symbols) in the source to a code alphabet. For example, all the s.p.a.ces in a text file could be replaced by a single code word and the number of s.p.a.ces. A compression algorithm is used to set up the mapping and then create a new file using the code alphabet; the compressed file will be smaller than the original and thus easier to transmit or store. Here are some of the categories into which common lossless-compression techniques fall: Run-length compression, which replaces repeating characters with a code and a value representing the number of repet.i.tions of that character (examples: Pack-Bits and PCX). Run-length compression, which replaces repeating characters with a code and a value representing the number of repet.i.tions of that character (examples: Pack-Bits and PCX). Minimum redundancy coding or simple entropy coding, which a.s.signs codes on the basis of probability, with the most frequent symbols receiving the shortest codes (examples: Huffman coding and arithmetic coding). Minimum redundancy coding or simple entropy coding, which a.s.signs codes on the basis of probability, with the most frequent symbols receiving the shortest codes (examples: Huffman coding and arithmetic coding). Dictionary coders, which use a dynamically updated symbol dictionary to represent patterns (examples: Lempel-Ziv, Lempel-Ziv-Welch, and DEFLATE). Dictionary coders, which use a dynamically updated symbol dictionary to represent patterns (examples: Lempel-Ziv, Lempel-Ziv-Welch, and DEFLATE). Block-sorting compression, which reorganizes characters rather than using a code alphabet; run-length compression can then be used to compress the repeating strings (example: Burrows-Wheeler transform). Block-sorting compression, which reorganizes characters rather than using a code alphabet; run-length compression can then be used to compress the repeating strings (example: Burrows-Wheeler transform). Prediction by partial mapping, which uses a set of symbols in the uncompressed file to predict how often the next symbol in the file appears. Prediction by partial mapping, which uses a set of symbols in the uncompressed file to predict how often the next symbol in the file appears.
4. 4. Murray Gell-Mann, ”What Is Complexity?” in Murray Gell-Mann, ”What Is Complexity?” in Complexity Complexity, vol. 1 (New York: John Wiley and Sons, 1995).
5. 5. The human genetic code has approximately six billion (about 10 The human genetic code has approximately six billion (about 1010) bits, not considering the possibility of compression. So the 1027 bits that theoretically can be stored in a one-kilogram rock is greater than the genetic code by a factor of 10 bits that theoretically can be stored in a one-kilogram rock is greater than the genetic code by a factor of 1017. See note 57 below for a discussion of genome compression.
6. 6. Of course, a human, who is also composed of an enormous number of particles, contains an amount of information comparable to a rock of similar weight when NOTES 509 we consider the properties of all the particles. As with the rock, the bulk of this information is not needed to characterize the state of the person. On the other hand, much more information is needed to characterize a person than a rock. Of course, a human, who is also composed of an enormous number of particles, contains an amount of information comparable to a rock of similar weight when NOTES 509 we consider the properties of all the particles. As with the rock, the bulk of this information is not needed to characterize the state of the person. On the other hand, much more information is needed to characterize a person than a rock.
7. 7. See note 175 in chapter 5 for an algorithmic description of genetic algorithms. See note 175 in chapter 5 for an algorithmic description of genetic algorithms.
8. 8. Humans, chimpanzees, gorillas, and orangutans are all included in the scientific cla.s.sification of hominids (family Humans, chimpanzees, gorillas, and orangutans are all included in the scientific cla.s.sification of hominids (family Hominidae Hominidae). The human lineage is thought to have diverged from its great ape relatives five to seven million years ago. The human genus h.o.m.o h.o.m.o within the within the Hominidae Hominidae includes extinct species such as includes extinct species such as H. erectus H. erectus as well as modern man ( as well as modern man (H. sapiens).
In chimpanzee hands, the fingers are much longer and less straight than in humans, and the thumb is shorter, weaker, and not as mobile. Chimps can flail with a stick but tend to lose their grip. They cannot pinch hard because their thumbs do not overlap their index fingers. In the modern human, the thumb is longer, and the fingers rotate toward a central axis, so you can touch all the tips of your fingers to the tip of your thumb, a quality that is called full opposability. These and other changes gave humans two new grips: the precision and power grips. Even prehominoid hominids such as the Australopithecine Australopithecine from Ethiopia called Lucy, who is thought to have lived around three million years ago, could throw rocks with speed and accuracy. Since then, scientists claim, continual improvements in the hand's capacity to throw and club, along with a.s.sociated changes in other parts of the body, have resulted in distinct advantages over other animals of similar size and weight. See Richard Young, ”Evolution of the Human Hand: The Role of Throwing and Clubbing,” from Ethiopia called Lucy, who is thought to have lived around three million years ago, could throw rocks with speed and accuracy. Since then, scientists claim, continual improvements in the hand's capacity to throw and club, along with a.s.sociated changes in other parts of the body, have resulted in distinct advantages over other animals of similar size and weight. See Richard Young, ”Evolution of the Human Hand: The Role of Throwing and Clubbing,” Journal of Anatomy Journal of Anatomy 202 (2003): 16574; Frank Wilson, 202 (2003): 16574; Frank Wilson, The Hand: How Its Use Shapes the Brain, Language, and Human Culture The Hand: How Its Use Shapes the Brain, Language, and Human Culture (New York: Pantheon, 1998). (New York: Pantheon, 1998).
9. 9. The Santa Fe Inst.i.tute has played a pioneering role in developing concepts and technology related to complexity and emergent systems. One of the princ.i.p.al developers of paradigms a.s.sociated with chaos and complexity is Stuart Kauffman. Kauffman's The Santa Fe Inst.i.tute has played a pioneering role in developing concepts and technology related to complexity and emergent systems. One of the princ.i.p.al developers of paradigms a.s.sociated with chaos and complexity is Stuart Kauffman. Kauffman's At Home in the Universe: The Search for the Laws of Self-Organization and Complexity At Home in the Universe: The Search for the Laws of Self-Organization and Complexity (Oxford: Oxford University Press, 1995) looks ”at the forces for order that lie at the edge of chaos.” (Oxford: Oxford University Press, 1995) looks ”at the forces for order that lie at the edge of chaos.”
In his book Evolution of Complexity by Means of Natural Selection Evolution of Complexity by Means of Natural Selection (Princeton: Princeton University Press, 1988), John Tyler Bonner asks the questions ”How is it that an egg turns into an elaborate adult? How is it that a bacterium, given many millions of years, could have evolved into an elephant?” (Princeton: Princeton University Press, 1988), John Tyler Bonner asks the questions ”How is it that an egg turns into an elaborate adult? How is it that a bacterium, given many millions of years, could have evolved into an elephant?”John Holland is another leading thinker from the Santa Fe Inst.i.tute in the emerging field of complexity. His book Hidden Order: How Adaptation Builds Complexity Hidden Order: How Adaptation Builds Complexity (Reading, Ma.s.s.: Addison-Wesley, 1996) includes a series of lectures that he presented at the Santa Fe Inst.i.tute in 1994. See also John H. Holland, (Reading, Ma.s.s.: Addison-Wesley, 1996) includes a series of lectures that he presented at the Santa Fe Inst.i.tute in 1994. See also John H. Holland, Emergence: From Chaos to Order Emergence: From Chaos to Order (Reading, Ma.s.s.: Addison-Wesley, 1998) and Mitch.e.l.l Waldrop, (Reading, Ma.s.s.: Addison-Wesley, 1998) and Mitch.e.l.l Waldrop, Complexity: The Emerging Science at the Edge of Order and Chaos Complexity: The Emerging Science at the Edge of Order and Chaos (New York: Simon & Schuster, 1992). (New York: Simon & Schuster, 1992).
10. 10. The second law of thermodynamics explains why there is no such thing as a perfect engine that uses all the heat (energy) produced by burning fuel to do work: some heat will inevitably be lost to the environment. This same principle of nature holds that heat will flow from a hot pan to cold air rather than in reverse. It also posits that closed (”isolated”) systems will spontaneously become more disordered over time-that is, they tend to move from order to disorder. Molecules in ice chips, for example, are limited in their possible arrangements. So a cup of ice chips has less entropy (disorder) than the cup of water the ice chips become when left at room temperature. There are many more possible molecular arrangements in the gla.s.s of water than in the ice; greater freedom of movement equals higher entropy. Another way to think of entropy is as multiplicity. The more ways that a state could be achieved, the higher the multiplicity. Thus, for example, a jumbled pile of bricks has a higher multiplicity (and higher entropy) than a neat stack. The second law of thermodynamics explains why there is no such thing as a perfect engine that uses all the heat (energy) produced by burning fuel to do work: some heat will inevitably be lost to the environment. This same principle of nature holds that heat will flow from a hot pan to cold air rather than in reverse. It also posits that closed (”isolated”) systems will spontaneously become more disordered over time-that is, they tend to move from order to disorder. Molecules in ice chips, for example, are limited in their possible arrangements. So a cup of ice chips has less entropy (disorder) than the cup of water the ice chips become when left at room temperature. There are many more possible molecular arrangements in the gla.s.s of water than in the ice; greater freedom of movement equals higher entropy. Another way to think of entropy is as multiplicity. The more ways that a state could be achieved, the higher the multiplicity. Thus, for example, a jumbled pile of bricks has a higher multiplicity (and higher entropy) than a neat stack.
11. 11. Max More articulates the view that ”advancing technologies are combining and cross-fertilizing to accelerate progress e
<script>