Towards a Global Brain
To fully understand the significance of today's developments in the area of communications, we need to go back in time to consider the social changes that have occurred over the last two hundred years. In this short period, the thrust of human activity has altered significantly.
Prior to the eighteenth century, the majority of the population (about 90 percent) was employed in the production and distribution of food. This percentage had stayed constant for hundreds of years, the actual number increasing at about the same rate as the population itself. With the advent of the Industrial Revolution, however, the increasing application of technology to farming led to a slowing in the growth rate of agricultural employment, and the curve began to bend over in the characteristic S-shape.
From the beginning of the nineteenth century the more developed nations have shown a steady increase in the number of people employed in industry and manufacturing—a shift away from the processing of food towards the processing of minerals and energy. Employment statistics for the U.S.A. show that the growth of industrial employment was considerably faster than that of agricultural employment. Whereas agricultural employment at that time was doubling every 45 years, industrial employment was doubling about once every 16 years. By 1900 equal numbers of people (about 38 percent) were employed in each sector. In terms of employment this date could be taken to mark the beginning of the Industrial Age in the U.S.A.
For the next seventy years, industry was the dominant activity. Over the last few decades, however, the invention of computers and the consequent increase in information processing capacities has brought about another shift. The steady application of technology and automation to industry caused the rate of growth of industrial employment to slow, giving rise once again to an S-curve. At the same time the number of people employed in information processing in its various forms—printing, publishing, accounting, banking, journalism, TV, radio, telecommunications as well as computing and its many ancillary occupations—has been growing at an exponential rate. Its doubling time may now be as short as six years.
By the mid-1970s the number of people engaged in the processing of information had caught up with those engaged in industry—the processing of energy and matter. From that time on, information processing has been our dominant activity. We had entered the "Information Age."
Although these developments refer specifically to the U.S.A., parallel changes can be found in most of the more developed nations. The less developed nations show similar tendencies, but they lag behind the more developed ones to varying degrees. These lags, however, will almost certainly decrease as time goes on.
While a developing country may be fifty years behind the West in reaching the stage at which industrial activity becomes dominant, it may only be ten years behind when it makes the transition to an information-dominant society. Japan is an example of a country which, despite a late start, has rapidly caught up with the West. South Korea moved from Agricultural Age to Information Age in only fourteen years. Many of the Middle East, oil-rich nations, such as Kuwait and Saudi Arabia, are also making rapid strides. China, although still predominantly agricultural, may spend only a short time in the Industrial Age before shifting to an information society. And it may well be that other countries will skip the Industrial Age entirely—at least as far as majority employment is concerned.
As more and more nations of the world move into the Information Age, telecommunications and information processing will lead humanity to become increasingly integrated and interconnected. Although it may seem that this interlinking has come upon us very suddenly, it is the result of a trend towards greater interconnectivity that is as old as humanity itself.
The first major step in this interlinking came many thousands of years ago with the development of verbal language. This led to a profound and fundamental change in the way we gained knowledge about the world. All other creatures (with the possible exception of whales and dolphins) learn primarily from their own experience of life. A dog learns through what happens to it in its own life, it does not benefit significantly from the experiences of other dogs elsewhere in the world. But with the advent of symbolic language human beings could begin to share experiences and so learn not only from their own lives but also from others'.
This was a major evolutionary leap, as significant perhaps as the appearance of sexual reproduction 2 billion years ago. Two cells could come together and through the exchange of genetic information share their hereditary data-banks—a breakthrough which, as we have seen, allowed new species to emerge thousands of times faster. Similarly, through language, human beings can exchange their own experiences and learnings, and the result has been a similar leap in the rate of evolution.
Language had allowed us to shift from biological evolution to the much faster evolution of mind. Not only did our ability to learn from each other enhance our individual lives, it also led us into the whole new arena of group evolution. We had become a collective learning system, building a collective body of knowledge that far exceeded the experience of any individual, but which any individual could, in principle access. Through language we had made the step from isolated organisms to a collective organism—much as a billion years ago single cells came to together to make the first multicellular creatures.
The rate of growth of this collective learning system was greatly enhanced by a series of breakthroughs in information technology. Today we tend to think of information technology in terms of computers and telecommunications, but these are themselves the consequence of a whole series of breakthroughs in information technologies dating back to the dawn of civilization.
The first great breakthrough was the invention of writing, some ten thousand years ago. Before writing, knowledge was handed down primarily by word of mouth, a process that was very open to distortions, omissions and misunderstandings. Writing made it possible to record our personal and cultural histories in a more reliable form and hand them down to future generations. The technological breakthrough of paper made records much easier to transport. We could share our knowledge with people in distant lands, linking human communities together.
The advent of the printing press in the fifteenth century further increased humanity's ability to disseminate written information. No longer did each copy of a manuscript have to be reproduced by hand—a process that was both slow and prone to error—thousands could be manufactured from the same original, and virtually simultaneously. In the first fifty years after the invention of printing, around 80 million books were produced. The philosophies of the Greeks and Romans were distributed, the Bible became widely accessible, and through various "how to" books the skills of many crafts were made more widely available, paving the way for the Renaissance.
The next major breakthrough occurred in the early-nineteenth century. This was the development of electrical communication in the form of telegraph ("writing at a distance"), and later the telephone ("hearing at a distance"). The time taken to transmit a message across the world suddenly dropped from weeks to seconds.
Fifty years later another breakthrough occurred through the use of radio waves as the transmission medium. This freed people from the need to be physically linked by cable. Information could be broad-cast to large numbers of people simultaneously. Since then, radio and its offshoot, television ("seeing at a distance"), have expanded rapidly, enabling the individual to be an eyewitness to events happening around the world.
At the same time that radio and television were spreading across the planet, another equally important development in information technology was occurring: electronic data processing.
The Digital Revolution
The first computer was built during World War II, to help the British intelligence service crack military codes. After the war there came a growing need to perform other complex calculations much more rapidly than could be done by paper and pencil or adding machine. To fulfill these needs technicians designed electronic calculators. Although slow and cumbersome by today's standards, these devices represented a huge leap forward of information processing power.
The first generation of commercial computers used vacuum tubes as the switches. The second generation, which followed in the early 1960s used the newly developed transistors. These were faster, smaller, and did not get so hot. Even so, they were pretty large. I worked for a while on one of these machines in the mid-sixties. It occupied the space of a small apartment, had 8K of memory, in which had to reside all the data and software (you wrote tight code in those days), and took 40 minutes to boot up each morning.
The third generation used integrated circuits; transistors and other parts of the circuitry were carved into silicon chips. And the fourth generation, employing large-scale integration, put the whole of the central processor of a computer onto a single chip. Today multiple processors are put on a chip, increasing computer power still further.
As successive generations of computers moved from the switching of relays to the switching of vacuum tubes to the switching of energy levels in crystals , so computers have very rapidly decreased in size, and equally rapidly increased in power. A laptop today has more memory, more flexibility, more functions, more versatility, and is far faster than any computer in existence in 1970.
The exponential increase in computer memory is reflected in Moore's Law which holds that memory capacity doubles every eighteen months. Computer speed follows a similar pattern. Twenty years ago a 10 megahertz chip was the norm; ten years ago it was several hundred megahertz. Today we have several gigahertz chips in our PCs, allowing real time speech recognition, simultaneous translation and a host of other capabilities that were science fiction only a few years back.
As the power-to-size ratio of computers has exploded so have their numbers. When the first computers were built in the mid-1940s, Thomas Watson, the founder and chairman of the company International Business Machines, said "I think there is a world market for maybe five computers," and decided the business was too small to be worth exploring. Twenty years later the company had shortened its name to IBM, and become the largest computer company in the world. Today hundreds of thousandsof computers are being manufactured each day. And that is just the computers we see. Many times that number are embedded in cars, cameras, entertainment systems, and household appliances, all of them geared to increasing our efficiency—and hence to pushing the pace of life faster and faster.
An equally significant development has been the direct linking of computers. The first computers were independent units, interacting only with their human operators (there were no operating systems in those days, you had to load files by hand). In the 1960's computer scientists began connecting machines together in the laboratory so that they could exchange data directly. In 1969 the U.S. Advanced Research Projects Agency (ARPA) linked computers across much longer distances, from one research institution to another.
By the mid-1970s other networks had emerged, and began to link with ARPAnet. In 1983, a connection "backbone" was established connecting computer networks from many US universities and other institutions.This network of networks became known as the "internetwork"—and soon just "Internet". It grew rapidly as many other host computers from around the world connected into it. Today it has become a global web of networks that is changing our lives in a multitude of ways. We are now in the middle of a information-telecommunication revolution that is happening many times faster than the Industrial Revolution, and is changing the world in an equally, if not more, profound way. And the seed of this revolution was the WorldWide Web.
The web began in 1990, when a group at CERN in Geneva created a way for a file in one computer to directly cross-reference, or link, a file in another computer. Soon people realized that this linking need not be limited to text. Images could also be included, making the interface much more user friendly. In 1994 the word got out to the general public. In 1995 feature stories about the web were appearing in the press and on television. By 1998, there was hardly an organization without a home page. Today it is being used for everything from research, commerce, auctions, corporate imaging, news delivery, and music and video distribution. With a doubling time of just 60 days, the WorldWide Web has become the fastest growing phenomenon in human history.
Just where this revolution will take us in anybody's guess. Just fifteen years ago, when the WorldWide Web was in its infancy, no one realized the impact it would have on human society. So how we can today expect to foresee its future fifteen years hence. Indeed given that the rate of development is continually accelerating, it is probably impossible for us to foresee where it well be even fire years from now. The only certainty is that the web will continue to grow and evolve, opening up new arenas of exploration and development far beyond our current imagination.
Such prodigious growth is bringing its own problems, not the least of which is congestion. The growing focus on multimedia has dramatically increased file sizes (a 3-second video clip can be several thousand times the size of a page of text, and so needs a proportionately greater chunk of data to be transferred across the net). At the same time, the number of users has been growing from tens of millions to more than a billion. As a result, the volume of traffic passing through the net is exploding. Try as they may, the telephone companies cannot lay new cable fast enough to keep up. The result is info-traffic jams. Add to this the ever-present danger of computer viruses, and the pressures which increasing commercialization of the net will bring, and it soon becomes apparent that the Internet is now in crisis.
However, as we saw in Chapter 5, a system in crisis is not necessarily a dying system. Crises can be important evolutionary drivers pushing the system into new levels of organization, and triggering the emergence of new forms and processes. Already the Internet has proved capable of evolving into a much more complex and diverse structure than that contemplated by its original creators, and, since nobody can turn it off, it will continue to evolve. New technologies, new software and other developments will make the net of five years' time as hard to imagine today as laptop computers' talking to each other across the globe was twenty years ago.
The Emerging Global Brain
The interlinking of humanity that began with the emergence of language has now progressed to the point where information can be transmitted to anyone, anywhere, at the speed of light. Billions of messages continually shuttling back and forth, in an ever-growing web of communication, linking the billions of minds of humanity together into a single system. Is this Gaia growing herself a nervous system?
The parallels are certainly worthy of consideration. We have already noted that there are, very approximately, the same number of nerve cells in a human brain as there are human minds on the planet. And there are also some interesting similarities between the way the human brain grows and the way in which humanity is evolving.
The embryonic human brain passes through two major phases of development. The first is a massive explosion in the number of nerve cells. Starting eight weeks after conception, the number of neurons explodes, increasing by many millions each hour. After five weeks, however, the process slows down, almost as rapidly as it started. The first stage of brain development, the proliferation of cells, is now complete. At this stage the fetus has most of the nerve cells it will have for the rest of its life.
The brain then proceeds to the second phase of its development, as billions of isolated nerve cells begin making connections with each other, sometimes growing out fibers to connect with cells on the other side of the brain. By the time of birth, a typical nerve cell may communicate directly with several thousand other cells. The growth of the brain after birth consists of the further proliferation of connections. By the time of adulthood many nerve cells are making direct connections with as many as a quarter of a million other cells.
Similar trends can be observed in human society. For the last few centuries the number of "cells" in the embryonic global brain has been proliferating. But today population growth is slowing, and at the same time we are moving into the next phase—the linking of the billions of human minds into a single integrated network. The more complex our global telecommunication capabilities become the more human society is beginning to look like a planetary nervous system. The global brain is beginning to function.
This awakening is not only apparent to us, it can even be detected millions of miles out in space. Before 1900, any being curious enough to take a planetary EEG (i.e., to measure the electromagnetic activity of the planet) would have observed only random, naturally occurring activity, such as that produced by lightning. Today, however, the space around the planet is teeming with millions of different signals, some of them broadcasts to large numbers of people, some of them personal communications, and some of them the chatter of computers exchanging information. As the usable radio bands fill up, we find new ways of cramming information into them, and new spectra of energy, such as light, are being utilized, with the potential of further expanding our communication capacities.
With near-instant linkage of humanity through this communications technology, and the rapid and wholesale dissemination of information, Marshall McLuhan's vision of the world as a global village is fast becoming a reality. From an isolated cottage in a forest in England, I can dial a number in Fiji, and it takes the same amount of time for my voice to reach down the telephone line to Fiji as it does for my brain to tell my finger to touch the dial. As far as time to communicate is concerned, the planet has shrunk so much that the other cells of the global brain are no further away from our brains than are the extremities of our own bodies.
There are also parallels between the evolution of the global brain and the evolution of mental functions. The first nervous systems made simple connections between different parts of the organism—between sensors and muscles, for example—that allowed basic reflex reactions. In a similar way, the early Internet allowed data transfer from one machine to another, but little more.
In more complex organisms nerve cells gathered into ganglia and then into rudimentary brains. This integration of nervous pathways led, among other things, to the emergence of memory—which as far as we can tell seems to be distributed throughout the brain. Memory tends to be associative; if I see a dog it may trigger my memory of my own dog, and the need to call the vet, which in turn may trigger memories of a fictitious vet in a television series, which may trigger further associations. The WorldWide Web, which has now become the repository for all human knowledge, would seem to provide a similar function on a global level. Data is not located in any single place, but is distributed amongst tens of millions of host computers across the planet. A link on any of the hundreds of billions of pages on the web will call up some or other associated page; moreover, just as human recall may take the form of a thought, a visual image, a sound, or some other modality, a link on the web may call up text, images, sounds, video, virtual reality, or some combination of them.
The web's associative memory has been augmented by search engines, which index and collate information across the net. These are rapidly becoming more sophisticated, prioritising the links returned according to content, popularity, the user's profile, and other factors. Software agents (small programs that can travel to different nodes of the net, selecting information and sending it back to the user), expert systems, and other emerging technologies will likely lead to a web that does more than just remember. It will be able to form new associations, synthesize information creating new knowledge, and perhaps solve problems presented to it. It will then have become a system that can learn and think for itself.
The changes this will bring will be so great that their full impact may well be beyond our imagination. No longer will we perceive ourselves as isolated individuals; we will know ourselves to be a part of a rapidly integrating global network, the nerve cells of an awakening global brain.
Date created: July 26, 2007