Concentrating the Mind
Doing and Being
Anchored in the Ground of Being
There’s no such thing as ego.
Trying to define yourself is like trying to bite your own teeth.
Concentrating the Mind
Doing and Being
Anchored in the Ground of Being
Too many thoughts?
The burdensome practice of judging brings annoyance and weariness
SosanThe Third Zen Patriarch
Prior to the eighteenth century, the majority of the population (about 90 percent) was employed in the production of food -- agriculture and fishing, for instance -- and its distribution. This percentage had stayed constant for hundreds of years, the actual number increasing at about the same rate as the population itself. In the early 1800s its doubling time was around 45 years. With the advent of the Industrial Revolution, however, the increasing application of technology to farming led to a slowing in the growth rate of agricultural employment, and the curve began to bend over in the characteristic S-shape.
From the beginning of the nineteenth century the more developed nations have shown a steady increase in the number of people employed in industry and manufacturing -- a shift away from the processing of food towards the processing of minerals and energy. Employment statistics for the U.S.A. show that the growth of industrial employment was considerably faster than that of agricultural employment, doubling about once every 16 years, and by 1900 equal numbers of people (about 38 percent) were employed in each sector. In terms of employment, therefore, this date could be taken to mark the beginning of the Industrial Age in the U.S.A.
For the next seventy years, industry was the dominant activity in the U.S.A. Over the last few decades, however, the invention of computers and the consequent increase in information processing capacities has brought about another shift. The steady application of technology and automation to industry caused the rate of growth of industrial employment to slow, giving rise once again to an S-curve. At the same time the number of people employed in information processing in its various forms -- printing, publishing, accounting, banking, journalism, TV, radio, telecommunications as well as computing and its many ancillary occupations -- has been growing at an exponential rate. Its doubling time may now be as short as six years.
By the mid-1970s the number of people in the U.S.A. engaged in the processing of information had caught up with those engaged in industry -- the processing of energy and matter. From that time on, information processing has been our dominant activity. We had entered the "Information Age."
Although these developments refer specifically to the U.S.A., parallel changes can be found in most of the more developed nations. The less developed nations show similar tendencies, but they lag behind the more developed ones to varying degrees. These lags, however, will almost certainly decrease as time goes on.
While a developing country may be fifty years behind the West in reaching the stage at which industrial activity becomes dominant, it may only be ten years behind when it makes the transition to an information-dominant society. Japan is an example of a country which, despite a late start, has quite definitely caught up with the West. South Korea moved from Agricultural Age to Information Age in only fourteen years. Many of the Middle East, oil-rich nations, such as Kuwait and Saudi Arabia, are also making rapid strides. China, although still predominantly agricultural, may spend only a short time in the Industrial Age before shifting to an information society. And it may well be that other countries will skip the Industrial Age entirely -- at least as far as majority employment is concerned.
If we look back over human history, we can see that this trend toward a progressive linking of humanity seems to have been going on for millennia. The sudden surge of information technology in the present day can be seen as the fruit of millions of years of human effort.
The first major step toward interconnection came with the development of verbal language. This led to a profound and fundamental change in the way we gained knowledge about the world. All other creatures (with the possible exception of whales and dolphins) learn primarily from their own experience of life. A dog learns through what happens to it in its own life, it does not benefit significantly from the experiences of other dogs elsewhere in the world. But with the advent of symbolic language human beings could begin to share experiences and so learn not only from their own lives but also from others'.
This was a major evolutionary leap, as significant perhaps as the appearance of sexual reproduction 2 billion years ago. Two cells could come together and through the exchange of genetic information share their hereditory data-banks -- a breakthrough which, as we have seen, allowed new species to emerge thousands of times faster. Similarly, through language, human beings can exchange their own experiences and learnings, and the result has been a similar leap in the rate of evolution.
Language had allowed us to shift from biological evolution to the much faster evolution of mind. Not only did our ability to learn from each other enhance our individual lives, it also led us into the whole new arena of group evolution. We had become a collective learning system, building a collective body of knowledge that far exceeded the experience of any individual, but which any individual could, in principle access.
Through language we had made the step from isolated organisms to a collective organism -- much as a billion years ago single cells came to together to make the first multicellular creatures.
The rate of growth of this collective learning system was greatly enhanced by a series of breakthroughs in information technology. Today we tend to think of information technology in terms of computers and telecommunications, but these are themselves the consequence of a whole series of breakthroughs in information technologies dating back to the dawn of civilization.
The first great breakthrough was the invention of writing, some ten thousand years ago. Before writing, knowledge was handed down primarily by word of mouth, a process that was very open to distortions, omissions and misunderstandings. Writing made it possible to record our personal and cultural histories in a more reliable form and hand them down to future generations. The technological breakthrough of paper made records much easier to transport. We could share our knowledge with people in distant lands, linking human communities together.
The advent of the printing press in the fifteenth century further increased humanity's ability to disseminate written information. No longer did each copy of a manuscript have to be reproduced by hand -- a process that was both slow and prone to error -- thousands could be manufactured from the same original, and virtually simultaneously. In the first fifty years after the invention of printing, around 80 million books were produced. The philosophies of the Greeks and Romans were distributed, the Bible became widely accessible, and through various "how to" books the skills of many crafts were made more widely available, paving the way for the Renaissance.
The next major breakthrough occurred in the mid-nineteenth century. This was the development of electrical communication in the form of telegraph, and later the telephone. The time taken to transmit a message across the world suddenly dropped from days or weeks to minutes and then fractions of a second.
Fifty years later another breakthrough occurred through the use of radio waves as the transmission medium. This freed people from the need to be physically linked by cable and simultaneously made it possible to transmit a message to large numbers of people, that is, to broadcast information. Since then, radio and its offshoot, television, which literally gave us the ability to "see at a distance", have expanded rapidly, enabling the individual to be an eyewitness to events happening around the world.
At the same time that radio and television were spreading across the planet, another equally important development in information technology was occurring: electronic computers.
Although cumbersome and slow by today's standards, these devices nevertheless represented a huge leap forward in terms of information processing power. During the 1960s and 1970s, dramatic strides were made in the computer's processing capacity and speed. Simultaneously the physical size of computers shrank remarkably.
The microprocessor, or "chip" as it is commonly called, represented a major revolution in computing technology. Less than a quarter of an inch in size, the average chip of 1990 contained more computing power than all the computers of 1950 put together, and this capacity has been doubling every year. In addition to the many advantages of its minute size, the chip's energy consumption is astoundingly low. The average computer of 1970 used more energy than 5000 pocket calculators of similar computing capacity a mere ten years later. The information/energy ratio has been steadily increasing and is now rocketing. We are able to do more and more with less and less.
At the same time the cost of information processing has been falling dramatically. Computing power is often measured in millions of instructions per second (MIPS). The first transistorized computers of the 1950s (IBM's 7090, for example) managed to reach about 1 MIP, and cost a million dollars. When the early integrated circuit computers of the late sixties, such as DEC's PDP 10, reached 10 MIPS, the price per MIP had fallen to $100,000. The Apple II, which heralded the personal computer revolution in the mid-1970s, brought the cost down to below ten thousand dollars per MIP. By 1990 the average PC cost around $1000 per MIP, while supercomputers like the Cray 3, operating at 100,000 MIPS cost about $10 million, or $100 per MIP. In 1994 personal computing power was also approaching $100 per MIP. And it will continue falling in the future. By the year 2000 you will probably be able to buy the equivalent computing power of a million dollar IBM 7090 for ten dollars or less.
Whereas in 1970 computers were used almost solely by large institutions such as governments and corporations, the microprocessor -- a microchip that is a computer in itself -- has made it possible for the technology of computers and data processing to be available, potentially, to anyone on the planet without draining the planet of its vital energy resources. If comparable changes had been made in various aspects of the automobile over the last twenty years, a Rolls Royce would now cost fifty cents. It would be less than a tenth of an inch long, have a gasoline consumption of tens of million miles per gallon, cruise at a hundred thousand miles per hour, and never need servicing! These tiny chariots would also be so commonplace as to be unremarkable. By the early 1990s there were more than 100 million personal computers in the world, and they were rolling off production lines at the rate of more than 100,000 per day.
By the mid-1970s other networks had emerged, and began to interlink with ARPAnet. This new network of networks became known as the "internetwork" -- and soon just "Internet". The net, as it is also sometimes called, continued to expand rapidly as many other host computers from around the world connected into it (see Figure 9). By 1994 Internet had grown into a massive web of networks with more than two million host computers and an estimated forty million users -- and its size was doubling every year.
The number of bulletin boards, "places" on the network where people can access data on specialized subjects, have discussions, and meet others of like interests has likewise exploded. In 1987 there were 6,000 bulletin boards (or BBSs). By 1994 there were close to 60,000, and the number was doubling every eighteen months.
Over the same period commercial services such as Compuserve, America Online, and Prodigy have also grown rapidly, bringing thousands of databases, computer shopping, newspapers, magazines, educational courses, airline schedules and e-mail directly into millions of homes.
Such prodigous rates of growth cannot continue far into the future. If internet were to continue doubling every year, it would reach more than a billion people by 1999 -- which is more than the probable number of people able to afford the luxury of a personal computer and an Internet account. Well before then the growth curve will begin turning into an S-curve. The further growth of the net will not then be in the number of connections, but in the versatility and richness of the connections.
As this global network continues to grow and evolve it will undoubtedly change in many ways. As I write there is much talk of the various crises and challenges facing Internet. There is growing congestion, it is running our of address space, multimedia is taxing its resources, it is still far from user-friendly, there are serious issues concerning privacy and security of data, and it will inevitably become increasingly commercialized. But as we saw in Chapter 5, a system in crisis is not necessarily a dying system. Crises can be important evolutionary drivers pushing the system into new levels of organization, and triggering the emergence of new forms and processes.
Already the Internet has proved capable of evolving into a much more complex and diverse structure than that contemplated by its original creators, and, since nobody can turn it off, it will continue to evolve. New technologies, new communication protocols, new software and other developments will make the net of ten years time as hard to imagine today as laptop computers talking to each other across the globe were twenty years ago.
In the years to come it is not only MIPS that will be important but bandwidth -- how fast data can be transmitted through the network. One optic fiber has the potential for 25 gigahertz -- which is about the volume of information that flows over the telephone lines in the U.S.A. during the peak moment on Mother's Day, or about 1,000 times more information than all the radio frequencies combined. All that on one thread of glass the width of a human hair.
Just as the price of MIPS has fallen dramatically over the last few decades, so too will the price of bandwidth. George Gilder, author of LIfe after Television and Telecosm, has pointed out that every major revolution has seen the cost of some commodity fall markedly and eventually become virtually free. With the Industrial Revolution physical force became virtually free compared with its cost when derived from animal or human muscle. Suddenly a factory could work 24 hours a day churning out products in a way that was incomprehensible before. Physical force became so cheap that rather than having to economize on its use, we could afford to "waste" it in moving walkways, electric toothbrushes and leaf blowers. Over the last 30 years we have seen the price of a transistor drop from one dollar to one four-thousandth of a cent. We no longer have to economize on the use of transistors, but can "waste" them to correct our spelling, play solitaire, or create fancy backgrounds on our computer screen. As the telecommunications revolution begins to bite we will see a similar drop in the cost of bandwidth. When that is virtually free, we will be able to afford to "waste" that too. We will be able to broadcast information through the net much as we now broadcast radio and television through the air.
Developments such as these seem to be taking us ever-more rapidly towards what William Gibson in his award-winning novel Neuromancer called cyberspace:
A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the non-space of the mind, clusters and constellations of data.
In Gibson's world, people enter cyberspace by feeding computer-generated virtual reality displays of information directly into their brains. Science fiction? Yes. But so was a trip to the moon fifty years ago.
The parallels are certainly worthy of consideration. We have already noted that there are, very approximately, the same number of nerve cells in a human brain as there are human minds on the planet. And there are also some interesting similarities between the way the human brain grows and the way in which humanity is evolving.
The embryonic human brain passes through two major phases of development. The first is a massive explosion in the number of nerve cells. Starting eight weeks after conception, the number of neurons explodes, increasing by many millions each hour. After five weeks, however, the process slows down, almost as rapidly as it started. The first stage of brain development, the proliferation of cells, is now complete. At this stage the fetus has most of the nerve cells it will have for the rest of its life.
The brain then proceeds to the second phase of its development, as billions of isolated nerve cells begin making connections with each other, sometimes growing out fibers to connect with cells on the other side of the brain. By the time of birth, a typical nerve cell may communicate directly with several thousand other cells. The growth of the brain after birth consists of the further proliferation of connections. By the time of adulthood many nerve cells are making direct connections with as many as a quarter of a million other cells.
Similar trends can be observed in human society. For the last few centuries the number of "cells" in the embryonic global brain has been proliferating. But today population growth is slowing, and at the same time we are moving into the next phase -- the linking of the billions of human minds into a single integrated network. The more complex our global telecommunication capabilities become the more human society is beginning to look like a planetary nervous system. The global brain is beginning to function.
With near-instant linkage of humanity through this communications technology, and the rapid and wholesale dissemination of information, Marshall McLuhan's vision of the world as a global village is fast becoming a reality. From an isolated cottage in a forest in England, I can dial a number in Fiji, and it takes the same amount of time for my voice to reach down the telephone line to Fiji as it does for my brain to tell my finger to touch the dial. As far as time to communicate is concerned, the planet has shrunk so much that the other cells of the global brain are no further away from our brains than are the extremities of our own bodies.
At the same time as the speed of global interaction is increasing, so is the complexity. In 1994 the worldwide telecommunications network had a billion telephones. Yet this network, intricate as it might seem, represents only a minute fraction of the communication terminals in the brain, the trillions of synapses through which nerve cells interact. According to John McNulty, a British computer consultant, the global telecommunications network of 1975 was no more complex than a region of the brain the size of a pea. But overall data-processing capacity is doubling every two and a half years, and if this rate of increase is sustained, the global telecommunications network could equal the brain in complexity by the year 2000. If this seems to be an incredibly rapid development, it is probably because few of us can fully grasp just how fast things are evolving.
The changes that this will bring will be so great that their full impact may well be beyond our imagination. No longer will we perceive ourselves as isolated individuals; we will know ourselves to be a part of a rapidly integrating global network, the nerve cells of an awakened global brain.
Earth and Environment
| Science and Consciousness
| Spiritual Awakening
| Waking Up In Time
| From Science to God
| Mindfulness of Being