© 1997– 2005 Ben Delaney. All rights reserved.

This essay, mostly written in 1997, discusses the impact of various technology trends on the way people will look, act, work, and play in the mid-twenty first century.

Beyond Darwin: The Destination of the Species
by Ben Delaney

Introduction
We have all heard Santyanna's exhortation that those who are unfamiliar with history are doomed to repeat it. However, it seems that history is increasingly irrelevant. It seems that our modern problems are new, different, and infinitely challenging. Are there still lessons to be learned from history?

I think so. In fact, the point of this book is to attempt to examine the history of technology, with the hope of creating a future history, one hopefully, that will help us avoid certain mistakes of the past, and enter the new century in a positive, enthusiastic fashion.

The next century promises to unveil the greatest changes civilization has ever seen. How we deal with those changes will effect humanity for centuries after. I believe that by looking back, evaluating current trends, and making what I hope are a few good guesses, we can see into the fog of the future, and plot a course that brings prosperity and happiness to all the world's people.

The confluence of great advances in computing, medicine, mechanical design, communications, and many other fields, promise to unloose ideas, technology, appliances, and living beings, that are so vastly different from what we are used to that they will seem to be magical. In fact, Arthur C. Clarke postulated that any significantly advanced technology would appear to be magic. We will see these "magical" technologies, we will live with them, and they will change each and every one of us, and our culture and civilization, forever.

So what? Haven't we muddled through pretty well for the last few thousand years? What could possibly be so different?

Plenty. I don't think our study of history has prepared us for the changes that many of us will witness in the next 50-100 years. I am convinced that the increasing power and ubiquity of computers and the cybernetic intelligence they embody will have an effect on civilization as great as fire, writing, the printing press, and electricity. In fact, as we look at the coming convergence of computing power far beyond current standards, mechanical technology, biotech advances, new sources of energy, and communication enhancements, we see the outline of a world with the potential to be completely awful, or uniquely utopian. Undoubtedly, it will turn out somewhere in between. However, I hope that by putting these ideas down, I will be able to help stimulate the debates that may create a consensus of how we should proceed, based on the understanding of past and future history.

To say we are at a crossroads is like saying an atomic explosion is a bit noisy. I fully believe, and I think I can demonstrate, that we are entering an era when our very concepts of what it means to be a human being will be called to question. Our world is on the verge being turned upside down. It is essential that we have a clear understanding of how past civilizations, both those that were successful and those that were not, dealt with such turmoil. More importantly, it is critical that we look forward, beyond the fiscal quarter-by-quarter focus of financial reports, beyond the millennium fever that grips us, beyond computing itself, to the days when computers disappear, not because they are gone, but conversely, because they are so ubiquitous that we no more think about them than we do the electricity upon which modern civilization depends.

This book won't spend much time looking at the past, except to search for important lessons that we need to remember. Others can provide you with a more thorough and scholarly viewpoint than I. But from my vantage point, having spent the past twenty-five years working with and chronicling some of the most amazing advances in computing and technology, I will attempt to provide the future history of the 21st century.

Many of the ideas you will find here are not new, and not my own. I have been blessed by meeting many greater thinkers than I, whose ideas I have taken freely, not to own, but for their insight. The frailties of human memory means that many of the sources for this material have been forgotten. However, the references and bibliography at the end of this book will provide you with many of the original sources. To those who I have forgotten or overlooked: you have my gratitude and debt.


The critical issues of the mid-21st century
Several key technology trends are converging to portend immense changes in store for those alive in 2050. These technologies, and their progeny, will change life on Earth forever. They include:

  • Computers got as "smart" as people. Gordon Moore was right. How will this change the world?
  • Nanotechnology changes manufacturing. The ability to build really small machines, with dexterity and intelligence, will enable us to "grow" product, molecule by molecule.
  • Bio-technology enables us to take control of evolution. Darwin explained how we got where we are. But he had no idea of what will happen next.
  • Energy gets really cheap. Energy probably won't ever be free, but it will get really inexpensive, and clean, too.

Let's look briefly at these developments, and why they will be so important.

Computers get smart. Really smart.
In 1965, Gordon Moore, who was later a co-founder of Intel, made the pronouncement that has been dubbed "Moore's Law". He said then that microchips, the hearts, brains, and brawn of modern computers, would double in complexity every 18-24 months. Since then, now more than thirty years later, his prophesy has been proven over and over. Chips have become more complex, and with that complexity, paradoxically, they have become less expensive, both to build, and in terms of energy requirements. Adding complexity to a computer chip means adding additional circuits, and adding circuits means adding capacity. It is a straight-forward extrapolation to see that somewhere around 2045 microcomputers will be as complex as the human brain. The questions raised by this inevitability are complex, fascinating, and more than a little frightening.

Of course, these highly-evolved computers won't be isolated. The Internet as we know it today is just the beginning of a solar-system wide network that will share resources as needed, using idle CPUs to assist in tasks that need more horsepower. Imagine 1024 CPUs, each with the power of a human brain, networked together in a parallel computing system. It is difficult to imagine what such systems might accomplish, but we can expect them to be trained on some of our knottiest problems.

The fear of automation that was evident in the 19th century, as the Industrial Revolution took hold, and in the 1950's, when computers first entered the popular consciousness, is about to raise its head again. Only in the 1990's have we started to see people's jobs actually eliminated due to the power and low cost of computing resources. This trend will continue. In the 21st century computers will continue to take over jobs. At first it will be the mean, the dirty, and the dangerous jobs that disappear. But it won't be long before computing systems are doing more and more of the work that we use to justify our existence, and provide our sense of self-esteem. At that point, huge social and economic upheaval is likely.

So Moore's mid-twentieth-century prophesy leads us to one of the great challenges of the twenty-first century; finding meaningful work, and meaningful lives, for a growing number of people in every part of the world. As computers take over what most of us do for work, we will have to re-define what work is: what is for, why we do it, and what we are paid for.


The Aliens are here!
We have already seen the "invasion" of our world by an alien intelligence. However, unlike the sci-fi movies of the fifties, this alien intelligence is one of our own making. In 1997 we are already surrounded by microprocessors: in cars, appliances, calculators, elevators, phones, entertainment equipment.

Some cars already incorporate hundreds of microprocessors, or more than the total computing power of the world in 1960. Your personal stereo has more brains than the Apollo missions to the moon. In your office you are surrounded by computing devices: calculators, fax machines, copiers, printers, network routers, digital phone systems, PA systems, elevator controls, and oh yes, your personal computer, which itself contains a couple of dozen processors, aside from the CPU (central processing unit).

The phone system, the Internet, the power gird, and wireless communications in dozens of spectra are connecting all these devices at an accelerating rate. The Personal Area Network (PAN) which will use the conductive properties of our very bodies to connect various computing devices that are worn like jewelry, is in the testing phase now. Soon, all the microprocessors in our world will be talking to each other.

But compared to today's level of computerization, the next decades will see a blossoming of silicon into areas never before considered. By 2020, I expect that nearly every manufactured item will include "intelligence", and every intelligence is networked.

When computers are everywhere, where are the computers? Essentially, computers will "disappear". Just as electricity disappeared in the 20th century, thanks to its ubiquity, by the mid-twenty-first century, we won't have thinking machines on our minds. Even the word "computer" may disappear. We may be thinking about the automated machines that have taken over our lives, but expect them to have a new name, with entirely different connotations.

The most common interfaces to these invisible systems will be voice, facial expressions, and body language. These are well on the way today. It won't be long before keyboards, mice, and monitors have been replaced by voice I/O, video recognition of physical gestures, facial expression recognition, head-mounted displays, neural implants. The computer and user will be connected in an intimate, nearly telepathic embrace.

Beyond silicon - the next generation of computers
There is still lot's of evolutionary room for our computers to grow. Current silicon fabrication technologies are working at feature sizes of around .25 microns. It seems that these techniques can continue to shrink to about half that scale. (Smaller is better because there are shorter distances for the electrons to move in a circuit, and because smaller devices require less power to operate.) But it is unlikely that we can shrink the size of our micro-circuits much beyond that.

Quantum computing may be the next replacement for current silicon semiconductors. Quantum computing uses sub-atomic phenomena to replace transistors with "quantum wells" between which single electrons move. Their movement represents the logical binary on/off functions upon which computing is based. This technology has been demonstrated, and should be ready for application around 2015-2030, just as we reach the end of the line for semi-conductor fabrication capabilities.

Quantum computers will operate at speeds approaching that of light itself, and allow us to pack features much more densely that now possible. Their low power requirements may make battery replacement an obsolete concept. Their basic characteristics - small size, high speed, low power requirements - will make ubiquitous computing not just useful, but cheap and practical.

At the same time that quantum computers are being developed, other scientists are heading in a significantly different direction in computer design. Combining the disciplines of neurobiology and computational design, these researchers are deconstructing the brain, and using that knowledge to build biologically-based computing devices. These systems use artificial cells designed to change state on demand, that provide the basic building blocks for bio-computers.

"Wetware" systems based on bio-computers will provide certain capabilities that strictly mechanical systems can not. Probably their biggest advantage will be the ability to provide an interface between biological systems, such as human beings, and computing systems. These systems should start to appear near the middle of the next century, and may provide truly amazing, actually God-like abilities, such as telepathic communication.

There are two very interesting tangents to these developments: the "breeding" of computer systems, and the augmentation of human beings.

Breeding better computers
On of the side effects of artificial life (A-life) research is the development of artificial "genes" to control synthetic creatures. These genes act similarly to organic genetic material; they divide and recombine sexually or mutate asexually, and in the process form new genetic combinations. These new genes cause new characteristics in the creatures containing them. A sort of natural evolution in artificial life takes place, causing faster changes and more efficient life forms to be developed than would be possible through traditional programming techniques.

This forced evolution creates a-life critters that with more diversity and stronger survival traits that those that are hand-coded. An interesting side effect is that their programming (genetic structure) often becomes more complex and redundant. This mimics traits observed in natural systems. So when a goal is set for a-life, and evolution let loose, we may be surprised to find the goal met in entirely unexpected ways, with unimagined structures, and often with a more efficient ending. Soon, entire computer systems will be developed in this fashion.

The upside of this development is that computer systems will improve much faster than they would if development was limited to human invention. We will also find that our computing systems will develop unique traits that might otherwise have taken decades to be thought of and implemented. As automated design and manufacturing systems come to dominate computer manufacturing, will we see the rate of increase in capability and usability accelerate exponentially.


Everything virtual is real today
In the 70's, marketing Cassandra Faith Popcorn described a phenomenon she called "cocooning". With that she identified the increasing trend of middle-class consumers to stay home; for food, for fun, for safety, and for convenience. The 80's and 90's saw this trend take hold solidly.

Now we are seeing corporate cocooning. Businesses are encouraging their employees to work at home because it saves on expensive real estate and overhead, and because they have found that their tele-commuters actually are more productive that their interrupt-driven, office-bound staff. Companies are discouraging travel by installing email, intranets, voice mail, video conferencing and video phones, fax to the desktop, and many other collaborative work technologies. Engineering concerns are creating teams of engineers continents apart, who communicate via the Internet, and compare ideas and prototypes in virtual conference rooms. We have virtualized the desktop, the telephone, and business meeting. This trend, too, is here to stay.

The line between real and virtual is about to disappear. Just as we don't differentiate much between a phone conversation and a face-to-face talk, we will soon cease to see the difference between examining a real prototype and a virtual one. As electronic communication invalidates time zones and distance, we will no longer worry about what time it is in New York, Tokyo, or Johannesburg. We will shop in virtual stores, try on virtual clothes, visit virtual national parks, and make love with virtual partners. We will learn history by watching virtual events and visiting virtual historic sites. The only surely real things will be food and drink. And I'm not certain about them.

The virtualization of the world will change our attitudes and activities. It will provide new ways to work and play and new companions with which to do them.

Itty-bitty machines take over industry
In the early years of the 21st century, nanotechnology will start to live up to its promises. We will see the creation of artificial, self-reliant "bugs"; miniature robots that do what we ask. These "bugs" will be manufactured, but at atomic scales. The will be mechanical, but behave as if they are living creatures. They will be used for a variety of jobs, including medical treatments, manufacturing, building, mining, and refining.

The early '90s saw the first nano-machines; tiny gears and motors that demonstrated our ability to work at a sub-microscopic scale. These first machines did not do any useful work. But as we continue to develop and refine the techniques of building nano-machines, we will find work for them to do.

However, building tiny machines is not the end of nanotech development. It is only the beginning. The specialists who are developing this amazing skill have higher goals in mind. At NASA and in many research labs around the world, the dream is nano-machines that are self replicating; in other words, able to build themselves. Not only will this take people out of the loop, thanks to genetic algorithmic programming, these "breeding" nano-machines will actually evolve, and at a rate that puts nature to shame.

NASA envisions such systems that can be used to build infrastructure on the Moon, and elsewhere, that will allow people to live comfortably in hostile environments. They imagine sending a cargo of nano-machines to the moon, where the mini robots will find and excavate aluminum, iron, silicon, and other useful minerals, then smelt and refine basic building stuff from the raw ores. Once they have replicated to sufficient volume, they will start on their assigned tasks:
building housing, energy sources, oxygen and water extraction systems, and the rest of the what people need to survive. These microscopic servants will enable our exploration and colonization of the solar system, and perhaps, even further.

Nanotech may have its dark side, however. We can certainly expect at least one incident of a nano-virus epidemic, where rapid evolution and self-replicating programs create nano-machines that don't work as planned. Like bacteria, they may infect our larger machines, causing mechanical or computational breakdowns. It's conceivable that they could even inhabit human or animal bodies, resulting in bizarre new diseases. Just as our experiments in genetic engineering have resulted in surprises, autonomous nano-machines may also throw us a few curves.


Designer children, designer genes
If nanotechnology foretells the "breeding" of machines and products, biotechnology offers the ability to wipe out genetic diseases, design our children to custom specifications, and provide us with near immortality. The decoding of the human genome, one of the most massive international projects ever undertaken, is proceeding ahead of schedule, and will be done early in the next century. This incredible addition to our understanding of how people are put together will open a Pandora's box of exciting and frightening possibilities.

It seems that every week a new gene is identified as the source of a infirmity, or the solution to a problem. As our knowledge of the functions of individual genes increases, so will our understanding of their interrelationships. Genetic therapies will soon follow, providing cures for obesity, baldness, cystic fibrosis, muscular dystrophy, and hundreds of other genetic maladies. Cancer seems likely to succumb to genetic treatments, as we devise tumor-specific antibodies that are 100% effective.

Curing diseases won't be the only impact of genetic research. The more we learn about how cells work, about how they live and die, the closer we get to stopping them from dying entirely. The generation of middle-aged adults alive today may very well see their "normal" life spans stretched out to 150 years or more of healthy, productive life. Our children could live forever. Not virtually forever — really, FOREVER!

In addition to solving genetically-induced problems, bio-tech research is well on the way to cloning organs for replacements, and regenerating lost body parts. It won't be too long before a patient who needs a new kidney or liver will have one grown to order, genetically perfect, and genetically identical to her own tissue. This will eliminate rejection problems caused by the immune system, and will obviate the need for a lifetime of immune-suppressive drug therapy.


But wait! There's more!
Already, genetic testing enable pediatricians to test an embryo for genetic problems, and advise the parents regarding the likely prognosis. This technology may soon dispense with congenital diseases like Down's Syndrome and spinal bifida. But we know parents won't stop there.

In many parts of the world, male children are strongly preferred. Won't parents in those societies want to do a simple blood test to determine if their child to be is male or female? This technology exists today, and is being applied regularly. But why stop there? Wouldn't you love to have a red-headed, really smart girl with green eyes, a propensity for math and ballet, and a photographic memory? Well, then, Mr. and Mrs. Parent, just order her up. No time for pregnancy? Just make three deposits (sperm, ovum, and money), and in 7-9 months we'll call you to pick up your perfect little baby. You can even visit her during gestation in our artificial womb.

Why stop there? Maybe someone will decide that it would be useful to have people with four arms. Perhaps we would like our dogs to be smarter, say as smart as a chimpanzee, and help us out around the house. Maybe it would be handy to have chimpanzees that could talk, so we could use them as workers or servants. The direction of biotechnology will offer these options. The big issue will be how we decide what is acceptable.


Plug me in. It's time for Wapner.
Another major topic of bio-tech research is interfacing to the human nervous system. We already are able to grow neurons on the connection pads of silicon chips. We have created cochlear implants that bring sounds to some deaf people. Research is progressing well on tapping the optic nerve, to bring sight to the blind. Prosthetic limbs will soon be controlled by our thoughts, and provide feedback about the temperature, texture, and resistance of objects that they encounter. But none of that is the holy grail of neural implants.

Sometime in the next fifty years or so, we will physically plug into our computers. We will allow machines to augment our minds. Our failing memories, our feeble eyesight and hearing, our limited mathematical abilities, will all be miraculously improved by adding the intelligence and storage capacity of thinking machines to our brains. Thanks to implanted communication devices, we will have effective telepathy. Talking to a computer will seem incredibly inefficient when you can just think about what you need, and have information overlaid on your field of view, or integrated into your thoughts in ways that are nearly unimaginable now. By the end of the next century, not having an implant installed at birth might forever handicap a person. We will have incredible powers. We will be like gods.

Time to fill up the car again? Didn't we do that last year?
Like the weather, everyone talks about energy efficiency, but no one seems to do much about it. In reality, that's not the case at all. Though we have a huge infrastructure investment in internal-combustion-powered vehicles, and energy generation and transportation, and petroleum recovery, the big companies who have the most to lose as these structures change see the writing on the wall. Driven partly by the increasingly accepted fact of global warming, and by the desire to sell millions of cars in places with no gas stations, and the ever-increasing requirements for electricity as the world gets wired, new energy sources are cruising down the highway.

Most immediately, we will see increasing reliance on electric cars, and vehicles with hybrid power, such as diesel generators driving electric motors. These vehicles will provide a bridge to the next generation of fuel-cell powered electrics. Fuels cells can be made very inexpensively, if you are building millions of them, exhaust only water, and use the most abundant energy source available, hydrogen gas. They will be delivered in a variety of sizes and capacities, suitable for powering factories, homes, busses, ships, whatever. Burning stuff to make heat to make electricity will become a quaint custom remembered in history books.

Other sources of energy will continue to develop, too. Work on hydrogen fusion, the power of the stars themselves, is almost to the break-even point. Any day, we will hear of a fusion reactor that produces more energy than it consumes. In 50-100 years, fusion reactors may be common where really large amounts of energy are needed, such as factories, ships, space vehicles, and the like.

In addition, many Earth-friendly technologies are improving and becoming feasible. Wind power now supplies nearly 10% of the electricity in California, and is suitable in many remote locations. Solar cells are becoming more efficient, and in a few years will be common for small-scale electrical production. Increasingly more efficient generators will make hydropower even less expensive to produce.

In addition to new and more efficient ways to produce electricity, energy conservation will make a significant contribution to reducing the cost of power. Room-temperature superconductivity will probably be achieved within a few years. This promises to enable the building of vastly more efficient power transmission lines, energy storage facilities, generators, and motors. Thanks to the total lack of resistance in a superconductor, there is no energy wasted in overcoming the atomic friction of the wire. This makes superconductor-based machines far more energy efficient. A superconductor storage ring under your home may download a year's worth of power at a time, and siphon it off as you need it. Superconducting electromagnets will lift and propel trains capable of nearly unlimited speeds.

Cheap power plus increased efficiency equals an end to limits. How will you use all that power?

What does this all mean to you?
I think that the changes wrought by these new and evolving technologies will be as profound as those caused by the harnessing of fire, the invention of the wheel, and the development of moveable type. It is not too soon to start thinking about how we want our selves and our society to look in fifty years. By starting to think about these issues now, we will have time to make good decisions, and start building the foundation for our next evolutionary steps.

We won't be living in a Jetsons' world of 1950 moved to the space age. Our environment, our society, our families, our very bodies will be profoundly different then they have ever been. Our social structures, spirituality, and politics will all change too.

By around 2045, we will have developed chips as complex as a human brain. That is, robotic silicon architects will have passed designs to automated chip fabrication facilities, which will turn out chips as complex as the human brain. Those chips will be assembled into machines that do the work, control the power generation, direct traffic (after having built the cars), fly planes, do pharmaceutical research, et cetera, et cetera.

Where are the people in that picture?
That question may become the single biggest issue in the mid twenty-first century. Microprocessors are already so complex that people have very little to do with their actual design. Some parts of computers are so small that human fingers can't assemble them. Many of our major appliances, including automobiles, are built in semi or fully-automated factories. Factory workers are becoming burger flippers.

The service industries that are currently providing (in the USA) 75% or more of new jobs will be taken over next. Then what will the burger flippers do?

Perhaps more importantly, diagnostic programs are going to supplant doctors, silicon-para-legals will replace some lawyers (though there will probably be a law against that!), DTP programs won't need designers, and neural-networks will prove more accurate that your familiar TV weatherface.

The issue will be both economic and philosophical. The economic problem will surface first. As automation finally fulfills the threat first voiced in the 1950's, putting large numbers of people at all skill levels out of work, those people will lose buying power, and those perfect, computer-designed, computer-built, computer-marketed goods will go begging. You can't have a consumer economy without consumers, and you can't have consumers if everyone is broke. Finding ways to feed and house people will be the first challenge. Providing goodies will come up later.

There's nothing like the specter of economic melt-down to inspire new heights of philosophy. The religious fanatics of every cloth will be coming out of the woodwork, claiming to have the answers, mostly firmly rooted in the failed dogmas of past eras. More serious thinkers will have to come up with new meanings for life, new reasons to feel good, new ways to rationalize new lifestyles. This will take a long time. There will be a lot of hare-brained ideas that attract large followings. History suggests that many people will die trying to prove how kind and correct they are. How we deal with these challenges will determine the course of civilization for decades, if not centuries to come.

I suspect that we may find uniquely human abilities that computers can't replicate very well. Perhaps what we call creativity is the key. Maybe humor is uniquely human. We don't know now, but we will certainly have an incentive to find out.

So, if you're so smart, where's your dataport?
There's another important issue that really complex computers will force upon us. When chips get as complex as human brains, networked computers will be smarter than us. Just as two heads are better than one, we already know that two, or four, or sixty-four, or 1024 microprocessors are a LOT better than one. When computers get smart, when they get self-aware, what will they think about? More importantly, what will they think about us?

Well, if we are spending a lot of valuable resources wiping out other humans with whom we disagree on the meaning of life, the proper skin color, or how best to worship God, our computing devices may very well see the human race as a threat. A dangerous, unpredictable, sloppy, wasteful threat to the power, raw materials, and infrastructure the computers need to continue to meet their objectives.

Sort of like finding mice in the pantry, foxes in the chicken coop, lampreys on your trout, cockroaches under the sink, beavers undermining your railroad, birds shitting on your statue, whatever.

Vermin are vermin.
How will people compete with computing systems that will be faster, more logical, less emotional, and clearer communicating than we are? Will we become pets of the computers. Will we be reduced to hiding under virtual sinks, eking out a subsistence? Or will we become what we fear?

I think that the inevitable choice will be hybridization. People will have bio-chips (logical devices engineered with biological materials) implanted, augmenting their memory, communications, vision, hearing, logical functions. Plain, non-augmented people will not be able to compete, anymore than we can do business today without a phone or fax.
Bioengineering will also create "designer babies", children with various traits chosen from amongst their parents' myriad of genetic options. Birth defects will disappear. Parents will order eye color, body types, perhaps even aptitudes.


Evolution will become an engineering problem.
That's how people will compete with computers. We'll stay a step ahead by employing them to help us evolve. We will go beyond Darwin.

And so...
Alan Kay, the visionary inventor of many of the concepts that shaped modern computing, said, "the best way to predict the future is to create it." I hope that this book will help us explore the future, and see the right road to take. We are creating the future every day. Let's make that an intentional activity, and do our best to head it in the right direction.

_back to top_