REASON * December 1998

Yesterday's Tomorrows: 1968-1998
Books that got the future right--and wrong

1998 marks the 30th anniversary of REASON. In the tumultuous year of our
founding, few people would have envisioned the world as it is today. The future has a way of surprising us. Some guides are, however, more prescient than others, and some ways of analyzing trends more promising. For this year's Special Book Issue Symposium, we asked a distinguished group of contributors to identify three books on the future--one that accurately identified important factors that made a difference between 1968 and 1998, one that fell flat, and one that appears to best identify the trends shaping the next 30 years--and to explain their choices.

The books did not have to explicitly engage in prediction: Indeed, often the most prescient works are those that identify something about current, or even past, conditions that turns out to be important in shaping the future. Conversely, many a book has been written on a current movement or "crisis" that fizzled.

Walter Truett Anderson

I don't much like the title of Jean-Marie Guéhenno's The End of the Nation-State (University of Minnesota Press, 1995, translated by Victoria Elliott), partly because I don't think the nation-state's end is anywhere on the horizon. Nor, for that matter, do I agree with many of the specific propositions in this quirky, idiosyncratic work. Nevertheless, I think Guéhenno captures the essence of what is and will be going on in the world much better than Samuel Huntington's much-discussed The Clash of Civilizations. Guéhenno, who recently served as France's ambassador to the European Union, argues that the kind of world now emerging is in a way more united and at the same time without a center, that fixed and definite territories are becoming largely irrelevant, and that networks are becoming more important than 19th-century institutions of power: "We are entering into the age of open systems, whether at the level of states or enterprises, and the criteria of success are diametrically different from those of the institutional age and its closed systems. The value of an organization is no longer measured by the equilibrium that it attempts to establish between its different parts, or by the clarity of its frontiers, but in the number of openings, or points of articulation that it can organize with everything external to it."

Conversely, there is much in Fritjof Capra's The Turning Point (Simon & Schuster, 1982) that I do agree with--his insistence on the importance and value of ecological thinking, feminism, and the human potential movements of the 1960s and '70s--at the same time that I heartily reject his simple pronouncement that all those hold The Answer and are destined to build the civilization of the future. It is a simplistic model of change, and such thinking historically proves not only wrong but dangerous. Capra reveals those dangers in his cheerful willingness to impose New Age agendas on the bad guys with the full power of the state: Nationalize oil, restructure society, do whatever is necessary to get everybody thinking right. In a fairly typical passage of policy recommendations he writes: "An important part of the necessary restructuring of information will be the curtailing and reorganization of advertising....legal restrictions on advertising resource-intensive, wasteful and unhealthy products would be our most effective way of reducing inflation and moving toward ecologically harmonious ways of living."

Finally, I nominate Susantha Goonatilake's Merged Evolution (Gordon and Breach, 1998) as a useful peek into the future. Goonatilake (of Sri Lankan birth, now based in the United States) brilliantly explores the interactions among what he calls three different "lineages" of evolution--biology, culture, and artifacts--and focuses on the capacity of information and communications technologies to, as he puts it, "cut a swath through the biological and cultural landscape as no other technology has hitherto done." He does not explicitly address the political and organizational themes that Guéhenno raises, yet his analysis provides a good understanding of how we are entering into an age of open systems--not only political systems but also biological ones--and also why narcissistic visions of green goodness are an inadequate guide to the 21st century.

Walter Truett Anderson is a political scientist and author of Future of the Self (Tarcher/Putnam).

Ronald Bailey

In the late 1960s and early '70s a plethora of terrible books about the future were published. In 1968, Paul Ehrlich published the neo-Malthusian classic The Population Bomb (Ballantine). "The battle to feed all of humanity is over," he notoriously declared. "In the 1970s the world will undergo famines--hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now." Ehrlich was far from alone. In 1967, the Paddock brothers, William and Paul, asserted in Famine 1975! (Little, Brown) that "the famines which are now approaching...are for a surety, inevitable....In fifteen years the famines will be catastrophic." In 1972, the Club of Rome's The Limits to Growth (Universe Books) suggested that at exponential growth rates, the world would run out of gold by 1981, mercury by 1985, tin by 1987, zinc by 1990, petroleum by 1992, and copper, lead, and natural gas by 1993. The end was nigh. The modern heirs to this strain of doomsaying include Lester Brown at the Worldwatch Institute and Vice President Al Gore.

But the silliness was not confined to the environmentalist front. Take a look at John Kenneth Galbraith's 1967 paean to planning, The New Industrial State (Houghton Mifflin), in which he asserted: "High technology and heavy capital use cannot be subordinate to the ebb and flow of market demand. They require planning and it is the essence of planning that public behavior be made predictable--that is be subject to control."

Galbraith, too, has heirs--most notably, Robert Reich and Lester Thurow. In his 1980 book The Zero-Sum Society (Basic Books) Thurow suggested that "solving our energy and growth problems demand [sic] that government gets more heavily involved in the economy's major investment decisions....Major investment decisions have become too important to be left to the private market alone." Thurow ended with this revealing claim: "As we head into the 1980s, it is well to remember that there is really only one important question in political economy. If elected, whose income do you and your party plan to cut in the process of solving the economic problems facing us?"

Ultimately, the neo-Malthusians and the zero-summers are pushing the same egalitarian agenda: Stop growth and then divvy up the static pie.

Fortunately, a far more prescient and optimistic intellectual tradition opposed the melancholy millenarians. In 1967, Herman Kahn, the founder of the Hudson Institute, published The Year 2000, (Macmillan) in which he laid out a variety of scenarios for the next 33 years. Today's U.S. GDP is at the low end of Kahn's most likely scenario. (Hudson Institute analysts claim that the breakdown of American public education is to blame for this lower than expected GDP.) But Kahn was spot on when he predicted the flooding of women into the work force, the growing equality between the races in the United States, and the boom in Japan. Later, in direct contrast to Thurow, Kahn published The Coming Boom (Simon & Schuster, 1982), which predicted the enormous economic growth that the United States has experienced since then. Kahn also pleaded for the reestablishment of "an ideology of progress":

"Two out of three Americans polled in recent years believe that their grandchildren will not live as well as they do, i.e., they tend to believe the vision of the future that is taught in our school system. Almost every child is told that we are running out of resources; that we are robbing future generations when we use these scarce, irreplaceable, or nonrenewable resources in silly, frivolous and wasteful ways; that we are callously polluting the environment beyond control; that we are recklessly destroying the ecology beyond repair; that we are knowingly distributing foods which give people cancer and other ailments but continue to do so in order to make a profit.

"It would be hard to describe a more unhealthy, immoral, and disastrous educational context, every element of which is either largely incorrect, misleading, overstated, or just plain wrong. What the school system describes, and what so many Americans believe, is a prescription for low morale, higher prices and greater (and unnecessary) regulations."

Kahn turned out to be right about the boom, but most of the intellectual class is still burdened with an anti-progress ideology which remains a significant drag on technological and policy innovation.

As for the future, Kahn's Hudson Institute colleague Max Singer is one of the surest guides. If you want to know what the next 50 years will look like, check out Singer's Passage to a Human World (Hudson Institute, 1987). He makes the often overlooked point that poor people in the developing world can see their futures by looking at our present. And because poor countries have a road map to development, they will be able to grow wealthier and healthier much faster than we did.

One of the important legacies of Kahn and Singer is the insight that a bright future depends on first believing that such a future is possible. Overcoming the pervasive pessimism of the intellectual class is a major piece of work left for us to do in the 21st century.

Contributing Editor Ronald Bailey is editing Earth Report 2000: The True State of the Planet Revisited, which will be published by McGraw-Hill next spring.

Gregory Benford

Surely the most infamous attempt to predict the economic future was the Club of Rome's The Limits to Growth (Universe Books). In 1972 the authors could foresee only dwindling resources, allowing for no substitutions or innovation. The oil shocks soon after lent their work credence, but markets have since erased their gloomy, narrow view of how dynamic economies respond to change. A famous bet over the prices of metals decisively refuted the central thesis of The Limits to Growth in 1990: The prices had fallen in real terms, contrary to the Club of Rome's predictions.

Rather than looking at the short run, and getting that wrong, consider peering over the heads of the mob to trace instead long-run ideas that do not necessarily parallel the present. An example of this is J.D. Bernal's The World, the Flesh and the Devil (long out of print but available online at physserv1.physics.wisc.edu/~shalizi/Bernal), which examined our prospects in terms that seemed bizarre in 1929 but resonate strongly today: engineered human reproduction, biotech, our extension into totally new environments such as the deep oceans and outer space. This slim book found its proper and continuing audience long after its first edition went out of print, and among hard-nosed futurologists it is still considered a neglected masterpiece.

To my mind, the best way to regard the future is to listen to scientists thinking aloud, making forays into territories seldom explored in our era of intense narrowness. Prediction is speculation, but it often arrives well-disguised. Sometimes it is a brief mention of a notion awaiting exploration, as when James Watson and Francis Crick alluded, in the last sentence of their paper reporting the discovery of DNA's double helix, to the implications for reproduction: "It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material."

In similarly laconic vein, consider a slim tome of stature comparable to Bernal's, Freeman Dyson's Imagined Worlds (Harvard University Press, 1997). Dyson in his lofty view shares an advantage with science fiction writers. Both are good at lateral thinking--the sideways swerve into future scenarios justified not by detail but by their intuitive sweep. Refusing to tell us how we get to their visions, Dyson and others take in a wider range of possibility than the hampered futurologists. "Science is my territory," he writes, "but science fiction is the landscape of my dreams."

In five admirably concise essays, Dyson takes us from the near past to the far future. Science and technology must progress through trial and error, he observes, while politics seeks to mandate outcomes and thus brings disasters. Dirigibles lost out to airplanes partly because of political agendas imposed upon them. The British Comet jetliner failed because management forced fast engineering results. Technology, then, must evolve through its own Darwinnowing.

Necessarily Dyson favors small science ("Tolstoyan") over big projects ("Napoleonic"). In our era of dwindling budgets, this seems the winning view. Luckily, small science governs in the crucial fields of neurobiology and microbiology, which will shape the next century. Attempts to impose big agendas on biology should be resisted.

Cleanly written, elegant in insight, this reflection by one of the great scientist-writers of our time beckons us to the far horizons of the human experience. Such vistas are more interesting, more inspiring, and ever more useful than the short views of the Club of Rome.

Contributing Editor Gregory Benford is a professor of physics at the University of California, Irvine. His most recent novel is Cosm (Avon Eos).

K. Eric Drexler

It was a great surprise when I realized that Robert Ettinger's The Prospect of Immortality (Doubleday, 1964) had actually made a sound argument. In the early 1970s I had heard of its thesis--that low temperatures could stabilize otherwise hopeless patients for later repair--but this looked like a technical impossibility. Cells often revive after freezing, but never whole mammals.

This observation, it turns out, is beside the point. By the late '70s, it had become clear that the onrush of molecular technologies would one day lead to thorough control of the structure of matter, including molecular machine systems able to examine and repair biological structures molecule by molecule. Suddenly Ettinger's case made sense. When I finally read his book, I found that he had anticipated molecular-level repair of the sort we now see how to develop. "Can mammals revive from freezing spontaneously?" is the wrong question, once molecular biorepair enters the picture. For us, the key question is instead, "Does freezing somehow erase the information content of the brain?"-- which it clearly doesn't.

This idea of long-term, low-temperature first aid, a.k.a. biostasis, is catching on, especially among those who think of their minds as information processes. Watch for those medical bracelets (freely translated, "In case of system crash, do not discard; call...."), especially up here in Silicon Valley. The San Jose Mercury News has called this a "Silicon Valley trend."

Soon after realizing that biostasis would work, I came across an impressively false work of prediction: Entropy, by Jeremy Rifkin (Viking, 1980). It explains that our world is doomed and that human action must be severely limited, due to the inevitable "dissipation of matter" described by the Fourth Law of Thermodynamics. Senators, academics, and futurists endorsed the book, but it turns out that this "Fourth Law" isn't in the textbooks and is simply false, making the work an edifice of the purest piffle. Rifkin later fuzzed his justifications, but his call for salvation through oppression stays clear.

In their discussions on the future of technology, authors Chris Peterson and Gayle Pergamit observe: "If a thirty-year projection `sounds like science fiction,' it may be wrong. But if it doesn't sound like science fiction, then it's definitely wrong."

Keep that in mind when you read Marc Stiegler's forthcoming Earthweb (Baen Books, April 1999), a novel that plausibly portrays a key part of the future. Stiegler sketches what the Web can become when it grows up--a fast, flexible marketplace of ideas and reputations. He combines the "idea futures" work of Robin Hanson with the "information markets" work of Phil Salin to depict a new and productive spontaneous order. The infoworld Stiegler describes may arrive in the next 10 years, soon enough to shape much of the next 30.

Imagine what the world might be like if good ideas more consistently won and bad ideas more consistently lost. Better media and incentives can help.

K. Eric Drexler is chairman of the Foresight Institute (www.foresight.org) and author of Engines of Creation (Doubleday) and Nanosystems (Wiley).

Charles Paul Freund

More nonsense has been written about television than about anything else in the last 30 years, perhaps in the whole of human history. TV, while indisputably reordering life, has purportedly made its audience stupid, inattentive, illiterate, violent, and worse. Choosing the single most ridiculous book on the effects of TV is a challenge, but Jerry Mander's 1978 rant, Four Arguments for the Elimination of Television (Morrow), is in a class of its own.

Between the covers of this one book are assertions that TV: is a form of sensory deprivation; is addictive; "trains people to accept authority"; is physically unhealthy; suppresses the imagination; hypnotizes its public; is the equivalent of "sleep teaching"; "redesign[s] human minds into a channeled, artificial, commercial form"; and much, much more. Mander wanted TV banned--literally banned--because it was turning people into passive zombies who would do anything they were told. Was he right? Ask the broadcast networks.

James B. Twitchell doesn't like TV much more than Mander does, but he understands it a lot better. "The purpose of television," Twitchell writes in Carnival Culture (Columbia, 1992), "is to keep you watching television." How does it try to achieve that goal? By showing us whatever it thinks we want to see. "Television is where we go to be hooked," he writes. "It's our carnival."

Twitchell's book isn't about TV; it's about "taste," and what has happened to it in recent decades. According to him, what happened was the collapse of the old taste hierarchy, the empowerment of the cultural consumer, and the triumph of the "vulgar." There's great power in the vulgar, he reminds us, which is why it was once institutionalized at the margins--as in the annual medieval carnivals--by those who sought to control culture. Now, he writes, thanks to modern media and the freedom made possible by technology, the popular carnival has displaced the high church of good taste.

Where are we and such media taking each other? Sherry Turkle is investigating an interesting line of possibility. In Life on the Screen (Simon & Schuster, 1995), Turkle sees whole new forms of personal liberation becoming available online. Not only new communities of people, but new people. Life online, she suggests, makes possible a re-evaluation of identity itself.

Turkle's work concentrates on multi-user domains and the great variety of selves that users can experiment with in such environments. This is a new dimension to our relationship with media and with technology, she argues, one that is shifting our understanding of self, other, and machine.

Turkle may or may not be right about what will happen in the next 30 years, but she and Twitchell are both right about the power that will shape the cultural future: It belongs to media's users. Mander's idea--that such users are the dupes of a technological oligarchy--has been virtually sacralized by a dyspeptic anti-media, technophobic class. But if there's one cultural lesson that the past 30 years have to offer, it is that however much this class pouts, the future doesn't care.

Charles Paul Freund is a REASON senior editor.

Michael Fumento

Whoever said even a stopped clock is right twice a day never heard of Paul Ehrlich. This professional fearmonger switched from studying butterflies to doomsaying in 1968, when he published The Population Bomb (Ballantine Books). Among its spectacular claims: "The battle to feed all of humanity is over. In the 1970s the world will undergo famines--hundreds of millions of people [including Americans] are going to starve to death in spite of any crash programs embarked upon now."

Between the "green revolution" in plant technology, a flattening Third World population curve, and imminent population shrinkage in many industrial countries, this prediction, like every major one Ehrlich uttered, fell flat. Still, The Population Bomb remains the standard to which all gloom-and-doom writers aspire. Fortunately, Ehrlich received the ultimate confirmation of his foolishness in 1990, when he received the MacArthur Foundation "genius award."

Ehrlich is always wrong because he can't comprehend that human brains are larger than those of butterflies; that we can and do adapt. As shown by the late Julian Simon, in his classic 1982 book The Ultimate Resource (Princeton University Press, updated in 1996), when the going gets tough, the tough rise to the occasion and make things as good as--and usually better than--they were. At least, that's true in a free society with a relatively free market. Simon's book is the standard by which environmental myth-busting books should be measured.

Biochemist and medical journalist Alexandra Wyke's 21st Century Miracle Medicine: RoboSurgery, Wonder Cures, and the Quest for Immortality (Plenum, 1997) is very much in the Simon mold. No talk here about turning old folks into Soylent Green to feed the teeming masses. Instead, she says, advances in biotech, computers, and information technology will change medicine so drastically in our lifetimes that today's therapies may eventually be equated with witch doctoring.

Gene therapy and genetic screening, Wyke says, will tremendously reduce the incidence of cancer, cardiovascular disease, and some neurological diseases. So-called "newly emergent" infections will quickly be controlled by medicines not discovered by happenstance but designed from the ground up by supercomputers that provide information far more useful than that from rodent testing. Surgery will depend not on the steady hand and experience of the doctor but on devices such as the recently invented ROBODOC, combined with new imagery technology and computers that essentially make flesh and bone transparent in 3-D images, allowing machines to make cuts or dissolve tumors and blockages in exactly the right place.

Initially, such developments will drive up health care costs, Wyke says, but as they proliferate, cut patient recovery times, and save productive lives, they will more than pay their way. She predicts that by 2050, the average life span in developed countries will be a century. Doctors won't disappear, but their role will greatly diminish as computers and robots are increasingly used to diagnose illness, prescribe medicine, and perform surgery.

Predictions beyond 20 years are usually more speculation than science, but I suspect Wyke's forecast is on target. We are already undergoing a medical revolution, with treatments and cures coming at a pace that's furious compared to just a few years ago. Luddites like Jeremy Rifkin will have to be defeated, and death never will. But hold on to your chairs, because we are on the verge of some very exciting times.

Michael Fumento is the author of four books on science and health issues and is writing two others, including a handbook on science and health issues for journalists.

Steven Hayward

First the wrong: Jacques Ellul's The Technological Society (Alfred A. Knopf), which was first published in France in 1954 but did not really make a mark until its publication in the United States in 1964, after which it became a favorite of the highbrows within the 1960s counterculture. Ellul, one of those curious figures who made French sociology so much more interesting than the Anglo-American version of the discipline, wrote several worthy books (some of them on theology), but The Technological Society was not one of them. It promoted the commonplace view that our advance toward a high-technology future would be dehumanizing. He wondered how we would manage to control this portentous technology. The collective problem posed by technology has turned out to be false, of course, as technology empowers people and diminishes government control.

The prophetic: When Charles Murray published Losing Ground to great fanfare in 1984, a few graybeards noted that it vindicated Edward Banfield's much-despised 1968 book The Unheavenly City (Little, Brown). Banfield challenged the premises of the War on Poverty while it was still in its ideological heyday (even though ordinary citizens had grown tired of it by 1968). He argued that lack of money was the least of the problems of the underclass and predicted that government programs aimed at solving urban poverty were sure to make things even worse. By 1984, Murray was able to fill in the details of Banfield's prophecy. You can draw a straight line from Banfield to Murray to today's welfare reforms, which impose time limits on eligibility, emphasize work, and require individual responsibility.

The future: Francis Fukuyama's 1992 book The End of History and the Last Man (The Free Press) deserves a second look, in part because current troubles in the world have given Fukuyama some second thoughts about whether he was correct that liberal democracy represented the final stage in the political evolution of mankind. But the second aspect of his book--the Last Man--also should prompt some fresh chin pulling. The sluggish public reaction to President Clinton's scandals suggests that we are indeed in the condition of Last Men who care only for comfortable self-preservation; the anti-smoking crusade looks like a perfect expression of the snobbery that Fukuyama, following Nietzsche and Kojève, predicted would be the most prominent moral trait of the Last Man. And yet, might the collapse of the tobacco bill have been a turning point? Might the growing support for serious Social Security privatization prove a portent that more of us would indeed like to break out of our rut, as Fukuyama predicted at the end of his book? This might be grasping at straws, but if in fact the progress of the nanny state is not an irreversible process, there is a decent chance that the Last Man of the liberal democratic age need not be a despicable man.

Contributing Editor Steven Hayward is Bradley Fellow at the Heritage Foundation.

Charles Murray

They appeared in the same year, 1962: The Other America, by Michael Harrington (Macmillan) and Capitalism and Freedom, by Milton Friedman (University of Chicago Press). The elite's response to their publication was almost a caricature of the biases of the time. The Other America was greeted ecstatically. Dwight McDonald's New Yorker review of it was read by John Kennedy and prompted the staff work for the War on Poverty. Capitalism and Freedom was not reviewed in any major American newspaper or newsmagazine.

How wrong can one book be? The Other America has to be a contender for the world record. Ignore the many evidentiary problems with Harrington's attempt to portray America as a society with 50 million people in poverty. The real damage was done by his underlying premise: Poverty is the fault not of the individual but of the system. Seized upon as the new intellectual received wisdom, this view drove the design of social policy for the next decade: expanded welfare programs that asked nothing from the recipients, a breakdown of accountability in the criminal justice system, erosion of equality under the law in favor of preferences to achieve equal outcomes. All were bad policies that caused enormous damage, underwritten by the assumption that people are not responsible for the consequences of their actions.

Meanwhile, Friedman got it right. In a free society, the vast majority of people can be in control of their lives. It is a free society that best provides for those who remain in need. A free society produces in reality the broad economic prosperity that Harrington sought through Marxist theory. Harrington's book is a road map for understanding just about everything that went wrong with social policy in the last 30 years; Friedman's is a road map for understanding just about everything that went right.

A book that is likely to be seen 30 years from now as anticipating intellectual trends is E.O. Wilson's Consilience (Alfred A. Knopf, 1998). I find its broad vision of a unity of knowledge, a coming together of our understandings in the hard sciences, soft sciences, and humanities, to be compelling. It is perhaps most obvious that sociologists and political scientists must reconcile their conclusions with the discoveries of the geneticists and neuroscientists, but the hard scientists have some bridges to build as well. Ever since the quantum revolution began a century ago, the image of the clockwork universe where everything could be predicted if everything were known has been breaking down. The hard sciences are increasingly one with the poet, recognizing that the universe is not just stranger than we know but stranger than we can imagine. To me, E.O. Wilson's vision of the scholarly future is not just discerning but exciting.

Charles Murray is Bradley Fellow at the American Enterprise Institute. His most recent book is What It Means to Be a Libertarian: A Personal Interpretation (Broadway Books).

John V.C. Nye

Most Mistaken about 1968-1998: Paul Samuelson's Economics (various editions, current edition from Irwin co-authored with William Nordhaus). Some colleagues are going to shoot me for this, but Samuelson's introductory textbook deserves a prominent place in a list of seminal works that completely missed the boat. This great mathematical theorist somehow managed to produce a best-selling work that enshrined activist Keynesianism as the mainstream policy instrument (excepting a few "extreme left-wing and right-wing writers" seventh edition, 1967); praised high levels of government taxation and regulatory intervention (opining that the state could play "an almost infinite variety of roles in response to the flaws of the market mechanism," 15th edition, 1995); claimed that there was little observable waste in government spending (third edition, 1955); and systematically overestimated the economic success of the Soviet Union, claiming as late as 1989 that "the Soviet economy is proof that...a socialist command economy can function and even thrive" (13th edition).

Most Far-Sighted: The Rise of the Western World, by Douglass North and Robert Paul Thomas (Norton, 1973). Selecting this book might seem like special pleading, as North is both my colleague and a good friend, but there are objective grounds for highlighting the contributions of The Rise of the Western World. This book and subsequent work by North (who shared the 1993 Nobel Memorial Prize in Economics) helped to change the academic debates about development by focusing attention on the institutions of market capitalism, particularly the rule of law, secure property rights, and the low transactions costs that are the hallmarks of well-functioning markets.

The book dismissed arguments that attributed modern economic growth to technical change as putting the cart before the horse. North and Thomas memorably argued that technology did not cause economic progress, but that technological progress was itself part of the process of modern economic growth ultimately promoted by good institutions. The failure to understand that new technology without sound markets does not produce long-lasting development led to failed policies both in the East bloc, which thought that economic growth was all about building more steel mills, and in the West, with programs designed to transfer capital and technical know-how to the Third World while paying scant attention to the price system and existing political institutions. North's work inspired renewed interest in microeconomic as opposed to macro policies of development throughout the world, and it served as the inspiration for other groundbreaking books, such as Hernando de Soto's The Other Path. North's perspective also gave him the insight to criticize conservatives who pushed for simple-minded deregulation in Russia in 1991 without taking into consideration the institutional problems facing the ex-Soviets.

Most Relevant to the Future: Mercantilism, by Eli Heckscher (Allen and Unwin, 1935). The rebirth of economic competition as political struggle, best seen in the misguided economic nationalisms of Lester Thurow and Pat Buchanan and compounded by Asia's woes, will give rise to more intense mercantilist struggles in the near future. At home, the limits to direct tax-and-spend policies do not mean that the government will shrink. Instead, efforts increasingly will be shifted to direct regulation and control through unfunded mandates of all forms. Can't have a national health program? No big deal, just require businesses to provide health care and regulate the HMOs. Heckscher forces us to confront the unpleasant fact that the struggle to form viable states inevitably will be accompanied by unproductive and obtrusive interventions. There is no better way to think about regulation, the rise of the nation-state, and the future of the West than by pondering Heckscher's magnum opus about the growth of mercantilism and the pernicious effects of bad laws on the common welfare. Sadly, it is out of print. Go to your library and read.

John V.C. Nye is an associate professor of economics and history at Washington University in St. Louis.

Walter Olson

Policy buffs have long treasured John Kenneth Galbraith as the short sale that keeps on earning: Exhume a copy of The New Industrial State (Houghton Mifflin, 1967, out of print except as audiotapes) and marvel at the resistless advent of central planning and "administered" pricing, or the inevitably declining importance of individual creativity amid the coming ascendancy of committees and bureaucracies, to name only two of the trends that have so failed to typify the past 30 years.

With the toppling of the idea of a society run from above by experts, American cities have had a chance to recover from many of the social-engineering schemes that had begun to render them unlivable. Honors for seeing the problem early might be divided between Edward Banfield The Unheavenly City, 1970; The Unheavenly City Revisited reissue in 1974, Waveland) and Jane Jacobs (The Death and Life of Great American Cities, Vintage, 1961). Banfield often gets pegged as a neocon and Jacobs as vaguely countercultural, which points up the dangers of facile categorization based on choosing up sides regarding the '60s: both were in fact defending natural patterns of living against authorities' efforts to superimpose artificial structures on the social (Banfield) and physical (Jacobs) forms of community. Fearless, forthright, and intensely factual to the point of reportage, both books are still terrific reads, while the involuted condescension of Galbraith's style, which so impressed the critics of his day, has worn poorly.

Of the battles ahead, few will involve higher stakes than the defense of the Enlightenment and its libertarian values from assaults of both Left and Right. It's today's most unsettling alliance-in-practice: campus-based identitarians and a prominent body of conservative intellectuals agree with each other that "tolerance" and "free speech" are meaningless or overrated concepts, that claims for the objective authority of science should be cut down to size, that official neutrality on such matters as religion and identity is a chimera, that liberal distinctions between public and private are suspect because they tend to insulate bad conduct from social correction, and so forth. (In the August 17 & 24 New Republic, Richard Wolin traces parallels between today's postmodernists and the influential reactionary theorist Joseph de Maistre, who wrote that "what we ought not to know is more important than what we ought to know.")

To see how the antirationalist right met its downfall the first time around, head back to the freethinkers' shelf for another look at Hume and Gibbon and Lecky, Darwin and Tom Huxley, Mencken and Ingersoll; at the moment I'm browsing Andrew D. White's A History of the Warfare of Science with Theology in Christendom (1896, Prometheus paperback reissue). The former Cornell president can be dated or quaint, but more often his book is packed with astonishing, colorful, and, yes, inspiring accounts of the achievement we call the modern mind, formed as one discipline after another, from meteorology to sanitation, economics to philology, epistemology to medicine, pulled free from superstition.

Contributing Editor Walter Olson is a senior fellow at the Manhattan Institute and the author of The Excuse Factory (The Free Press).

Randal O'Toole

The accurate book: Jane Jacobs's The Death and Life of Great American Cities (Vintage, 1961). After 35 years, Jacobs's book remains the best critique of urban planning--and a wonderful critique of American planning in general. The book almost single-handedly demolished the federally funded urban-renewal movement.

Jacobs was scathing in her attacks on planners. "The pseudoscience of city planning and its companion, the art of city design," she wrote, "have not yet broken with the specious comfort of wishes, familiar superstitions, oversimplifications, and symbols." When "contradictory reality intrudes" on planners' preconceived notions, they merely "shrug reality aside." Ironically, planners today use The Death and Life of Great American Cities as a textbook as they try to turn suburbs into the dense cities that Jacobs admired. They ignore her specific warnings against doing so even as they overlook her assaults on their profession.

The fall-flat book: A.Q. Mowbry's Road to Ruin (J.B. Lippincott, 1969). Road to Ruin was one of the first in an unrelenting series of attacks on the automobile, attacks that continue today in Jane Holtz Kay's Asphalt Nation (Crown, 1997) and James Kunstler's The Geography of Nowhere (Simon & Schuster, 1993). Americans would rather not drive, these books claim, but they are forced to do so by a conspiracy of auto manufacturers, highway builders, and housing developers. "The automobile population is growing faster than the human population," warned Mowbry. "Under the doctrine that the machine must be served, the highway advocates are already laying plans for an accelerated effort to blanket the nation with asphalt."

Americans today drive three times as many miles as they did when Mowbry wrote. Yet well under 2 percent of the United States has been "blanketed" with asphalt--and much of that was roads before autos were invented. Though Road to Ruin and similar books convinced U.S. governments to spend billions of dollars subsidizing urban transit, Americans stubbornly continue to drive nearly everywhere they go. Those hostile to the automobile never see the enormous benefits that the auto has produced.

The next-30-years book: Joel Garreau's Edge City: Life on the New Frontier (Doubleday, 1991). Garreau, a Washington Post writer, coined the term "edge cities" for the most recent trend in urban development: concentrations of retailing, manufacturing, and entertainment in new towns on the fringes of major urban areas. Conditioned by writers such as Mowbry, Garreau's first reaction to an edge city was one of horror: "It seemed insane to me. It was a challenge to everything I had been taught: that what this world needed was More Planning; that cars were inherently Evil; that suburbia was morally wrong."

Garreau soon realized that planners had no foundation in reality, while the developers building edge cities had to be in touch with what people wanted, or they would lose their shirts. Edge City is as much a celebration of the marketplace as it is a prediction of what 21st-century American cities will look like--provided government doesn't try to keep them in the 19th century.

Randal O'Toole is an economist with the Thoreau Institute and the 1998 McCluskey Conservation Fellow at the Yale School of Forestry.

John J. Pitney Jr.

In 1969, at the age of 14, I read my first serious book about elections: The Emerging Republican Majority (Arlington House), by Kevin Phillips. Its 482 pages of maps, graphs, tables, and analysis grabbed me the way a baseball almanac would fascinate a more normal kid. I've kept my copy all these years, and I recently took a close look at it again. While Phillips's subsequent books (e.g., The Politics of Rich and Poor) have been more debatable, this one got it right.

Voting power, he said, was shifting from the Northeast to the Southern and Western states of the Sun Belt--a term that he coined in this book. Since Republicans were strong in the Sun Belt, they could look forward to an advantage in presidential elections. (He made no claims about congressional races.) At the time, many commentators belittled his analysis, noting Goldwater's massive 1964 defeat and Nixon's narrow 1968 victory. Phillips won the argument. Since the publication of the book, every presidential race has gone either to a Sun Belt Republican or a Republican-sounding Southern Democrat.

A year after Phillips foresaw the shape of presidential politics, Charles Reich took on all of society. "There is a revolution coming," he said in The Greening of America (Random House). "It promises a higher reason, a more human community, and a new and liberated individual." He never defined his terms precisely, but the new "Consciousness III" apparently spurned materialism, capitalism, and competition. It also meant wearing bell-bottoms. No kidding: "Bell bottoms have to be worn to be understood....They give the ankles a special freedom as if to invite dancing on the street."

Reich pictured the America of the future as "an extended family in the spirit of the Woodstock festival." It didn't happen, thank God. At Woodstock, people who took drugs and rolled around in mud were "hippies" or "flower children." Today we call them "the homeless." Fortunately, many of the Woodstock generation grew up, got haircuts, opened money-market accounts, and joined the emerging Republican majority. Some even subscribe to REASON.

Because of Greening, Reich was the most famous professor at Yale Law School at the time that Bill Clinton was attending. Some of Reich's goopier language about idealism seeped into Clinton's rhetoric, but here's one line he won't be quoting: "To be dishonest in love, to `use' another person, is a major crime."

What comes next? That's the question James P. Pinkerton addressed in his aptly titled 1995 book, What Comes Next (Hyperion). The government's vast "Bureaucratic Operating System," he wrote, has degenerated past the point where incremental patches can work. We need a new system that includes privatization, decentralization, elimination of failed agencies, and a fairer way to raise revenue, such as the flat tax.

That's a positive vision of the future, but there's no guarantee that it will come true. Although Republicans have praised these goals, they have lately been timid about acting on the specifics. Too bad. If supporters of free minds and free markets don't act on their beliefs, supporters of bureaucratic government will stay in charge. And the future will bear a depressing resemblance to the past.

Contributing Editor John J. Pitney Jr. is associate professor of government at Claremont McKenna College.

Robert W. Poole Jr.

One of the most prescient books of the past 30 years appeared at the end of 1968: Peter Drucker's The Age of Discontinuity (Harper & Row). At a time when the world of policy and government was dominated by the ideas of people like John Kenneth Galbraith, Drucker challenged the conventional wisdom across the board. He foresaw a half-century of fundamental change, in both the U.S. and the global economy, and in the ideas by which we attempt to make sense of the respective roles of government and the private sector, both for-profit and nonprofit. He identified knowledge as the key factor in economic growth, and he challenged governments to rethink their policies so as not to inhibit the huge changes that would be necessary as societies adjusted to the emerging knowledge-based economy--especially the revolution to be unleashed by widespread access to inexpensive computer power.

For me, Drucker's book first identified the concept of "reprivatization," calling for a fundamental rethinking of government's role (seeing it primarily as policy maker and regulator, rather than provider of goods and services). This insight was one of the critical influences that led me to research and write about privatization, and to set up what became the Reason Foundation.

The booby prize for prescience surely belongs to another 1968 volume, Paul Ehrlich's The Population Bomb (Ballantine). Evincing complete economic ignorance, combined with blindness to demographic evidence already becoming available, Ehrlich presented a Malthusian scenario under which out-of-control population growth would lead to mass starvation. He predicted that even England "will not exist in the year 2000." Despite their obvious absurdity, Ehrlich's views helped ignite today's enormously influential environmental movement.

As for the best book identifying trends that will shape the next 30 years, I want to cheat just a bit by discussing two books. The best book that sets the stage, by documenting the move away from central planning over the past decade, is Daniel Yergin and Joseph Stanislaw's The Commanding Heights (Simon & Schuster, 1998). The book's central theme is the replacement of the idea of "government knowledge" with the idea of "market knowledge," an insight the authors correctly trace to Nobel laureate F.A. Hayek.

But while The Commanding Heights is reportorial, it is not very analytical or predictive. The most profound book that examines the underlying factors and trends that will shape this country over the next several decades is Virginia Postrel's The Future and Its Enemies (The Free Press, 1998). This delightful book is an exercise in applying Hayek's insights about the dynamics of a free market and a free society to turn-of-the-millennium America. If you want to see what "spontaneous order" means when applied to the complex, high-tech world in which we will spend the rest of our lives, you should read this book.

Robert W. Poole Jr. is president of the Reason Foundation.

Virginia Postrel

Thirty years ago, conventional wisdom held that to reap the benefits of science, technology, and markets, we must deny ourselves fun. This repression theory of progress, derived from turn-of-the-century sociologist Max Weber, was just as popular in the counterculture as it was in the establishment. The only question was which side of the tradeoff you preferred.

Among the most influential expositions of this view was Daniel Bell's The Cultural Contradictions of Capitalism (Basic Books). In this 1976 book, Bell embraced the repression theory but observed that rising living standards were eroding the Puritan ethic. Capitalism, he argued, would destroy itself by encouraging hedonistic play: "In America, the old Protestant heavenly virtues are largely gone, and the mundane rewards have begun to run riot....The world of hedonism is the world of fashion, photography, advertising, television, travel. It is a world of make-believe in which one lives for expectations, for what will come rather than what is....Nothing epitomized the hedonism of the United States better than the State of California."

Bell's book became a touchstone for intellectuals on both the left and the right, so much so that a 20th anniversary edition was proudly issued in 1996. But its thesis had been falsified during the intervening decades. Far from destroying capitalism, play proved a great spur to progress and prosperity. Obsession, not repression, was the key. Nothing epitomized the trend better than the state of California.

The book that correctly captured this trend was set, however, in the Puritans' old stomping ground of Massachusetts. The cover of my paperback edition describes The Soul of a New Machine (Little, Brown and Avon) as "the phenomenal bestseller!" and it was indeed a phenomenon, winning the Pulitzer Prize in 1982. Tracy Kidder's nonfiction tale of engineers racing the clock to build a minicomputer that would "go as fast as a raped ape" alerted the literati that something big was going on among the technology nerds. "In technical creativity they have found a fulfillment that occasionally verges on ecstacy," reported The New York Times. Nowadays, that insight is hardly news. Thirty years ago, it was unthinkable.

All we can say about work 30 years hence is that it will be different. Two short stories suggest possible evolutions. Bruce Sterling's "The Beautiful and the Sublime," anthologized in Crystal Express (Ace, 1990), imagines a world in which "the ability to reason...comes through the wires just like electricity," thanks to advances in artificial intelligence. With reason cheap and abundant, markets and social life reward the aesthetic, and artistic temperaments are prized. The story's particulars are fanciful, but it provides a useful antidote to assumptions that today's economic totem pole will be tomorrow's.

Neal Stephenson's "Jipi's Day at the Office," published in the 80th anniversary issue of Forbes (July 7, 1997) similarly imagines a world in which intangible personal qualities define one's economic value. The title character has an unusual knack for soothing the irritable, a quality she applies to making her employer's resort particularly pleasant. On this day, however, she must persuade a paranoid, artificially intelligent car bomb not to blow. Not exactly fun, but the sort of wits-matching game that fits nowhere in Bell's repression thesis.

Editor Virginia Postrel is the author of The Future and Its Enemies: The Growing Conflict over Creativity, Enterprise, and Progress , just published by The Free Press.

Adam Clayton Powell III

The future is always with us, but the past is ever more so. And it is only the gifted who can peer into the future without straying to focus on the rear-view mirror of what we have already put behind us.

George Orwell's 1984 (Harcourt Brace) became an instant science fiction classic, widely viewed as a glimpse into future decades. But it was far more focused on 1948 than 1984, and the world of Big Brother was in truth the world of Hitler and Stalin. The Berlin Wall and what it represented ultimately collapsed, brought down by the determined rise of empowered, free individuals. And a key tool was the spreading of information technologies, from the Xerox machine to the Internet.

The best commentary on this shift was not in the book 1984 but in a television commercial in the year 1984. The ad was produced for Apple, introducing the then-new Macintosh computer. Broadcast only once, it featured a meeting hall full of drone-like people watching a huge screen displaying an image evoking 1984's Big Brother. A young woman burst into the back of the hall, ran up the aisle toward the screen and smashed it with a huge hammer.

Enter the Mac. Exit Big Brother.

Just a year before that commercial, Technologies of Freedom (Harvard University Press) correctly anticipated today's front-page clashes of central authority and the Internet. Writing in a world of slow, clunky, text-based networks, MIT professor Ithiel de sola Pool anticipated the rise of a high-speed Internet and its expansion into a popular medium--and resistance from central authorities.

"As speech increasingly flows over those electronic media, the five-century growth of an unabridged right of citizens to speak without controls may be endangered," he wrote at the end of the book's very first paragraph. And then he proceeded to outline cases, past and future, describing how the new technology has in its nature the seeds of freedom.

"Electronic technology is conducive to freedom," he wrote. "The degree of diversity and plenitude of access that mature electronic technology allows far exceeds what is enjoyed today. Computerized information networks of the twenty-first century need not be any less free for all to use without let or hindrance than was the printing press. Only political errors make them so."

A decade later, in Winchell: Gossip, Power and the Culture of Celebrity (Vintage Books, 1995), author Neal Gabler captured the rise of mass media and of celebrity-centered popular infotainment. Gabler was concerned not with freedom itself but with the implications of the rise of nonelite media. Gabler told this through the story of Walter Winchell. Decades before his role narrating The Untouchables television series, Winchell had been a top columnist and radio commentator, inventing the modern popular gossip column, celebrity-centered network newscasts, and even the very feel of network news, "its breathlessness, its ellipses, its abrupt shifts, its drama."

Winchell covered a broad range of people we would now call celebrities: "chorus girls, prize fighters, restaurateurs, journalists and even politicians like [New York City mayor] Jimmy Walker" --and his friend, Franklin D. Roosevelt. And Winchell wasn't even a journalist!

Gabler could have been writing about Matt Drudge or the latest Web site designer. And he may have written a clear look at the future of popular media.

Adam Clayton Powell III is vice president of technology and programs at the Freedom Forum.

John Shelton Reed

A book that got it wrong, huh? But there are so many, and they're wrong in so many different ways....How about something by the Chicken Little of population studies, Paul Ehrlich? One of his more fevered imaginings is The End of Affluence: A Blueprint for Your Future (Ballantine Books, 1974), but any of a half-dozen others would do as well. Ehrlich started his doomsaying career in 1968 (a big year for doomsaying) with The Population Bomb, (Ballantine Books) and he's been at it so long that he has become a sort of endearing figure, the Harold Stassen of environmentalism.

Speaking of Republicans, my candidate for a book that got it mostly right is The Emerging Republican Majority (Arlington House, 1969), by Kevin Phillips. How we laughed at Phillips's title when the Republicans got creamed in 1974! But he had the underlying demographic and cultural trends spot on, as Reagan demonstrated and the 1994 elections confirmed. The only way the Democrats can elect a president now is to nominate a Southerner who talks like a Republican (at least long enough to get elected), and even so it takes a real doofus like Ford, Bush, or Dole to screw it up for the Republicans. Of course, the Republicans seem to seek these guys out.

Finally, for a book that may tell us something about the next 30 years, I'm going to play a wild card: Walker Percy's Love in the Ruins: The Adventures of a Bad Catholic at a Time Near the End of the World (Farrar, Straus & Giroux). Published in 1971, this gets a few things wrong (Spiro Agnew is mentioned as a revered elder statesman), but what resident won't recognize Dr. Tom More's Louisiana hometown, which "has become a refuge for all manner of conservative folk, graduates of Bob Jones University, retired Air Force colonels, passed-over Navy commanders, ex-Washington, D.C., policemen, patriotic chiropractors, two officials of the National Rifle Association, and six conservative proctologists"? The center isn't holding: "Americans have turned against each other; race against race, right against left, believer against heathen, San Francisco against Los Angeles, Chicago against Cicero. Vines sprout in sections of New York where not even Negroes will live. Wolves have been seen in downtown Cleveland, like Rome during the Black Plague." The Republicans and Democrats have reorganized as the Knothead and Left parties, but it hardly matters: "Don't tell me the U.S.A. went down the drain because of Leftism, Knotheadism, apostasy, pornography, polarization, etcetera, etcetera," Dr. Tom says. "All these things may have happened, but what finally tore it was that things stopped working and nobody wanted to be a repairman." Bracing stuff.

John Shelton Reed is William Rand Kenan Jr. Professor of sociology at the University of North Carolina at Chapel Hill. His most recent book is Glorious Battle: The Cultural Politics of Victorian Anglo-Catholicism (Vanderbilt University Press).

Lynn Scarlett

Two ideas dominated political philosophy in the 20th century. The first was that mankind could collectively define the "good order." The second was that mankind could bring about this order through collective planning. These ideas were not new. But in the 20th century, they took root and sprouted as grand movements, like the Soviet communist experiment, and in more modest endeavors, like city renewal projects.

By the 1960s, almost no city planner challenged the idea of "the plan." Then came Jane Jacobs. In The Death and Life of Great American Cities (Vintage, 1961), she announced, "this book is an attack on current city planning and rebuilding." Jacobs described cities as "fundamentally dynamic." She celebrated diversity. Jacobs argued that cities were problems of "organized complexity," by which she meant that cities represented the "intricate minglings of different users." Cities were, she wrote, created through the "vastly differing ideas and purposes" of different people, all "planning and contriving outside the formal framework of public action." She saw attempts to impose order, by moving people around "like billiard balls" to new locations, as destructive of the organized complexity that kept cities vibrant.

Jacobs was right. Her prescience anticipated public housing project debacles and sterile redevelopment programs of the 1970s and '80s. But her vision extended beyond city problems. She recognized and celebrated unplanned order, the importance of feedback, competition, and innovation--ideas re-entering late 20th-century political debates about environmentalism, global economies, communications systems, and technological evolution.

Rewind half a century to the writings of French philosopher Jacques Ellul, in whom we find a mental map of another sort. Where Jacobs celebrated uncertainty, complexity, and spontaneous order, Ellul, in The Technological Society (Alfred A. Knopf, 1954) and subsequent books, offered a near-incoherent mix of contempt for uncertainty and yearning for self-will and choice. Jacobs launched her views using the city as her subject; Ellul targeted technology. His thesis: Technology dominates all else. It destroys traditional social structures, imagination, and values--all that makes us human.

Ellul is, above all, what REASON Editor Virginia Postrel calls a stasist. He does not oppose technology but wants to control it. By the 1990s, Ellul was lamenting that "no one has taken charge of the system." In uncertainty, he sees a kind of disorder and amorality.

But Ellul has it wrong. The very uncertainty and change in human structures that he describes is not generating a society of mere consumers and game players. Technology is allowing new social forms, new relationships, new possibilities for human imagination, interaction, and cooperation.

Cooperation is not always--or even often--wrought by some great "we" taking charge and dictating outcomes, an insight explored by Yale law and economics scholar Robert Ellickson in his 1991 book Order Without Law (Harvard University Press). Ellickson remarks that in everyday speech we refer to "`law and order,' which implies that governments monopolize the control of misconduct. This notion is false." Ellickson, like Jacobs 30 years earlier, is an idea buster. He proposes that among human settlements, cooperation is the norm; conflict the exception. And cooperation is often achieved without recourse to law. Instead, it is achieved through ever-evolving informal rules that generate mutual gain.

Ellickson's work is empirical. He shows how real people in real communities achieve cooperation. But his thesis is playing out in a broader political and philosophical tableau in which states, regions, and cities increasingly struggle with the clumsiness of statutes and the inability of laws to deal with the complexities of human interaction and physical reality. Not all circumstances are alike, but statutes require uniformity. Informal social and economic interactions offer a more resilient, adaptive, and evolutionary response to circumstance. Widening the realms in which informal bargaining, negotiation, and trade can supplant order generated through statute is the challenge for the next century. The former builds on human cooperation; the latter reinforces propensities for conflict. Ellickson's Order Without Law anticipates the potentially deepening tensions between formal and informal orders.

Lynn Scarlett is executive director of the Reason Public Policy Institute.



Site Meter