Home Recent Papers 1990's Papers Interviews & Letters Programs David Pearce Snyder

 

 

 

 

 

At Long Last.....The Future Really is Now!"*

by

David Pearce Snyder

Consulting Futurist

 

 

 

Bearing the IT standard, America is poised to play a dominant role in the 21st Century.

 

We're all going to the future! It's our common destination. But the future differs from a physical destination in one important aspect: "once you get to the future, if you don't like it....you can't come back to the future. The future is for keeps."

We can talk about what the future is going to be like because the world we live in is a large system – and large systems are coherent, orderly, stable and have great inertia. At least over a short period of time – five to ten years – much of the future is forecastable with reasonable certainty. This means that we can describe what the future is going to be like; it doesn't mean that we know what's going to happen there.

Guessing Wrong About the Future

Abraham Lincoln had a very good grasp of this problem when he observed during his famous debate with Stephen Douglas, that "If we could first know where we are and whither we are tending, we could better judge what to do and how to do it." Well, that makes a lot of sense. In fact, 120 years later, researchers at Johns Hopkins University concluded pretty much the same thing.

They did some studies on bad planning, and it turns out there is one principal cause of bad planning: most people make erroneous assumptions about what they think is going to happen outside of their institution; outside of their realm of personal experience and expertise. Most of us think professionally about a decision and make technically sound plans; if only the world had turned out the way we thought it was going to turn out. Our principal planning mistakes arise from our failure to understand where we are and whither we are tending "as a Nation."

This suggests that, if I can provide you with a sound set of assumptions about what's happening in your institution’s external operating environment, then you should be able to combine my sound assumptions with your experience and expertise and, in the process, avoid up to 90 percent of the bad plans you would otherwise make.

That's what the Johns Hopkins research says: 90 percent of bad plans are attributable to mis-assumptions about the future. So the mission of the futurist is to provide the decision-maker and the planner with a sense of where we are and where we are headed.

Boom Times

Where we are as a nation just now seems pretty good. In 1996, the U.S. economy passed all other countries in our performance characteristics – a combination of inflation, economic growth, unemployment, etc. We are the most competitive economy in the world again – for the first time in a decade. And 1997 was a banner year – the economy grew 3.9 percent. It was so good that the Chairman of the Federal Reserve Board, Alan Greenspan, said at the end of the year that he expected moderation in growth.

But growth didn't moderate – it got even better. In 1998, the U.S. economy grew 4.2 percent. Many people expected that the Asian economic slump would have a detrimental effect on us. But instead, declining Asian demand moderated what might have been runaway U.S. growth at high inflation.

By March, 1998, the U.S. economy set an all-time record for peacetime expansion. In response to our renewed prosperity, consumer optimism soared. Yet just seven or eight years before, consumers had been more pessimistic than they had ever been in the history of public opinion polls. Now it's terrific. And, throughout 1999, the American national enterprise continued to outstrip the predictions of economists. (In fact, our inability to forecast economic performance is one of the embarrassing little secrets of the futures business; it is not possible to reliably forecast economic performance until eighteen months after it's occurred. That's why futurists call economics the 'dismal science.')

There have been ten recessions since the end of the Second World War and the economists only anticipated two of them. There have been three booms since the end of the Second World War, and economists didn't forecast any of those. In fact, it was as surprising to the general public as it was to the economists that things had gotten so good by the end of the '90s, largely because at the beginning of the '90s, things were so bad that surveys of the public opinion showed that most of us believed that the Golden Age of America was already behind us.

A Harris poll found, in 1993, that only twenty-five percent of Americans believed that their children would live better lives than they had. We had been laying off people throughout the '80s at a faster rate than we had in the '70s – an average of about two million a year, up from about one million a year. In the 1990s, layoffs shot up to over three million a year.

First, it takes seventy-five years – about two generations – before we fully assimilate a fundamental, new technology throughout all of the institutions and all of the functions that make up an entire nation's economy.

In spite of the mass lay-offs, the unemployment rate didn't soar however, because most of these people were finding other jobs. But, typically, only 30 percent of those who lose their jobs in mid-career find other jobs that pay as well and have similar benefits. On average, when you take mid-career termination, you take a 50 percent cut in lifetime earnings. Average wages in America fell from 1973 right on through the middle of the 1990s – by fifteen percent. What's more, the layoffs that we experienced at the beginning of the '90s were not only rank-and-file workers – they included millions of middle managers.

So, America’s prospects in the early 1990s were genuinely gloomy. And it was just at this unpromising moment that a number of U.S. think tanks launched research efforts to help us better understand however, "where we were and whither we were tending." One of those groups was the Center for Economic Policy Research at Stanford University.

Some Lessons From History

In 1990, the Center issued a study of the last technological revolution through which the United States had passed – the shift from steam power to electric power. Other economic historians went back and looked at studies for other primary technologic innovations. Together, they concluded that, if the computer represents a genuinely revolutionary technology, we probably ought to understand three things:

First, it takes seventy-five years – about two human generations – in order to fully assimilate a fundamental new technology throughout all of the institutions and all of the functions that make up an entire nation's economy. Indeed, during their first twenty-five years, new technologies are so unreliable and inefficient that they have no measurable impact on an economy's performance.

During the second twenty-five years of a new technology's existence, it becomes productive enough that it is purchased in large volume and installed in the workplace, but it is still expensive, non-standardized and non-reliable. The workforce really doesn't know how to use it very well. Management certainly doesn't know how to apply it very well. At this stage of development, a new technology typically only makes individual operations more efficient, not entire processes.

In fact, the net effect of a new technology during the second twenty-five years of its existence is to produce a slow-down in productivity and a downturn in prosperity. Fortunately, the third revelation from Stanford was that, when a new technology gets to be about fifty years old, it becomes cheap enough, mature enough – and our understanding of how to use it becomes good enough – that it becomes hyperproductive. All of the promises of a new technology -- most of which are never achieved during its first fifty years -- arrive suddenly, all at once, during the final twenty-five years of the technology’s assimilation.

Turning Business Around

Because new tools endow human resources with new productive capabilities, we must change the ways we use human resources in order realize the full productive potential of a new technology. And, in fact, in less than ten years we have witnessed a change in management philosophy. Employers have begun to invest in rank-and-file workers. The Labor Department hired Ernst & Young to do human resource audits of those companies that had consistently been most productive over the last five years. What researchers found was that there was a common pattern in these firms.

Working with researchers from Harvard and Wharton business schools, Ernst & Young discovered that economic benefits to employers are greatest when they successfully integrate innovations in management and technology with appropriate employee training and empowerment programs. If you don't pursue all four initiatives, you get nothing. Indeed, that is one of the big mistakes employers make. They invest in capital equipment. They invest in training. But they leave the same hierarchical, authoritarian bureaucracy in place. Adding computers to an authoritarian, compart-mentalized bureaucracy is about as productive as adding spark plugs to a steam engine.

You have to transform the organization or you simply won't get much yield out of a new technology. Ultimately, the greatest improvements in productivity occur when rank-and-file employees are provided with easy-to-use but sophisticated computer programs – expert systems, simulations, decision algorithms, etc. – so that they are able to make superior, value-adding decisions on their own discretion, based upon their real-world circumstances rather than relying solely upon standard procedures and supervisory or managerial guidance or intervention.

Productivity, Prosperity Rise Together

At the beginning of the 20th Century, Frederick Taylor established the primary paradigm of scientific industrial management: "There is one best way to do each job of work; it is management's responsibility to determine the one best way, and it is the rank-and-file workers' job to follow management' procedures." As we enter the 21st Century, there is a new paradigm of management for the info-mated workplace: "There is always a better way to do each job of work; it is management's responsibility to determine what must be done, and it is the rank-and-file workers' job to determine how to do it best."

Now that this fundamentally new approach to management is becoming well-understood throughout business, it is increasingly clear that our recent improvements in economic performance are just the beginning. We now know how to substantially enhance the productivity of all of our operations; it's just a matter of time. And, as productivity rises, wages will also rise. The Bureau of Labor Statistics (BLS) currently projects that, through 2020, at least 60 percent of all new U.S. jobs will offer above -average wages. Back at the beginning of the 1980s, the BLS forecast – accurately – that U.S. employers would eliminate millions of middle income jobs and replace them with millions of low-skill, low-pay jobs. What the Labor Department was anticipating fifteen years ago was that effect of technologic revolutions which the economic historian, Joseph Schumpeter, once called "the wave of creative destruction."

Schumpeter said that, in order to develop productive marketplace applications of a new technology, it is necessary to shift resources away from successful existing enterprises. In our current case, capital and human resources were diverted from productive uses in mature industrial operations to be applied to immature information technologies, products and services, with the predictable result that U.S. productivity improvement rates dropped by fifty percent following 1973, and average wages fell.

The coming together of the technologies that we label "information technology" has begun to alter, fundamentally, the manner in which we do business and create economic value.

Fortunately – and just as predictably – both the productive potential of information technology and our capacity to use it have improved steadily over time – a National "learning curve" – to the point that our productivity is now rising at twice the average rate of 1970-95, with the very real possibility that it will double again within five to seven years. We have passed through an "inflection point." [An inflection point" occurs at the moment at which change is so powerful that it fundamentally alters the direction of an entire system.] We are now on the uphill side of this change and, with any luck at all, we should have at least twenty years of prosperity, based on rising productivity and improving performance.

A Place in the Sun

While we have passed the inflection point of the "Information Revolution," and are headed up, most of the other industrial economies of the world are six, seven or eight years behind us. That means that they still have six, seven or eight years of bad road ahead of them that we have already put behind us. But, once they have restructured their economies, they, too, will experience a rising tide of productivity that will "lift all their boats."

Emerging technologies push out the old. This process – the coming together of the technologies that we label "information technology" – has begun to alter, fundamentally, the manner in which we do business and create economic value. Indeed, extrapolating the current performance of the economy into the next twenty years, the U.S. Labor Department projects that average family income in America will rise from 43,000 p.a. to over $70,000 p.a. by 2020; clearly good news for the American public, and a wonderful revenue base for the public sector.

It's going to be exciting. Even Y2K will be exciting; but not catastrophic. For some countries – notably Russia, Ukraine and India – the Year 2000 glitch is expected to cause serious disruptions – but for the U.S., Y2K is merely going to be a speed-bump on the way to paradise.

In fact, the U.S. is already five years into an era of dramatic economic growth. As we get through Y2K, there will be boom growth in the global economy, in which America will be the most efficient competitor. The United States is entering the 21st Century more dominant in world economic affairs than it has been since 1945. What's more, seven-eighths of the benefits of our newly-matured information technology are about to land on our collective national doorstep all at once.

The Internet has become a mass medium. In fact, I heard on one of the Internet newscasts here in Silicon Valley that the number of Americans on-line passed one-hundred million this week. Total Internet operations in 1998, including all the systems servers, data bases, portals, search engines and on-line transactions, etc., come to over $300 billion.

The electric power industry last year was a $250 billion industry. The Internet is already bigger than electric power – an industry that's been around for one hundred ten years. The automotive industry was $350 billion last year, but Internet operations are going to shoot right past automotives to $500 billion this year. Within five years, essentially all business-to-business/business-to-government transactions – $1.5 trillion p.a. – will be done on-line, sharply reducing the overhead cost of all procurement and contracting.

This is a revolutionary moment in history. We have now reached the hyper-productive phase of this technology. This will be a magic moment! We're all going to have a whole set of powerful, information-handling tools to use to better do what ever we do; to make our cities more livable; to make our workers more productive; and to make life better than it has been before.

For these and the many other opportunities that lie before us, I commend to you the 21st Century and its cities you will lead into the Information Age.

 

 

 

©1999 David Pearce Snyder

The Snyder Family Enterprise

8628 Garfield Street

Bethesda, MD 208187

phone: 301-530-58-7

e-mail: snyderfam1@AOL.com

 

Call For Information 301.530.5807 

Home Up Feedback Contents Search Contact Us

Copyright © 2010 Snyder Family Enterprise