Good afternoon. It's a great pleasure and a privilege for me to be here today, with the leaders of many of this nation's finest manufacturing companies. It's also an excellent opportunity to highlight a few relevant sections of my organization's resume. I hope thereby to spur a few ideas on how your companies—and your industries—can capitalize on the technology programs of the National Institute of Standards and Technology, or NIST. For starters, let me emphasize that NIST is a non-regulatory agency. I've been advised that this piece of information might help make a good first impression with an audience of manufacturing executives.
To put my remarks in context, I should provide some additional background. NIST is part of the Commerce Department. We work with industry to develop and apply technology, measurements, and standards. That's our mission, and we carry it out through four major programs:
That's the vantage point from which I shall peer into the future and contemplate the prospects for manufacturing engineering. My job as director of the NIST Manufacturing Engineering Laboratory—a 330-person R&D and service operation—confers some credibility to my perspective, I suppose. But, like the product warning label on the Batman costume says, "Caution: Cape does not enable user to fly."
In other words, a podium and an audience do not a technology visionary make. In fact, by speculating about the future of manufacturing engineering, I risk making a blunder that could add me to this widely circulated collection of remarks by people who have misread technology's tea leaves. It's been posted on scores of Web pages, so you might already have seen it one form or another. The Information Age can be brutal to technology prognosticators whose predictions miss their mark.
Somehow, the manufacturing doomsayers of the '70s and '80s—the ones who foretold of the demise of U.S. manufacturing and the coming of the full-service economy—seemed to have to escaped this form of digital embarrassment.
Now, let's go back to the future, without erasing our memory of past competitive challenges. Today, we seem to be awash in new technology—in opportunities to build new products and better processes. It's an exciting, yet perilous, time. New technology can stake your company to a big market lead, or, in the hands of a competitor, it can shove you to the back of the pack.
To be sure, technology-based competition varies considerably among industries, but it's heating up everywhere. In the high-tech industries where it's most intense, one economist uses the analogy—not of a race, but of a casino—to convey the flavor of the global competition as well as the nature of the high-risk technology bets that companies confront.
I'm not even going to attempt to survey the full frontier of promising new technologies. You're already doing that to avoid the risk of a high-stakes crapshoot that puts your company's future on the line. Instead, I'd like to trace three paths of technology development trends. Unfolding events in all three, I believe, could have revolutionary consequences for manufacturing engineering and for your companies.
Okay, let's start with deterministic manufacturing.
My laboratory and all of NIST devote time, energy, and resources to anticipating the future needs of U.S. manufacturers. We have to, and this graph of trends in manufacturing tolerances illustrates why. Over the last half century, dimensional tolerances have been shrinking tenfold every decade or so. State-of-the-art, high-precision products of the early 1980s are equivalent to today's high-volume offerings. For the laboratory charged with, among other things, maintaining the national standard of length—that's my laboratory, the NIST Manufacturing Engineering Lab—this progression is, frankly, a real challenge. Our measurement capabilities must exceed industry's best—ideally by a factor of four or better.
The microelectronics industry is at the forefront of this relentless push. Ever-smaller devices squeezed onto ever-faster and ever more powerful chips, means that we at NIST must forever be splitting hairs—and splitting and splitting them, over and over again . We're now at the point where we are developing measurement tools that are built molecule by molecule, and even atom by atom. We are also developing tools for making electrical measurements that count individual electrons. I could go on, but the trend is clear. In all areas of advanced manufacturing—in discrete parts and continuous processing—there's an unabating need for higher levels of accuracy, precision, selectivity, and specificity.
Why? Improvements in these areas translate into higher quality, lower costs, less waste, better product performance, and happier customers. Regardless of the industry that you're in, if your competitors can measure better than you, and if they can reliably manufacture and assemble parts and products to more exacting tolerances—if they can do that—then you could be in deep trouble.
The trend toward greater levels of precision and accuracy has several facets. One is the growing complexity of part geometry, which makes it doubly difficult to manufacture to tight tolerances. But complex shapes also confer advantages—special features that appeal to customers, higher levels of performance, or a single part that can do a job that was previously performed by combinations of two, three, or more different parts. The aerospace industry with its growing use of high-speed machine tools is a case in point. Companies are banging out large thin-walled parts that are replacing much heavier, riveted assemblies of many parts. Here's NIST's High-Speed Machining Testbed, where my lab, McDonnell Douglas, and Penn State are studying the dynamics of high-speed machining processes—with the aim of reducing them to mathematics, as opposed to trial-and-error experimentation.
So, machine tools must grow in sophistication and capability. Traditional designs and incremental improvements to those designs may not be good enough. Whether the U.S. machine tool industry—a collection of mostly small companies with limited R&D budgets—will continue to be a major supplier of this equipment is one question. Another may be, does it really matter whether your machine tools and process equipment are made here or abroad? I submit that, for the most part, it does.
Allow me to return to the very recent past and to the semiconductor industry. When Japan overtook the U.S. as the world's leading chipmaker during the 1980s, it built its advantage, in part, on the superior offerings of Japanese suppliers of semiconductor manufacturing equipment. The latest and greatest steppers and other equipment arrived in the United States six months to a year after they debuted at Japanese fabs, a considerable disadvantage. When the U.S. semiconductor industry regained the top position, the improved performance and quality of equipment made by domestic suppliers were cited as chief reasons. And part of the reason for this resurgence in the technical capabilities of U.S. equipment makers is SEMATECH—a government-industry partnership that has concentrated on improving supplier performance.
Decades earlier another government-enabled innovation—computer numerical control—had the potential to stake U.S. machine-tool builders and their customers to a significant competitive advantage. But that opportunity largely slipped through our grasp, and into the hands of German and Japanese machine-tool builders. All were quicker to exploit the technology. Today and down the road, partnerships may be key to ensuring that the U.S. machine-tool industry can continue to innovate and to compete at the cutting edge of technology application.
At NIST, a number of projects—past, present, and future—have been designed with the aim of improving machine tool capabilities and performance. Many fall squarely within the domain of "deterministic manufacturing."
Software error correction is one of the longest running lines of research in my laboratory, and this research has been especially useful to industry. When this work began, improving the performance of coordinate measuring machines and machine tools required making costly changes in the design, physical construction, and mechanical workings of the equipment.
Our researchers recognized, however, that computers presented the opportunity to correct proactively for predictable sources of errors, such as those caused by slight variations in the geometry of machines and their components. At the time, industry considered the idea too risky, and did not pursue it. We went ahead and demonstrated the feasibility of the approach. As the cost of computing power decreased, makers of CMMs chose to refine the technology and incorporate it into their products.
Brown and Sharpe and Sheffield, now part of Giddings and Lewis, were among the first. The lineage of software error correction has since multiplied, and software-based methods of enhancing machine-tool accuracy have grown in power and sophistication. Our partners in implementing the technology have included Saginaw Machine Systems, General Motors, Ford, NCMS, Automated Precision, Inc., Giddings & Lewis—this time on a piston-turning machine.
An Advanced Technology Program project has focused on the assembly side of things. The prospect of an ATP award was the impetus for forming a not-so-likely cast of partners: eight suppliers of assembly equipment and engineering services, two universities, Chrysler and General Motors. The effort was called the "2-millimeter project," which referred to the goal of pushing beyond the then world-class standard for dimensional control in automobile bodies. (Another goal was to be able to roll a ball bearing down the seams of a U.S. built car the way it was done in the Lexus commercial a few years ago.) The research strategy was to take a coordinated systems approach—to identify and model the factors that contribute to dimensional variation and, then, to devise a metrology-based assembly process.
The approach worked, initially yielding a methodology that's already been applied in at least five U.S. assembly plants. All now achieve the world standard for dimensional control, which, as you might suspect, has dipped below 2 millimeters. The first, I am told, was a Chrysler plant with the oldest workforce among U.S. auto assembly plants. But the biggest benefits are yet to come, when the full set of results are incorporated into new tooling and when they are applied at more plants and in more industrial sectors, such as aerospace, metal furniture, and appliance manufacturing. Efforts to foster this inter-industry technology transfer are under way.
The potential of software-based methods for improving machine-tool performance may reach full flower in this monster or in any of the growing variations of the still experimental hexapod machine tools. Hexapod technology—or, more broadly, parallel-design machines—represents a radical departure from traditional machine tool design. The jury is still out on whether it will live up to theoretical expectations, but one of the technology's principal virtues is that it is well-suited for the Information Age. For hexapods, accuracy depends mostly on the control and coordination of the strut movements. This makes them very computer intensive, which could be an advantage. It is much easier to change software instructions to improve performance and correct errors than it is to change the mechanical components of the machine. Right now we're working with a handful of companies on characterizing the performance of the technology and on developing evaluation methods. We participate in a Hexapod Users Group, which includes most of the makers of parallel machine tools, several prospective manufacturing users, DoE laboratories, and a number of universities. Stay tuned. In the not too distant future, NIST intends to provide you with the option of checking out the technology from your desktop computer or workstation. More on this later.
Now, let's move from the brawn and sinew of machine tools on to the brains. Today, machine-tool control—in fact, industrial process control—is an area of intense interest. It's poised, I think, for remarkable advances. Continuing progress in computers, sensors, software, and mathematical modeling presents incredible opportunities for predictive, closed-loop process control. Now, you can already find examples of factories that approach this ideal. Not many, but a few. They are likely to be proprietary, or highly customized systems, built with controllers and other components that have closed interfaces. So, if you want to upgrade or revamp your system, add a new capability, or otherwise change it, you'd better have deep pockets and time to spare. You're either going to be locked into buying a product, often at a premium, from the maker of your controller—if it has the application you're after. Or, you may have to spring for customized system integration—the software equivalent of gum and baling wire, except that it's tremendously more costly.
Then there's the problem of lugging your growing accumulation of one-of-a-kind legacy software into the future. All of us in this room recognize the value, the promise, the potential of integration—not only at the level of process control, but in all facets of your operations. But, for the moment, let's stick to control—specifically machine-tool control. Ideas for new applications abound. Prospects for new capabilities are tantalizing. There's been an explosion of innovation in the area of sensors and actuators, presenting amazing new possibilities for process control and quality assurance.
Yet, I'd wager that your companies are devoting more resources to application maintenance than to pushing the envelope, or to pursuing new, more robust approaches to control. And when your company does venture into new application realms, an engineer or programmer becomes a modern-day Sisyphus, pushing this growing boulder of legacy software up the hill to the next highest level of automation.
Like the cartoon character "Popeye," some companies are saying, "That's all I can stands, and I can't stands no more!" Their answer is, "open architecture"—not a characteristic Popeye response, I grant you. Bluto probably wouldn't have bought it. But a few years ago, such a response would have been considered just as fanciful as a cartoon. But we are, I believe, in the early stages of an evolution toward open architectures and standard interfaces for controllers. To add momentum, we at NIST are working with industrial partners to clear technical obstacles to "plug-and-play" interoperability. I'll say a bit about this work in a moment.
In business terms, what a shift from "closed" to "open" architecture means is a shift from a market with relatively few players to a diversified market—a situation that usually benefits the customer. That's how the "Big 3" and major aerospace companies see the situation. We've worked with both types—at Boeing and at GM. And we've worked with this guy: Matt Shaver, proprietor and sole employee of his garage-based machining operation outside of Baltimore. Think of him as representing a "tier four or five" supplier.
Researchers in my laboratory have developed a prototype open-architecture controller. It serves as a testbed for evaluating standardized interfaces designed to accommodate interchangeable hardware and software components. Now, with a PC-based controller and standard interfaces, Mat Shaver, for example, can comparison shop and search for the best value. That's a significant advantage. He told our researchers that a 40-megabyte hard disk from a controller vendor sells for about $1,700. In comparison, an 850-megabyte, floor-hardened disk drive for a personal computer averages about $300. That's more than 20 times the capacity for less than one-fifth of the price.
Multiply these prospective savings, and it's no surprise that some manufacturers have begun to push for open architectures. The "Big 3" and their aerospace counterparts have formed the Open Modular Architecture Controller Users Group to develop specifications for software interfaces, or application programming interfaces. The job has been assigned to a working group of industry and government engineers and scientists, including several from NIST. Our collaborators also have included Advanced Technology and Research Corp. and Hewlett- Packard, two companies that recently introduced controllers with open architectures. Siemens also has an open controller on the market.
To further this evolution, we will continue to focus on measurements and tests for evaluating and validating prototype standards. We're concentrating on priority applications identified with the guidance of industry groups, including a consortium we formed last year. And as standards are formalized, we will develop tests and tools that will help controller manufacturers and software vendors ensure that their products conform with standards, a role NIST already performs for the international product data exchange standard known as STEP.
By the way, STEP—the Standard for the Exchange of Product Model data—has begun to make—well—big strides. Software vendors have begun to implement formally approved elements of the standard and major manufacturers are beginning to claim it as their own. I anticipate that suppliers will be especially thrilled by STEP's growing acceptance, since they will no longer have to play musical CAD seats to satisfy the often conflicting requirements of their major customers.
In the controller area, the interfaces that ultimately do achieve broad industry acceptance will likely be a combination of market-dictated choices and standards crafted by consortia and formal standardization bodies. Because of the diversity of industries and needs in the manufacturing sector, several standard controller architectures may result. But we're heading in the right direction—from what was once dismissed as a manufacturing pipe dream toward controllers with software 'hooks' that enable competent programmers to affect real-time control processes. This is a matter of great strategic importance to U.S. industry—indeed to manufacturers all over the world, as indicated by standard architecture initiatives mounted in Europe and in Japan. To ensure that the resultant standards work to their advantage—or, at least, not to their disadvantage—U.S. companies and industries must be active in the standards arena.
Let's move up the integration ladder and survey the vast panorama of information technology, or IT. It's a sure bet that, with each rung of your ascent, the dizzier you will become. Call it Information Age vertigo—the result of a swirling array of hardware and software options, streaming torrents of data, a multitude of ideas on how to reorganize companies and revamp processes, and an almost endless procession of consultants, of course.
I promised earlier to issue an alert when I was about to start using a lot of buzzwords, and I'll do that now.
If spending patterns are any indication, American industry is enamored with IT. According to one 1996 estimate, U.S. companies spent 43 percent of their capital budgets on hardware alone—more than $200 billion—or "more than they invested in factories, vehicles, or any other type of equipment." When software, networks, support and maintenance, training and other related expenses are taken into account, U.S. industry's total IT bill was about $500 billion in 1996—about half of the more than $1 trillion spent worldwide.
What's motivating industry's spending binge on all this information technology—on all this cool stuff? Here's a partial explanation, attributed to a VP from a very, very large software company. "Cool," he is reported to have told a Silicon Valley audience, "is a powerful reason to spend money."
I suspect there's a not-so-small grain of truth here. But obviously, the primary motivators are gains in productivity, greater customer-responsiveness, better performance, new organizational and manufacturing capabilities, and competitive advantages. Although many companies use their IT tools quite masterfully, on the whole, industry's investment in IT is not yielding full value. In large part, this is because we lack the means flexibly to integrate processes, functions, systems, and companies on small and large scales.
Our research shows, for example, that, today, there are more than 400 software products billed as manufacturing and production engineering tools. Some of these simulation, modeling, and other engineering support tools are very powerful applications for a particular function or a small set of functions. But these tools are largely incompatible with one another. Engineers re- enter data as they move back and forth between applications, which can lead to errors or to decisions made on the basis of information that's out-of-date or just plain wrong. How much more useful these applications would be to your engineers—and to your business—if they were part of an integrated manufacturing tool kit. Regularly updated data would flow seamlessly among various software applications within a common computing environment. Elements of a shared database would range from production-system requirements to product, process and equipment specifications and from cost estimates and budget spreadsheets to plant layouts and set-up illustrations.
In fact, we're working with the users and makers of production-engineering software to develop and demonstrate a prototype integrated environment. Participating companies include: Black and Decker, McDonnell Douglas, Raytheon, Deneb Robotics, Cim Technologies, and Adra Systems. Several government programs and universities also are involved.
So, with regard to manufacturing applications of information technology, we do have a vision that we're pursuing in our laboratories and with our collaborators. It's a shared vision. And this vision was captured best, perhaps, in a study by the National Research Council:
"The vision for 21st-century manufacturing presumes that interconnecting manufacturing applications will be as simple as connecting household appliances—one need only know how to run the application . . . and manage the interface . . . The ease of interconnection and interoperation extends from devices found on the factory floor to applications connecting the factory to the product design facility to applications connecting an enterprise to its suppliers and customers . . ."
We're pursuing this vision through our National Advanced Manufacturing Testbed. The NAMT really is about the future, about building the technical means to make the most of advances in the performance of computing, communication, and networking technologies. This is a job that must be tackled collaboratively, and the NAMT is designed to facilitate that kind of effort, in the style of the next phase of the Information Age.
With industry's guidance, we've designed the NAMT to serve as a vehicle for building information-based-manufacturing's equivalents of roads, bridges, interchanges, and even mass transit rails. It's a multi-node, multi-project testbed, built on a state-of-the-art, high-speed computing and communications infrastructure. NIST serves as the "virtual host" to remotely located collaborators from companies, universities, and government laboratories located around the country. Through the NAMT, for example, you will be able to evaluate a new control algorithm on the NIST hexapod, while sitting at the computer in your laboratory or home office.
We're addressing real manufacturing problems here, but the solutions must meet a requirement that goes beyond the immediate "fix." By that I mean all projects must yield solutions that are modular, integratable elements of larger systems. In so doing, NAMT research and demonstrations will contribute to an open set of standards, interfaces, architecture specifications, and other infrastructural elements that enable varied sets and subsets of manufacturing systems to work together.
Here's an example, the focus of a newly begun consortium. The objective is to develop the basis for virtual machine tools and inspection machines—computer models that behave just like the real McCoys on the factory floor. That capability would go a long, long way toward eliminating the communication gap between design and manufacturing. You could cut digital bits before you cut metal to make certain that the first real part you make will be within specifications. Think of the advantages: You'd be able to optimize use of your own machinery; you'd be able to avoid costly, eleventh-hour surprises, like the unexpected need to build new tooling or to change a design; and you'd be able to assess accurately whether prospective suppliers have the resources and capabilities to deliver parts that are within as-designed specs.
What must we do to reduce virtual machining and inspection to industrial practice? We need to develop tools and standardized building blocks. We need, for example, computer models that represent actual machine behavior, mathematical representations of part geometry, powerful machining and inspection algorithms, common data formats, and remotely accessible performance-data repositories. These will be the products of NAMT research. Many will be offered as the starting points for industry standards. That's an essential feature of the NAMT.
The value of information technology lies largely in connections, in links between applications, resources, and facilities. This is why NAMT projects emphasize developing the means quickly to assemble—and, as competitive circumstances dictate—reassemble these linkages. This also is why collaboration is so essential. Standards—the means to achieving desired levels of interoperability, modularity, and reconfigurability—cannot be developed in isolation.
This is the unifying theme of the projects already under way at the NAMT and of those yet to come. For the record, here's the current slate of projects. I'd be glad to comment on them later if you'd like. Companies are participating in all of them.
The promise of information technology for manufacturing is, in fact, bright. But there are clouds. I've mentioned a few—lack of interoperability, bulky legacy applications and data, costly maintenance. Looming even larger on the horizon is the question as to whether smaller manufacturers—suppliers—will have the wherewithal to embrace and deftly apply advanced Information Age technologies. If your supplier base doesn't join you in making the transition to information-based manufacturing, then you quickly come to a dead-end. As a result, the benefits you realize from your investments in IT and other advanced technology may be marginal.
This is a challenge that NIST is helping to address through its Manufacturing Extension Partnership. As of this year, MEP consists of about 400 locally operated centers and field offices staffed by engineers and other specialists who provide technical assistance to smaller manufacturers. At this size, MEP estimates that, within the next few years, the network will be able to work with about 55,000 establishments annually.
Evaluation after evaluation—and there have been many since the program began in 1989—has credited extension services with helping to improve the performance of the vast majority of manufacturers that used these services.
MEP has targeted information technology and supply-chain optimization as two especially critical areas warranting increased emphasis. It recognizes that ever more demanding customers—that includes many of you—and increasingly formidable competitors are making the progression to more advanced, more capable technologies essential to the long-term survival of smaller manufacturers. Several major OEMs are taking a keen interest in MEP's planning and activities in these areas. If you'd like details, I'd be glad to put you in touch with the program's management.
I'll conclude with a rather pedestrian observation: What I find most noteworthy about modern manufacturing is the sheer and growing diversity of its parts, even as the larger companies cull their tiers of suppliers. The business of manufacturing has evolved from the equivalent of one- man bands and simple combos to incredible orchestras that play on a world stage. To be sure, the capabilities of individuals and their instruments remain important. But the tuning, the timing, and the arranging of a vast number of contributors are now absolutely critical to the quality and success of the performance. Today, it's not enough to be a virtuoso in one domain.
This means that while building their proficiencies and investing in their own instrument, manufacturers must also think like they're members of orchestras. While paying attention to the overall score, they must also attend to the details of how best to perform with others, like—for example—developing the interface standards that will enable each participant in a distributed manufacturing enterprise to enter on time, and in key.
I may have already strained the analogy. But information technology, I believe, is fundamentally changing the ecology of business, redefining the nature of competition, and placing a high premium on cooperation. Often in the arena of information technology, we'll discover that what is good for all performers, can be even better for one. It will be best, however, for the firms most skilled and most savvy in understanding all aspects of the performance.
Thank you.