Today, the discovery and optimization of new materials for innovative products is a time-consuming and laborious process, as much a craft practiced by skilled artisans as a science. Expensive trial-and-error-based experimentation is a highly inefficient way to screen potential candidates for a desired new application. This is largely because the task of designing a new material is extremely complex, involving many factors that must be balanced. For example, the performance of metal alloys—whether high strength steel for automobiles, lightweight aluminum for airplanes, or specialty alloys used for nuclear reactors—are highly dependent on relatively small differences in the arrangement of their atoms.
A few percent change in composition or slight modification in manufacturing processes can alter critical qualities, such as strength, by 50 percent or more The investigation of all possible variations of the composition and manufacturing processes of such alloys would be prohibitively expensive and time-consuming, involving numerous tests of strength and other properties, as well as microstructural analyses to determine the arrangements of the atoms. Alloys with major composition alterations are frequently not examined at all because of the time and cost associated with insertion of new materials into engineering systems.
In another example, composite materials (multicomponent materials that include colloidal fillers in polymer matrices, nanoparticle dispersions, and polymers), whether they are advanced functional "inks" for printable electronics, new concrete formulations or nanocomposites, may include dozens of molecular and microscale components, each of which can profoundly affect highly tuned properties. Identical arguments can be made for such diverse applications as photovoltaic materials, advanced batteries, catalytic materials, or next-generation electronics. The result is much lost opportunity for the discovery and optimization of new materials on which new higher performance products can be based.
A powerful new tool for materials discovery and optimization has begun to emerge: computational materials by design. In contrast to an empirical trial-and-error-based approach that may take a decade to implement, computational approaches based on physics-based material models can lead to hugely reduced development time, materials of higher performance, and far more effective and cheaper products.
DOE's Oak Ridge National Laboratory is currently using these methods to direct alloy development for Gen 4 nuclear reactors. GE has cut their jet engine alloy development cycle from fifteen years to nine years by using computational approaches and hopes to cut the time by half again using improved models and data. Procter and Gamble has made a significant commitment to virtual computing in product design and development, which has saved P&G about 17 years of design time in 2009 alone. Products ranging from automobile engines to computer chips to next-generation nuclear power plants are ripe to benefit from such modem methods of materials engineering —clearly a major enabler for the future of manufacturing and American industry. But the effective use of modeling and simulation requires reliable data on the fundamental physical properties of materials at all relevant scales, from atomic to, say, steel I-beams.
For example, atomic and molecular interactions at the nanoscale, crystal grain defects at the micro scale, and shape variations at the macro scale all conspire to affect the strength of a material. The most fundamental data are based on interactions at the atomic scale. Today such data remains spotty and of highly variable quality. Some data is measured at various labs around the world, but in an uncoordinated manner and with incompatible formats. This problem is particularly severe when researchers must tie one scale of modeling to another, say the atomic scale to the scale of cell phone components or turbine blades, in order to study the full range of material performance characteristics.
Modeling and simulation can also be used to generate necessary materials parameters and data for higher-level engineering models when data are not available. Quantum scale models provide such data when measurements are too expensive or uncertain, or even impossible to measure. Examples of materials data very difficult and expensive to measure yet amenable to modeling are interface energies between solid phases, and barriers to diffusion, or mixing of solids on the atomic scale.
Such atomistic methods have matured in the past decade, and are being investigated by cutting edge industry, such as Intel, for product design. But major efforts in both theory and experiment are needed to provide the data that underlies successful modeling at all length scales. These efforts are central to providing the interoperability, validity and confidence levels necessary to ensure adoption by industry of modeling and simulation for computational materials by design, and are directly aligned with NIST's role in establishing data quality.
The missions of the Materials Genome Initiative and National Institute of Standards and Technology (NIST) are tightly aligned. NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life. MGI addresses precisely these mission elements by providing the means to reduce the cost and development time of materials discovery, optimization, and deployment. Both missions are driven by industrial competitiveness, with the creation of an innovation infrastructure as the means to this end. Given NIST expertise in the integration, curation, and provisioning of critically evaluated data and models, NIST has assumed a leadership role within the MGI.
In order to foster widespread adoption of the MGI paradigm both across and within materials development ecosystems, NIST is establishing essential data exchange protocols and the means to ensure the quality of materials data and models. These efforts will yield the new methods, metrologies, and capabilities necessary for accelerated materials development. NIST is working with stakeholders in industry, academia, and government to develop the standards, tools and techniques enabling acquisition, representation, and discovery of materials data; interoperability of computer simulations of materials phenomena across multiple length and time scales; and the quality assessment of materials data, models, and simulations.
Internally, NIST is conducting several path-finder projects to enable integration of key aspects of the materials innovation infrastructure, expose challenges in the construction of this infrastructure, and to serve as exemplars for the broader MGI effort. This includes pilot projects to develop superalloys and advanced composites, both new, energy efficient materials for transportation applications. These activities are coordinated by the NIST Material Measurement Laboratory, in partnership with the NIST Information Technology Laboratory, and with broad participation across the Institute.
In summary, NIST is establishing (1) the essential materials data and model exchange protocols and the (2) means to ensure the quality of materials data and models, ultimately (3) establishing new methods, metrologies, and capabilities necessary for accelerated materials development. Additionally, though its efforts to (4) integrate these activities, NIST is working to test and disseminate its developed infrastructure and best practices to its stakeholders.
Check out all the NIST projects supporting the MGI.