NIST logo

Tech Beat - October 22, 2014

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: June 23, 2010
Date Modified: October 22, 2014 
Contact: inquiries@nist.gov

NIST Offers Electronics Industry Two Ways to Snoop on Self-Organizing Molecules

A few short years ago, the idea of a practical manufacturing process based on getting molecules to organize themselves in useful nanoscale shapes seemed … well, cool, sure, but also a little fantastic. Now the day isn’t far off when your cell phone may depend on it. Two recent papers emphasize the point by demonstrating complementary approaches to fine-tuning the key step: depositing thin films of a uniquely designed polymer on a template so that it self-assembles into neat, precise, even rows of alternating composition just 10 or so nanometers wide.

Computer simulations of two possible morphologies of a block copolymer film
Computer simulations of two possible morphologies of a block copolymer film demonstrate the need for an accurate 3D imaging tool. Red and blue areas represent the two different phases of the polymer film, seen from the side. Each phase is about 12 nm wide. Viewed from the top, both would appear to have evenly separated rows of the "red" phase, the bottom sample in fact has an unwanted horizontal band that will disrupt the pattern transfer. Soft X-ray scattering data can distinguish the two.
Credit: Pitera/IBM Almaden Research Center
View hi-resolution image

The work by researchers at the National Institute of Standards and Technology (NIST), the Massachusetts Institute of Technology, and IBM Almaden Research Center focuses on block copolymers a special class of polymers that under the proper conditions, will segregate on a microscopic scale into regularly spaced “domains” of different chemical composition. The two groups demonstrated ways to observe and measure the shape and dimensions of the polymer rows in three dimensions. The experimental techniques can prove essential in verifying and tuning the computational models used to guide the fabrication process development.

It’s old news that the semiconductor industry is starting to run up against physical limits to the decades-long trend of ever-denser integrated chips with smaller and smaller feature sizes, but it hasn’t reached bottom yet. Just recently, Intel Corp. announced that it had in production a new generation of chips with a 14-nanometer minimum feature size. That’s a little over five times the width of human DNA.

At those dimensions, the problem is creating the multiple masking layers, sort of tiny stencils, needed to define the microscopic patterns on the production wafer. The optical lithography techniques used to create the masks in a process akin to old-school wet photography are simply not capable of reliably reproducing the extremely small, extremely dense patterns. There are tricks you can use such as creating multiple, overlapping masks, but they are very expensive.

block copolymer
Transmission electron microscope (TEM) tomography provides a nanoscale, 3D visualization of the structure of a templated block copolymer. The purple features are silica posts fabricated by electron-beam lithography that direct the self-assembly of the copolymer. The material self-assembles to form two orthogonal layers of cylinders (green).
Credit: Winterstein/NIST
View hi-resolution image

Hence the polymers. “The issue in semiconductor lithography is not really making small features—you can do that—but you can't pack them close together,” explains NIST materials scientist Alexander Liddle. “Block copolymers take advantage of the fact that if I make small features relatively far apart, I can put the block copolymer on those guiding patterns and sort of fill in the small details.” The strategy is called “density multiplication” and the technique, “directed self-assembly.”

Block copolymers (BCPs) are a class of materials made by connecting two or more different polymers that, as they anneal, will form predictable, repeating shapes and patterns. With the proper lithographed template, the BCPs in question will form a thin film in a pattern of narrow, alternating stripes of the two polymer compositions. Alternatively, they can be designed so one polymer forms a pattern of posts embedded in the other. Remove one polymer, and in theory, you have a near-perfect pattern for lines spaced 10 to 20 nanometers apart to become, perhaps, part of a transistor array.

If it works. “The biggest problem for the industry is the patterning has to be perfect. There can't be any defects,” says NIST materials scientist Joseph Kline. “In both of our projects we're trying to measure the full structure of the pattern. Normally, it's only easy to see the top surface, and what the industry is worried about is that they make a pattern, and it looks okay on the top, but down inside the film, it isn’t.”

Kline’s group, working with IBM, demonstrated a new measurement technique* that uses low-energy or “soft” X rays produced by the Advanced Light Source at Lawrence Berkeley National Labs to probe the structure of the BCP film from multiple angles. Because the film has a regular, repeating structure, the scattering pattern can be interpreted, much as crystallographers do, to reveal the average shapes of the stripes in the film. If a poor match between the materials causes one set of stripes to broaden out at the base, for example, it will show up in the scattering pattern. Their major innovation was to note that although the basic technique was developed using short-wavelength “hard” X rays that have difficulty distinguishing two closely related polymers, much better results can be obtained using longer wavelength X rays that are more sensitive to differences in the molecular structure.**

While X-ray scattering can measure average properties of the films, Liddle’s group, working with MIT, developed a method to look, in detail, at individual sections of a film by doing three-dimensional tomography with a transmission electron microscope (TEM).*** Unlike the scattering technique, the TEM tomography can actually image defects in the polymer structure—but only for a small area. The technique can image an area about 500 nanometers across.

Between them, the two techniques can yield detailed data on the performance of a given BCP patterning system. The data, the researchers say, are most valuable for testing and refining computer models. “Our measurements are both fairly time-consuming, so they're not something industry can use on the fab floor,” says Kline. “But as they're developing the process, they can use our measurements to get the models right, then they can do a lot of simulations and let the computers figure it out.”

“It’s just so expensive and time-consuming to test out a new process,” agrees Liddle. “But if my model is well validated and I know the model is going to give me accurate results, then I can crank through the simulations quickly. That’s a huge factor in the electronics industry.”

*With the daunting name “resonant critical dimension small angle X-ray scattering” (res-CDSAXS).
**D.F. Sunday, M.R. Hammond, C. Wang, W. Wu, D. Delongchamp, M. Tjio, J. Cheng, J.W. Pitera, R.J. Kline. Determination of the internal morphology of nanostructures patterned by directed self assembly. ACS Nano, 2014, 8 (8), pp 8426–8437 DOI: 10.1021/nn5029289.
***K.W. Gotrik, T. Lam, A.F. Hannon, W. Bai, Y. Ding, J. Winterstein, A. Alexander-Katz, J.A. Liddle, C.A. Ross. 3D TEM Tomography of templated bilayer films of block copolymers. Advanced Functional Materials. Article first published online Oct. 2, 2014 DOI: 10.1002/adfm.201402457.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

NIST Taps Nine Experts to Help Craft Disaster Resilience Framework for Communities

The National Institute of Standards and Technology (NIST) has engaged nine experts in fields ranging from transportation and water infrastructure to societal dimensions of disasters to further its ongoing effort to draft a disaster resilience framework for U.S. communities.

Recognized leaders in their fields, NIST’s new disaster resilience fellows were chosen to complement the knowledge and skill sets of agency researchers developing the framework—a guidance document to help communities prepare for hazardous events and to restore vital functions quickly if disruptive incidents occur.

The fellows also will assist NIST staff in establishing a Disaster Resistance Standards Panel. With initial funding from NIST, this independent body will be responsible for updating the framework and identifying new priorities for standards development and other actions that will help communities to better prevent natural and human-caused hazards from becoming disasters.

Listed under their area of expertise, NIST’s new disaster resilience fellows are:

Community Resilience Planning
Chris Poland, Chris D. Poland Consulting Engineer, Canyon Lake, Calif.

Electrical Power Infrastructure
Erich Gunther, EnerNex, Knoxville, Tenn.
Stuart McCafferty, GridIntellect, Huntsville, Ala.

Emergency Planning and Response
Jay Wilson, Hazard Mitigation Program Coordinator, Clackamas County, Ore.

Societal Dimensions of Disasters
Liesel A. Ritchie, University of Colorado Boulder, Natural Hazards Center, Boulder, Colo.

Transportation Infrastructure
Joseph Englot, HNTB, New York, N.Y.
Theodore Zoli, HNTB, New York, N.Y.

Water Infrastructure
Kevin M. Morley, American Water Works Association, Washington, D.C.
Donald Ballantyne, Ballantyne Consulting LLC, Seattle, Wash.

For more information on the fellows and the NIST-led community disaster resilience effort, go to: www.nist.gov/el/building_materials/resilience/.

To support development of the disaster resilience framework, NIST is convening regional workshops to solicit input from diverse stakeholder groups. The next workshop will be held Oct. 27-28, 2014, in Norman, Okla. For the registration site, go to: www.nist.gov/el/building_materials/resilience/oklahoma_workshop.cfm.

Media Contact: Mark Bello, mark.bello@nist.gov, 301-975-3776

back to top

Strengthening Thin-Film Bonds with Ultrafast Data Collection

When studying extremely fast reactions in ultrathin materials, two measurements are better than one. A new research tool invented by researchers at Lawrence Livermore National Laboratory (LLNL), Johns Hopkins University and the National Institute of Standards and Technology (NIST) captures information about both temperature and crystal structure during extremely fast reactions in thin-film materials.*

nanocalorimetry graph
Temperature and structure: Graph shows heat absorbed by a thin film of aluminum as its temperature increased. Inset boxes show electron diffraction patterns captured by DTEM as temperature changes. The patterns reveal the crystal structure and orientation of the aluminum. At low temperatures, pattern is characteristic of a face-centered-cubic crystal structure. When the sample is heated past the large melting peak, the spots disappear indicating that the aluminum has lost its crystal structure due to melting.
Credit: NIST

The combined device will help scientists study new materials and processes used to make advanced technologies, including state-of-the-art semiconductors and flat-screen display devices, says David LaVan, a NIST materials scientist who co-led the study.

Modern electronics manufacturing often pushes the limits of current measurement technology. Making a flat-screen display requires bonding a large sheet of a pure, rare material to an underlying metal substrate with as few defects as possible. To do so, manufacturers typically sandwich a thin film between the two materials and heat it rapidly to high temperatures, causing it to react and bond the metals.

This method usually works, but industry researchers would like to optimize the process. And existing tools to describe what’s happening in the reactive thin film provide only incomplete information. One such technique, nanocalorimetry, can track very precisely large temperature changes—at rates up to ,1000 degrees Celsius per millisecond—that occur at a very small scale. Such a measurement can alert researchers to a material’s phase transitions, for example, when a metal melts. But nanocalorimetry tells researchers little about the actual chemical processes or microstructural changes they are measuring as a material heats up or cools down.

To study these changes, LaVan’s LLNL collaborators Geoffrey Campbell, Thomas LaGrange and Bryan Reed developed a different device, the dynamic transmission electron microscope (DTEM). In traditional transmission electron microscopy, diffraction and transmission patterns made by electrons passing through a thin sample provide information about how the sample’s atoms are arranged. But TEM typically requires that the sample maintain one crystal structure for an extended period, as the microscope’s detector captures enough electrons to generate an image.

DTEM, by contrast, captures structural information very rapidly. It relies on a pulsed laser to send short, bright blasts of electrons through a sample. LaVan and his colleagues at NIST and Johns Hopkins realized that if the LLNL group’s DTEM laser pulses were synched with a rapid temperature rise, the researchers could simultaneously track phase transitions and structural changes in materials they were studying. “It’s like peanut butter and chocolate,” LaVan says. “If we can somehow get these two instruments working simultaneously, we’ll have the whole story.”

But first the researchers needed to shrink the circuitry for their nanocalorimeter to a tenth of its original size, so that it could fit inside the microscope. The researchers also needed to write new software to synchronize the microscope’s electron pulses with the nanocalorimeter’s rapid heating pulses. “To get [the devices] to work together was really a substantial effort from three different research groups,” LaVan says.

Finally, LaVan and team member Michael Grapes, a research associate at NIST, and graduate student in materials science Timothy Weihs’ group at Johns Hopkins, flew the redesigned nanocalorimeter to Livermore, synchronized it with the DTEM, and ran tests on thin films of materials such as aluminum, whose microstructural and thermal properties are well understood. The scientists found that, as expected, the nanocalorimeter recorded phase transitions at the same time the DTEM recorded structural changes, and both sets of measurements were consistent with their study materials’ known properties.

The research team is already moving on to study other, less well-understood materials. Recently, the scientists have used their combined nanocalorimeter-DTEM to measure what happens when aluminum and nickel combine to form thin-film alloys. The team’s study provides, for the first time, simultaneous structural and thermal data on this reaction at high heating rates, LaVan says.

*M.D. Grapes, T. LaGrange, L.H. Friedman, B.W. Reed, G.H. Campbell ,T.P. Weihs and D.A. LaVan. Combining nanocalorimetry and dynamic transmission electron microscopy for in situ characterization of materials processes under rapid heating and cooling. Review of Scientific Instruments 85, 084902. Published online Aug. 18, 2014.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

NIST's Cloud Computing Roadmap Details Research Requirements and Action Plans

The National Institute of Standards and Technology (NIST) has published the final version of the US Government Cloud Computing Technology Roadmap, Volumes I and II. The roadmap focuses on strategic and tactical objectives to support the federal government’s accelerated adoption of cloud computing. This final document reflects the input from more than 200 comments on the initial draft received from around the world.

cloud on road
Credit: Irvine/NIST and ©magann/Fotolia

The roadmap leverages the strengths and resources of government, industry, academia and standards development organizations to support technology innovation in cloud computing.

The first volume, High-Priority Requirements to Further USG Agency Cloud Computing Adoption, describes the roadmap’s purpose and scope. The draft focused on three priorities: security, interoperability (the ability for systems to work together) and portability (enabling data to be moved from one cloud system to another). The final version adds two priorities: performance and accessibility. The document lays out 10 requirements necessary for the federal government cloud adoption, including developing international standards, security solutions, and clear and consistent categories of cloud services.

Each requirement is described and features a list of “Priority Action Plans” with target completion dates. Research teams from government, industry and academia are working on these plans and report their findings via publications and presentations at meetings such as the Cloud Computing Forum and Workshop series. The document also provides information about future plans, collaborations, and how cloud work fits with other developing information technologies and national initiatives.

The second volume, Useful Information for Cloud Adopters, introduces a conceptual model, the NIST Cloud Computing Reference Architecture and Taxonomy and presents U.S. government cloud target business and technical use cases.

Volume II also identifies existing interoperability, portability and security standards that apply to the cloud model and specifies high-priority gaps that need new or revised standards, guidance and technology. It also covers security challenges in cloud computing adoption. The document provides insight into the choice of the 10 requirements and the Priority Action Plans listed in Volume I.

Previous NIST work in cloud computing includes an internationally accepted definition of cloud computing, a cloud computing reference architecture (a model) and a security reference architecture draft. NIST scientists are involved in cloud-related international standards committees and lead a number of public working groups to solve cloud-related challenges.

NIST has recently established three new public working groups on Cloud Service, Federated Community Cloud, and Cloud Interoperability and Portability. Current work in the Cloud Computing Metrics group addresses the gaps in metrics and metrology in cloud computing under Requirement 10 from Volume 1.

The cloud community’s work with NIST is critical to U.S. government’s adoption of cloud computing but can be used by all interested in the field.

The two volume set:

  • L. Badger, D. Bernstein, R. Bohn, F. de Vaulx, M. Hogan, M. Iorga, J. Mao, J. Messina, K. Mills, E. Simmon, A. Sokol, J. Tong, F. Whiteside and D. Leaf. US Government Cloud Computing Technology Roadmap Volume I: High-Priority Requirements to Further USG Agency Cloud Computing Adoption (NIST Special Publication 500-293), October, 2014.
  • L. Badger, R. Bohn, S. Chu, F. de Vaulx, M. Hogan, M. Iorga, V. Kauffman, F. Liu, J. Mao, J. Messina, K. Mills, E. Simmon, A. Sokol, J. Tong, F. Whiteside and D. Leaf. US Government Cloud Computing Technology Roadmap Volume II: Useful Information for Cloud Adopters (NIST Special Publication 500-293), October, 2014.

is available as a single pdf document at www.nist.gov/manuscript-publication-search.cfm?pub_id=915112.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

back to top

Evidence Says: December Symposium Showcases Forensics at NIST

Forensic science research goes back a long way at the National Institute of Standards and Technology (NIST)—more than a century. In fact, the agency served as the nation’s federal crime laboratory from 1913 to 1932, when the FBI established its own operation. Today, NIST research programs continue to make significant contributions to forensics, strengthening its scientific underpinnings and ensuring the credibility necessary for effective criminal justice.

forensic toolmark
The Forensics@NIST 2014 symposium will showcase the many ways that the agency supports forensic science, such as providing methods for the analysis and matching of toolmarks collected as evidence.
Credit: ©Robert Rathe
View hi-resolution image

To spotlight how NIST currently serves the forensics community, the agency is hosting “Forensics@NIST 2014” on Dec. 3-4, 2014, at NIST headquarters in Gaithersburg, Md.

Attendees at the two-day symposium will learn how NIST’s world-class laboratories and staff support many branches of forensic science including DNA analysis, fingerprint impression analysis, biometrics, ballistics, and computer and cell phone forensics. The event will feature numerous lectures and poster presentations by NIST scientists, engineers and collaborators.

Each day will highlight a specific set of disciplines:

  • Dec. 3: computer forensics, latent fingerprints and other biometrics, and DNA
  • Dec. 4: firearms/toolmark analysis and statistical measurements (Only poster presentations will be offered on this day.)

Opening the symposium will be the keynote address “Are Judges Losing Confidence in Forensic Science?” by Jed S. Rakoff, U.S. District Court Judge for the Southern District of New York. Judge Rakoff also teaches courses at Columbia Law School in white collar crime, science and the law, class actions, and the interplay of civil and criminal law. He is the co-author of five books and has published more than 125 articles.

NIST is hosting the symposium at no cost to attendees. However, to allow as many people as possible to benefit from the event, participants are asked to sign up for only the specific days they plan to be present. Registration ends Nov. 26, 2014.

To register online, visit the symposium homepage at www.nist.gov/forensics/forensics-at-nist-2014.cfm.

Attendees may participate in one of several special tours/demonstrations highlighting NIST programs in ballistics testing, usability and fingerprints, robotic intelligence systems, neutron research, rapidly simulated environmental exposure of evidence, and trace contraband detection, as well as guided visits of the NIST Museum and the interactive exhibit exploring the agency’s mission and historic achievements. For tour details, including sign-up on a first-come, first-served basis, contact Corinne Lloyd at corrine.lloyd@nist.gov.

For those unable to attend “Forensics@NIST 2014” in person, the presentations will be webcast. Details on how to access the program will be posted before December 3 on the symposium homepage.

To learn more about NIST forensic science research, activities and resources, see www.nist.gov/forensics.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

back to top

James Olthoff Named Director of NIST’s Physical Measurement Laboratory

James Olthoff, a 27-year veteran of the National Institute of Standards and Technology (NIST), has been named director of the agency's Physical Measurement Laboratory (PML).

James Olthoff
James Olthoff
Credit: NIST
View hi-resolution image

"I have full confidence that under Dr. Olthoff's leadership, PML will continue in its tradition of world-class measurement science and technology along with the delivery of innovative and critically needed measurement services to industry," said NIST Acting Director Willie May.

Olthoff joined NIST's Applied Electrical Measurements Group in 1987. In 2001, he became chief of the Quantum Electrical Metrology Division, which maintained the low-frequency electrical standards for the United States. In 2007, Olthoff became deputy director of NIST's Electronics and Electrical Engineering Laboratory, which was responsible for all fundamental U.S. electrical and laser standards and research, and provided essential metrology support to the U.S. semiconductor and photo electronic industries. In 2011, he became PML's Deputy Director for Measurement Services, responsible for oversight of all calibration services at NIST.

Olthoff received a Ph.D. in physics from the University of Maryland in 1985 in the area of atomic, molecular and optical physics. He then held a two-year appointment at the Johns Hopkins School of Medicine before arriving at NIST. He has authored or co-authored more than 120 publications and has co-authored or edited four books.

PML, with some 500 federal employees and more than 700 associates working at NIST's Gaithersburg, Md. and Boulder, Colo. campuses, develops and disseminates the national standards of length, mass, force and shock, acceleration, time and frequency, electricity, temperature, humidity, pressure and vacuum, liquid and gas flow, and, optical, acoustic, ultrasonic and ionizing radiation.

PML's activities range from fundamental measurement research such as quantum computing to provision of measurement services, including calibration services, standards and data. It is also responsible for coordinating the NIST-wide Calibrations and Weights and Measures programs, and supporting two joint institutes: JILA, with the University of Colorado Boulder, and the Joint Quantum Institute with the University of Maryland. PML's talent pool includes four Nobel laureates and scores of award-winning researchers, technicians and staff.

"I am deeply honored to have been chosen to lead one of the finest organizations in the history of metrology at a time when measurement science faces unprecedented challenges and unparalleled opportunities," Olthoff said. "I look forward to helping ensure that NIST always meets the unique and ever-changing measurement needs of the United States."

The appointment is pending Department of Commerce approval.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

back to top