NIST logo

Tech Beat - May 5, 2015

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: June 23, 2010
Date Modified: May 6, 2015 

Senate Confirms May as 15th NIST Director

Washington, D.C. – On May 4, 2015, the U.S. Senate confirmed Willie E. May as the second Under Secretary of Commerce for Standards and Technology and the 15th director of the National Institute of Standards and Technology (NIST). May has been serving as acting director since June 2014. He has worked at NIST since 1971, leading research activities in chemical and biological measurement science activities prior to serving as associate director for laboratory programs and principal deputy to the NIST director.

Willie E. May
NIST Director Willie E. May
Credit: NIST

“Willie has been a partner and champion in our efforts to strengthen America’s manufacturing sector and promote innovation, key drivers to spurring economic growth, and core pillars of the Department’s ‘Open for Business Agenda.’ In addition to serving as a world-class research institute, NIST has taken the lead on several major Department of Commerce and Obama Administration priorities, including implementing a national network of manufacturing institutes and working with industry and other stakeholders to develop the NIST Cybersecurity Framework,” said U.S. Secretary of Commerce Penny Pritzker.

This honor is something I could never have imagined when I began working as a bench chemist at the National Bureau of Standards more than 40 years ago," said May. "I am fully committed to maintaining NIST as a world-leading scientific research institution providing measurements, standards and technology solutions to our stakeholders. I will work to strengthen our Manufacturing Extension Partnership and Baldrige Performance Excellence programs, which also can significantly contribute to our nation's advanced manufacturing and innovation goals. I look forward to working with Secretary Pritzker to address the department's new responsibilities called out in the Revitalize American Manufacturing and Innovation Act."

In addition to his responsibilities at NIST, May also serves as the vice president of the International Committee on Weights and Measures (CIPM) and president of the CIPM's Consultative Committee on Metrology in Chemistry and Biology.May received a B.S. in chemistry from Knoxville College in Tennessee and a Ph.D. in analytical chemistry from the University of Maryland. Before joining NIST (then the National Bureau of Standards), May worked as a senior analyst at the Oak Ridge Gaseous Diffusion Plant. At NIST, his research has focused on trace organic analytical measurement science, the physical and chemical properties of organic compounds and liquid chromatography, which is used to identify the components in a mixture.

Among many other awards and honors, May was elected a Fellow of the American Chemical Society in 2011. He has been recognized with the Department of Commerce's Bronze (1981), Silver (1985) and Gold (1992) medals. The National Organization for the Professional Advancement of Black Chemists and Chemical Engineers (NOBCChE) has recognized him with both the Percy Julian Award for outstanding research in organic analytical chemistry and the Henry Hill Award for exemplary work and leadership in the field of chemistry. May received the 2007 Alumnus of the Year Award from the College of Chemical and Life Sciences at the University of Maryland, and in 2010 he was among the first class of inductees into the Knoxville College Alumni Hall of Fame. He was the keynote speaker for the 2002 winter commencement ceremonies for the University of Maryland's College of Life Sciences, and for Wake Forest University's Graduate School of Arts and Sciences commencement exercises in 2012.

NIST was established in 1901, and since then, has carried out its mission to promote U.S. innovation and industrial competitiveness by making essential contributions to industry, science, public safety and national security. NIST's research and standards development activities cover a broad array of disciplines, from quantum physics to cybersecurity, advanced manufacturing to forensic science.

As a non-regulatory agency of the Commerce Department, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life. To learn more about NIST, visit

Media Contact: Jennifer Huergo,, 301-975-6343

back to top

Go Figure: What Is the Most Efficient Design for New Single-Family Home?

What is the optimal energy efficiency design for a new single-family house? Two economists at the National Institute of Standards and Technology (NIST) took a crack at answering this important question, and their multifaceted answer should help to inform home-buying decisions to get the biggest combined economic, energy and environmental “bang for the buck.”

Using a composite scoring system that rated building performance over 10 years, the economists found that houses built to exceed the Maryland energy efficiency building code (based on the 2012 International Energy Conservation Code)—but stopping short of trying to achieve “net-zero” energy performance—deliver the best value for home buyers. A net-zero house is one designed with both energy-conservation and energy-generation features—such as solar panels—so that over the course of a year it generates at least as much energy as it uses. 

According to the NIST analysis, a house that surpasses the current Maryland building code with four energy-efficiency upgrades saves about $6,300 in life-cycle costs over a decade, compared with the code-compliant building. And compared with NIST’s Net-Zero Energy Residential Test Facility (NZERTF), which incorporates nine energy-efficiency technologies exceeding code requirements, the optimized dwelling would realize total estimated savings of about $32,000 in total construction and operating costs over the same span. 

The analysis by economists Josh Kneifel and Eric O’Rear is detailed in a new NIST publication.* Using models, the economists compared the performance of alternative house designs under identical weather conditions.

On the basis of energy usage alone, according to the simulation, the NZERTF, located in Gaithersburg, Md., ranked at the top. Over a “typical” year, the NIST test house, a laboratory doubling as a prototypical two-story, four-bedroom suburban home, generated an energy surplus of more than 4,200 kilowatt hours and had an electric bill refund of almost $570. 

In contrast, the code-compliant house used more than twice as much energy as the NZERTF, for a total annual cost of $3,153. But a cost “optimal” house that adds high-efficiency windows, a high-efficiency heating and cooling system, reduced air leakage, and 100 percent energy-efficient lighting would trim total energy usage by nearly half. Additionally, there are other alternative energy efficiency measure combinations that lead to similar cost and energy savings.

“’Optimal’ is to a large degree in the eye of the beholder,” Kneifel explains. “Some people may only be interested in decreasing costs while other may be willing to pay a premium for a house that minimizes energy consumption or obtains a green certification. Most likely, a homeowner will be interested in some combination of the three.”

Kneifel and O’Rear’s study took advantage of the design specifications and operation of the NIST NZERTF and the framework developed for NIST’s Building Industry Reporting and Design for Sustainability (BIRDS) database. BIRDS includes 864 building designs with extensive information for each on energy, environmental and cost performance.

With this rich supply of data, the economists used computer simulations to assess a variety of designs according to each performance yardstick. They also used three different methods to evaluate overall sustainability performance—a composite of economic, energy, and environmental metrics. 

The NZERTF was the top performer in both the environmental impact and energy consumption metrics, but the significant additional investment for extra energy technologies, especially the full-house solar energy system, placed it at the bottom in terms of cost performance over a 10-year study period.

Compared with the Maryland code-compliant house, the house with the four energy upgrades yielded an annualized rate of return that was 10 percent better and the NZERTF’s return on investment was 3.1 percent less than the code-compliant house.

“These comparisons indicate that the NZERTF design is not yet fully cost-competitive, but also that building homes to meet commonly adopted energy efficiency codes will leave energy and financial savings on the table,” Kneifel says. In addition, financial incentives for energy efficiency or renewable energy, which were not considered in the analysis, can impact the relative rankings of alternative designs.

The economist advises that as technology advances, energy prices rise, and other factors change, the relative merits of more energy efficient house designs also will change. “The energy efficiency of the cost optimal design is a moving target,” Kneifel says.

*J.D Kneifel, E.G. O'Rear. Sustainability Performance of the NIST Net-Zero Energy Residential Test Facility Relative to a Maryland Code-Compliant Design, (NIST SP 1187), March 2015.

Media Contact: Mark Bello,, 301-975-3776

back to top

Two NIST Experts Named Service to America Medal Finalists

Two National Institute of Standards and Technology (NIST) researchers have been named finalists for 2015 Samuel J. Heyman Service to America Medals, which recognize excellence and innovation in the federal workforce.

Gretchen K. Campbell and Ron Ross
NIST Physicist Gretchen K. Campbell and NIST Computer Scientist Ron Ross.
Credit: NIST

Physicist Gretchen K. Campbell and computer security researcher Ron Ross were among the 30 finalists announced May 6 as part of Public Service Recognition Week. They were chosen from more than 500 nominations across the federal government.

Campbell was named a finalist in the Call to Service Medal category. The medal recognizes a federal employee whose professional achievements reflect the important contributions that a new generation brings to public service. Three others were named in the category.

She was selected for her work in advancing atomtronics, the emerging science of creating circuits, devices, and materials using ultra-cold atoms instead of electrons. The field could pave the way for a new generation of technologies that surpass today’s state-of-the art electronics.

Ross, a NIST Fellow, is one of four finalists for the Homeland Security and Law Enforcement Medal, which recognizes a federal employee for a significant contribution to the nation in activities related to homeland security and law enforcement. Ross instituted a state-of-the-art risk assessment system that has protected federal computer networks from cyberattacks and helped to secure information critical to national and economic security.

The winners will be announced October 7, 2015, in Washington, D.C. The medals are bestowed by the Partnership for Public Service, a D.C.-based nonprofit organization.

To learn more about the Samuel J. Heyman Service to America Medals and this year’s finalists, read “Finalists announced for ‘Sammies,’ which honor federal employees’ work” in the Washington Post.

Media Contact: Chad Boutin,, 301-9754261

back to top

NIST Contributes to IEC White Paper on Wireless Sensor Networks, IOT

A new white paper by an international team surveys the role of wireless sensor networks in the evolution of the Internet of Things (IoT). The report catalogs current needs for underlying standards and infrastructure that must be met before wireless devices can become, as some envision, nearly as plentiful as dust.

Cover of publication

Credit: IEC

Published by the International Electrotechnical Commission (IEC), Internet of Things: Wireless Sensor Networks* was prepared by a team led by Shu Yinbiao, of the State Grid Corporation of China, and Kang Lee, an engineer at the National Institute of Standards and Technology (NIST).

Wireless sensor networks have been called a digital skin. They are self-organizing networks of distributed devices—or nodes—that work cooperatively to gather and transmit information from the surrounding environment, be it a factory, electric grid, field-monitoring site, intelligent transportation system, or virtually any other setting. In addition to sensing the environment, these networks also may handle some control functions, such as adjusting building thermostats, redirecting traffic flows in a city, or optimizing manufacturing processes in a factory. 

The IoT is a broader technological concept that embodies wireless sensor networks. As envisioned, the IoT would embed miniature computers in all manner of objects, from the most sophisticated, such as aircraft, to the mundane, such as clothing and appliances. Each object would be uniquely identifiable and linked through an Internet-like structure.

Cisco Systems estimates that the numbers of devices connected to the Internet will double to 50 billion by 2020. 

The new IEC white paper discusses evolution of wireless sensor networks within the wider context of the IoT, describes the characteristics of wireless sensor networks and current applications and trends, and surveys future applications and the obstacles that stand in the way. It also assesses needs for standards to achieve interoperability among wireless sensor networks from different vendors and across varied applications.

Given the growing number of uses of wireless sensor networks, it’s not surprising that many different standards organizations address various aspects of the technology, often in isolation. The white paper calls on standards organizations to improve communication and coordination, make unified plans, optimize resource allocation, and reduce duplicative efforts.

It also recommends increasing research devoted to improving network performance and quality of service, developing a systems architecture and integration technologies to accommodate diverse networks, developing a common model for ensuring security, and improving access technologies to conserve wireless spectrum and to support larger networks.

“We hope this paper will be a useful resource for a large and diverse community of stakeholders,” says Lee. “It provides a much-needed, high-level perspective on the technology’s vast potential and on the standards-related tasks that must be accomplished so that we can realize it.”

*International Electrotechnical Commission, Internet of Things: Wireless Sensor Networks, 2015. Download at:

Media Contact: Mark Bello,, 301-975-3776

back to top

Getting Better All the Time: JILA Strontium Atomic Clock Sets New Records

In another advance at the far frontiers of timekeeping by National Institute of Standards and Technology (NIST) researchers, the latest modification of a record-setting strontium atomic clock has achieved precision and stability levels that now mean the clock would neither gain nor lose one second in some 15 billion years*—roughly the age of the universe.

JILA's strontium lattice atomic clock now performs better than ever because scientists literally "take the temperature" of the atoms' environment. Two specialized thermometers, calibrated by NIST researchers and visible in the center of the photo, are inserted into the vacuum chamber containing a cloud of ultracold strontium atoms confined by lasers.
Credit: Marti/JILA
View hi-resolution image

Precision timekeeping has broad potential impacts on advanced communications, positioning technologies (such as GPS) and many other technologies. Besides keeping future technologies on schedule, the clock has potential applications that go well beyond simply marking time. Examples include a sensitive altimeter based on changes in gravity and experiments that explore quantum correlations between atoms.

As described in Nature Communications,** the experimental strontium lattice clock at JILA, a joint institute of NIST and the University of Colorado Boulder, is now more than three times as precise as it was last year, when it set the previous world record.*** Precision refers to how closely the clock approaches the true resonant frequency at which the strontium atoms oscillate between two electronic energy levels. The clock's stability—how closely each tick matches every other tick—also has been improved by almost 50 percent, another world record.

The JILA clock is now good enough to measure tiny changes in the passage of time and the force of gravity at slightly different heights. Einstein predicted these effects in his theories of relativity, which mean, among other things, that clocks tick faster at higher elevations. Many scientists have demonstrated this, but with less sensitive techniques.****

"Our performance means that we can measure the gravitational shift when you raise the clock just 2 centimeters on the Earth's surface," JILA/NIST Fellow Jun Ye says. "I think we are getting really close to being useful for relativistic geodesy."

Relativistic geodesy is the idea of using a network of clocks as gravity sensors to make precise 3D measurements of the shape of the Earth. Ye agrees with other experts that, when clocks can detect a gravitational shift at 1 centimeter differences in height—just a tad better than current performance—they could be used to achieve more frequent geodetic updates than are possible with conventional technologies such as tidal gauges and gravimeters.

In the JILA/NIST clock, a few thousand atoms of strontium are held in an "optical lattice," a 30-by-30 micrometer column of about 400 pancake-shaped regions formed by intense laser light. JILA and NIST scientists detect strontium's "ticks" (430 trillion per second) by bathing the atoms in very stable red laser light at the exact frequency that prompts the switch between energy levels.

The JILA group made the latest improvements with the help of researchers at NIST's Maryland headquarters and the Joint Quantum Institute (JQI). Those researchers contributed improved measurements and calculations to reduce clock errors related to heat from the surrounding environment, called blackbody radiation. The electric field associated with the blackbody radiation alters the atoms' response to laser light, adding uncertainty to the measurement if not controlled.

To help measure and maintain the atoms' thermal environment, NIST's Wes Tew and Greg Strouse calibrated two platinum resistance thermometers, which were installed in the clock's vacuum chamber in Colorado. Researchers also built a radiation shield to surround the atom chamber, which allowed clock operation at room temperature rather than much colder, cryogenic temperatures.

"The clock operates at normal room temperature," Ye notes. "This is actually one of the strongest points of our approach, in that we can operate the clock in a simple and normal configuration while keeping the blackbody radiation shift uncertainty at a minimum."

In addition, JQI theorist Marianna Safronova used the quantum theory of atomic structure to calculate the frequency shift due to blackbody radiation, enabling the JILA team to better correct for the error.

Overall, the clock's improved performance tracks NIST scientists' expectations for this area of research, as described in "A New Era in Atomic Clocks" at The JILA research is supported by NIST, the Defense Advanced Research Projects Agency and the National Science Foundation.

* For this figure, NIST converts an atomic clock's systematic or fractional total uncertainty to an error expressed as 1 second accumulated over a certain minimum length of time. That is calculated by dividing 1 by the clock's systematic uncertainty, and then dividing that result by the number of seconds in a year (31.5 million) to find the approximate minimum number of years it would take to accumulate 1 full second of error. The JILA clock has reached a higher level of precision (smaller uncertainty) than any other clock.

** T.L. Nicholson, S.L. Campbell, R.B. Hutson, G.E. Marti, B.J. Bloom, R.L. McNally, W. Zhang, M.D. Barrett, M.S. Safronova, G.F. Strouse, W.L. Tew and J. Ye. Nature Communications. Systematic evaluation of an atomic clock at 2 × 10-18 total uncertainty. April 21, 2015.

*** See 2014 NIST Tech Beat article, "JILA Strontium Atomic Clock Sets New Records in Both Precision and Stability," at

**** Another NIST group demonstrated this effect by raising the quantum logic clock, based on a single aluminum ion, about 1 foot. See 2010 NIST news release, "NIST Pair of Aluminum Atomic Clocks Reveal Einstein's Relativity at a Personal
," at

Media Contact: Laura Ost, laura Ost,, 303-497-4880

back to top

NIST Releases Draft Community Resilience Planning Guide for Public Review

Guide aims to help communities prepare for—and effectively recover from—disasters

HOUSTON—The U.S. Commerce Department's National Institute of Standards and Technology (NIST) today issued a draft guide to help communities plan for and act to keep windstorms, floods, earthquakes, sea-level rise, industrial mishaps and other hazards from inflicting disastrous consequences.

NIST is requesting public feedback on the draft Community Resilience Planning Guide for Buildings and Infrastructure, which Acting Under Secretary of Commerce for Standards and Technology and Acting NIST Director Willie May unveiled during a workshop at Texas Southern University in Houston today.

The official first version of the guide will be released this fall and updated periodically as new building standards and research results become available and as communities gain experience using the guide and recommend improvements.

"The guide helps to translate the concept of community resilience into practice," May said at the workshop. "We need stakeholder input to ensure that the guide will be an effective tool for helping communities to not only 'weather the storm' but also to bounce back quickly and efficiently."

Damage from Hurricane Sandy
Resilience planning can help communities to better withstand and bounce back from storms and other hazards so that they can quickly restore roads, power and other essential services. Aerial views during a U.S. Army search and rescue mission show damage from Hurricane Sandy to the New Jersey coast, Oct. 30, 2012.
Credit: U.S. Air Force photo by Master Sgt. Mark C. Olsen
View hi-resolution image

According to data collected by the Commerce Department's National Oceanic and Atmospheric Administration (NOAA), over the last four years, the nation experienced 42 extreme weather events that caused at least $1 billion in damage, for a total cost of about $227 billion and 1,286 lives lost. In all, there were 334 major disaster declarations in the United States between 2010 and 2014. According to a separate tally by the Center for Research on the Epidemiology of Disasters in Belgium, the United States experienced about 500 natural disasters between 1994 and 2013, ranking second globally, behind China. The 10 deadliest of these U.S. disasters killed more than 4,000 people.

"Resilience planning is not a stand-alone activity," said NIST structural engineer Therese McAllister, who led development of the planning guide. "The guide recommends that communities integrate their resilience plans into economic development as well as zoning and other local planning activities that impact buildings, public utilities and other infrastructure systems that residents rely on for important services."

The guide lays out a six-step process that starts with the formation of a resilience team drawn from the community and culminates with the development and implementation of resilience strategies that are updated regularly. The resilience team's role is to engage community representatives in a series of efforts that include defining how vital social functions like healthcare, education and public safety are supported by local buildings and infrastructure systems, such as power, water and transportation.

This information helps to address a critical question: When do buildings and infrastructure systems that support social functions need to be restored so that recovery is not deferred and the community's longer-term ability to serve local residents does not deteriorate?

The guide is an important addition to the National Preparedness System, which provides a way to organize preparedness activities and programs. Nearly 24,000 U.S. communities have developed mitigation plans that aim to reduce the risk of damage from a hazard, according to the Federal Emergency Management Agency (FEMA). Effective mitigation measures will, for example, protect a building from flooding, but factoring in resilience will help to ensure that the structure also has power and water during recovery.

The draft guide consists of two volumes. The first provides an overview of community resilience and summarizes the six steps involved in developing and implementing a resilience plan. It also provides an example of how a fictional community uses the framework to plan and guide resilience efforts.

The second volume serves as a detailed resource to support the six steps. It includes comprehensive sections on characterization of social and economic functions, buildings, transportation, energy, communication, water and wastewater and community resilience metrics.

NIST led the development of the draft guide, convening four regional meetings to gather stakeholder input. It engaged nine outside experts in disciplines ranging from buildings to public utilities and from earthquake engineering to sociology to assist in drafting the guide. NIST also drew on its own expertise, developed through its detailed studies of more than 50 disasters and building failures, including the collapse of the World Trade Center buildings and the 2011 Joplin, Mo., and Moore, Okla., tornadoes. NIST expertise also comes from ongoing research to improve the structural performance of buildings and technical contributions to the development of building standards and codes by other organizations. NIST has no regulatory authority.

The 60-day public review of the draft Community Resilience Planning Guide for Buildings and Infrastructure has been announced in the Federal Register. For more information on NIST's Community Disaster Resilience Program, visit the NIST website.

As a non-regulatory agency of the Commerce Department, NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards and technology in ways that enhance economic security and improve our quality of life. To learn more about NIST, visit


Media Contact: Mark Bello,, 301-975-3776

back to top

New Report Recommends Policies for Improved Preservation of Biological Evidence

All states should have laws ensuring that criminal justice systems properly handle, store and retain forensic biological evidence, according to a new report* from the National Institute of Standards and Technology (NIST). NIST's guide, Biological Evidence Preservation: Considerations for Policy Makers, encourages legislators, judges, law enforcement officials, crime laboratory managers and other policy makers to implement or update laws that support best practices in this critical area.

DHS image of evidence collection
A law enforcement officer collects evidence from a simulated crime scene during training for handling of hazardous biological materials.
Credit: Center for Domestic Preparedness, Federal Emergency Management Agency, U.S. Department of Homeland Security
View hi-resolution image

"While 43 states and the District of Columbia have enacted statutes related to the preservation of biological evidence, policies and procedures can be enacted in states that currently have no laws," as well as states looking to improve existing legislation, according to the NIST report.

Biological evidence refers to two types of evidence commonly recovered from crime scenes or collected during criminal investigations: biological samples such as blood, semen and other bodily fluids; hair; tissue; bones and teeth; or items containing biological material such as a bloody T-shirt. An earlier NIST report, The Biological Evidence Preservation Handbook: Best Practices for Evidence Handlers,** detailed a set of best practices to help ensure biological evidence is properly stored to avoid contamination, protected against premature destruction or degradation, and accurately tracked to prevent loss.

The new guide for policy makers discusses key issues that influence and drive policies in this area. Based on a thorough examination of existing state statutes, current trends, law, scientific literature and expert opinions, the authors make nine recommendations for actions that support best practices for preserving biological evidence.

"Biological evidence can carry a lot of weight in solving crimes, but if you can't find it or find it in an unusable state, it won't help you conduct the necessary forensic analyses to administer justice fairly," said Shannan Williams, project manager in the NIST Forensic Science Research Program.

Among the report's policy recommendations are that each state require:

  • The establishment of an authoritative body to define and enforce standards related to biological evidence preservation;
  • Biological evidence be stored in appropriate environmental conditions, based on known scientific practices;
  • Evidence be retained according to timetables based on the type of crime and the status of the case; and
  • A means for defendants or petitioners to seek recourse in cases where it has been judicially determined that a denial of access to biological evidence has occurred.

Both reports were authored by the Technical Working Group on Biological Evidence Preservation, a group of 20 experts from various forensic, law enforcement and scientific disciplines, as well as legal scholars, medical personnel and representatives of relevant professional organizations.

The National Commission on Forensic Science, coordinated by the Department of Justice and NIST, has chosen to address this topic through the creation of an Evidence Retention and Preservation Working Group. The working group is developing a document that will summarize the status of scientific and legal issues surrounding the retention and preservation of biological as well as non-biological evidence.

The commission plans to discuss policies to support best practices for biological evidence preservation at its meeting in Washington, D.C., from April 30-May 1, 2015. Williams will present the new NIST report to the commission during the meeting. For meeting details and a link to a webcast of the event, go to

To learn more about biological evidence preservation, see the NIST Forensic Evidence Management Web page,

* Technical Working Group on Biological Evidence Preservation. Biological Evidence Preservation: Considerations for Policy Makers (NISTIR 8048). 2015,

** Technical Working Group on Biological Evidence Preservation. The Biological Evidence Preservation Handbook: Best Practices for Evidence Handlers (NISTIR 7928).2013,

Media Contact: Michael E. Newman,, 301-975-3025

back to top

It's a Trap! But That's Okay for Novel Light-Detecting Material

Traps. Whether you’re squaring off against the Empire or trying to wring electricity out of sunlight, they’re almost never a good thing. But sometimes you can turn that trap to your advantage. A team from the University of Nebraska-Lincoln, working with researchers at the National Institute of Standards and Technology (NIST), has shown that charge-trapping defects that are typically problematic in solar cells can be an asset when engineering sensitive light detectors.

photothermal induced resonance images
Height (c) and photothermal induced resonance (d) images of a sample area before annealing, and the same sample area after annealing 60 min (g,h). Scale bars are 1 micrometer. Data that the group gathered suggest that during curing, metal ion clusters are left on the OTP film surface. These ion clusters could act as charge traps, enabling the material's increased sensitivity. Credit: NIST

Their work, which appeared on the cover of Advanced Materials, suggests that these defects could be harnessed to make light sensors that consume very low power and could be used for imaging, spectroscopy, and other industrial and scientific applications. 

For the past few years, researchers have been studying films of organometal trihalide perovskites—OTPs for short—for use in solar cells, because they have several attractive qualities. 

For one, they are relatively easy to process. A solution of the material can be poured over a fast-spinning disk. The spinning disk throws off excess and the solution—now much thinner than a typical coat of paint—is dried. Even though the films are only a few hundred nanometers thick, they can still absorb most of the light that hits them. 

So far, other “solution-processable” materials have been hindered by their relative inability to conduct light-generated electrical charges, a property known as carrier mobility. But OTPs’ carrier mobility is comparable to that of crystalline silicon. Potentially, it could be a game-changer that would allow the engineering of high-performance, light-harnessing devices at low cost.

When a photon is absorbed by a light-sensitive material, it transfers energy to a negatively charged electron, which goes to an excited state and leaves behind an empty spot, called a hole, which is positively charged. To make use of the absorbed energy in a solar cell or light detector, these oppositely charged carriers must drift in opposite directions towards different electrodes. Material defects that “trap” either the electrons or the holes reduce carrier mobility and degrade the device’s performance.

The research team led by the University of Nebraska-Lincoln found that although trap states in the bulk material are bad for solar cells, surface defects and traps close to the electrodes in OTPs can be engineered to boost their light-detecting performance. 

“To detect light, one measures changes that occur in a material when a photon hits it,” says NIST’s Andrea Centrone. “One way to increase a detector’s sensitivity to light is to apply a voltage to it. Traps located near the electrodes lower the energy barrier for injecting electrons into the material. In our devices, lowering the barrier effectively multiplies the material’s light sensitivity up to 500 times when we apply the right voltage.” 

Centrone explains that trap states only have this multiplying effect if they are located in proximity to the electrodes, and not throughout the material. Because that large amplification requires a very low voltage (about 1 volt) in OTPs, these highly sensitive detectors could be powered with button batteries like the ones found in watches, or integrated into low power circuits. 

Using a technique called photothermal induced resonance (PTIR), the NIST researchers studied the surface decomposition of OTP films at the nanoscale during curing. PTIR is a novel experimental technique being advanced at NIST that combines the spatial resolution of atomic force microscopy with the chemical specificity of infrared spectroscopy.

The data they gathered suggest that during curing, metal ion clusters are left on the OTP film surface. These ion clusters could act as charge traps, enabling the increased sensitivity. The researchers believe that PTIR characterization will provide important information to link the nanoscale properties of OTP films to the macroscale properties of OTP devices, which may allow the engineering of more efficient OTP-based light detectors and solar cells. 

“Discovering that a small applied voltage increases the light sensitivity of hybrid perovskites is extremely exciting,” says Jinsong Huang at the University of Nebraska-Lincoln. “Not only did we add hybrid perovskites to the catalog of light-sensing materials, but the seemingly magical electronic transport capabilities of hybrid perovskites have resulted in better performance with respect to other materials, such as organic and nanocomposite materials, whose sensitivity can be similarly boosted by our strategy.”

*R. Dong, Y. Fang, J. Chae, J. Dai, Z. Xiao, Q. Dong, Y. Yuan, A. Centrone, X. C. Zeng and J. Huang. High gain and low-driving-voltage photodetectors based by organolead triiodide perovskites. Advanced Materials, Volume 27, Issue 11, pages 1912–1918, March 18, 2015.

Media Contact: Mark Esser,, 301-975-8735

back to top

NIST Technique Can Measure Volumes of Key 'Lab on a Chip' Components

Imagine shrinking tubes and beakers—in fact, most of a clinical chemistry lab—down to the size of a credit card. When engineers figured out how to do that two decades ago, they enabled complex tests to be performed with tiny "lab on a chip" technology. But until now, there has been no way to accurately measure the size of the tiny vessels they created. Now,scientists at the National Institute of Standards and Technology (NIST) have found* a potential solution to this longstanding manufacturing issue.

NIST found a combination of techniques to effectively measure microfluidic channels, achieving an accuracy of within 5 percent for both a channel's depth and its bottom's width. Scale bar in this cross-section of a channel represents 50 micrometers.
Credit: Reyes/NIST
View hi-resolution image

The NIST approach could meet an important need in the microfluidics industry, which creates devices useful in fields from medical testing to toxin detection. It could be particularly important for miniaturized devices that rely on volume measurements to report, for example, the concentration of a particular molecule in a mixture.

Because microfluidic devices are good at straining and purifying liquids, they have found use across the biotech field, where they are invaluable for DNA testing, distilling specific proteins from a mixture, and other applications where a great number of samples need to be analyzed quickly. These labs on a chip can take a variety of shapes depending on their purpose, but they often feature tiny pipes and chambers where fluid samples flow and collect.

These channels and reservoirs have presented a longstanding problem to manufacturers, whose devices are often made of plastic. There's been no way to measure their dimensions effectively for quality control. It's hard for a company that makes plastic devices to guarantee that one of their miniature labs will perform the same as another—or to design devices whose function depends on knowing how much fluid there is and how much difference there is between the intended design and the final assembly.

"The inability to determine these dimensions limits a multi-billion dollar industry," says NIST's Darwin Reyes. "Without them, your chip might not give you the right value for the concentration of insulin in a blood test, for example, or tell you how much DNA there was at one crucial step in the process."

The NIST team's solution has been to survey a number of potential measurement methods in the hopes that some combination of them would turn out to deliver reliable results. According to Reyes, it was the first time anyone had looked at multiple techniques and compared their efficacy quantitatively for knowing the actual dimensions of the channel.

Their survey led them to a pair of techniques: optical coherence tomography (OCT) and confocal microscopy (CM). They found that while neither could reliably measure all dimensions by itself, OCT could measure a channel's depth to an accuracy of 4 percent while CM gave readings of its width at the bottom to within 5 percent. The results matched those obtained by filling the channels with epoxy and measuring its cross-section later—"an approach that gives great answers but unfortunately ruins the device in the process," Reyes says.

The combination of OCT and CM only takes a few minutes and would therefore be fast enough to meet manufacturers' needs, Reyes says, though at this point it would be expensive. But, he says, the work "opens the door for what could be seen as a standard method for measuring microfluidic devices in very little time and with traceable accuracy and precision."

* D.R. Reyes, M. Halter and J. Hwang. Dimensional metrology of lab-on-a-chip internal structures: A comparison of optical coherence tomography with confocal fluorescence microscopy. Journal of Microscopy, published online April 8, 2015. DOL:10.1111/jmi.12245.

Media Contact: Chad Boutin,, 301-975-4261

back to top

Commerce Acting Under Secretary Encourages Business to take Active Role in Corporate Cyber Security

The acting Under Secretary of Commerce for Standards and Technology today called on corporate CEOs and board members to take active roles in managing how their institutions deal with cybersecurity risks.

NIST Acting Director Willie May at the Board Agenda: CYBER Conference, April 17, 2015.
Credit: Heyman/NIST

Speaking at the "Board Agenda: CYBER" conference in Washington, D.C., Dr. Willie May said, "As CEOs, board members, or other senior leaders of your organizations, managing cyber risks is one of the most important things you can do to protect your assets, your customers, and your companies." May also is the acting director of the National Institute of Standards and Technology (NIST).

May said that top corporate managers should review and consider using the Framework for Improving Critical Infrastructure Cybersecurity, a voluntary guidance document issued by NIST a year ago last February. The product of a year-long collaboration of cybersecurity and management experts from the federal government, industry and academia, the framework was designed to be a risk management approach that builds on recognized best practices and standards for cybersecurity.

May said preventing all successful cyber attacks is likely not possible, however a company can use the framework to help make successful attacks substantially more difficult and to facilitate rapid detection and recovery. "The goal is a balanced approach that both protects and quickly detects when something is amiss. And it's one that emphasizes being prepared with a strong response and recovery plan," he said.

May said that the framework already is being used effectively by firms ranging from major multinationals to small businesses. The full prepared text of May's remarks is available from the NIST website.

Media Contact: Jennifer Huergo,, 301-975-6343

back to top