NIST logo

Tech Beat - June 17, 2014

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: June 17, 2014
Date Modified: June 17, 2014 
Contact: inquiries@nist.gov

NIST Technique Could Make Sub-wavelength Images at Radio Frequencies

Imaging and mapping of electric fields at radio frequencies (RF)* currently requires the use of metallic structures such as dipoles, probes and reference antennas. To make such measurements efficiently, the size of these structures needs to be on the order of the wavelength of the RF fields to be mapped. This poses practical limitations on the smallest features that can be measured.

subwave imaging
Laboratory apparatus for mapping and imaging of radio frequency (RF) electric fields at resolutions below the usual RF wavelength limit. Rubidium atoms are placed in the glass cylinder (on the right), which is illuminated at opposite ends by red and blue laser beams. The cylinder (2.5 by 7.5 centimeters in size) moves left on a track to enable the narrow laser beams to scan its entire width. The antenna (on the left) generates an RF field, which, depending on its frequency, has a certain effect on the spectrum of light absorbed by the atoms. By measuring this effect researchers can calculate and map the RF field strength as a function of position in the cylinder.
Credit: Holloway/NIST
high resolution image

New theoretical and experimental work by researchers at the National Institute of Standards and Technology (NIST) and the University of Michigan suggests an innovative method to overcome this limit by using laser light at optical wavelengths to measure and image RF fields. The new technique uses a pair of highly stable lasers and rubidium atoms as tunable resonators to map and potentially image electric fields at resolutions far below their RF wavelengths (though not below the much shorter wavelengths of the lasers).

This advance could be useful in measuring and explaining the behavior of metamaterials and metasurfaces—structures engineered to have electromagnetic properties not found in nature, such as the illusion of invisibility. Imaging with sub-RF wavelength resolution also could help measure and optimize properties of densely packaged electronics and lead to new microscopy systems and imaging sensors.

Typically, RF field measurements are averaged over antenna dimensions of tens of millimeters (thousandths of a meter) or more. NIST's prototype technique has resolution limited by the beam widths of the two lasers used—in the range of 50 to 100 micrometers (millionths of a meter.) The technique was used to map RF fields with much longer wavelengths of 2863 and 17,605 micrometers (frequencies of 104.77 gigahertz and 17.04 gigahertz), respectively.**

The NIST and Michigan researchers mapped field strength as a function of position at resolutions as low as one-hundredth of an RF wavelength, far below normal antenna limits. Such data might be used to make colorized 2D images. In theory, the technique should work for wavelengths ranging from 600 to 300,000 micrometers.

The rubidium atoms are in a hollow glass cylinder (see photo), which is traversed down its length by two overlapping laser beams that act as stimulants and filters. First, a red laser excites the atoms, which initially absorb all the light. Then, a tunable blue laser excites the atoms to one of many possible higher energy ("Rydberg") states, which have novel properties such as extreme sensitivity and reactivity to electromagnetic fields.

Next an RF field—at the frequency to be mapped or imaged—is applied. This field alters the frequency at which the atoms vibrate, or resonate, altering the frequencies at which the atoms absorb the red light. This change in the absorption is easily measured and is directly related to the electric field strength at that part of the cylinder. By moving the cylinder sideways on a track across the narrow laser beams, researchers can map the changing field strength across its diameter. The blue laser can be tuned to excite the atoms to different states to measure the strength of different RF frequencies.

In the demonstration, researchers measured the strength of standing waves at specific locations inside the glass cylinder. For the two frequencies studied, measurements of the field agreed with results from numerical simulations.

The imaging technique is a spinoff of an ongoing NIST effort to develop a method that will, for the first time, directly link electric field measurements to the International System of Units (SI).

NIST developed the new measurement and imaging technique. University of Michigan co-authors provided the tunable blue laser and assisted in the measurements. The project is funded in part by the Defense Advanced Research Projects Agency.

*The term RF is used here to span the conventional radio, microwave, millimeter wave and terahertz frequency bands.
**C.L. Holloway, J.A. Gordon, A. Schwarzkopf, D. Anderson, S. Miller, N. Thaicharoen and G. Raithel. Sub-wavelength imaging and field mapping via EIT and Autler-Townes splitting in Rydberg atoms. Applied Physics Letters. 104, 244102; Posted online June 16, 2014. doi:10.1063/1.4883635

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

back to top

NIST Analytic Technique Offers Arson Investigators Faster, More Accurate Results

A research group at the National Institute of Standards and Technology (NIST) has demonstrated a new method for detecting ignitable liquids that could change the way arson fires are investigated. The new process for analyzing debris for traces of fire accelerants is faster and more accurate than conventional methods and produces less waste.

arson investigation tools
Shown here are the tools for testing arson samples using NIST’s new PLOT capillary method. Clockwise from top are a test chimney with burned sample of Douglas Fir (used in test burns), paint can equipped with a septum cap, test sample of Douglas Fir, scraper, collection tray, fire debris samples, then at the bottom a PLOT capillary and two containers used with the capillary.
Credit: Burrus/NIST
high resolution image

An arson investigation typically requires collecting one or two liters of ashes and debris from various locations within a fire scene in metal cans similar to those used for paint, and sending the material to a lab. The testing methods typically include gas and liquid chromatography or various versions of spectroscopy, with gas chromatography being the most widely used in fire debris analysis, according to the lead NIST researcher, Tom Bruno.

When the fire debris is received at the testing facility, samples are taken for testing. Sometimes this will involve suspending a strip with activated charcoal in the air or "headspace" directly above the sample in the paint can for a period of time that can vary, depending on the judgment of the analyst, for 2-3 hours or up to 16 hours.

Other testing methods include "dynamic purge and trap" of the headspace. And still another sampling method involves a newer solid-phase microextraction method (SPME) that does not destroy the sample. This later method, however, has a high displacement rate of heavier over lighter ignitable liquid components, is difficult to automate, makes preserving and archiving samples difficult and has not shown a consistent ability to obtain repeatable and quantitative results. Also, the SPME sampling method requires expensive equipment, and the SPME fibers are easily damaged. Still other methods are less sensitive and produce large amounts of chemical waste.

The vapor collection method developed by Bruno's group involves the dynamic adsorption of headspace vapors on short porous layer open tubular (PLOT) columns maintained at low temperature (as low as -40 C). The benefits of this method are many. The collection sensitivity is high; below 1 part per billion (ppb). The low temperature is achieved using a vortex tube connected to compressed air; it has no moving parts, and is attractive for use in environments with explosive or flammable materials.

After vapor collection, the PLOT capillaries can be heated (up to 160 C, again with the vortex tube), releasing the vapor. The capillaries used are robust and cheap, and this process is especially effective with relatively nonvolatile substances because of its wide operating temperature range. It also is not limited to water-borne samples, as most commercial sampling instruments are. And best of all, this PLOT-cryo method can be used to simultaneously test for up to eight different ignitable liquids from a single sample. This allows investigators to take multiple samples from each of several locations in a fire scene (such as a grid approach) in a short amount of time. This method also enables high repeatability and quality assurance of the testing process and is available in a portable unit that can perform the sampling in remote locations.

"This sampling method is faster, more efficient, recovers more analytics and produces much less waste than traditional methods," Bruno said. "And the sampling device and its components are much cheaper than traditional equipment." While the present study involved samples measured in the laboratory, Bruno has further developed the method to be field portable. A patent is pending for a device that will offer these same vapor collections, even at fire scenes. The self-contained portable unit is carried in a standard briefcase and may be available to arson investigators in as little as two years.

*J.E. Nichols, M.E. Harries, T.M. Lovestead and T.J. Bruno. Analysis of arson fire debris by low temperature dynamic headspace adsorption porous layer open tubular columns. Journal of Chromatography A. Volume 1334, 21 March 21, 2014.

Media Contact: James Burrus, james.burrus@nist.gov, 303-497-4789

back to top

Snowballs to Soot: The Clumping Density of Many Things Seems to Be a Standard

Particles of soot floating through the air and comets hurtling through space have at least one thing in common: 0.36. That, reports a research group at the National Institute of Standards and Technology (NIST), is the measure of how dense they will get under normal conditions, and it’s a value that seems to be constant for similar aggregates across an impressively wide size range from nanometers to tens of meters.*

plastic beads
High school student Jessica Young checking the packing density of random aggregates of plastic spheres in a cylinder. Young's work as a summer intern at NIST contributed to a paper arguing that rigid aggregates like those she's testing tend to clump together at roughly the same density regardless of scale, from microscopic soot to large comets.
Credit: Baum/NIST
high resolution image

NIST hopes the results will help in the development of future measurement standards to aid climate researchers and others who need to measure and understand the behavior of aerosols like carbon soot in the atmosphere.

Soot comes mostly from combustion and is considered the second biggest driver of global warming, according to NIST chemist Christopher Zangmeister. It is made up of small round particles of carbon about 10 or 20 nanometers across. The particles stick together randomly in short chains and clumps of a half dozen or more spheres. These, in turn, clump loosely together to form larger, loose aggregates of 10 or more which over a few hours will compact into a somewhat tighter ball which is atmospheric soot.

The interesting question for chemists studying carbon aerosols is how tight? How dense? Among other things, the answer relates to the balance of climate effects from soot: heating from light absorption versus cooling from light reflection.

The maximum packing density of objects is a classic problem in mathematics, which has been fully solved for only the simplest cases. The assumed density in models of atmospheric soot is 0.74, which is the maximum packing density of perfect spheres, such as billiard balls, in a given space. But when Zangmeister’s team made measurements of the packing density of actual soot particles, the figure they got was 0.36. “We figured, man, we’ve got to be wrong, we’re off by a factor of two,” Zangmeister recalls, but “a bunch more measurements” convinced them that 0.36 was correct. Why?

Enter the summer help. Two students, one in college and one in high school, who were working with Zangmeister’s group last summer were set to the task of modeling the packing question with little 6 mm plastic spheres sold for pellet guns. They glued thousands of random combinations of spheres together in clumps of from 1 to 12 spheres, and then filled every available size of graduated cylinders and hollow spheres with their assemblies, over and over, and over.

Their charted results, as a function of clump size, form a curve that levels off at … 0.36.

It gets better. Inspired by a book on the solar system he was reading with his son, Zangmeister checked NASA’s literature. Comets are formed very much the same way as soot particles, except out of dust and ice, and they’re a lot bigger. NASA’s measurements on a collection of 20 comets estimate that packing density at between 0.2 and 0.4. So 0.36 may be an all-purpose value.**

NIST’s interest in the nature of soot particles is driven by a desire to imitate them, according to Zangmeister. “It's amazing how much uncertainty there is in optical measurements of particles in the atmosphere. The reason for this uncertainty is rooted in something really important to NIST: there are no real methods for calibrations. You can calibrate any CO2 measurement using one of our Standard Reference Materials for CO2 in air, but there's no such thing as a bottle of standard aerosol or a standard aerosol generator. That’s really at the heart of what we’re trying to do: make a black material that simulates carbon that you can put into an aerosol and know it will come out the same way every time. It's a real materials chemistry project.”

The agency is working with the National Research Council of Canada and Environment Canada on the project.

*C.D. Zangmeister, J.G. Radney, L.T. Dockery, J.T. Young, X. Ma, R. You and M.R. Zachariah., The packing density of rigid aggregates is independent of scale. PNAS Early Edition. Published online June 9, 2014. doi:10.1073/pnas.1403768111.
**0.36 is also very close to the reported values for compacted silicon dioxide monomers (ceramics industry) and pharmaceutical powders made from “microscale random aggregates.”

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

NIST to Establish New Centers of Excellence for Work in Forensics, Disaster Resilience

Officials at the National Institute of Standards and Technology (NIST) have announced plans to establish two new research Centers of Excellence to work with academia and industry on issues in forensic science and disaster resilience.

northridge quake damage
Need for disaster resilience planning: The Northridge, Calif., earthquake of January 17, 1994, resulted in significant damage to the community’s infrastructure, including streets, gas pipelines and power transmission systems.
Credit: Rymer/U.S. Geological Survey
high resolution image

NIST plans to hold merit competitions to establish the centers, tentatively planned to be funded at up to $4 million a year for five years.

NIST Centers of Excellence are meant to provide multidisciplinary research centers where experts from academia, industry and NIST can work together on specific high-priority research topics. The agency established its first such center, dedicated to advanced materials research, in December 2013.*

One of the planned new centers would focus on tools to support community disaster resilience. The center would work on developing integrated, systems-based computational models to assess community infrastructure resilience and guide community-level resilience investment decisions. The proposed center also would develop a data management infrastructure that allows for public access of disaster data, as well as tools and best practices to improve the collection of disaster and resilience data.

The second proposed center would support NIST's efforts to strengthen forensic science through the development and delivery of improved measurement and analysis technologies and the development of best practices and standardized methodologies to improve evidence interpretation and reporting. Because forensic science covers a broad array of technical disciplines, NIST is considering one or more cross-cutting areas where research could benefit work across the field. Potential technical areas of focus include probabilistic methods (analysis techniques that produce a scientific estimate of the likelihood that a known and unknown sample match), pattern recognition and digital evidence.

Each of these centers will provide additional technical resources and expertise to support NIST's ongoing efforts in these important areas.

Details of the application process will be posted on Grants.gov this summer. Plans for both centers are subject to the availability of funding. Interested parties may subscribe to receive email notification of NIST Center of Excellence program announcements by clicking here.

For more on NIST's on-going programs on disaster resilience, see: http://www.nist.gov/el/building_materials/resilience/. For more on NIST work in forensic science, see: www.nist.gov/forensics/.

*See "NIST Announces New Center for Materials Research to Advance Manufacturing and Innovation" at www.nist.gov/public_affairs/tech-beat/tb20131203.cfm#coe.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top

Two NIST Researchers Honored with Flemming Awards

Two researchers at the National Institute of Standards and Technology (NIST) are among 12 federal employees being honored today* as recipients of the 2013 Arthur S. Flemming Awards.

The Flemming Awards were established in 1948 by the Washington, D.C., Downtown Jaycees in honor of Arthur Flemming's commitment to public service throughout the seven decades of his distinguished career. The awards recognize exceptional young federal employees in five categories: leadership and management; legal achievement; social science, clinical trials and translational research; applied science and engineering; and basic science.

The two NIST researchers honored this year, with their award citations, are:

Thomas Perkins
Physicist Thomas Perkins is one of two NIST winners of the 2013 Arthur S. Flemming Awards.
Credit: NIST
high resolution image

Dr. Thomas T. Perkins, of NIST's Physical Measurement Laboratory:
For creating unprecedented new ways, as a physicist, to precisely measure and manipulate the key molecules of life (DNA, RNA, proteins) under real world biological conditions for the first time, through innovative, multidisciplinary programs combining atomic force microscopy (AFM), laser physics, molecular biology, and advanced electronics. Dr. Perkins' leadership has led to the invention of new AFM systems 100 times more stable and sensitive than the previous world's best. He achieved this remarkable improvement in the wet, warm environment needed to measure the molecules of life under natural conditions, rather than in a vacuum near absolute zero required by previous AFMs. Dr. Perkins' work has revealed new details about the structure and function of DNA, RNA, and proteins in natural environments for the first time, providing knowledge to engineer more effective medical diagnostics and treatments. He leads partnerships with industry to transfer advanced AFM and related technologies to develop new research and measurement tools for molecular biology. Mindful of the future, he mentors and trains the next generation of young scientists to pioneer new research and measurement technologies working in industry, universities, and national laboratories.

Dr. Emanuel H. Knill, of NIST's Information Technology Laboratory
For his remarkable accomplishments as a NIST Fellow in the Applied and Computational Mathematics Division of the Information Technology Laboratory. Dr. Emanuel Knill is one of the world's leading theorists in the field of quantum information science and engineering. An emerging discipline at the intersection of physics and computer science, quantum information is likely to revolutionize science and technology in the same way that lasers, electronics, and computers did in the 20th century. Dr. Knill has developed some of the essential mathematical foundations for exploring the unique rules of quantum mechanics which govern atomic-scale systems to enable the development of novel computing devices with phenomenal increases in information storage and processing capability. His groundbreaking research in the theory of quantum optics, quantum error correction, quantum state tomography, quantum computer benchmarking, and quantum algorithms is providing essential guidance to the experimental physics community as it works to create a new age of quantum engineering.

The Arthur S. Flemming Awards are administered by The Trachtenberg School of Public Policy and Public Administration at The George Washington University. The full list of the 2013 Flemming award winners is at http://tspppa.gwu.edu/award-recipients.

*Originally published on June 9, 2014.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

back to top