Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).
In This Issue...
Shelling Out Evidence: NIST Ballistic Standard Helps Tie Guns to Criminals
Thanks to a new reference standard developed by the National Institute of Standards and Technology (NIST), law enforcement agencies will have an easier time linking the nearly 200,000 cartridge cases recovered annually at U.S. crime scenes to specific firearms.
Cartridge cases—the empty shells left behind after a gun is fired—are routinely sent to forensic laboratories for analysis when they're found at a shooting scene. Using a specialized microscope called an Integrated Ballistic Identification System (IBIS), lab technicians acquire digital images of three markings, or "signatures," impressed on the cartridge case by the gun that fired it. These signatures—the firing pin impression, the breech face impression and the ejector mark—are unique when fired from a specific firearm and can serve as "fingerprints" for that gun once the digital images are entered into a national database known as the National Integrated Ballistic Information Network (NIBIN).
For forensic examiners to reliably match recovered cartridge cases with ones whose signatures have been recorded in the NIBIN, they need to have confidence in the accuracy of the equipment and procedures used to make the link. That's where NIST's new "standard casing" comes in. The standard contains two items: an exact replica of a master cartridge case with distinct signature marks (obtained from the Department of Justice's Bureau of Alcohol, Tobacco, Firearms and Explosives, or ATF), and a "golden" digital image of those same signatures that reside on the NIBIN.
Forensic labs can image the signatures on the SRM cartridge cases—a test of their optical microscope and measurement procedures—and compare those images to the golden image provided by ATF. In this way, lab technicians can verify their equipment is calibrated and functioning properly, validate their methodology, and can demonstrate that their work is traceable to an authoritative national standard.
To make the standard cartridge cases, NIST engineers used a technique called electroforming, which is similar to the electroplating method used by jewelers to coat objects with silver, gold and other metals. First, a master fired cartridge case was selected. Then the electroforming process created a near-perfect inverse copy, like a photographic negative, of the master. The negative was then used as a mold from which multiple replicas were made.
"The electroforming process is so accurate that the replica cartridge cases made using it have signature marks that are less than a few micrometers—millionths of a meter—different from those on the master," says NIST mechanical engineer Alan Zheng, one of the team members who developed the reference casing.
NIST Standard Reference Material (SRM) 2461, Standard Casing, is the second NIST ballistic reference material to be made available to law enforcement agencies. In 2006, NIST created a Standard Bullet, SRM 2460, which is machined to precisely mimic the firing markings seen on sample bullets obtained from ATF and the Federal Bureau of Investigation (FBI). Like the standard cartridge case, SRM 2460 includes a bullet and golden images of the striations on the bullet.*
"Together, the two SRMs make a powerful tool for forensic labs and the law enforcers they support," says NIST physical science technician Brian Renegar, another member of the ballistics SRM development team. "For instance, if a crime is committed in California, and another in New York using the same firearm, the link between the two crimes might be missed. The NIBIN network enables forensic examiners to identify these potential matches, or 'hits', where they might otherwise go unnoticed. And the NIST ballistic SRMs ensure that the imaging systems used in the labs are calibrated and operating properly, and that proper measurement procedures are being followed."
For more information, including ordering instructions, for both SRM 2460 and 2461, go to www.nist.gov/srm/index.cfm.
* See the Jan. 19, 2007, NIST Tech Beat article, "NIST 'Standard Bullet' Fights Gang Violence" at www.nist.gov/public_affairs/techbeat/tb2007_0119.htm#bullet.
Media Contact: Michael E. Newman, email@example.com, 301-975-3025
NIST's Speedy Ions Could Add Zip to Quantum Computers
Take that, sports cars! Physicists at the National Institute of Standards and Technology (NIST) can accelerate their beryllium ions from zero to 100 miles per hour and stop them in just a few microseconds. What's more, the ions come to a complete stop and hardly feel the effects of the ride. And they're not just good for submicroscopic racing—NIST physicists think their zippy ions may be useful in future quantum computers.
The ions (electrically charged atoms) travel 100 times faster than was possible before across a few hundred micrometers in an ion trap—a single ion can go 370 micrometers in 8 microseconds, to be exact (about 100 miles per hour.)
Although ions can go much faster in accelerators, the NIST ions demonstrate precision control of fast acceleration and sudden stops in an ion trap. A close analogy is a marble resting at the bottom of a bowl, and the bowl suddenly accelerating (see animation). During the transport, the marble will oscillate back and forth relative to the center of the bowl. If the bowl is suddenly stopped at the right time, the marble will come to rest together with the bowl. Furthermore, the NIST researchers assured that their atomic marble's electron energy levels are not affected, which is important for a quantum computer, where information stored in these energy levels would need to be moved around without compromising the information content.
For a quantum computer to solve important problems that are intractable today, the information carried by many quantum bits, or qubits, needs to be moved around in the processor. With ion qubits, this can be accomplished by physically moving the ions. In the past, moving ions took much longer than the duration of logic operations on the ions. Now these timescales are nearly equivalent. This reduces processing overhead, making it possible to move ions and prepare them for reuse much faster than before.
As described in Physical Review Letters,* NIST researchers cooled trapped ions to their lowest quantum energy state of motion and, in separate experiments, transported one and two ions across hundreds of micrometers in a multi-zone trap. Rapid acceleration excites the ions' oscillatory motion, which is undesirable, but researchers controlled the deceleration well enough to return the ions to their original quantum state when they came to a stop. A research group from Mainz, Germany, reports similar results.
The secret to the speed and control is custom electronics. NIST researcher Ryan Bowler used fast FPGA (field programmable gate array) technology to program the voltage levels and durations applied to various electrodes in the ion trap. The smooth voltage supply can move the ions very fast while also keeping them from getting too excited.
With advances in precision control, researchers think ions could be transported even more quickly and yet still return to their original quantum states when they stop. Researchers must also continue to work on the many practical challenges, such as suppressing unwanted heating of the ion motion from noisy electric fields in the environment. The research is supported by the Intelligence Advanced Research Projects Activity, National Security Agency, Office of Naval Research, and Defense Advanced Research Projects Agency.
* R. Bowler, J. Gaebler, Y. Lin, T.R. Tan, D. Hanneke, J.D. Jost, J.P. Home, D. Leibfried and D.J. Wineland. Coherent diabatic ion transport and separation in a multi-zone trap array. Physical Review Letters. Aug. 20, 2012. DOI: 10.1103/PhysRevLett.109.080502.
Media Contact: Laura Ost, firstname.lastname@example.org, 303-497-4880
Seeing the Light with NIST's New Noiseless Optical Amplifier
Most devices that amplify light suffer from the same problem: making the image brighter also adds muddying distortion. Scientists working at the National Institute of Standards and Technology (NIST) have demonstrated that they can amplify weak light signals without adding noise while also carrying more information—more pixels—than other low-noise amplifiers. The new development could improve optical communications, quantum computing and information processing, and enhance biological and astronomical imaging.*
Researchers have developed other light amplifiers using "nonlinear" crystals and optical fibers that don't add noise, but they're limited when it comes to amplifying images. Crystals need high laser intensities, which can distort the image. Amplifying light with fibers works well, but the fibers have to be long and the beam is confined to a small area, which constrains the complexity of the image to single pixels.
NIST's four-wave mixing technique amplifies images by intersecting the light from three differently colored lasers—two "pumps" and a probe laser carrying the image—at precise angles inside a gas of hot rubidium atoms. After passing through a stencil in the shape of the image they want to amplify, the probe laser, whose color, or frequency, is halfway between those of the pump lasers, bisects the angle made by the pump lasers. The combination of the lasers' color, their angle of intersection, and their interaction with the rubidium gas creates the conditions for noiseless amplification of complex images with potentially thousands of pixels.
There is a limitation to this kind of an amplifier—it's "phase sensitive." This means that for the amplification to be noiseless, the pump and signal beams going into the amplifier have to remain stable with respect to each other to within a small fraction of a wavelength so that the beams interfere and add up properly. Such a condition on the beams makes it harder to keep them aligned and stable than for the more common "phase insensitive" amplifiers.
According to NIST physicist Paul Lett, this technique can amplify images by a factor of up to 4.6 times the original signal strength.
"The light we use is infrared, which is good for biological and astronomical imaging," says Lett. "Now we just need to show that our technique amplifies the image faithfully, pixel by pixel, so that we can be assured that it is fully practicable."
* N.V. Corzo, A.M. Marino, K.M. Jones and P.D. Lett. Noiseless optical amplifier operating on hundreds of spatial modes. Physical Review Letters. Published online July 26, 2012.
Media Contact: Mark Esser, email@example.com, 301-975-8735
NIST Shows New Device Could Improve Fiber-Optic Quantum Data Transmission
Tests performed at the National Institute of Standards and Technology (NIST) show that a new method for splitting photon beams could overcome a fundamental physical hurdle in transmitting electronic data. These results* could lead to commercial systems that can help safeguard the transfer of sensitive information.
The findings confirm that a prototype device developed with collaborators at Stanford University can double the amount of quantum information that can be sent readily through fiber-optic cables, and in theory could lead to an even greater increase in the rate of this type of transmission.
Conventional fiber-optic systems, in use for decades, transmit data as a series of light pulses—just a step up from Morse code. Such pulse streams can be intercepted by third parties undetectably. But the photons themselves can carry data, encoded in their quantum states. Because any attempt to intercept that data alters the quantum state, eavesdroppers can always be detected.
While information scientists have found a way to encode photon quantum states successfully, practical systems need the photons to be at wavelengths compatible with both existing optical-fiber networks and single-photon sensitive (and economically viable) silicon detectors. Unfortunately, these wavelengths are different.
One potential solution is to change the photons' wavelength from the infrared—desirable for fiber networks—to the visible spectrum so a silicon detector can "see" them. A method of doing so is to mix the information-carrying photons with a second photon beam. The information-carrying photons absorb this second beam's additional energy and get kicked up from the infrared region of the spectrum to the wavelength of visible red light, which silicon sensors can detect. However, silicon single-photon detectors cannot operate very fast, which puts limitations on data rates and ultimately, their usefulness for quantum information transmission systems.
"The limiting factor up until this point has been the detector speed," says Paulina Kuo, a scientist with NIST's Applied and Computational Mathematics Division. "Researchers would like a way around this issue, as it stands in the way of quantum information-based security innovations."
The heart of the newly developed device is a new crystal that goes beyond converting the wavelength of the photons. Designed and fabricated by Stanford's Jason Pelc, the crystal is capable of splitting the beam of infrared, information-carrying photons into two distinct beams of slightly different color, and directing the different-colored photons to different outputs. Controlling the flow to either output allows the team to use two "slow" detectors in place of one, thereby doubling the overall system speed.
NIST tests showed that this innovation allows twice as much data to be sent in a single beam, and Kuo says that the photons conceivably can be split not just into two, but several different beams.
"We first demonstrated this concept last year,** but with this new device, the technique can be scaled up, meaning that in theory, we can significantly increase the amount of information that can be sent," she says. "We hope this is a potential solution to the detector problem."
* J.S. Pelc, P.S. Kuo, O. Slattery, L. Ma, X. Tang and M.M. Fejer. Dual-channel, single photon upconversion detector at 1.3 micrometers. Optics Express. V. 20 No. 17. Published Aug. 3, 2012.
** L. Ma, J.C. Bienfang, O. Slattery and X. Tang. Up-conversion single-photon detector using multi-wavelength sampling techniques. Optics Express Vol. 19, No. 6. Published Mar. 14, 2011.
Media Contact: Chad Boutin, firstname.lastname@example.org, 301-975-4261
Long-Predicted Fluctuations in Cell Membranes Observed for First Time
A long-standing mystery in cell biology may be closer to a solution thanks to measurements taken at the National Institute of Standards and Technology (NIST) and France's Institut Laue-Langevin (ILL), where scientists have observed changes in the thickness of a model cell membrane for the first time. The findings, which confirm that long-predicted fluctuations occur in the membranes, may help biologists understand many basic cellular functions, including how membranes form pores.
Every cell in your body is surrounded by a cell membrane, a thin, flexible wall made of fatty molecules that maintains the integrity of the nucleus and the rest of the cell's interior. Cells need a way to take in nutrients and expel waste across the membrane, and generally this involves lodging special proteins in the membrane. These proteins form holes that can open and close, acting as gateways to the interior.
Before these proteins take their place in the membrane, they float freely about the cell's protoplasm. But just how the membrane—whose job, after all, is to form an otherwise impermeable barrier—allows these proteins to penetrate it in the first place is largely a mystery, though one clue might lie in its dynamic nature.
"The cell membrane is not a static barrier. It's always moving, its thickness fluctuating and waves rippling through it," says Michihiro Nagao of the NIST Center for Neutron Research (NCNR). "Some theories indicate that if a protein is near the interior of the membrane when it is moving in just the right way, this movement might allow the protein to work its way in somehow."
The research team constructed a set of artificial membranes and analyzed their movement with a spin echo machine, a very specialized device of which there are only a few in the world. After a lengthy measurement effort, the team eventually found that when warmed to around body temperature, the membrane thickness fluctuated by up to 8 percent roughly every 100 nanoseconds, or 30 times slower than for comparable nonbiological sheets.
"Some theories indicate that some form of motion like this must be happening for pores to form, so it's exciting to actually see them," says Paul Butler, also of the NCNR.
It will take time to understand completely the cause of the fluctuations, why they are so slow, and how they enable protein insertion, but Butler points out that knowledge of the speed and size of the fluctuations will be helpful in designing therapies to control dysfunction in membrane permeability, including the creation of undesirable pores that lead to cell death.
"This research gives us a tool with which we can measure the effect of potential therapeutic agents on the thickness fluctuations," Butler adds.
The operation of the instrument at NIST is funded in part by the National Science Foundation.
*A.C. Woodka, P.D. Butler, L. Porcar, B. Farago and M. Nagao. Lipid bilayers and membrane dynamics: Insight into thickness fluctuations. Physical Review Letters, DOI: 10.1103/PhysRevLett.109.058102, Vol. 9, Issue 5, Aug. 3, 2012.
Media Contact: Chad Boutin, email@example.com, 301-975-4261
NIST Focuses on Testing Standards to Support Lab on a Chip Commercialization
Lab on a chip (LOC) devices—microchip-size systems that can prepare and analyze tiny fluid samples with volumes ranging from a few microliters (millionth of a liter) to sub-nanoliters (less than a billionth of a liter)—are envisioned to one day revolutionize how laboratory tasks such as diagnosing diseases and investigating forensic evidence are performed. However, a recent paper* from the National Institute of Standards and Technology (NIST) argues that before LOC technology can be fully commercialized, testing standards need to be developed and implemented.
"A testing standard," explains NIST physical scientist and paper author Samuel Stavis, "defines the procedures used to determine if a lab on a chip device, and the materials from which it is made, conform to specifications." Standardized testing and measurement methods, Stavis writes, will enable MEMS (microelectromechanical systems) LOC manufacturers at all stages of production—from processing of raw materials to final rollout of products—to accurately determine important physical characteristics of LOC devices such as dimensions, electrical surface properties, and fluid flow rates and temperatures.
To make his case for testing standards, Stavis focuses on autofluorescence, the background fluorescent glow of an LOC device that can interfere with the analysis of a sample. Stavis states that multiple factors must be considered in the development of a testing standard for autofluorescence, including: the materials used in the device, the measurement methods used to test the device and how the measurements are interpreted. "All of these factors must be rigorously controlled for, or appropriately excluded from, a meaningful measurement of autofluorescence," Stavis writes.
Quality control during LOC device manufacturing, Stavis says, may require different tests of autofluorescence throughout the process. "There may be one measure of autofluorescence from the block of plastic that is the base material for a chip, another once the block has been fashioned into the substrate in which the functional components are embedded, and yet another as the final device is completed," Stavis says. "To manufacture lab on a chip devices with reliably low autofluorescence, accurate measurements may be needed at each stage."
Stavis also emphasizes that it is important not to confuse testing standards with product standards, and to understand how the former facilitates the latter. "A product standard specifies the technical requirements for a lab on a chip device to be rated as top quality," he says. "A testing standard is needed to measure those specifications, as well as to make fair comparisons between competing products."
* Stavis, S.M. A glowing future for lab on a chip testing standards. Lab on a Chip (2012), DOI: 10.1039/c2lc40511c
Media Contact: Michael E. Newman, firstname.lastname@example.org, 301-975-3025
Updated NIST Guide is a How-To for Dealing With Computer Security Incidents
The National Institute of Standards and Technology (NIST) has published the final version of its guide for managing computer security incidents. Based on best practices from government, academic and business organizations, this updated guide includes a new section expanding on the important practice of coordination and information sharing among agencies.
Government agencies face daily threats to their computer networks. The Federal Information Security Management Act requires government agencies to establish incident response competencies, and NIST researchers revised the guidance in Computer Security Incident Handling Guide to cover challenges related to today's evolving threats.
During the chaotic first minutes when a computer system is under attack, having a well-prepared incident response plan to follow ensures that steps such as alerting other agencies or law enforcement occur in the correct order.
The revised NIST guide provides step-by-step instructions for new, or well-established, incident response teams to create a proper policy and plan. NIST recommends that each plan should have a mission statement, strategies and goals, an organizational approach to incident response, metrics for measuring the response capability, and a built-in process for updating the plan as needed. The guide recommends reviewing each incident afterward to prepare for future attacks and to provide stronger protections of systems and data.
"This revised version encourages incident teams to think of the attack in three ways," explains co-author Tim Grance. "One is by method—what's happening and what needs to be fixed. Another is to consider an attack's impact by measuring how long the system was down, what type of information was stolen and what resources are required to recover from the incident. Finally, share information and coordination methods to help your team and others handle major incidents."
A draft version of the guide covered agencies sharing and coordinating information, but public comments called for more detailed information in this area, and the authors added a section on this topic to meet the requests. The guidance suggests that information about threats, attacks and vulnerabilities can be shared by trusted organizations before attacks so each organization can learn from others. By reaching out to the trusted group during an attack, one of the partners may recognize the unusual activity and make recommendations to quash the incident quickly. Also, some larger agencies with greater resources may be able to help a smaller agency respond to attacks.
The guide provides recommendations for agencies to consider before adding coordination and information sharing to the incident response plan, including how to determine what information is shared with other organizations and consulting with legal departments.
The final edition of Computer Security Incident Handling Guide (NIST Special Publication 800-61, Rev. 2) is available at http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-61r2.pdf.
Media Contact: Evelyn Brown, email@example.com, 301-975-5661
Steering Group for Identity Ecosystem Hosts First Meeting
The Identity Ecosystem Steering Group Kickoff Meeting to support the National Strategy for Trusted Identities in Cyberspace (NSTIC) will be held Aug. 15 and 16, 2012, in Chicago, Ill.
In April 2011, President Obama signed the strategy, calling for the public and private sectors to collaborate on the creation of an “Identity Ecosystem” where individuals can choose from multiple identity providers and digital credentials for more convenient, secure, and privacy-enhancing online transactions.
The private-sector-led Identity Ecosystem Steering Group will provide an open process for organizations to participate in development of the ecosystem. The group’s goal will be to craft a framework for identity solutions that can replace passwords, allow individuals to prove online that they are who they claim to be, and enhance privacy. The group will also devise an accreditation process for Identity Ecosystem participants and certify that accreditation authorities validate adherence to the requirements of the framework.
The NSTIC Program Office, hosted by the National Institute of Standards and Technology (NIST) has awarded a grant to Trusted Federal Systems to serve as the steering group secretariat for an initial 24-month period. During that time, the group is expected to develop plans to become self-sustaining.
Membership in the steering group is open to all interested parties, with categories for “participating” or “observing” membership. Participating members actively participate in the steering group and the work of the plenary, its standing committees and working groups. Members can join as representatives of an organization or as individuals. The full steering group is expected to meet at least two times per year.
The workshop will be held at the Donald E. Stephens Convention Center in Chicago. Attendance is free and open to all, however, registration is required to participate in the formal proceedings. Visit http://ow.ly/cNkiZ to register in advance for the steering group and for information on participating via webcast, for those who cannot attend in person. Attendees may also register onsite.
Media Contact: Jennifer Huergo, firstname.lastname@example.org, 301-975-6343
Four at NIST Honored with Flemming Awards
Four National Institute of Standards and Technology (NIST) employees were among the 12 recipients of the annual Arthur S. Flemming Award that recognizes outstanding service to the federal government by individuals with three to 15 years of federal service.
Craig Brown, a research chemist in the NIST Center for Neutron Research, is recognized "for pioneering contributions to the understanding of new materials suited for hydrogen energy storage in next-generation clean automobiles. His work and findings address one of the largest obstacles in the road to the hydrogen economy—the development of safe, practical storage systems that operate at room temperature."
Elizabeth Gentry, Metric Program coordinator in the NIST Physical Measurement Laboratory (PML) Office of Weights and Measures, is recognized "for exceptional leadership in serving as the national focal point for voluntary conversion to the metric system. She led an effort to persuade states to amend their laws and regulations to permit manufacturers and retailers to voluntarily use metric units on their packaging. At the same time, she worked to ensure that the laws of other countries continue allowing current U.S. labeling while the transition occurs."
Nathan Newbury, a physicist in PML's Quantum Electronics and Photonics Division, who "invented and applied fiber-laser frequency combs to address challenging research problems involving subhertz optical spectroscopy, high-frequency metrology, nanometer-precision distance ranging, and ultrahigh-bandwidth communications. His research with fiber-laser frequency combs is being replicated worldwide and has potential contributions to advances in precision timekeeping, climate-change science, and precision manufacturing."
Till Rosenband, a physicist in the PML Time and Frequency Division, is recognized for having developed "the world's most accurate atomic clock, with an uncertainty equivalent to one second in 4 billion years. The clock can be used for exquisitely sensitive measurements of gravity, motion, and other quantities, exploiting the ticking rate to make a new class of sensors, which can be used in mineral exploration, inertial navigation, and new ultraprecise measurements of fundamental physics constants."
For the past 63 years, the Flemming Awards have recognized outstanding men and women in all branches of the federal government. Presented at George Washington University's Marvin Center on June 4, 2012, the awards were first established in 1948 in honor of the commitment of Arthur S. Flemming (1905-1996) to public service throughout his distinguished career, which spanned seven decades and 11 presidencies. The awards are given in three categories: Applied Science, Engineering and Mathematics; Research; and Managerial or Legal Achievement. Awardees are selected by the Arthur S. Flemming Awards Commission and a panel of judges.
Ed.: Verb changed in fifth paragraph to more accurately reflect nature of advance. Aug. 13, 2012.
Media Contact: Richard Wilkinson, email@example.com, 301-975-5040