Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).

View the beta site
NIST logo

Tech Beat - February 7, 2012

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: February 7, 2012
Date Modified: February 7, 2012 
Contact: inquiries@nist.gov

NIST Announces Program to ID Human Cell Lines for Research

The National Institute of Standards and Technology (NIST) has announced that it is launching a project to collect and catalog DNA identification data for up to 1,500 human cell lines used in biological and medical research. In a notice posted in the Feb. 3, 2012, Federal Register, NIST called for voluntary contributions of cell lines to be cataloged in the project.

dna analysis
Cellular fingerprint: This electropherogram demonstrates the STR (short tandom repeat) technique NIST will use to identify human cell lines. The chart shows (in grey) specific locations, or loci, on the DNA chain of a human chomosome that are known to harbor short repeating sequences of DNA bases of varying lengths. The peaks count the actual number of such repeats at each locus. If the STRs at a sufficient number of loci are counted this way—NIST uses eight loci—the chances of a random match are approximately 100 million to one.
Credit: Kline/NIST
View hi-resolution image

The data will be collected in a publically accessible database hosted by the National Center for Biotechnology Information (NCBI), a division of the National Library of Medicine of the National Institutes of Health.

“Immortalized” human cell lines are laboratory cultures of cells that have been induced to continue growing and replicating. They are widely used in pharmaceutical, biomedical and biotechnology research, multiplied and divided, passing from lab to lab and country to country. The oldest such cell line is the so-called HeLa line, originally derived from cervical cancer cells. That line dates to 1951.

The biomedical research community has become increasingly concerned about mix-ups, cross contamination and misidentification in widely used cell lines—problems that potentially could invalidate research results. The problem was highlighted by the work of University of California researcher Walter Nelson-Rees, who in a series of papers in the 1970s documented extensive misidentification of cell cultures contaminated with cells from the HeLa line. Studies since then have demonstrated that the problem is, if anything, getting worse. In one survey, the German cell line repository Deutsche Sammlung von Mikroorganismen und Zellkulturen (DSMZ) found that 18 percent of human cancer cell lines sampled were misidentified.

A key problem to date has been the lack of a convenient, reliable method by which research groups can validate the identity of their lines. The NIST project seeks to remedy that by building a database of cell lines that are reliably identified by profiling DNA markers called short tandem repeats (STRs)—the same technique used in criminal forensics to match DNA samples. The profile analyzes nuclear DNA from the cells for STRs—short sequences of DNA bases that are repeated from two to six times in row—at eight specific sites on the molecule. It also checks a gene to determine cell gender. The probability that two unrelated cells will have matching profiles is approximately 1 in 100 million.

STR profiling offers several advantages for identifying cell lines, in addition to being highly discriminating, according to NIST experts. It’s a relatively simple procedure for a cell biology lab to run; the costs are low, particularly because STR profiling kits developed for the forensic community are readily available; and the results can be summarized as numeric values and made widely available through a public-access database such as the one hosted by NCBI.

Information on cell lines in the database will include various descriptors such as the cell line name, the tissue of origin, morphology, pathologic or disease-state and details of the growth culture; the STR markers and procedures used in identification and the STR profile of the cell line.

NIST will accept up to 15 candidate cell lines from submitters on a first-come, first-served basis. No cell lines grown on nonhuman feeder cells will be accepted due to the possibility of cross-species contamination. Submitters must bear the cost of shipping the cell samples or DNA extracts to NIST. NIST will pay for the STR profiling, subject to the availability of funds.

Full details of the program are available in the NIST Federal Register notice, “Identification of Human Cell Lines Project” [Docket No. 120104006–2006–01], available at http://www.gpo.gov/fdsys/pkg/FR-2012-02-03/pdf/2012-2459.pdf. Information on the project as it progresses will be available on the website of the NIST Applied Genetics Group at http://www.nist.gov/mml/biochemical/genetics/index.cfm.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

Unusual 'Collapsing' Iron Superconductor Sets Record for Its Class

A team from the National Institute of Standards and Technology (NIST) and the University of Maryland has found an iron-based superconductor that operates at the highest known temperature for a material in its class.* The discovery inches iron-based superconductors—valued for their ease of manufacturability and other properties—closer to being useful in many practical applications.

When calcium atoms (yellow spheres) in these iron-based crystals (left) are replaced on some occasions with praseodymium (blue sphere in right image), the crystals are able to superconduct at up to 47K - but the crystals can also collapse, shrinking by about 10 percent in size. Adding a sufficient amount of praseodymium is necessary to avoid the collapse, which compromises the materials usability in electronics applications.
Credit: NIST
View hi-resolution image

Iron-based superconductors, which were discovered only about four years ago, are a hot research topic, in part because they are more amenable to commercial applications than copper-based superconductors, which are more difficult to make and are frequently brittle. Of the four broad classes of iron-based superconductors, the 1:2:2 class—so named because their crystals are built around a hub of one atom of calcium, two of iron and two of arsenic—is particularly promising because these superconductors’ properties can be custom-tailored by substituting other atoms for these basic elements.

Magnets made with low-temperature superconductors have already found use in hospital MRI machines, but less expensive MRI machines and other applications, such as superconducting cables for resistance-free power transmission over long distances, become closer to reality the more choices manufacturers have among superconductors.

Working at the NIST Center for Neutron Research (NCNR) and the University of Maryland, the team found that a particular type of 1:2:2 superconductor possesses some unexpected properties. Of perhaps greatest value to manufacturers is that its threshold temperature of superconductivity is 47 degrees Kelvin, the highest yet for the 1:2:2 class, whose previous record was 38K.

But the crystal also has a highly curious property: It can superconduct at this record temperature when a smaller atom is substituted for the crystal’s original calcium in some of its hubs, and when this substitution is performed, the overall crystal actually shrinks by about 10 percent, a dramatic size change. “It’s almost like what would happen if you cut off a few inches from the bottom of your chair’s legs,” says the NCNR’s Jeff Lynn. “The crystal just collapses. The change is quite visible in neutron scans.”

This effect is likely one that manufacturers will want to avoid. But Lynn says the group’s research has determined how to make the substitution while eluding the collapsed state altogether, so that as it is cooled, the potential mechanical instabilities associated with the collapse are sidestepped. “This understanding should enable manufacturers to use the superconductor in electronic devices,” he says.

* S.R. Saha, N.P. Butch, T. Drye, J. Magill, S. Ziemak, K. Kirshenbaum, P.Y. Zavalij, J.W. Lynn and J. Paglione. Structural collapse and superconductivity in rare-earth-doped CaFe2As2. Physical Review B. Published Jan. 13, 2012. DOI: 10.1103/PhysRevB.85.024525.

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

Good Timing: NIST/CU Collaboration Adds Timing Capability to Living Cell Sensors

Individual cells modified to act as sensors using fluorescence are already useful tools in biochemistry, but now they can add good timing to their resumé, thanks in part to expertise from the National Institute of Standards and Technology (NIST).

human cells
Micrograph of human cells modified to act as metal sensors. Researchers at JILA built hardware to help time the dynamic states of these sensors, which change fluorescence color when they bind metal ions. The cells are false-colored to indicate the extent of their reactions; the darker cells have bound the most metal ions.
Credit: Hairong Ma/JILA
View hi-resolution image

With the added capability to track the timing of dynamic biochemical reactions, cell sensors become more useful for many studies, such as measurements of protein folding or neural activity.

As described in the Journal of the American Chemical Society,* a NIST biophysicist working at JILA and a collaborator at the University of Colorado Boulder (CU) developed a microfluidic system that records biochemical reactions over a time span of milliseconds to seconds in living human cells modified to act as FRET (fluorescence resonance energy transfer) sensors.

The fast, flexible system uses lasers to measure sensor signals at two points in time at a rate of up to 15 cells per second. Statistical data, such as the average value of the FRET response for thousands of cells, can be collected in minutes.

"Our system is the first one that measures FRET response times at the single-cell level, while at the same time measuring over many cells," says JILA Fellow Ralph Jimenez, whose research group built the optics, microfluidics, electronics and other hardware.

JILA is a joint institute of NIST and CU. Jimenez is collaborating with Amy Palmer, an assistant professor in CU's Department of Chemistry and Biochemistry, who handled the molecular design and cell-biology aspects of the project.

The FRET technique relies on reactions that occur between large biological molecules in close proximity to each other. One molecule absorbs light energy from a laser and transfers this energy to the nearby acceptor molecule. The acceptor molecule then releases this energy as light (fluorescence) at a characteristic wavelength that is different from the original laser light. Measurements of this fluorescence indicate the extent of the energy transfer. FRET can be used to study many types of cellular processes. In these experiments, the researchers were interested in the type and concentration of metal ions within cells, which can affect important cell processes. The JILA/CU experiments used cells genetically modified to take up particular metal ions and signal changes in their concentrations by altering the FRET signals.

The researchers made a microfluidic device with a flow-control valve system that mixes cells and metal-containing chemicals in just a few milliseconds. The cells then pass single file through two blue laser beams that excite the FRET fluorescence signal at different locations in the device. With precise flow control and flexible device design, cell travel time between the two locations can be varied from 1 millisecond to 10 seconds. Scientists measure the FRET signal changes within individual cells between the two locations.

"FRET is an important measurement technique used in bio-imaging, so it's great that NIST could begin to contribute to measurements of the fidelity of FRET-based sensors," Jimenez says. "We have a lot more work planned for the future with this instrument."

The project is part of the research team's effort to develop cell sensors with improved optical, physical and chemical properties and to enable detection of very faint signals in living cells. The work was supported in part by a CU-NIST seed grant, the National Institutes of Health and the National Science Foundation.

* H. Ma, E.A. Gibson, P.J. Dittmer, R. Jimenez and A.E. Palmer. High-throughput examination of FRET-detected metal-ion response in mammalian cells. Journal of the American Chemical Society (JACS). Published online Jan. 19, 2012. (Communication) DOI: 10.1021/ja2101592.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

Charter Service: Encasing the Magna Carta

Credit: NARA video

You often hear about the Framers of the Constitution, but not so much the framers of the Magna Carta. They work for the National Institute of Standards and Technology (NIST).

Not the authors, of course; they’ve been dead 700 years. But a NIST engineering team, at the behest of the National Archives, designed and built a state-of-the-art encasement and transport cart to protect the Archive’s prized copy of the 1297 Magna Carta. Their work—and the freshly conserved Magna Carta—were on display Feb. 2, 2012, at a special “behind-the-scenes” showing at the National Archives Conservation Lab. The enclosure is designed to visually enhance the parchment document while maintaining the interior environment so it does not degrade the document, which is the underpinning to Western Civilization and U.S. law.

The encasement is basically a controlled environment, explained NIST Project Engineer Jay Brandenburg, who regularly does similar tasks to isolate sensitive lab equipment.

The first Magna Carta was signed in 1215 by King John of England. He was forced by an assembly of barons to put in writing, for the first time, the traditional rights and liberties of the country’s free persons. After another confrontation with barons, Edward I not only reissued the Magna Carta in 1297, but for the first time, it was entered into the official Statute Rolls of England and became the foundation of English law.

magna carta
The Magna Carta rests in its argon-filled NIST encasement at the Archives Conservation lab.
Credit: Hill/NARA
View hi-resolution image

The owner of this copy of the Magna Carta, David M. Rubinstein, loaned the document to the National Archives and paid for its restoration and encasement.

While the Archives refurbished the parchment, NIST engineers and crafts people built a platform to hold the Magna Carta, the large encasement it sits in and a heavy-duty cart.

NIST worked from a three-dimensional laser scan of the document to support it on the platform and to create a nest to hold the original wax seal with Edward I’s likeness, which is attached to the Magna Carta by a frail parchment ribbon.

The platform was created from a single 6-inch thick block of aluminum to minimize the number of joints or spots that could cause leaks in the encasement, explained Brandenburg. About 90 percent of the block was cut away with a computer-controlled milling machine based on the three-dimensional image to create the perfect fit.

The end result is an enclosure about 41 inches wide by 28 inches long and 6 inches deep. It weighs 225 pounds. The encasement cover is made of a special laminated glass with antireflective coatings to ensure maximum visibility of the document while protecting it. The encasement is sealed with close-fitting bolts that hold the frame against double O-rings that create the encasement seal. The case was filled with argon gas and will be monitored to avoid as much oxidation damage as possible.

The enclosure will likely never be seen in its entirety again. By mid-February it will be placed inside the new interactive display in the West Rotunda Gallery of the U.S. National Archives Building in Washington, D.C. There, alongside three other documents for which NIST built similar enclosures—the Declaration of Independence, the Constitution and the Bill of Rights—it will be on display for the 1 million visitors that pass through the Archives each year.

To learn more about previous NIST work on document enclosures, visit “For Posterity: NIST Helps to Preserve the ‘Charters of Freedom’” at www.100.nist.gov/Charter/charters_of_freedom_project.htm.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Provides Octagonal Window of Opportunity for Carbon Capture

Filtering carbon dioxide, a greenhouse gas, from factory smokestacks is a necessary, but expensive part of many manufacturing processes. However, a collaborative research team from the National Institute of Standards and Technology (NIST) and the University of Delaware has gathered new insight into the performance of a material called a zeolite that may stop carbon dioxide in its tracks far more efficiently than current scrubbers do.*

The roughly octagonal pores in zeolite SSZ-13 are like stop signs for carbon dioxide, capturing molecules of the greenhouse gas while apparently letting other substances through. The material could prove to be an economical smokestack filter.
View hi-resolution image

Zeolites are highly porous rocks—think of a sponge made of stone—and while they occur in nature, they can be manufactured as well. Their toughness, high surface area (a gram of zeolite can have hundreds of square meters of surface in its myriad internal chambers) and ability to be reused hundreds of times makes them ideal candidates for filtering gas mixtures. If an unwanted molecule in the gas mixture is found to stick to a zeolite, passing the mixture through it can scrub the gas of many impurities, so zeolites are widely used in industrial chemistry as catalysts and filters.

The team explored a zeolite created decades ago in an industrial lab and known by its technical name, SSZ-13. This zeolite, which has octagonal “windows” between its interior pore spaces, is special because it seems highly capable of filtering out carbon dioxide (CO2) from a gas mixture. “That makes SSZ-13 a promising candidate for scrubbing this greenhouse gas out of such things as factory smokestacks,” says Craig Brown, a researcher at the NIST Center for Neutron Research (NCNR). “So we explored, on an atomic level, how it does this so well.”

Using neutron diffraction, the team determined that SSZ-13’s eight-sided pore windows are particularly good at attracting the long, skinny carbon dioxide molecules and holding onto their “positively-charged” central carbon atoms, all the while allowing other molecules with different shapes and electronic properties to pass by unaffected. Like a stop sign, each pore halts one CO2 molecule—and each cubic centimeter of the zeolite has enough pores to stop 0.31 grams of CO2, a quantity that makes SSZ-13 highly competitive when compared to other adsorbent materials.

Brown says a zeolite like SSZ-13 probably will become a prime candidate for carbon scrubbing because it also could prove more economical than other scrubbers currently used in industry. SSZ-13’s ability to attract only CO2 could mean its use would reduce the energy demands of scrubbing, which can require up to 25 percent of the power generated in a coal or natural gas power plant.

“Many industrial zeolites attract water and carbon dioxide, which are both present in flue exhaust—meaning both molecules are, in a sense, competing for space inside the zeolite,” Brown explains. “We suspect that this novel CO2 adsorption mechanism means that water is no longer competing for the same site. A zeolite that adsorbs CO2 and little else could create significant cost savings, and that’s what this one appears to do.”

Brown says his team is still collecting data to confirm this theory, and that their future efforts will concentrate on exploring whether SSZ-13 is equally good at separating CO2 from methane—the primary component of natural gas. CO2 is also released in significant quantities during gas extraction, and the team is hopeful SSZ-13 can address this problem as well.

* M.R. Hudson, W.L. Queen, J.A. Mason, D.W. Fickel, R.F. Lobo and C.M. Brown. Unconventional, highly selective CO2 adsorption in zeolite SSZ-13. Journal of the American Chemical Society Published on the Web Jan. 10, 2012. DOI: 10.1021/ja210580b

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

New NIST 'Cell Assay on a Chip': Solid Results from Simple Means

The great artist and inventor Leonardo da Vinci once said that “simplicity is the ultimate sophistication.” National Institute of Standards and Technology (NIST) research engineer Javier Atencia certainly believes in the wisdom of what da Vinci preached; he has a reputation for creating novel microfluidic devices out of ordinary, inexpensive components. This time, he has combined a glass slide, plastic sheets and double-sided tape into a “diffusion-based gradient generator”—a tool to rapidly assess how changing concentrations of specific chemicals affect living cells.*

Atencia’s latest innovation offers a simple and inexpensive way to expose an array of cultured cells to a chemical gradient—a solution where the chemical concentration changes gradually and predictably across the array. Such gradients are a rapid, high-throughput way to evaluate the effect on cell growth or toxicity.

There are two distinct advantages to the new NIST system over traditional microfluidic cell assay devices. The first is that the gradient is created by diffusion—the gentle movement of matter from one point to another by random molecular motion. Conventional microfluidic systems usually mix fluids by pumping them in a circular motion or by twisting and folding them together. Diffusion greatly reduces the risk of cells being swept away or damaged by shearing forces in the test fluid.

The second big advantage is simplicity. The gradient generator is built in layers, with each section precisely positioned with an alignment tab. The base is a glass slide, upon which is attached a strip of double-sided tape cut to have a row of four micrometer-sized channels. Atop this goes a polystyrene strip cut to have two lines each of four tiny circular “wells” where each pair lines up with the ends of the channel below it. The next layer is another strip of double-sided tape, this time with a Y-shaped canal cut into it to serve as the flow path for the chemical gradient. Finally, a Mylar strip cut to have an identical Y-canal serves as the cover.

chipscale cell assay
NIST researchers have combined a glass slide, plastic sheets and double-sided tape to create an inexpensive and simple-to-build microfluidic device for exposing an array of cells to different concentrations of a chemical.
Credit: Cooksey/NIST
View hi-resolution image

The hinged cover allows access to the wells for adding test cells. Once done, the cover is lowered and affixed, sealing the gradient generator. Fluid flow in and out of the system is accomplished using another Atencia innovation, magnetic connectors. Kept at constant pressure, this flow assures a steady-state stream through the device and creates a diffusion gradient in each buried channel. Cells in the channels are simultaneously exposed to a range of chemical concentrations from high to low.

To test the device, Atencia and his colleagues loaded it with cells genetically engineered to produce large amounts of green fluorescent protein (GFP) and then introduced cycloheximide (CHX), a chemical that shuts down ribosomes, the cell’s protein factories. Cells exposed to the toxin quickly stop synthesizing GFP, decreasing fluorescence by an amount directly related to the concentration of CHX.

This is exactly what the researchers observed in the gradient generator assays. The cells were exposed three times to CHX, and each time, the level of GFP fluorescence increased as the concentration of CHX in the gradient decreased, and vice versa.

In his previous microfluidics creations, Atencia turned a simple plastic dish into a “microfluidic palette” for exposing cells to multiple chemical concentrations in a single chamber** and then merged donut-shaped magnets and plastic tubes to make a leak-free connector for getting fluids into and out of a microchannel.***

* J. Atencia, G.A. Cooksey and L.E. Locascio. A robust diffusion-based gradient generator for dynamic cell assays. Lab on a Chip, Vol. 12, Pages 309-315 (2012). DOI: 10.1039/C1LC20829B

** “‘Microfluidic Palette’ May Paint Clearer Picture of Biological Processes,” NIST Tech Beat, July 28, 2009. www.nist.gov/public_affairs/tech-beat/tb20090728.cfm#palette.

*** “Novel NIST Connector Uses Magnets for Leak-Free Microfluidic Devices,” NIST Tech Beat, Nov. 17, 2009. www.nist.gov/public_affairs/tech-beat/tb20091117.cfm#magnets.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

New Colors: JILA Scientists Confirm First 'Frequency Comb' to Probe Ultraviolet Wavelengths

Physicists at JILA have created the first “frequency comb” in the extreme ultraviolet band of the spectrum, high-energy light less than 100 nanometers (nm) in wavelength. Laser-generated frequency combs are the most accurate method available for precisely measuring frequencies, or colors, of light. In reaching the new band of the spectrum, the JILA experiments demonstrated for the first time a very fine mini-comb-like structure within each subunit, or harmonic, of the larger comb, drastically sharpening the measurement tool.

Artist’s conception of JILA’s extreme ultraviolet (EUV) frequency comb. The original light source is a pulsed infrared laser, which is used to create a train of attosecond-long pulse bursts at EUV wavelengths (the bright white spot in the distance). Each of the resulting “harmonics”—strong signals at regular fractions of the original infrared wavelength—has its own set of “teeth” marking individual frequencies (series of adjacent white lines in the foreground), creating a frequency comb within each harmonic. To prove the new structure exists, JILA scientists observed a tooth interacting with argon atoms, indicated by the glowing atom symbol in the center foreground.

Credit: Baxley/JILA
View hi-resolution image

The new comb, described in the Feb. 2 issue of the journal Nature,* confirms and expands on the JILA group’s 2005 claim of the ability to generate extreme ultraviolet (EUV) frequencies for making precise measurements in that part of the electromagnetic spectrum. The new tool can aid in the development of “nuclear clocks” based on ticks in the nuclei of atoms, and measurements of previously unexplored behavior in atoms and molecules.

JILA is a joint venture of the National Institute of Standards and Technology (NIST) and the University of Colorado Boulder.

“Nobody doubted that the EUV frequency comb was there, it's just that nobody had seen it with real experimental proof,” says NIST/JILA Fellow Jun Ye, the group leader. “The new work provides the first experimental proof, and also really shows that one can now do science with it.”

Frequency combs are created with ultrafast pulsed lasers and produce a span of very fine, evenly spaced “teeth,” each a specific frequency, which can be used like a ruler to measure light. Frequency combs are best known for measuring visible and near-infrared light at wavelengths of about 400 to 1500 nm (frequencies of about 750 to 200 terahertz, or trillions of cycles per second), enabling development of next-generation atomic clocks.** In the past few years researchers at JILA, NIST and many other laboratories have pushed comb boundaries toward other regions of the electromagnetic spectrum.

To create the world’s first extreme ultraviolet (EUV) frequency comb, JILA scientists used a high-power laser to generate infrared light pulses that bounce back and forth and overlap in an optical cavity 154 million times per second(a frequency of 154 megahertz, or MHz). When xenon gas is injected into the cavity, the laser field drives an electron temporarily out of each atom of gas. When the electron snaps back into the atom, it generates a train of light pulses with a duration of several hundred attoseconds each (1 attosecond is 0.000 000 000 000 000 001 seconds). The process generates “harmonics”—strong signals at regular fractions of the original infrared wavelength. As a result of the high repetition frequency of the laser (154 MHz), for the first time ever, each harmonic has its own set of “teeth” marking individual frequencies, a mini frequency comb within the big comb.

The EUV comb is the first system for high-accuracy laser spectroscopy—the use of light to probe matter and make measurements traceable to international standards—at wavelengths below 200 nm, a frequency of more than 1 petahertz (quadrillion cycles per second).

The EUV comb is the culmination of several technical advances, including improved high-power ytterbium fiber lasers, an optical cavity formed by five mirrors in which light pulses overlap perfectly and build on each other in a stable way, and better understanding of the plasma (a mix of electrons and electrically charged atoms, or ions) required to generate EUV light inside the cavity. Researchers finally achieved an ideal balance of high power and stability in the cavity.

Applications for the new comb include the development of nuclear clocks, based on changes in energy levels of an atom’s nucleus instead of the electronic structure as in today’s atomic clocks. The nucleus is well isolated from external interference and thus might make an extremely stable clock. Other applications include studies of plasmas such as those in outer space; and searches for any changes in the fundamental “constants” of nature, values crucial to many scientific calculations. Ye hopes to continue extending combs toward shorter wavelengths to create an X-ray frequency comb.

This research is a result of a five-year collaboration between JILA and IMRA America Inc., of Ann Arbor, Mich., which designed and built the high-power precision ytterbium fiber laser specifically for this project. The research was funded in part by the Defense Advanced Research Projects Agency, the Air Force Office of Scientific Research, NIST and the National Science Foundation.

* Arman Cingöz, Dylan C. Yost, Thomas K. Allison, Axel Ruehl, Martin E. Fermann, Ingmar Hartl and Jun Ye. Direct frequency comb spectroscopy in the extreme ultraviolet. Nature, Feb. 2, 2012.

** See the NIST background paper “Optical Frequency Combs” at www.nist.gov/public_affairs/releases/frequency_combs.cfm.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

Comment  Comment on this article.back to top

NIST Report on Texas Fire Urges Firefighters to Consider Wind Effects

Portion of a NIST computer simulation of an April 2009 residential fire in Houston, Texas, showing the deadly wind-driven flow path created by the failure of a large span of windows in the rear of the structure.

Credit: NIST Engineering Laboratory

Wind conditions at a fire scene can make a critical difference on the behavior of the blaze and the safety of firefighters, even indoors, according to a new report by the National Institute of Standards and Technology (NIST). The findings confirm earlier NIST research, but they take on a particular immediacy because they are based on detailed computer models of a tragic 2009 residential fire in Houston, Texas, that claimed the lives of two firefighters.

The NIST modeling was done at the request of the Houston Fire Department (HFD) and the Centers for Disease Control and Prevention’s National Institute for Occupational Safety and Health (NIOSH), both of which wanted expert insight into the fire dynamics (behavior) that killed a 29-year veteran captain and a probationary firefighter.

Two NIST fire experts traveled to Houston shortly after the April 12, 2009, fire in a one-story ranch-style home located on the east side of the city. They examined the site and collected data about the behavior of the fire and the factors impacting that behavior—in particular, the wind at the time—in order to unravel the events that led to the deaths of the two men.

This was accomplished by creating sophisticated computer models of the fire and then visualizing them using two popular NIST software tools: the Fire Dynamics Simulator (FDS), which numerically characterizes the movement of smoke and hot gases caused by fire, wind and ventilation systems; and Smokeview, which displays the FDS calculation results as animations. The simulations portrayed two different scenarios of the Houston fire. The first demonstrated the actual conditions that firefighters experienced that day, including the contributing role of wind, while the second was intended to show how the fire may have behaved in the absence of wind. The wind-included scenario indicated that the fire followed a wind-driven flow path between the den and the front door after the failure of a large span of windows in the den. Floor-to-ceiling temperatures rapidly increased—in some areas, in excess of 260 degrees Celsius (500 degrees Fahrenheit)—in this flow path where multiple crews of firefighters were working. In the NIST simulation that excluded wind, the flow path was not created, and the temperatures and conditions where the firefighters were working were significantly less hazardous.

The authors of the NIST report, Adam Barowy and Daniel Madrzykowski, stated that “the ‘wind’ and ‘no wind’ simulations clearly demonstrate how wind conditions can rapidly change the thermal environment from tenable (survivable) to untenable for firefighters working in a single-story residential structure fire.” They add that the results from the Houston fire simulations are in agreement with those NIST has done in collaboration with the Fire Department of New York City and the Chicago Fire Department for wind-driven fires in high-rise structures. This, the authors said, stresses the importance of including wind conditions for all structural fire scene operations—both before and during firefighting—and adjusting tactics according to changing wind situations, especially regarding interior operations, to enhance the safety, and maximize the effectiveness, of firefighters.

The NIST report that describes the details of the computer models and what was learned from them, Simulation of the Dynamics of a Wind-Driven Fire in a Ranch-Style House—Texas (NIST Technical Note 1729), is available online at www.nist.gov/customcf/get_pdf.cfm?pub_id=909779.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

NIST Seeks Proposals for Projects to Improve Trust in Cyberspace

On Feb. 1, 2012, the National Institute of Standards and Technology (NIST) launched a competition for pilot projects to accelerate progress toward improved systems for interoperable, trusted online credentials that go beyond simple user IDs and passwords. The competition, managed by the NIST-hosted National Program Office for the National Strategy for Trusted Identities in Cyberspace (NSTIC), plans to award a total of approximately $10 million to fund five to eight projects running up to two years.

NSTIC, a White House initiative, works with private-sector organizations, advocacy groups, public agencies and others to improve the privacy, security and convenience of online transactions by creating an “Identity Ecosystem”—secure, efficient, easy-to-use and interoperable mechanisms that allow Internet users to establish their identify for online services in a manner that promotes confidence, privacy, choice and innovation.

Details of the solicitation and submission process are in the NIST Federal Funding Opportunity (FFO) notice posted at Grants.gov (www.grants.gov) under Funding Opportunity Number 2012-NIST-NSTIC-01. Initial proposals must be received no later than 5 p.m. Eastern time on March 7, 2012. Selected finalists will be invited to submit a full proposal. NIST anticipates funding projects in the range of approximately $1.25 million to $2 million per year, though proposals requesting smaller amounts may be considered.

NIST has cited several factors that have prevented identity solutions from being widely deployed in the marketplace, including the need for technical standards that ensure interoperability among different authentication systems, a lack of clarity about liabilities when something goes wrong, a need for common standards for privacy protections and data re-use, and problems with the ease of use for some strong authentication technologies. Any “Identity Ecosystem” that addresses these problems, according to NIST, must satisfy four core principles. Identity solutions should be privacy enhancing and voluntary, secure and resilient, interoperable, cost effective and easy to use.

On Feb. 15, 2012, NIST plans to host a proposer’s conference from 9 a.m. to 12 noon at the Department of Commerce in Washington, D.C., to offer guidance on preparing proposals, explain criteria to be used in making awards, and answer questions from the public. The event will include a live webcast. Participants may ask questions through Twitter and live tweets using the event hashtag, #NSTIC.

Details on the webcast address and registration information for the conference are available at: http://www.nist.gov/itl/nstic-pilots-grant-proposers-conference.cfm. Further information about NSTIC and upcoming related events is available at: http://www.nist.gov/nstic.

For more information, see the Feb. 1 NIST news announcement, “NIST to Fund Pilot Projects that Advance Trusted Identities in Cyberspace” at www.nist.gov/public_affairs/nsticpilotgrants.cfm.

Media Contact: Gail Porter, gail.porter@nist.gov, 301-975-3392

Comment  Comment on this article.back to top

NIST Report Recommends New Privately Led Steering Group to Drive Trusted Identities in Cyberspace

The National Institute of Standards and Technology (NIST) released its recommendations for a new, privately led steering group to tackle the complex policy and technical issues necessary to create an online environment where individuals and organizations will be able to better trust one another. In a report released Feb. 7, 2012, NIST also announced its intent to issue a Federal Funding Opportunity for an organization to convene the steering group and provide it with initial secretarial, administrative and logistical support.

The report lays out a path for implementing the National Strategy for Trusted Identities in Cyberspace (NSTIC), a White House initiative to bring together the private sector, advocacy groups, public-sector agencies and others to improve the privacy, security and convenience of online transactions.

“While NSTIC is a government initiative, the ‘Identity Ecosystem’ it envisions must be led by the private sector,” said Jeremy Grant, NIST’s senior executive advisor for identity management. “The recommendations we published today lay out a specific path to bring together all NSTIC stakeholders—including the private sector, advocacy groups, public-sector agencies and other organizations—to jointly create an online environment, the ecosystem, where individuals and organizations will be able to better trust one another, with minimized disclosure of personal information.”

The new report contains several key recommendations, including:

  • An Identity Ecosystem Steering Group should be established as a new organization that is led by the private sector in conjunction with, but independent of, the federal government.
  • The group should be structured to safeguard protections for individual privacy and the underrepresented, through mechanisms such as a special privacy coordination committee and an appointed ombudsman.
  • The group should be initially funded by the government through a competitive two-year grant to catalyze its formation and ensure there are no barriers to participation. After a period of initial government support, the steering group will need to establish a self-sustaining structure capable of allowing continued growth and operational independence.

The report also includes a recommended charter to help jumpstart the steering group’s initial activities.

The NIST report was developed with significant input from the public. In June 2011, NIST published a Notice of Inquiry* to solicit feedback and examples from the public. More than 270 people participated in a workshop and 57 responses were received to the notice of inquiry from a wide variety of stakeholders, including private industry and consumer advocacy groups.

The report, “Recommendations for Establishing an Identity Ecosystem Governance Structure” can be found at http://www.nist.gov/nstic/2012-nstic-governance-recs.pdf.

NIST is planning a follow-on workshop on March 15, 2012, at the Department of Commerce in Washington, D.C., to convene stakeholders, review the findings of the report and kickoff NSTIC implementation activities in advance of the formal formation of the steering group later this spring.

Further information about this event and other upcoming NSTIC events will be available at: http://www.nist.gov/nstic.

A copy of the full text of the National Strategy for Trusted Identities in Cyberspace signed by President Obama in April 2011 is available at: http://www.whitehouse.gov/sites/default/files/rss_viewer/NSTICstrategy_041511.pdf.

* The Notice of Inquiry and responses to it can be found at http://www.nist.gov/nstic/notices.html.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top

Federal Information Systems Security Educators' Association Conference Meets March 27-29

The National Institute of Standards and Technology (NIST) will host the 25th annual conference of the Federal Information Systems Security Educator’s Association (FISSEA) March 27-29, 2012, at its Gaithersburg, Md., headquarters.

FISSEA is an organization responsible for promoting cybersecurity awareness, training and education. The annual meeting is geared toward both new and seasoned security officers, IT managers, information security educators and researchers, cybersecurity trainers and teachers and those involved in instructional design and curriculum development. It is open to individuals in government, industry and academia.

This year’s meeting will explore “A New Era in Cybersecurity Awareness, Training and Education.” Attendees will have an opportunity to learn about current cybersecurity projects, emerging trends and new initiatives. The program includes an update on the National Initiative for Cybersecurity Education (NICE), a national campaign coordinated by NIST that is designed to improve the cyber behavior, skills and knowledge of every segment of the population, to enable a safer cyberspace.

Scheduled keynotes are Vice Admiral A. Patricia Tracey, USN (retired) former Chief of Naval Education and Training/Director of Training, who will provide the awareness, training and an educational perspective; and Admiral Betsy Hight, USN (retired) who will provide information from a technical standpoint.

FISSEA runs a contest in conjunction with the annual meeting. The FISSEA Security Awareness, Training & Education Contest asks for entries in the categories of awareness posters, motivational items, awareness website, awareness newsletter and role-based training and education. The deadline for participation in the contest is Feb. 17.

FISSEA also honors an “Educator of the Year” who has made significant contributions in education and training plans for federal information systems security. Nominees need not be FISSEA members, but do need to be nominated by a member. The deadline for nominations is March 5.

For more information on NICE, see  http://csrc.nist.gov/nice/. For more information on the meeting, the Security Awareness, Training and Education Contest, and the Educator of the Year Award, see http://www.nist.gov/itl/csd/fissea-2012-conference.cfm.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Advisory Committee Recommends Non-Profit Guide for Next-Gen Public Safety Communications Net

Creation of a next-generation public safety communications network requires leadership from a single non-profit organization devoted to this purpose, according to a Jan. 31, 2012, report released by a federal advisory committee. Such a network would support voice, video and data transmissions, and ideally be at the disposal of all first responders—the medical, emergency, law enforcement or military personnel who are first on the scene of events that threaten public safety.

The report was released by the Visiting Committee on Advanced Technology (VCAT), which reviews and makes policy recommendations to the National Institute of Standards and Technology (NIST). NIST is engaged in the research supporting public safety communications and operates a testbed at its Boulder, Colo., campus. The committee held meetings and collected input from the communications and public safety communities, as well as the public.

Public safety communications reach across many geographical, jurisdictional and technological lines, involving federal, state and local agencies, as well as private organizations and even volunteers. All have different procedures, budgets and existing technologies that would need to be coordinated to create a communications solution for the entire country.

To meet this challenge, the committee recommends that a non-governmental, non-profit organization be charged with development of standards that would support creation of the network.

As a model, the report describes the Smart Grid Interoperability Panel, which includes representatives from a large number of sectors with an interest in the next-generation power grid. According to the report, "that panel has been an effective mechanism for serious work on the elaboration of standards and requirements and identification of useful specifications for Smart Grid devices."

The committee envisions an organization that can establish "frameworks for cooperation that can build on common planning, standards, technology, budgeting and practices."

For more details, see the NIST Jan. 31 news announcement, “New Report Outlines Key Features of Next-Generation Public Safety Communications” at www.nist.gov/director/vcat/pubsafety_commreport_013112.cfm.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top

NIST Cloud Computing Videos Available Online

Video recordings of the Nov. 2-4, 2011 Cloud Computing Forum & Workshop IV hosted by the National Institute of Standards and Technology (NIST) are now available for on-line viewing.

The three-day November meeting featured, among other highlights, the unveiling of the public draft of its U.S. Government Cloud Computing Technology Roadmap.*

The videos from the meeting include:

  • keynote addresses by NIST Director Patrick Gallagher and U.S. Chief Information Officer Steve VanRoekel;
  • presentation on USG Cloud Computing Technology Roadmap Highlights; and
  • panel discussions on:
    • Cloud without Borders: International Perspectives
    • The Case for USG Cloud Computing Priorities
    • USG Security Challenges and Mitigations

To view the workshop videos, go to www.nist.gov/itl/cloud/playlist.cfm.

* See the Oct. 13, 2011, NIST Tech Beat item, “Federal Cloud Technology Roadmap to be Introduced at Forum & Workshop, Nov. 2-4“ at www.nist.gov/public_affairs/tech-beat/tb20111013.cfm#roadmap.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top