NIST logo

Tech Beat - October 25, 2011

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: October 25, 2011
Date Modified: October 25, 2011 
Contact: inquiries@nist.gov

Future 'Comb on a Chip': NIST's Compact Frequency Comb Could Go Places

Laser frequency combs—extraordinarily precise tools for measuring frequencies (or colors) of light—have helped propel advances in timekeeping, trace gas detection and related physics research to new heights in the past decade. While typical lasers operate at only a single or handful of frequencies, laser frequency combs operate simultaneously at many frequencies, approaching a million for some combs. These combs have very fine, evenly spaced "teeth," each a specific frequency, which can be used like hash marks on a ruler to measure the light emitted by lasers, atoms, stars or other objects. But frequency combs are usually bulky, delicate lab instruments—about the size of a typical suitcase—and challenging to operate, which limits their use.

microcomb
Stack of quartz optical 'cavities' -- precisely machined disks of solid quartz crystal -- for use in NIST's compact laser frequency comb. (Only one is actually used.) A low-power infrared laser produces light that travels in a loop inside one of the cavities. Each cavity is 2 millimeters wide and shaped like a flat ellipse.
Credit: S. Papp/NIST
View hi-resolution image

Now, researchers at the National Institute of Standards and Technology (NIST) have developed a compact laser frequency comb,* a step toward user-friendly and ultimately chip-based combs that could enable new applications in astronomical searches for Earth-like planets, high-capacity telecommunications, and—if other components are miniaturized as well—portable versions of the most advanced atomic clocks. Large frequency combs are best known as the "gears" in today's room-sized versions of these clocks.

NIST's prototype micro-comb consists of a low-power semiconductor laser about the size of a shoebox and a high-quality optical cavity just 2 millimeters wide. A miniature laser like those in DVD players might be substituted in the future to squeeze the whole comb apparatus onto a microchip.

Compact frequency combs have been developed recently by a number of other research groups, but NIST's is the first to use a cavity made of fused silica, or quartz, the most common optical material. This means it could be integrated easily with other optical and photonic components, lead author Scott Papp says.

A full-size frequency comb uses a high-power, ultrafast laser.** By contrast, the new compact version relies on a low-power laser and the cavity's unusual properties. The cavity is designed to limit light dispersion and confine the light in a small space to enhance intensity and optical interactions. The infrared laser light travels in a loop inside the cavity, generating a train of very short pulses and a spectrum of additional shades of infrared light. The small cavity, with no moving parts, offers insight into basic processes of frequency combs, which are difficult to observe in large versions.

Among its desirable features, NIST's compact comb has wide spacing between the teeth—10 to 100 times wider than that found in typical larger combs. This spacing allows scientists to more easily measure and manipulate the teeth. Of particular interest to project leader Scott Diddams, the widely spaced teeth can be individually read by astronomical instruments. Portable frequency combs can thus be used as ultrastable frequency references in the search for Earth-like planets orbiting distant stars.*** Portable frequency combs can also have many other important applications. For example, because a frequency comb can simultaneously generate hundreds of telecommunication channels from a single low-power source, a micro-comb might eventually replace individual lasers now used for each channel in fiber-optic telecommunications.

"We hope this is just the beginning and look forward to bigger and better developments," Diddams says. "In the short term we want to learn if this new type of comb can one day replace ultrafast laser-based combs used with NIST's best atomic clocks. And if not, its small size will likely lead to other opportunities."

The research was supported in part by the Defense Advanced Research Projects Agency.

* S.B. Papp and S.A. Diddams. Spectral and temporal characterization of a fused-quartz microresonator optical frequency comb. Physical Review A. Forthcoming.
** See background on optical frequency combs at http://www.nist.gov/public_affairs/releases/frequency_combs.cfm.
*** See 2009 Tech Beat article, "NIST, CU to Build Instrument to Help Search for Earth-like Planets," at http://www.nist.gov/public_affairs/techbeat/tb2009_1103.htm#cu.

Media Contact: Laura Ost, laura.ost@nist.gov, 303-497-4880

back to top

Improved Characterization of Nanoparticle Clusters for EHS and Biosensors Research

The tendency of nanoparticles to clump together in solution—"agglomeration"—is of great interest because the size of the clusters plays an important role in the behavior of the materials. Toxicity, the persistence of the nanomaterials in the environment, their efficacy as biosensors and, for that matter, the accuracy of experiments to measure these factors, are all known to be affected by agglomeration and cluster size. Recent work* at the National Institute of Standards and Technology (NIST) offers a way to measure accurately both the distribution of cluster sizes in a sample and the characteristic light absorption for each size. The latter is important for the application of nanoparticles in biosensors.

nanoparticles
Clusters of roughly 30-nanometer gold nanoparticles imaged by transmission electron microscopy. (Color added for clarity.)
Credit: Keene, FDA
View hi-resolution image

A good example of the potential application of the work, says NIST biomedical engineer Justin Zook, is in the development of nanoparticle biosensors for ultrasensitive pregnancy tests. Gold nanoparticles can be coated with antibodies to a hormone** produced by an embryo shortly after conception. Multiple gold nanoparticles can bind to each hormone, forming clusters that have a different color from unclustered gold nanoparticles. But only certain size clusters are optimal for this measurement, so knowing how light absorbance changes with cluster size makes it easier to design the biosensors to result in just the right sized clusters.

The NIST team first prepared samples of gold nanoparticles—a nanomaterial widely used in biology—in a standard cell culture solution, using their previously developed technique for creating samples with a controlled distribution of sizes***. The particles are allowed to agglomerate in gradually growing clusters and the clumping process is "turned off" after varying lengths of time by adding a stabilizing agent that prevents further agglomeration.

They then used a technique called analytical ultracentrifugation (AUC) to simultaneously sort the clusters by size and measure their light absorption. The centrifuge causes the nanoparticle clusters to separate by size, the smaller, lighter clusters moving more slowly than the larger ones. While this is happening, the sample containers are repeatedly scanned with light and the amount of light passing through the sample for each color or frequency is recorded. The larger the cluster, the more light is absorbed by lower frequencies. Measuring the absorption by frequency across the sample containers allows the researchers both to watch the gradual separation of cluster sizes and to correlate absorbed frequencies with specific cluster sizes.

Most previous measurements of absorption spectra for solutions of nanoparticles were able only to measure the bulk spectra—the absorption of all the different cluster sizes mixed together. AUC makes it possible to measure the quantity and distribution of each nanoparticle cluster without being confounded by other components in complex biological mixtures, such as proteins. The technique previously had been used only to make these measurements for single nanoparticles in solution. The NIST researchers are the first to show that the procedure also works for nanoparticle clusters.

* J.M. Zook, V. Rastogi, R.I. MacCuspie, A.M. Keene and J. Fagan. Measuring agglomerate size distribution and dependence of localized surface plasmon resonance absorbance on gold nanoparticle agglomerate size using analytical ultracentrifugation. ACS Nano, Articles ASAP (As Soon As Publishable). Publication Date (Web): Sept. 3, 2011 DOI: 10.1021/nn202645b.
** HCG: Human chorionic gonadotropin.
*** See J.M. Zook, R.I. MacCuspie, L.E. Locascio, M.D. Halter and J.T. Elliott. Stable nanoparticle aggregates/agglomerates of different sizes and the effect of their size on hemolytic cytotoxicity. Nanotoxicology, published online Dec. 13, 2010 (DOI: 10.3109/17435390.2010.536615) and the Feb. 2, 2011, NIST Tech Beat article, "NIST Technique Controls Sizes of Nanoparticle Clusters for EHS Studies," at www.nist.gov/public_affairs/tech-beat/tb20110202.cfm#nanoparticles.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

Quantum Computer Components 'Coalesce' to 'Converse'

If quantum computers are ever to be realized, they likely will be made of different types of parts that will need to share information with one another, just like the memory and logic circuits in today's computers do. However, prospects for achieving this kind of communication seemed distant—until now. A team of physicists working at the National Institute of Standards and Technology (NIST) has shown* for the first time how these parts might communicate effectively.

coalescing photons
[1] A single photon is produced by a quantum dot (QD). Simultaneously, a pair of photons is produced by a parametric down-conversion crystal (PDC). [2] One of the PDC photons—which has different characteristics than the QD photon—is routed into a cavity and filter, [3] rendering this PDC photon and the QD photon nearly identical.
Credit: Suplee, NIST
View hi-resolution image

The goal to develop quantum computers—a long-awaited type of computer that could solve otherwise intractable problems, such as breaking complex encryption codes—has inspired scientists the world over to invent new devices that could become the brains and memory of these machines. Many of these tiny devices use particles of light, or photons, to carry the bits of information that a quantum computer will use.

But while each of these pieces of hardware can do some jobs well, none are likely to accomplish all of the functions necessary to build a quantum computer. This implies that several different types of quantum devices will need to work together for the computer or network to function. The trouble is that these tiny devices frequently create photons of such different character that they cannot transfer the quantum bits of information between one another. Transmuting two vastly different photons into two similar ones would be a first step toward permitting quantum information components to communicate with one another over large distances, but until now this goal has remained elusive.

However, the team has demonstrated that it is possible to take photons from two disparate sources and render these particles partially indistinguishable. That photons can be made to "coalesce" and become indistinguishable without losing their essential quantum properties suggests in principle that they can connect various types of hardware devices into a single quantum information network. The team's achievement also demonstrates for the first time that a "hybrid" quantum computer might be assembled from different hardware types.

The team connected single photons from a "quantum dot," which could be useful in logic circuits, with a second single-photon source that uses "parametric down conversion," which might be used to connect different parts of the computer. These two sources typically produce photons that differ so dramatically in spectrum that they would be unusable in a quantum network. But with a deft choice of filters and other devices that alter the photons' spectral shapes and other properties, the team was able to make the photons virtually identical.

"We manipulate the photons to be as indistinguishable as possible in terms of spectra, location and polarization—the details you need to describe a photon. We attribute the remaining distinguishability to properties of the quantum dot," says Glenn Solomon, of NIST's Quantum Measurement Division. "No conceivable measurement can tell indistinguishable photons apart. The results prove in principle that a hybrid quantum network is possible and can be scaled up for use in a quantum network."

The research team includes scientists from the NIST/University of Maryland Joint Quantum Institute (JQI)  and Georgetown University. The NSF Physics Frontier Center at JQI provided partial funding

*S.V. Polyakov, A. Muller, E.B. Flagg, A. Ling, N. Borjemscaia, E. Van Keuren, A. Migdall and G.S. Solomon. Coalescence of single photons from dissimilar single-photon sources. Physical Review Letters, 107, 157402 (2011), DOI: 10.1103/PhysRevLett.107.157402.

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

NIST Measures Key Property of Potential 'Spintronic' Material

An advanced material that could help bring about next-generation "spintronic" computers has revealed one of its fundamental secrets to a team of scientists from Argonne National Laboratory (ANL) and the National Institute of Standards and Technology (NIST).

50-50 alloy
Manganite oxide lattices (purple) doped with lanthanum (magenta) and strontium (green) have potential for use in spintronic memory devices, but their usual disorderly arrangement (left) makes it difficult to explore their properties. The ANL/NIST team's use of a novel orderly lattice (right) allowed them to measure some of the material's fundamental characteristics.
Credit: Argonne National Laboratory
View hi-resolution image

The material, constructed of two different compounds, might one day allow computers to use the magnetic spin of electrons, in addition to their charge, for computation. A host of innovations could result, including fast memory devices that use considerably less power than conventional systems and still retain data when the power is off. The team's effort not only demonstrates that the custom-made material's properties can be engineered precisely, but in creating a virtually perfect sample of the material, the team also has revealed a fundamental characteristic of devices that can be made from it.

Team members from ANL began by doing something that had never been done before—engineering a highly ordered version of a magnetic oxide compound that naturally has two randomly distributed elements: lanthanum and strontium. Stronger magnetic properties are found in those places in the lattice where extra lanthanum atoms are added. Precise placement of the strontium and lanthanum within the lattice can enable understanding of what is needed to harness the interaction of the magnetic forces among the layers for memory storage applications, but such control has been elusive up to this point.

"These oxides are physically messy to work with, and until very recently, it was not possible to control the local atomic structure so precisely," says Brian Kirby, a physicist at the NIST Center for Neutron Research (NCNR). "Doing so gives us access to important fundamental properties, which are critical to understand if you really want to make optimal use of a material."

The team members from ANL have mastered a technique for laying down the oxides one atomic layer at a time, allowing them to construct an exceptionally organized lattice in which each layer contains only strontium or lanthanum, so that the interface between the two components could be studied. The NIST team members then used the NCNR's polarized neutron reflectometer to analyze how the magnetic properties within this oxide lattice changed as a consequence of the near-perfect placement of atoms.

They found that the influence of electrons near the additional lanthanum layers was spread out across three magnetic layers in either direction, but fell off sharply further away than that. Tiffany Santos, lead scientist on the study from ANL, says that the measurement will be important for the emerging field of oxide spintronics, as it reveals a fundamental size unit for electronic and magnetic effects in memory devices made from the material.

"For electrons to share spin information—something required in a memory system—they will need to be physically close enough to influence each other," Kirby says. "By ordering this material in such a precise way, we were able to see just how big that range of influence is."

* T. S. Santos, B. J. Kirby, S. Kumar, S. J. May, J. A. Borchers, B. B. Maranville, J. Zarestky, S. G. E. te Velthuis, J. van den Brink and A. Bhattacharya. Delta doping of ferromagnetism in antiferromagnetic manganite superlattices. Physical Review Letters, Week ending Oct. 14, 2011, 107, 167202 (2011), DOI: 10.1103/PhysRevLett.107.167202.

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

NIST Releases Two Wireless Security Guides, Requests Comment

The National Institute of Standards and Technology (NIST) has issued for public review and comment two draft guides to securing wireless communication networks. NIST is requesting comments on the two publications—one on Bluetooth networks and one on wireless local area networks—by Nov. 10, 2011.

The draft Guide to Bluetooth Security (NIST Special Publication 800-121 Rev. 1) is a revision of the original guide, which was released in September 2008. The document describes the security capabilities of technologies based on Bluetooth, which is an open standard for short-range radio frequency communication, and gives recommendations to organizations on securing their devices effectively. Significant changes from the original SP 800-121 include an update to the vulnerability mitigation information for "Secure Simple Pairing," which helps protect against eavesdropping, and the introduction of Bluetooth version 3.0 High Speed and Bluetooth version 4.0 Low Energy security mechanisms and recommendations. Version 3.0 provides data rate improvement over previous versions of Bluetooth, while 4.0 concerns smaller, resource-constrained devices like heart rate monitors and other wearable medical sensor networks.

The draft Guidelines for Securing Wireless Local Area Networks (SP 800-153) is intended to provide organizations with recommendations for improving the security configuration and monitoring of their wireless local area networks (WLANs) and their devices connecting to those networks. SP 800-153's recommendations cover topics such as standardized WLAN security configurations, security assessments and continuous monitoring. SP 800-153 is an entirely new document that supplements and does not replace older NIST publications on WLAN security, such as SP's 800-97 and 800-48.

The draft version of SP 800-153 is available at http://csrc.nist.gov/publications/drafts/800-153/Draft-SP800-153.pdf, and of SP 800-121 rev. 1 at http://csrc.nist.gov/publications/drafts/800-121r1/Draft-SP800-121_Rev1.pdf.

Comments on these publications should be submitted via email by the Nov. 10 deadline. For SP 800-153, please submit comments to 800-153comments@nist.gov, with "Comments on SP 800-153" in the subject line. Likewise for SP 800-121, please send them to 800-121comments@nist.gov, with "Comments on SP 800-121" in the subject line.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Releases Update to Smart Grid Framework

An expanded list of standards, new cybersecurity guidance and product testing proposals are among the new elements in an updated roadmap for Smart Grid interoperability released today* for public comment by the National Institute of Standards and Technology (NIST).

The NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0, builds upon and updates a January 2010 report. NIST's first outline, Release 1.0**, laid out an initial plan for transforming the nation's aging electric power system into an interoperable Smart Grid—a network that will integrate information and communication technologies with a power-delivery infrastructure, enabling two-way flows of energy and communications.

"Making such dramatic changes to the power grid requires an overarching vision of how to accomplish the task, and this updated Framework advances that vision," said NIST's George Arnold, the National Coordinator for Smart Grid Interoperability. Because the Smart Grid will be a highly complex system of interacting systems, it is essential that everyone with a stake in the new grid have a common understanding of its major building blocks and how they interrelate, Arnold said. "Utilities, manufacturers, equipment testers and regulators will find essential information in the Framework that was not previously available."

Release 2.0 adds 22 standards, specifications and guidelines to the 75 NIST recommended as immediately applicable to the Smart Grid in the first roadmap. New to the 2.0 version is a chapter on the roles of the Smart Grid Interoperability Panel (SGIP), an organization created by NIST in November 2009 to provide an open forum for members to collaborate on standards development. More than 700 organizations are now members of the SGIP, which recently made the first six entries into its Catalog of Standards, a technical document for those involved with developing grid-connected devices. Eventually, hundreds of such standards will be entered into the Catalog, which is also described in the SGIP chapter.

Further improvements and additions to the original document include:

  • an expanded view of the architecture of the Smart Grid
  • a number of developments related to ensuring cybersecurity for the Smart Grid, including  a Risk Management Framework to provide guidance on security practices;
  • a new framework for testing the conformity of devices and systems to be connected to the Smart Grid – the Interoperability Process Reference Manual;
  • Information on efforts to coordinate the Smart Grid standards effort for the United States with similar efforts in other parts of the world; and  
  • an overview of future areas of work, including electromagnetic disturbance and interference, and improvements to the SGIP processes.


The request for public comment on NIST Framework and Roadmap for Smart Grid Interoperability Standards, Release 2.0, is available from the Federal Register at www.federalregister.gov/articles/2011/10/25/2011-27556/nist-framework-and-roadmap-for-smart-grid-interoperability-standards-release-20-draft-request-for#p-3 and will be open for public comment until 5:00 PM Eastern time on Nov. 25, 2011. The document itself may be found on the NIST Smart Grid Collaboration Wiki at http://collaborate.nist.gov/twiki-sggrid/bin/view/SmartGrid/IKBFramework.

* Posted on Oct. 25, 2011.
** See the NIST Jan. 19, 2010, news announcement "NIST Issues First Release of Framework for Smart Grid Interoperability" at www.nist.gov/smartgrid/smartgrid_011910.cfm.

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top

Final Version of NIST Cloud Computing Definition Published

After years in the works and 15 drafts, the National Institute of Standards and Technology's (NIST) working definition of cloud computing, the 16th and final definition has been published as The NIST Definition of Cloud Computing (NIST Special Publication 800-145).

Cloud computing is a relatively new business model in the computing world. According to the official NIST definition, "cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction."

The NIST definition lists five essential characteristics of cloud computing: on-demand self-service, broad network access, resource pooling, rapid elasticity or expansion, and measured service. It also lists three "service models" (software, platform and infrastructure), and four "deployment models" (private, community, public and hybrid) that together categorize ways to deliver cloud services. The definition is intended to serve as a means for broad comparisons of cloud services and deployment strategies, and to provide a baseline for discussion from what is cloud computing to how to best use cloud computing.

"When agencies or companies use this definition," says NIST computer scientist Peter Mell, "they have a tool to determine the extent to which the information technology implementations they are considering meet the cloud characteristics and models. This is important because by adopting an authentic cloud, they are more likely to reap the promised benefits of cloud—cost savings, energy savings, rapid deployment and customer empowerment. And matching an implementation to the cloud definition can assist in evaluating the security properties of the cloud."

While just finalized, NIST's working definition of cloud computing has long been the de facto definition. In fact before it was officially published, the draft was the U.S. contribution to the InterNational Committee for Information Technology Standards (INCITS) as that group worked to develop a standard international cloud computing definition.

The first draft of the cloud computing definition was created in November 2009. "We went through many versions while vetting it with government and industry before we had a stable one." That one, version 15, was posted to the NIST cloud computing website in July 2009. In January 2011 that version was published for public comment as public draft SP 800-145.

Researchers received a large amount of feedback, which mainly dealt with interpretations. The definition from draft to final remained substantively the same and only a modest number of changes were made to ensure consistent interpretations. The NIST Definition of Cloud Computing (SP 800-145) is available at http://csrc.nist.gov/publications/PubsSPs.html#800-145.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

New Study Finds that Baldrige Award Recipient Hospitals Significantly Outperform Their Peers

A new report has found that health care organizations that have won Baldrige National Quality Awards for performance excellence or been considered for a Baldrige Award site visit outperform other hospitals in nearly every metric used to determine the 100 Top Hospitals, a national recognition given by Thomson Reuters.

Commissioned by the Foundation for the Malcolm Baldrige National Quality Award, a private organization, and conducted by Thomson Reuters, the report found that Baldrige hospitals were six times more likely to be counted among the 100 Top Hospitals, which represent the top 3 percent of hospitals in the United States, and that they statistically outperform the 100 Top Hospitals on core measures established by the U.S. Centers for Medicare & Medicaid Services.

To evaluate the benefits of health care organizations using the Baldrige Criteria for Performance Excellence, the Baldrige Foundation chose to conduct a comparison with the Thomson Reuters 100 Top Hospitals national study. The 18-year-old Thomson Reuters program is based on a rigorous, time-tested statistical methodology using publically available, unbiased data.

"[H]ospitals using the Baldrige process are significantly more likely than peers to become 100 Top Hospitals award winners, thereby achieving performance equal to or better than the top 3 percent," the report states. "Although the Baldrige process and the 100 Top Hospitals statistical measurement are quite different, the results of this study suggest that the methods are complementary and identify similarly high-achieving organizations."

Named after Malcolm Baldrige, the 26th Secretary of Commerce, the Baldrige Award was established by Congress in 1987 to enhance the competitiveness and performance of U.S. businesses. Since 1988, 86 organizations have received Baldrige Awards. The Baldrige Performance Excellence Program is managed by NIST in conjunction with the private sector.

Originally given only to manufacturers, small businesses and service companies, Congress and the President broadened the Baldrige Award program in 1998 to include education and health care organizations. Nonprofit organizations, including government agencies, became eligible for the award in 2007. The first health care organization to receive the award was SSM Health Care (www.nist.gov/baldrige/ssmhealth.cfm) of St. Louis, Mo., in 2002.

Health care organizations have accounted for more than 50 percent of Baldrige award applicants since 2005.

Baldrige hospitals also were far more likely than their peers to be cited for marked improvement over a span of five years. According to the report, "[m]ore than 27 percent of Baldrige winner hospitals also won a 100 Top Hospitals: Performance Improvement Leaders award, while only 3 percent of their non-Baldrige peers won the award."

"The results of the Thomson Reuters study confirm what we've known for years: using the Baldrige Criteria and the earnest pursuit of the Baldrige evaluation will improve your organization by nearly every measure of success, be it in outcomes, safety, customer and employee satisfaction, or profitability," says Baldrige Performance Excellence Program Director Harry Hertz.

The study, Comparison of Baldrige Award Applicants and Recipients with Peer Hospitals on a National Balanced Scorecard, is available at www.nist.gov/baldrige/upload/baldrige-hospital-research-paper.pdf.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

NIST Names 10 to National Construction Safety Team Advisory Committee

Ten prominent building and fire experts have been appointed by Patrick Gallagher, director of the National Institute of Standards and Technology (NIST), to serve on the National Construction Safety Team (NCST) Advisory Committee. The committee, first established in 2003, advises the NIST director and relevant staff on studies of building failures and associated evacuation and emergency response procedures conducted under the authorities of the NCST Act (Public Law 107-231). This includes guidance on the composition and function of investigation teams and other responsibilities under the act.

The committee is being reestablished with new members serving staggered terms as required by its charter. The original group that served from 2003 to 2008 was focused heavily on NIST's investigation of the collapses of three buildings at New York's World Trade Center complex on 9/11.

The new NCST Advisory Committee members serving a one-year term are:

  • Carlos Fernandez-Pello, professor, Department of Mechanical Engineering, University of California Berkeley (Berkeley, Calif.)
  • Susan Cutter, distinguished professor and director, Hazards and Vulnerability Research Institute, University of South Carolina (Columbia, S.C.)
  • Jeffrey Garrett, president and CEO, CTL Group (Skokie, Ill.)


Members serving a two-year term are:

  • Ron Coleman, chairman, Board of Trustees, Commission on Fire Accreditation International (Elk Grove, Calif.)
  • Anne Kiremidijian, professor, Department of Civil and Environmental Engineering, Stanford University (Stanford, Calif.)
  • Sarah A. Rice, project manager, Preview Group Inc. (Cincinnati, Ohio)


Members serving a three-year term are:

  • Paul A Croce, retired VP and manager of research, FM Global (Middletown, R.I.)
  • Jeremy Isenberg, senior principal, Specialty Practices Group, AECOM (Oakland, Calif.)
  • R. Shankar Nair, principal and senior VP, Teng & Associates Inc. (Chicago, Ill.)
  • James R. Quiter, principal, Arup (Walnut Creek, Calif.)


Members were selected on the basis of their technical expertise and experience, records of distinguished professional service, and knowledge of issues affecting teams established under the NCST Act.

Under the NCST Act, NIST is responsible for conducting investigations of the events leading to building failures and associated evacuation and emergency response procedures that result in substantial loss of life or pose the potential for substantial loss of life. The NIST investigations establish the likely technical causes of the building failures and evaluate the technical aspects of emergency response and evacuation procedures in the wake of disasters, such as blasts, earthquakes, fires, impacts and windstorms, or while the building is in service or under construction. The goal is to recommend improvements to the way in which buildings are designed, constructed, maintained and used.

The new NCST Advisory Committee will hold its first meeting on Nov. 7, 2011, at NIST headquarters in Gaithersburg, Md. The meeting is open to the public. Details will be posted in an upcoming notice in the Federal Register (www.gpoaccess.gov/fr).

More information about the NCST Act and the NCST Advisory Committee may be found at http://www.nist.gov/el/disasterstudies/ncst/index.cfm. For background on NIST's more than 40 years of experience studying structural failures and fires, go to the web pages of the NIST Disaster and Failures Studies Program at www.nist.gov/el/disasterstudies.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

NIST Digital Library of Mathematical Functions Receives IT Award

Government Computer News magazine has honored the Digital Library of Mathematical Functions (DLMF), which the National Institute of Standards and Technology (NIST) released last year, with one of its 10 annual awards for information technology (IT) achievement in government.

GCN award winners
From left to right are: Abdou Youssef (math search -- George Washington University and NIST) Marjorie McClain (bibliography) Bruce Miller (information architect) Ronald Boisvert (Information Technology Editor) Daniel Lozier (General Editor) Charles Clark (Physical Sciences Editor) Frank Olver (Editor-in-Chief and Mathematics Editor -- University of Maryland and NIST) Bonita Saunders, Qiming Wang, and Brian Antonishek (graphics).
Credit: Zaid Hamid
View hi-resolution image

The awardees, which are chosen by a group of judges from across the public-sector IT community, were selected from more than 200 nominations. According to Government Computer News, all of the recipients " share a commitment to drive down costs and displayed the leadership and engineering skills needed to put the power of some of the world's biggest computer facilities into the hands of individual citizens and professional end users."

Along with its hardbound companion, the NIST Handbook of Mathematical Functions, the DLMF appeared in March 2010 as an update and expansion of the 1964 Handbook of Mathematical Functions, which has sold more than 1 million copies and is the agency's most widely-cited publication of all time. Creating the DLMF, an interactive electronic reference work that is freely available online, required the NIST team to develop not only its own website, but new tools for mathematical authoring, for display of colorful three-dimensional function visualizations, and for searching in mathematics-intense databases, all created specifically for Web-based content.

"Part of the technical hurdle was translating all the complicated math formulas into a format that could be used by the Web," says Dan Lozier, one of the DLMF's authors. "Most users never realize that the tools for displaying math on the Web are still in their infancy—with the DLMF they just see what they need."

Lozier says that because the information in the DLMF is in highly technical form, no off-the-shelf commercial products were capable of delivering it to the public effectively; so the team created their own tools for bringing the functions and interactive graphics to the Web.

"It took a decade of effort," Lozier says, "but the painstakingly verified 10,000 equations, 600 visualizations and 2,500 references in the DLMF should provide scientists and engineers with ready access to information they need to do advanced mathematical modeling and computational simulation well into the 21st century."

The winning projects and their teams are featured in the October 15, 2011, issue of Government Computer News, and they were honored at the magazine's annual gala awards dinner and reception on Oct. 19.

Browse the NIST Digital Library of Mathematical Functions at http://dlmf.nist.gov/.

Media Contact: Chad Boutin, chad.boutin@nist.gov, 301-975-4261

Comment  Comment on this article.back to top