NIST logo

Tech Beat - January 24, 2012

Tech Beat Archives

Submit an email address to receive news from NIST:

Editor: Michael Baum
Date created: January 24, 2012
Date Modified: January 24, 2012 
Contact: inquiries@nist.gov

Cool Nano Loudspeakers Could Make for Better MRIs, Quantum Computers

A team of physicists from the Joint Quantum Institute (JQI), the Neils Bohr Institute in Copenhagen, Denmark, and Harvard University has developed a theory describing how to both detect weak electrical signals and cool electrical circuits using light and something very like a nanosized loudspeaker.* If demonstrated through experiment, the work could have a tremendous impact on detection of low-power radio signals, magnetic resonance imaging (MRI), and the developing field of quantum information science.

nanomech membrane
JQI researchers think they have discovered a way to amplify faint electrical signals using the motion of a nanomechanical membrane, or loudspeaker. If shown in experiments, the scheme could prove a boon to magnetic resonance imaging and quantum information science. This schematic of the proposed device shows its use in detecting--in this example--a signal produced by the quantum-mechanical "spin" of a group of atoms. The atoms generate a faint radiofrequency signal in a coil (L) which is connected to microscale wires that form an electrical capacitor. This vibrates the 'nanomembrane' which in turn affects the resonant frequency of a laser optical cavity. The output is light at frequency that is the sum of the original laser frequency plus the signal from the atoms.
Credit: Taylor/NIST
View hi-resolution image

The JQI is a collaborative venture of the National Institute of Standards and Technology (NIST) and the University of Maryland, College Park.

"We envision coupling a nanomechanical membrane to an electrical circuit so that an electrical signal, even if exceedingly faint, will cause the membrane to quiver slightly as a function of the strength of that signal," says JQI physicist Jake Taylor. "We can then bounce photons from a laser off that membrane and read the signal by measuring the modulation of the reflected light as it is shifted by the motion of the membrane. This leads to a change in the wavelength of the light."

Present technology for measuring the wavelength of light is highly sensitive, which makes it ideal for detecting the nanoscopic motions of the loudspeaker caused by extremely faint electrical signals.

And the ability to detect extremely faint electrical signals may someday make MRI medical procedures much easier.

"MRI machines are so big because they are stuffed with really powerful superconducting magnets, but if we can reduce the strength of the signals we need for a reading, we can reduce the strength, and the size, of the magnets," Taylor says. "This may mean that one could get an MRI while sitting quietly in a room and forgo the tube."

The same setup could be used to generate information-carrying photons from one qubit to another, according to Taylor.

One popular quantum information system design uses light to transfer information among qubits, entangled particles that will exploit the inherent weirdness of quantum phenomena to perform certain calculations impossible for current computers. The 'nanospeaker' could be used to translate low-energy signals from a quantum processor to optical photons, where they can be detected and transmitted from one qubit to another.

All this, and the team will throw in cooling the system for free. According to their calculations, translating the mechanical motion of the little loudspeaker into photons will siphon a considerable amount of heat out of the system (from room temperature to 3 kelvin or -270 C), which in turn will reduce noise in the system and provide for better signal detection.

* J. M. Taylor, A. S. Sørensen, C. M. Marcus and E. S. Polzik. Laser cooling and optical detection of excitations in a LC electrical circuit. Phys. Rev. Lett. 107, 273601. Published online Dec. 27, 2011. http://link.aps.org/doi/10.1103/PhysRevLett.107.273601

Media Contact: Mark Esser, mark.esser@nist.gov, 301-975-8735

Comment  Comment on this article.back to top

Hacking the SEM: Crystal Phase Detection for Nanoscale Samples

nanowires
Top: Transmission electron diffraction pattern from from a segment of an indium gallium nitride (InGaN) nanowire about 50 nanometers in diameter taken with an SEM using the new NIST technique clearly shows a unique pattern associated with crystal diffraction. Bottom: Same pattern but with an overlay showing the crystallographic indexing associated with the atomic structure of the material.
Credit: Geiss/NIST
View hi-resolution image

Custom modifications of equipment are an honored tradition of the research lab. In a recent paper,* two materials scientists at the National Institute of Standards and Technology (NIST) describe how a relatively simple mod of a standard scanning electron microscope (SEM) enables a roughly 10-fold improvement in its ability to measure the crystal structure of nanoparticles and extremely thin films. By altering the sample position, they are able to determine crystal structure of particles as small as 10 nanometers. The technique, they say, should be applicable to a wide range of work, from crime scene forensics to environmental monitoring to process control in nanomanufacturing.

The technique is a new way of doing electron diffraction with an SEM. In standard SEM-based electron diffraction, the researcher analyses patterns that are formed by electrons that bounce back after striking atoms in the sample. If the sample is a crystalline material, with a regular pattern to the arrangement of atoms, these diffracted electrons form a pattern of lines that reveals the particular crystal structure or "phase" and orientation of the material.

The information, say NIST's Robert Keller and Roy Geiss, can be critical. "A common example is titanium dioxide, which can exist in a couple of different crystal phases. That difference significantly affects how the material behaves chemically, how reactive it is. You need to add crystallographic identification to the chemical composition to completely characterize the material."

SEMs often are outfitted with an instrument for just this task, a device called an EBSD (electron back-scatter diffraction) detector. The problem, they say, is that below a certain size, the usual setup just doesn't work. "You can determine the crystal structure of an isolated particle down to a size of about 100 to 120 nanometers, but below that the crystals are so small that you're getting information about the sample holder instead." A somewhat more exotic instrument, the transmission electron microscope (TEM), does much better, they say, but samples below about 50 nanometers in size show very limited diffraction patterns because the higher-powered electron beam of the TEM just blasts through them.

The novel tweak developed by Keller and Geiss combines a little of each. They moved the SEM sample holder closer to the beam source and adjust the angles so that instead of imaging electrons bouncing back from the sample, the "EBSD" detector is actually seeing electrons that scatter forward through the sample in a manner similar to a TEM. (They also came up with a unique method of holding samples to make this work.)

They have shown that their technique produces reliable crystal phase information for nanoparticles as small as 10 nanometers across, as well as for single crystalline grains as small as 15 nanometers in an ultrathin film.

Electron diffraction in an SEM, says Keller, "in general represents the only approach capable of measuring the atomic structure, defect content, or crystallographic phase of single nanoparticles. This is a critical need in cases of extremely limited sampling of unknown particles. This work pushes electron diffraction to a new frontier by providing spatial resolution that rivals that possible in a TEM, and makes it available to anyone with an SEM. And that's an ubiquitous tool in virtually all fields that require characterization of solids."

Typical applications, the researchers say, include pinpointing ammunition sources from gunshot residue at crime scenes; determining the processing history of confiscated drugs; accurate characterization of nanoparticles for health, safety and environmental impact studies; and optimizing grain structure in high-performance electronics based on thin films and process and quality control in nanomanufacturing.

* R.R. Keller and R.H. Geiss. Transmission EBSD from 10 nm domains in a scanning electron microscope. Journal of Microscopy, 2011. doi: 10.1111/j.1365-2818.2011.03566.x. Scheduled to appear in the March 2012 issue.

Media Contact: Michael Baum, michael.baum@nist.gov, 301-975-2763

Comment  Comment on this article.back to top

NIST to Support Federal Agencies as They Implement Best Practices for Standards Activities

Last week, the White House issued a memorandum recognizing the role of the National Institute of Standards and Technology (NIST) in helping agencies implement best practices for standards development activities to address national priorities. The memo clarifies how federal agencies should work with the private sector in standards development, and it stresses the importance of public-private partnerships to the U.S. standards system and promoting innovation.

“NIST has more than 100 years of experience collaborating with industry and standards development organizations, and we’re looking forward to working with the White House and our colleagues in other agencies to share best practices as we implement the principles outlined in this memo,” said Under Secretary of Commerce for Standards and Technology and NIST Director Patrick Gallagher. “Recently, we’ve applied those lessons learned to helping coordinate standards efforts for such economically important areas as the Smart Grid, Health IT and the National Strategy for Trusted Identities in Cyberspace.”

The memo specifically refers to NIST’s expertise in—and authority to coordinate—conformity assessment activities of federal, state and local governments, and the private sector. Conformity assessment is a mechanism for assuring that a product, service or management system meets the specified requirements of a particular standard. Comformity assessment is a key component of the U.S. standards system and can cover product or service testing, manufacturing process inspection or management system implementation.*

From the screw thread dimensions for bolts to the technical specifications that ensure computer software compatibility, standards play a largely unseen but critical role in our daily life, affecting everything from banking to communications to transportation to health care. The great majority of global merchandise trade is affected by standards and by regulations that embody standards.

Standards in the United States are predominantly voluntary and consensus-based, and developed in a process led by the private sector that brings together industry, government agencies, consumers and other interested stakeholders, all participating equally as subject matter experts to develop timely and effective standards solutions. When matters of national importance require standards, the federal government can serve an essential convening role to “accelerate standards development and implementation to help spur technological advances and broaden technology adoption,” according to the Jan. 17 memo.

The memo clarifies principles guiding federal government engagement in standards activities and formalizes several of the policy recommendations that were proposed in an October 2011 report of the Subcommittee on Standards of the National Science and Technology Council (NSTC). The subcommittee, which is chaired by Gallagher, sought input from stakeholders about the effectiveness of the federal government engagement in standards.

The memo, which extends guidance provided by the Office of Management and Budget in 1998, was jointly released by three offices in the Executive Office of the President—the Office of Science and Technology Policy, the Office of the U.S. Trade Representative, and the Office of Information and Regulatory Affairs within the Office of Management and Budget.

The White House standards memo is available at: http://www.whitehouse.gov/sites/default/files/omb/memoranda/2012/m-12-08.pdf. The report of the NSTC Subcommittee on Standards, “Federal Engagement in Standards Activities to Address National Priorities” is available at http://standards.gov/upload/Federal_Engagement_in_Standards_Activities_October12_final.pdf. Information on the subcommittee can be found at: http://standards.gov/nstcsubcommitteeonstandards.cfm.

*For a discussion of conformity assessment, see this explanation by the International Organization for Standardization (ISO) at www.iso.org/iso/resources/conformity_assessment/what_is_conformity_assessment.htm.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

back to top

NIST Publishes Draft Implementation Guidance for Continuously Monitoring an Organization's IT System Security

Three new draft reports published by the National Institute of Standards and Technology (NIST) are designed to help both public and private organizations improve the security of their information management systems by developing capabilities for continuous monitoring of security. Comments are requested on the drafts.

For many organizations, information is one of their most valuable assets. Over the past decade, the IT security world has been moving ever closer to implementing diverse sets of security tools that enable tracking the security of enterprise-wide computer systems. "Organizations need to have 'situational awareness' over their information systems and to understand their security posture in a constantly evolving IT environment," explains NIST computer scientist David Waltermire. This requires an organization to have a dynamic process to identify and respond to new vulnerabilities and developing threats.

"Some organizations are already adopting continuous monitoring programs and acquiring tools to help, Waltermire said, "but there is little technical guidance on implementing a standardized approach. That is the goal of these three new publications."

The first of the three drafts, CAESARS Framework Extension: An Enterprise Continuous Monitoring Technical Reference Model (NIST Interagency Report 7756 Second Public Draft) (available at http://csrc.nist.gov/publications/drafts/nistir-7756/Draft-NISTIR-7756_second-public-draft.pdf), provides a reference model for organizations to collect data from across a diverse set of security tools, analyze the data, score the data, enable user queries and provide overall situational awareness. The model is designed so organizations can meet these goals by leveraging their existing security tool investments and avoiding designing and paying for custom solutions. It was developed using the Department of Homeland Security (DHS) continuous monitoring framework named Continuous Asset Evaluation, Situational Awareness, and Risk Scoring architecture (CAESARS) as a starting point.

"Organizations are already using CAESARS, but the architecture lacked specific requirements enabling product interoperability and interorganizational information sharing between different systems within the enterprise environment," Waltermire said.

The second document, Continuous Monitoring Reference Model Workflow, Subsystem, and Interface Specifications (NISTIR 7799) (available at http://csrc.nist.gov/publications/drafts/nistir-7799/Draft-NISTIR-7799.pdf), provides the technical specifications for the continuous monitoring reference model presented in NISTIR 7756 with enough specificity to enable instrumentation of existing products and development of new capabilities by vendors. The specifications in NISTIR 7799 define an ecosystem in which a variety of interoperable products can be combined into a continuous monitoring solution.

The third document, Applying the Continuous Monitoring Technical Reference Model to the Asset, Configuration and Vulnerability Management Domains (NISTIR 7800) (available at http://csrc.nist.gov/publications/drafts/nistir-7800/Draft-NISTIR-7800.pdf), augments the reference model with guidance on addressing these specific areas. It does this by leveraging the Security Content Automation Protocol (SCAP) version 1.2 for configuration and vulnerability-scan content, and it recommends reporting results in an SCAP-compliant format.

NIST is asking for public comment on the three draft publications. Please send comments to fe-comments@nist.gov by February 17. For clarity, please be sure to note which publication is the subject of your comments.

Two earlier publications provide roots for continuous monitoring. NIST's Information Security Continuous Monitoring (ISCM) for Federal Information Systems and Organizations (Special Publication 800-137), published in September 2011, was written to help organizations apply NIST's Risk Management Framework* to understand their security posture against threats and vulnerabilities and to determine how effectively their security controls are working. An Office of Management and Budget(OMB) memorandum (M-11-33** emphasizes monitoring the security state of information systems on an ongoing basis to enable ongoing, risk-based decisions.

* Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach (NIST SP 800-37 Rev. 1) can be found at http://csrc.nist.gov/publications/nistpubs/800-37-rev1/sp800-37-rev1-final.pdf.
**OMB memorandum M-11-33, FY 2011 Reporting Instructions for the Federal Information Security Management Act and Agency Privacy Management, is available at http://www.whitehouse.gov/sites/default/files/omb/memoranda/2011/m11-33.pdf.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Issues Cloud Computing Guidelines for Managing Security and Privacy

The National Institute of Standards and Technology (NIST) has finalized its first set of guidelines for managing security and privacy issues in cloud computing.*

Guidelines on Security and Privacy in Public Cloud Computing (NIST Special Publication 800-144) provides an overview of the security and privacy challenges facing public cloud computing and presents recommendations that organizations should consider when outsourcing data, applications and infrastructure to a public cloud environment. The document provides insights on threats, technology risks and safeguards related to public cloud environments to help organizations make informed decisions about this use of this technology.

"Public cloud computing and the other deployment models are a viable choice for many applications and services. However, accountability for security and privacy in public cloud deployments cannot be delegated to a cloud provider and remains an obligation for the organization to fulfill," said publication co-author Tim Grance.

The key guidelines include:

  • Carefully plan the security and privacy aspects of cloud computing solutions before implementing them.
  • Understand the public cloud computing environment offered by the cloud provider.
  • Ensure that a cloud computing solution—both cloud resources and cloud-based applications—satisfy organizational security and privacy requirements.
  • Maintain accountability over the privacy and security of data and applications implemented and deployed in public cloud computing environments.


SP 800-144 is geared toward system managers, executives and information officers making decisions about cloud computing initiatives; security professional responsible for IT security; IT program managers concerned with security and privacy measures for cloud computing; system and network administrators; and users of public cloud computing services.

The publication also provides a detailed list of Federal Information Processing Standards and NIST special publications that provide materials particularly relevant to cloud computing and are recommended to be used in conjunction with SP 800-144.

The document can be downloaded from http://www.nist.gov/manuscript-publication-search.cfm?pub_id=909494.

* Guidelines on Security and Privacy in Public Cloud Computing was first issued as a draft for public comment in February 2011. See the Feb. 2 Tech Beat article, "Cloud Computing at NIST: Two New Draft Documents and a Wiki," at www.nist.gov/public_affairs/tech-beat/tb20110202.cfm#cloud.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

February Forums Help Manufacturers Get on Track to Build Next Generation Rail

The National Institute of Standards and Technology (NIST) and the U.S. Department of Transportation (DOT) will host two forums in February 2012 to help U.S. manufacturers prepare for upcoming opportunities to become suppliers for the next generation of railcars and locomotives. The first forum will be held Feb. 8 in Sacramento, Calif., and the second will be Feb. 14 in Chicago.

The Next Generation Rail Supply Chain Connectivity Forums will bring together large railcar builders and original equipment manufacturers (OEMs) with smaller, capable and interested U.S. manufacturers. Smaller manufacturers will have the chance to learn what products are needed and what investments they should consider when entering the rail industry. The idea is to identify a broader domestic supply base that includes both traditional and non-traditional rail suppliers, with the goal of 100 percent domestic content in railcars that will be funded by state and federal dollars.

NIST will help coordinate the meetings through its Hollings Manufacturing Extension Partnership (MEP), a nationwide network of centers that work with small and mid-sized U.S. manufacturers to help them create and retain jobs, increase profits, and save time and money. “Through these forums, U.S. manufacturers will learn more about concrete opportunities that can grow and diversify their business into this exciting emerging market,” says Roger Kilmer, director of NIST MEP. The forums are the outgrowth of a partnership formed in October 2011 between the U.S. Department of Commerce and DOT to join efforts in promoting the development of a domestic supply base to support transportation in the United States. (See the Commerce Department’s Oct. 18, 2011, news announcement, “Commerce and Transportation Departments Forge Partnership to Boost Domestic Manufacturing Across America” at http://www.commerce.gov/blog/2011/10/18/commerce-and-transportation-departments-forge-partnership-boost-domestic-manufacturi.)

Each one-day forum will include representatives from federal and state agencies, local economic development agencies, rail service operators, rail car builders and associated OEMs, and U.S. manufacturers who can potentially be suppliers.

For the Sacramento event, NIST MEP and DOT are partnering with California MEP affiliates Manex and California Manufacturing Technology Center (CMTC). The Illinois MEP affiliate Illinois Manufacturing Extension Center (IMEC) is a partner for the Chicago event.

An initial national webinar on Dec. 15, 2011, attracted more than 150 attendees and provided an opportunity for the large car builders and OEMs to introduce themselves to manufacturers who currently serve as rail suppliers, as well as those who do not, but are interested in the opportunity. 

Visit www.nist.gov/mep/rail.cfm to learn more about the Next Generation Rail Supply Chain Connectivity Initiative, to register for upcoming forums, and to access the recorded audio and slides from the December webinar.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top

International Community Gathers at NIST in March to Discuss Biometric Performance and Testing

The International Biometric Performance Conference 2012, to be held March 5-9 at the National Institute of Standards and Technology (NIST), will bring together evaluators, users and technology providers to discuss recent advances in the fields of biometric testing, performance definition and specification, and measurement quality assurance. Biometrics is the practice of measuring and recording unique physical characteristics of a person, and later using those measurements to definitively identify the individual. Common examples are fingerprints, iris patterns and DNA.

The conference is sponsored by NIST, the U.K.’s National Physical Laboratory and Germany’s Fraunhofer IGD.

Conference speakers will detail recent developments in how systems are being tested, certified, upgraded and improved. The meeting will focus on identifying fundamental, relevant, effective and new performance metrics for biometric systems and determining and sharing best practices for performance evaluation, calibration and design as they relate to procurement specifications and day-to-day operational testing and evaluation.

The conference tracks include test methods, operational aspects and security and privacy topics. Two half-day workshops will be held in conjunction with the conference. NIST is hosting the conference at its Gaithersburg, Md., facility in the metropolitan Washington, D.C., area.

For more information on the conference, see www.nist.gov/itl/iad/ig/ibpc2012.cfm. To register, see https://www.fbcinc.com/e/NIST/IPBC/atreg1.aspx.

Media Contact: Evelyn Brown, evelyn.brown@nist.gov, 301-975-5661

Comment  Comment on this article.back to top

NIST Accepting Applications for Two Summer Programs for Middle School Science Teachers

The National Institute of Standards and Technology (NIST) is accepting applications for the 2012 Summer Institute for Middle School Science Teachers and the Research Experience for Teachers (RET) program. These two opportunities give teachers the chance to spend time on the NIST campus in Gaithersburg, Md., learning from and working with scientists and engineers.

The NIST Summer Institute is a two-week workshop for middle school science teachers featuring hands-on activities, lectures, tours and visits with NIST scientists and engineers in their laboratories. Teachers learn about core topics such as forensics and materials science, and receive educational materials to help them integrate these topics into their classroom while meeting curriculum standards.

Middle school science teachers who have already completed the NIST Summer Institute are eligible to apply for the RET program and spend six continuous weeks at the NIST Gaithersburg, Md., campus working side-by-side with NIST scientists. The teachers work on projects that combine research with direct applications tailored to developing, maintaining, advancing and enabling the Nation’s measurement system. The research projects are chosen for their relevance to the teachers’ interests and the NIST mission.

Applications to both programs are submitted by school districts, not individual teachers. For the NIST Summer Institute, public school districts and accredited private educational institutions in the United States and its territories may nominate no more than one teacher per middle school in a school district or institution. For example, if there are 20 middle schools within a district, the district may nominate up to 20 teachers. The RET program has slots for up to three eligible teachers.

Both programs give participants the opportunity to increase their understanding of the subjects they teach and how scientific research is performed. They also gain a network of scientists and engineers at NIST with whom to consult once they are back in the classroom.

Applications for both the NIST Summer Institute and RET program are due by 3 p.m. Eastern time on March 21, 2012.

More information and links to the applications can be found on the Summer Institute Web page at http://www.nist.gov/iaao/teachlearn/index.cfm.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top

Economic Study Shows Value of Baldrige-Based Performance Excellence

The news in 2001 was impressive, but it’s even more emphatically evident a decade later: the Baldrige Performance Excellence Program (BPEP) significantly benefits the U.S. economy. That’s the finding from a new economic study* to determine the practical value to organizations using the Baldrige Criteria for Performance Excellence—the benefits of the program outweigh the overall cost by 820 to 1.

The new study by professors Albert N. Link of the University of North Carolina at Greensboro and John T. Scott of Dartmouth College follows up on a 2001 analysis by the same team examining the potential benefits versus costs of the Baldrige Performance Excellence Program (BPEP). The program is managed by the National Institute of Standards and Technology (NIST) in conjunction with the private sector.

In 2001, the duo estimated the total potential economic benefits of the Baldrige Program to the U.S. economy at nearly $25 billion and its total operational cost at $119 million, a cost-to-benefit ratio of 207 to 1.** The finding was derived using data from a Baldrige Criteria benefits survey of corporate members of the American Society for Quality (ASQ)—showing an 18 to 1 cost-to-benefit ratio—and then extrapolating the results to the entire country based on the assumption that other companies in the economy used the Baldrige criteria and benefited to the same extent as the firms responding to the survey.

In their latest study, Link and Scott took a more direct approach, surveying the 273 Malcolm Baldrige National Quality Award applicants since 2006. They also expanded their assessment of the practical value of the Baldrige Criteria to these organizations on three levels—cost savings, customer satisfaction and financial gain (gains from increased value of sales). Link and Scott estimate that the benefits outweigh the overall cost of the BPEP by a ratio of 820 to 1.

In their report documenting the new study, Link and Scott explain that even this figure may be on the conservative side. “If the social costs were compared to the benefits for the economy as a whole, the benefit-to-cost ratio would be considerably higher,” they wrote.

The authors summarized their results by stating, “The Baldrige Performance Excellence Program, with the imprimatur of national leadership and a prominent national award … creates great value that could not be replicated by private-sector actions alone.”

The BPEP raises awareness about the importance of performance excellence in driving the U.S. and global economy; provides organizational assessment tools and criteria; educates leaders in businesses, schools, health care organizations, and government and nonprofit organizations about the practices of national role models; and recognizes them by honoring them with the only Presidential Award for performance excellence.

For more information on BPEP, go to www.nist.gov/baldrige.

* Albert N. Link and John T. Scott. NIST Planning Report 11-2: Economic Evaluation of the Baldrige Performance Excellence Program. December 2011. Available online at www.nist.gov/director/planning/upload/report11-2.pdf.

** Albert N. Link and John T. Scott. NIST Planning Report 01-3: Economic Evaluation of the Baldrige Performance Excellence Program. October 2001. Available online at www.nist.gov/director/planning/upload/report01-3.pdf.

Media Contact: Michael E. Newman, michael.newman@nist.gov, 301-975-3025

Comment  Comment on this article.back to top

March Workshop to Support Trusted IDs in Cyberspace

The National Strategy for a Trusted Identities in Cyberspace (NSTIC) National Program Office will host the 2012 NIST/NSTIC IDtrust Workshop “Technologies and Standards Enabling the Identity Ecosystem” on March 13 and 14, 2012, in Gaithersburg, Md.

Managed by the National Institute of Standards and Technology (NIST), NSTIC is a White House initiative to work collaboratively with the private sector, advocacy groups, public sector agencies and other organizations to improve the privacy, security and convenience of sensitive online transactions. The strategy envision a set of interoperable technology standards and policies—an "Identity Ecosystem"—where individuals, organizations and underlying infrastructure—such as routers and servers—can be authoritatively authenticated.

The workshop will focus on how technologies and standards can help the framework of the Identity Ecosystem coalesce. As envisioned by the NSTIC, the Identity Ecosystem is a user-centric online environment—a set of technologies, policies and agreed upon standards—that securely supports transactions ranging from anonymous to fully authenticated and from low to high value.

The two-day workshop will feature plenary presentations and panel discussions by leading identity management and standards experts addressing a broad swath of technology and standards issues important to identifying and implementing the four NSTIC Guiding Principles in the Identity Ecosystem, that chosen standards and policies should be:

  • privacy-enhancing and voluntary,
  • secure and resilient,
  • interoperable, and
  • cost-effective and easy-to-use.


The workshop topics will include privacy management, trust models, usability, viable business models for an identity ecosystem, attributes and the results of the Internet Society’s mapping exercise, “The Global Identity Ecosystem.” The workshop also will feature a report on a December 2011 meeting at NIST on “Privacy-Enhancing Cryptography: Working with encrypted data without decrypting."

For more information on the workshop, go to www.nist.gov/itl/csd/ct/nstic_idtrust-2012.cfm.

Media Contact: Jennifer Huergo, jennifer.huergo@nist.gov, 301-975-6343

Comment  Comment on this article.back to top

Call for Nominations for the National Medal of Technology and Innovation

The Department of Commerce’s United States Patent and Trademark Office (USPTO) is seeking nominations for the 2012 National Medal of Technology and Innovation. The medal is presented each year by the President of the United States and is this country’s highest award for technological achievement.

The medal is awarded annually to individuals, teams (up to four individuals), companies or divisions of companies for their outstanding contributions to America’s economic, environmental and social well-being. By highlighting the national importance of technological innovation, the medal also seeks to inspire future generations of Americans to prepare for, and pursue technical careers to keep America at the forefront of global technology and economic leadership.

For more information about the National Medal of Technology and Innovation and detailed information about the requirements for submission of a nomination, see www.uspto.gov/about/nmti/index.jsp. All completed nominations must be submitted to the USPTO by 5 p.m. Eastern time, March 31, 2012. Please note, self-nominations will not be accepted.

Follow updates on the National Medal of Technology and Innovation on Twitter @NMTIMedal.

Media Contact: Richard Maulsby, richard.maulsby@uspto.gov, 571-272-8333

Comment  Comment on this article.back to top