On July 26, 2024, NIST released an Initial Public Draft of Managing Misuse Risk for Dual-Use Foundation Models. Public comment on this document will be accepted through September 9, 2024, at 11:59 PM Eastern Time. To provide feedback, please email NISTAI800-1 [at] nist.gov (NISTAI800-1[at]nist[dot]gov). Comments may also be submitted via regulations.gov under docket number 240802-0209.
Earlier Draft Reports for Comment
On April 29, 2024, NIST released four draft publications intended to help improve the safety, security and trustworthiness of AI systems in support of President Biden's Executive Order. Comments were due by June 2.
NIST issued a formal Request for Information (RFI) to assist in carrying out several of its responsibilities under Executive Order 14110 on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (EO 14110). Comments were welcomed through February 2, 2024. Read More.
NIST held a virtual workshop on Secure Development Practices for AI Models on January 17, 2024. This workshop supported the EO 14110 task for NIST to develop a companion resource to the NIST Secure Software Development Framework (SSDF). A livestream of the workshop can be viewed on NIST's website.
NIST requested feedback on Initial Public Draft of the Guidelines for Evaluating Differential Privacy Guarantees. This publication, which fulfills one of NIST’s assignments under EO 14110, is intended to help agencies and practitioners of all backgrounds understand how to evaluate promises made (and not made) when deploying differential privacy, including for privacy-preserving machine learning. Comments were due by January 25, 2024.
NIST relies heavily on its engagement with the private and public sectors in developing its AI-related guidance and will continue to follow that practice in carrying out these EO assignments. That will include:
NIST will use its general AI email list to distribute information about specific opportunities to engage: Sign up for AI email alerts here.
For EO related questions email: ai-inquiries [at] nist.gov (ai-inquiries[at]nist[dot]gov)