Supercomputer Simulations Yield Method for Predicting Behavior of New Concrete Formulas
Just because concrete is the most widely used building material in human history doesn’t mean it can’t be improved. A recent study conducted by researchers from the National Institute of Standards and Technology (NIST), the University of Strasbourg and Sika Corporation using Department of Energy (DOE) Office of Science supercomputers has led to a new way to predict concrete’s flow properties from simple measurements.
Concrete begins as a thick pasty fluid containing innumerable particles in suspension that can, ideally, flow into a space of nearly any shape, where it hardens into a durable, rock-like state. Its initial flexibility combined with its eventual strength has made it the material of choice for building everything from the ancient Roman Colosseum to the foundations of countless modern bridges and skyscrapers.
But concrete is not without its problems. For example, when concrete is pumped, it can jam in pipes, leading to time and cost overruns during construction. The particles can settle out, leading to structural problems after the concrete hardens. And a significant amount of energy is needed to create the cement that reacts with water to produce hardened concrete. This critical binding agent is manufactured at high temperatures in a kiln, a process that generates a great deal of carbon dioxide, a greenhouse gas. According to the World Business Council for Sustainable Development, worldwide cement manufacture is estimated to account for at least 5 percent of humanity’s carbon dioxide emissions.
The industry can develop less energy-intensive concrete mixtures by replacing some of the cement with alternative materials like fly ash. However, these alternatives can require expensive chemical additives, and they also can have a range of effects on concrete flow. Ideally, the industry would like to tailor the use of these chemical additives, thus helping to assure the greatest use of alternative materials.
“We’d like to be able to design concrete that performs better on the job and doesn’t demand so much energy to manufacture,” says NIST computer scientist William George. “But what should we make it from? And what can we replace cement with? The answers will affect its properties. So we realized we needed to learn more about how suspensions work.”
While it’s a simple goal to describe, accomplishing it demanded some complex math and physics, and at the same time, an enormous amount of computer power to study how all the particles and fluid react as they are mixed. The NIST team was granted an INCITE Award that provided more than 110 million core hours at the Argonne Leadership Computing Facility. The ALCF supercomputers allowed them to simulate how a suspension would change if one or more parameters varied—the number of suspended particles, for example, or their size.
Suspensions have a remarkable property: Plotting two parameters—viscosity vs. shear rate (the latter refers to how neighboring layers of the fluid change velocity as it flows through a pipe)—always generates the same shaped curve as plotting them for the suspending fluid alone without added particles. This is true no matter what fluid is used. The curve just sits on a different location on the X-Y axis, as though someone had pushed it upwards or off to the side without otherwise altering its shape.
What the team unexpectedly found was the amount that the curves had to be shifted could be predicted based on the microscopic shear rates that existed between neighboring particles. Experiments at the University of Strasbourg confirmed the simulated results, which allowed the team to come up with a general theory of suspensions’ properties.
“So now if you have a suspension that is made with a fluid that behaves a bit differently, you can still predict what its properties will be,” George says. “You just have to measure the properties of the fluid that the particles are placed in, and you predict how the fresh concrete will behave.”
The results should help accelerate the design of a new generation of high-performance and eco-friendly cement-based materials by reducing time and costs associated with R&D, George adds.
NIST is also using this new knowledge to create Standard Reference Materials for industrial researchers to calibrate concrete rheometers—instruments used to measure the flow of complex fluids—for material development. Ultimately, this could help expand the use of alternative materials. While it is not yet known whether these alternatives will fit the bill, the team’s research could eventually help industry researchers zero in on the best new recipes.
*M. Liard, N.S. Martys, W.L. George, D. Lootens and P. Hebraud. Scaling laws for the flow of generalized Newtonian suspensions. Journal of Rheology, 58, 1993 (Nov/Dec 2014 issue), doi:10.1122/1.4896896.
Media Contact: Chad Boutin, email@example.com, 301-975-4261
Liquids and Glasses Relax, Too. But Not Like You Thought.
A new insight into the fundamental mechanics of the movement of molecules recently published* by researchers at the National Institute of Standards and Technology (NIST) offers a surprising view of what happens when you pour a liquid out of a cup. More important, it provides a theoretical foundation for a molecular-level process that must be controlled to ensure the stability of important protein-based drugs at room temperature.
Proteins depend critically on their three-dimensional structure, the shape the long and complex molecules tend to fold into. Modern protein-based drugs—for example, vaccines or antibodies created to fight cancers—generally are not stable at room temperature or in the liquid formulations most convenient for clinical use. To preserve them for use in parts of the world without reliable refrigeration, manufacturers freeze-dry the proteins and coat the complex molecules with glassy sugars to keep their structure intact. "It's like a lollypop," observes NIST biochemist Marcus Cicerone, "but these lollypops are only 10 microns or smaller."
The challenge is to design the sugar coating to get the maximum shelf life for a given pharmaceutical protein, which ideally would be measured in years. The issue revolves around what chemists refer to as "relaxation"—broadly, any molecular motion that leads to transport of the molecule. About 10 years ago, NIST researchers discovered a testing shortcut.** Using neutron radiation, they discovered that measuring tiny molecular movements in the proteins at very short timescales—picoseconds***—could reliably predict the long-term stability of a formulation. The sugars that worked the best were the ones that suppressed the tiny, rapid motions. Exactly why this was so was not particularly clear, but it worked.
This new paper finally explains the underlying principles. The neutron experiments, says Cicerone, measure mean square displacement. "Imagine a jarful of molecules. It's how far the average molecule jiggles around for a given timescale," he says. "In condensed matter like a liquid or glass, we usually think that all the molecules are identical, and on the average they all have the same environment with a little bit of space for them to jiggle, but not very much."
"What we found is that picture is not really right."
In reality, Cicerone says, there are two different environments the molecules can be in. "There is one environment like that—molecules are very well packed and on a picosecond timescale they move maybe one percent of their radius. They're hardly moving at all. But there's another environment where some molecules can move maybe 30 percent of their radius in the same time.They're really making big jumps, and in glasses, those big jumps are essentially the only way that molecules can move around. Everybody else is completely stuck.
"It's kind of like a 15 puzzle. You can only move one at a time."
What happens is a molecule next to a region that's more loosely packed can move there, and does. Then one that was next to it suddenly has room to move, and does, and so on. On a picosecond and nanometer scale of time and space, when you pour a liquid out of a cup, it doesn't really all come out all at once. It's more follow-the-leader.
On a practical level, says Cicerone, the results explain why the short timescale mean displacement measurements can predict the results of molecular degradation measurements that would normally take months. "It gives a really good solid understanding of why these picosecond and nanosecond timescale measurements correlate with degradation processes in glass for the proteins," he says, "so it gives us confidence that the techniques we build that are based on this idea will be robust and people will be able to use them."
As a bonus, he says, the model also explains a somewhat arcane degradation process in glasses called Johari-Goldstein relaxation. "It's the timescale for the switching between the tightly packed and loosely packed regions. It's the vacancy in the game of 15 moving around," says Cicerone.
* M.T. Cicerone, Q. Zhong and M. Tyagi. Picosecond dynamic heterogeneity, hopping and Johar-Goldstein relaxation in glass-forming liquids. Physical Review Letters 113,117801-117801 (2014).
** See the 2004 article, "Keeping Drugs Stable Without Refrigeration," at www.nist.gov/mml/msed/drugs_061604.cfm, and the 2008 article, "Candy-Coating Keeps Proteins Sweet," at www.nist.gov/mml/msed/sugar_081908.cfm.
***0.000 000 000 001 second
Media Contact: Michael Baum, firstname.lastname@example.org, 301-975-2763
Symposium to Focus on Future of Voting Systems
The Election Assistance Commission (EAC) and the National Institute of Standards and Technology (NIST) are sponsoring a two-day symposium to explore emerging trends in voting. The symposium will take place Feb. 9 and 10, 2015, at the Department of Commerce’s Herbert C. Hoover Building in Washington, D.C.
The symposium will bring together election officials, academics and representatives of voting system manufacturers, voting system test laboratories, standards development organizations, and federal, state and local government.
“Our goal is to foster an inclusive and informative conversation about trends in voting affected by technology, as well as how people interact with that technology,” said Mary Brady, who manages NIST’s role in supporting the EAC. The 2002 Help America Vote Act established the Technical Guidelines Development Committee, which is chaired by NIST, and directed the institute to assist the commission with the development of voluntary voting system guidelines.
Acting Under Secretary of Commerce for Standards and Technology and Acting Director of NIST Willie May is scheduled to provide opening remarks at the symposium, along with EAC Commissioners Tom Hicks, Matthew Masterson and Christy McCormick. Tammy Patrick, senior advisor to the Bipartisan Policy Center’s Democracy Project, will deliver the keynote address during Monday’s sessions.
The first day of the symposium will explore trends in voting systems, including the people, processes and technology. The second day will include ongoing activities in interoperable systems and a series of breakout sessions that will engage the participants in identifying forward-looking technologies across a wide variety of voting topics such as usability, accessibility, auditing and testing.
The symposium is free, but all attendees must preregister. Full details on the event can be found on the NIST website.
Media Contact: Jennifer Huergo, email@example.com, 301-975-6343
NIST Meeting: Cybersecurity Is a Key Ingredient In the Manufacturing Mix
The National Institute of Standards and Technology (NIST) is bringing experts together to discuss the cybersecurity challenges faced by the rapidly developing field of direct digital manufacturing (DDM) and to discuss methods for improvement. The Feb. 3, 2015, meeting at the NIST Gaithersburg campus will inform NIST’s future efforts in the area.
DDM uses computer-controlled processes to streamline manufacturing by cutting out time-consuming and costly steps such as developing precise molds or cutting dies. Additive manufacturing and 3-D printing are some of the most well-known examples of DDM and are used to create physical objects directly from digital files. Today the technology is used to create a variety of products including bone replacements, airplane parts and even action figures using one’s very own image. In the future, it may be possible to “print” virtually anything you could want, from food to complex electronic components.
NIST conducts research in both additive manufacturing and cybersecurity. This meeting is aimed at researchers in both areas, as well as stakeholders who use or build DDM machines and software. The goals are to develop a greater understanding of cybersecurity risks in DDM, to discover how cybersecurity needs for DDM technologies differ from those of traditional industrial control systems or cyber-physical machines, to understand the approaches and solutions stakeholders such as software developers, product designers and manufacturers are using, and to identify areas where NIST can assist in building security into this developing technology rather than adding it on after the fact.
The symposium will explore cybersecurity needs for DDM, including ensuring the protection of intellectual property and the integrity of printers, elements being printed and design data. Speakers from industry, academia and government will discuss the industry’s current state, cybersecurity risks and solutions, and implications for information and communications technology supply chain risk management. Experts will share lessons learned, and participants will assist in identifying specific vulnerabilities and presenting possible solutions or ways forward.
To register and learn more details for “Cybersecurity for Direct Digital Manufacturing,” please see the event page.
Media Contact: Evelyn Brown, firstname.lastname@example.org, 301-975-5661