Just a Standard Blog
What happens when atoms and electrons collide with each other or get irradiated by light, and how does knowing those answers help us? These are questions that have occupied most of my scientific career. The answers may not seem important to many people outside the field of atomic, molecular and optical physics (AMO), but they have an impact on many things, from nuclear fusion reactors down to the common fluorescent light bulb.
You may remember from high school chemistry that atoms have electrons whirling about the nucleus. These electrons can get excited by light or by other particles, raising them up to higher energies or stripping them away from the atom entirely, creating ions. The glow from a fluorescent bulb is caused by this kind of excitation. It turns out that if we can measure and understand the amount of energy it takes to cause these excitations in different atoms and molecules, we can use this information in all kinds of applications. Some are in fundamental physics, like determining how the first atoms started to form after the Big Bang. Others are practical, like overcoming problems in fusion reactor design, where highly energetic ions can significantly impair the performance of the reactor. It’s quite interesting, but as I discovered, gaining a detailed understanding of these processes can require some of the most powerful computers humanity has invented.
I received a Ph.D. in theoretical chemistry at the University of Chicago in 1968. That’s a long time ago, and there has been enormous scientific progress in the intervening 50-plus years. After a postdoc and a short stint in industry, I wound up at LASL — Los Alamos Scientific Laboratory, as it was known then, today called Los Alamos National Laboratory. I learned a lot over several years, and given that LASL had the most advanced computational infrastructure in the world, I took the opportunity to make computational AMO the main focus of my research efforts.
Researchers in computational AMO develop mathematical models of how atoms and molecules behave when they collide or are irradiated by light, and feed that model (or “code”) into a computer, which then calculates the outcomes — all without doing the actual physical experiment.
Making this my focus really seemed to me a no-brainer, as I had a table of riches in front of me that few others possessed. For one, I had access to some of the finest computers available at the time. I programmed on the early Control Data Corporation 6600 and 7600 supercomputers and then on the Cray-1, which arrived at LANL in 1976. The Cray-1 did not have an operating system, and it took almost two years before it was a fully functioning and generally productive machine, but what a machine it was. It delivered about 108 flops/sec, had a million words of 64-bit memory (a lot more than the 7600), and it ran about 10 times faster than the 7600. Computers have advanced since then, of course. A decent high-performance computing cluster today might run 100,000 times faster than the Cray did.
But we had problems sharing our work with the rest of the field. Our codes remained poorly documented (who had the time?) and certainly not widely distributed. To a significant extent, this is still the norm in AMO even today, though it is not the case for many other areas of theoretical physics such as particle and gravitational physics. The situation in AMO always troubled me, as I felt that if these quite capable codes were more user friendly and readily available, a lot more science would get done.
Fast-forward about 20 years, and I became a program director for theoretical AMO at the National Science Foundation (NSF), where I was heavily involved with the NSF supercomputer program and eventually the XSEDE (Extreme Science and Engineering Discovery Environment) project. During that time, I learned more about high-performance computing and also about science gateways. A gateway is a website or portal that enables a user to access these state-of-the-art codes and use them on remote computational platforms to study scientific problems. I realized that these gateways, which were being used in other areas, could make a sea change in the world of AMO by allowing many more scientists to have access to our state-of-the-art AMO codes.
I got a chance to put my preaching into practice when I left the NSF in 2014 to take a position at NIST. By that time, it had become obvious that our ability to gather incredible amounts of data from experimental sources, computation and sensors presented tremendous opportunities as well as challenges to all areas of science.
However, not much had changed in the world of computational AMO. Most of the codes were still homebrewed, lacked documentation and saw limited distribution. This inhibited scientific progress and, as an awful side effect, meant that most of the codes wound up in the software trash heap when students or postdocs who developed them left. It was a terrible waste of human resources that often led to reinventing the wheel. I hoped that I could help catalyze a real cultural change, and a workshop that I helped organize at Harvard in May of 2018 started the ball rolling. The workshop led to a few critical groups deciding to band together and develop an AMO sciences (AMOS) gateway where users could access a host of codes and run them from a web interface on NSF supercomputer systems.
We partnered with experts at Indiana University and were able to get six advanced codes up and running on what is now known as the AMOS Gateway. Now even less-sophisticated users can go to the gateway and do real calculations without knowing how to download and build the code, which in itself can be a deal-breaker for inexperienced users. The result is a democratization of the field of AMO, where the line between the haves and the have-nots is blurred.
The gateway now has about 300 registered users, runs codes for knowledgeable users worldwide, and also serves as an educational tool for students who are eager to learn many of the practical ways computational AMO can assist them in learning how atomic and molecular quantum systems behave.
The AMOS Gateway has also had impact on problems central to the NIST mission. NIST has been providing the scientific community with data on the energy levels, spectral properties and electron scattering cross sections from atoms and some molecules for many years. For many atomic targets, experimental results are unavailable not only due to the limited number of scientists performing the experiments but also because many of these systems cannot be easily prepared in the laboratory. With the advent of new and powerful computers and especially those with graphical processing units, this has changed.
In addition, basic theory has progressed to the point where a number of the atomic computer codes can claim, with some authority, that they are capable of producing more reliable data than many experiments. An example is the AMOS gateway calculations on atomic indium, which is a candidate to replace mercury in streetlamps, but is notoriously difficult to study experimentally. The gateway calculations are needed to model the gas discharge. It was quite pleasing to see that both the experimental and gateway approaches gave quantitatively the same results, bolstering confidence that the data generated is accurate and in agreement with the limited data available in the NIST database.
Finally, artificial intelligence, machine learning and deep learning are now being used to effectively extract information from existing data and to make predictions on the outcome of measurements on unknown systems. Coupled with simulation, these become powerful new tools that provide the foundations for a new form of digital metrology. The combination of advances in computer science and computational AMOS now allow both advanced and novice users to perform realistic simulations that enhance our understanding of how electrons and light interact with atomic and molecular systems.
There is still a lot of space to explore, but the good news is that we are learning every day how to improve these digital approaches to metrology. A future where experimental measurement or computation is no longer impossible or too costly could be on the horizon.