In chemistry, computational models may be getting worse

John Timmer
The electrons density of a buckminster fullerene molecule, calculated using density functional theory.
Scientific research teams often use a computational model to understand their system better. In many fields, building these computational models is a full-time job, one that provides people with long careers. These models can require a sophisticated understanding of physics, chemistry, and biology, and they often require careful and informed tradeoffs between accuracy and computational speed.
This is especially true in chemistry, where the electrons that make everything work are governed by the quantum mechanical wave function. Computing for anything but the simplest atoms is impossible, yet we often want to understand what electrons are up to in complex bulk materials.
Over the last few decades, researchers have built a variety of algorithms meant to produce an approximate result, typically based on concepts collectively termed "density functional theory." But researchers have now shown that the most recent generations of algorithms have gotten extremely biased; they continue to get better at estimating the energy of the electrons, but they may be getting worse at getting their geometry right. Ironically, the problem may be an increased reliance on empirical data for developing the software.
Density functional theory shows up a lot in chemistry and materials science work because it scales from single atoms up to complex materials. It's used to estimate the behavior of electrons in these materials, which can be critical to understanding everything from their basic chemistry to their suitability for various electronics applications. The model provides a way to get at this data that doesn't face the impossibility of calculating the wave functions of all the electrons involved.
At the heart of density functional theory—the theory portion of things—is the idea that there's a mathematical function that can relate the distribution of the electron density in a material to its energy. If you can minimize this function, you can produce the ground state of the system, getting both the energy and electron density, which tells you a lot about its chemical, physical, and electronic properties.
There's just one small catch: we have no idea what this function is.
The applied science of density functional theory has been finding ways to approximate the function by balancing the computational intensity of the task against adding increasingly realistic physics to the electrons' behavior. The assumption is that the more things you get right in the algorithm, the better it would reflect the actual behavior of a system.
A Russian-US team of researchers decided to test whether this assumption was correct. So they gathered 128 different algorithms and set them loose on a series of simple atoms and ions (things like a fluorine ion that has lost five electrons). These are simple enough that a different computational approach can provide a nearly exact solution before the Sun expands to swallow the Earth. The idea was to compare how well the different density functional theory algorithms approximated the relatively exact solutions.
For a while, there was a clear trend: as the algorithms get more sophisticated over time, the algorithms do a better job of describing the system properly. Both the energy and electron density keep getting closer to the values produced by the more exact algorithm. But things changed shortly after 2000. After that point, the energies estimated by these algorithms keep getting better. In contrast, the estimated electron densities actually start getting worse.
Why might this be taking place? Ironically, it's more of a problem in algorithms that are based on empirical data. Rather than calculating everything based on physical principles, algorithms can replace some of the calculations with values or simple functions based on measurements of real systems (an approach called parameterization). The reliance on this approach, however, seems to do bad things to the electron density values it produces. "Functionals constructed with little or no empiricism," the authors write, "tend to produce more accurate electron densities than highly empirical ones."
On some level, this is clearly moving away from the ideas at the foundation of the entire field: that you need to get both the energy and the density distribution closer to the ground state.
But does it matter? In an accompanying perspective, Sharon Hammes-Schiffer of the University of Illinois at Urbana-Champaign agrees that we should try to do better with both values. But she also notes that these issues should be verified with more complex systems, such as entire molecules. She also suggests that the current algorithms may still be fine for a number of uses. "For applications in chemistry, biology, and physics, relative energies and geometries are often of primary interest," she writes. "If the electron density does not affect these properties and is not of direct interest itself, then perhaps the inaccurate electron density is irrelevant."
Of course, researchers that have relied on these algorithms will want to step back and evaluate whether their system might be one where electron density does matter.
In chemistry, computational models may be getting worse Reviewed by Bizpodia on 22:44 Rating: 5

No comments:

Bizpodia © All Rights Reserved!

Contact Form


Email *

Message *

Powered by Blogger.