Faculty Sponsor: Francis W. Starr
Phase change materials such as Ge2Sb2Te5 (GST) are ideal candidates for next-generation, non-volatile, solid-state memory due to the ability to retain binary data in the amorphous and crystal phases, and rapidly transition between these phases to write/erase information. Thus, there is wide interest in using molecular modeling to study GST. Unfortunately, simulations of the length and time scales that are needed to model the phase transition behavior are challenging due to the computational expense of existing models, namely Density Functional Theory (DFT) and a Gaussian Approximation Potential (GAP). Here we present a machine-learned (ML) potential, implemented using the Atomic Cluster Expansion (ACE) framework, that shows comparable accuracy to these existing models and performs three orders of magnitude faster. We train ACE-ML potentials using a recently introduced indirect learning approach where, instead of being trained directly on DFT forces and energies, the potential is trained from an intermediate ML potential, allowing us to consider a significantly larger training set.
The substantial speedup of the ACE model compared to alternatives allows us to thoroughly study the transitions between crystal and amorphous phases with only modest computational resources. Specifically, we apply this potential to three problems relevant for GST’s applications: the melting temperature from the crystalline phase, the nucleation and growth of crystal grains in heated amorphous systems, and the dependence of the dielectric function on temperature and phase. In this poster, we present complete methods for each of these projects, as well as preliminary results.
Summer-Research-Poster-2024-3