Accelerated material discovery using machine-learning potentials
Çѽ¿ì
¼­¿ï´ë

Recently, machine-learning (ML) approaches to developing interatomic potentials are attracting considerable attention. In particular, the high-dimensional neural network potential (NNP) suggested by Behler and Parrinello is attracting wide interests with applications demonstrated over various materials. In this presentation, we first introduce our in-house code for training and executing NNP called SIMPLE-NN and discuss its unique feature such as GDF weighting which significantly improves stability of ML potentials during MD simulations. We further discuss on the fundamental aspect of ML potentials that enables the transferability of the potential. We show that the ML potential is nothing but a manifestation of O(N) method of DFT, which is realized in terms of atomic energies. As application examples, we present our recent results on phase change behavior of chalcogendies and silicidation process in semiconductor fabrication. Furthermore, we will show that ML potentials can be used as highly accurate surrogate models in exploring large space of crystal structures. This enables finding the stable crystal structure for complicated multicomponent systems.