Search
🔍

Research

STREAM Lab

Machine learning potential

Machine learning potentials (MLPs) are computational models designed to predict the energies, forces, and other properties of atomic systems with accuracy comparable to quantum mechanical calculations, but at significantly lower computational costs. Unlike traditional empirical potentials, which rely on predefined functional forms, MLPs are trained on extensive datasets generated through high-level quantum computations, such as density functional theory (DFT).
The process of utilizing MLPs begins with generating high-quality data from quantum mechanical methods. This data typically includes atomic positions, energies, and forces. The collected dataset is then used to train a machine learning model, such as neural networks or Gaussian process regressors, enabling it to effectively learn patterns in atomic interactions.
Once the training phase is completed, these machine learning potentials can quickly and accurately predict atomic interactions, enabling simulations of larger systems and longer timescales that were previously computationally impractical using purely quantum mechanical methods. Due to their speed and precision, MLPs are widely applied in materials science for studying new materials and alloys, in chemistry to explore reaction mechanisms and catalysis, in drug discovery to investigate molecular interactions, and in physics to model phase transitions and material defects.
Lately, I've been particularly interested in utilizing machine learning potentials for multiscale simulations, specifically by extending the accessible timescale through kinetic Monte Carlo (kMC) methods. Additionally, I'm exploring combining these potentials with generative models to facilitate efficient materials discovery.

Material property prediction

AI-driven material property prediction leverages advanced machine learning models known for their exceptional flexibility and superior ability to capture complex, nonlinear relationships between material structures and their properties. Unlike traditional empirical or linear models, AI methods can accurately represent intricate patterns and subtle interactions within data, leading to more reliable predictions of material performance.
Furthermore, these AI approaches naturally scale to optimization tasks, efficiently exploring vast compositional and structural spaces to identify optimal candidates. This inherent scalability combined with remarkable predictive capability significantly accelerates and enhances the material design and discovery process.

First-principle calculation

First-principles calculation, also known as ab initio calculation, refers to computational methods that directly solve quantum mechanical equations without relying on empirical parameters or fitting to experimental data. Typically based on density functional theory (DFT), these calculations predict material properties, such as electronic structures, stability, mechanical properties, and chemical reactivity, entirely from fundamental physical principles.
The key benefit of first-principles methods for experiments and material design lies in their predictive accuracy and reliability. By providing detailed insights into electronic interactions and atomic configurations, they help researchers identify and understand underlying mechanisms governing material behavior. Consequently, first-principles calculations can guide experimental efforts by predicting promising materials, clarifying experimental observations, and reducing unnecessary trials. In material design, these methods enable researchers to explore novel structures and compositions systematically, ultimately accelerating innovation and facilitating targeted development of advanced materials.