Search
🔍

Research

STREAM Lab

First-principles calculation

The foundation of our research is Density Functional Theory (DFT) and other first-principles methods, which solve quantum mechanical equations without relying on empirical parameters. These calculations deliver reliable, parameter-free predictions of electronic structure, thermodynamic stability, mechanical properties, and chemical reactivity directly from fundamental physical laws. Beyond providing ground-truth data, first-principles methods play an indispensable role in guiding and interpreting experiments, clarifying underlying mechanisms, validating hypotheses, and identifying promising material candidates before costly synthesis. In our lab, DFT serves both as a standalone research tool and as the primary source of high-quality training data for machine learning models.

Machine learning potential

While first-principles methods offer unmatched accuracy, their computational cost limits the system sizes and timescales they can access. Machine learning potentials (MLPs) overcome this bottleneck by learning the quantum-mechanical energy landscape from DFT data, reproducing first-principles accuracy at a fraction of the cost. Unlike traditional empirical potentials with fixed functional forms, MLIPs are highly flexible and transferable across diverse chemical environments. In our lab, we develop, train, and benchmark MLIPs as a core research activity, not merely as infrastructure, pushing the frontier of their applicability to complex reactive systems, alloys, and novel material classes.

Multi-scale simulation

Materials phenomena span an enormous range of length and time scales, from atomic vibrations lasting femtoseconds to microstructural evolution unfolding over seconds or beyond, that no single simulation method can capture alone. Multi-scale simulation address this challenge by developing and applying methodological advances that systematically bridge these gaps. At the atomistic level, quantum-accurate methods such as DFT and MLIPs provide the energetic foundation, while advanced sampling techniques, kinetic Monte Carlo (kMC), and coarse-graining strategies extend the accessible time and length scales far beyond what conventional molecular dynamics can reach. By carefully connecting these levels of description, we investigate slow but critical processes, such as catalyst degradation, phase transition, ion diffusion, and defect kinetics, under realistic conditions, building a truly predictive understanding of material behavior across the full hierarchy of scales.

Generative AI for materials design

The ultimate goal of computational materials science is not just to understand existing materials, but to design entirely new ones. Generative AI approaches, including diffusion models, variational autoencoders, and large language models, learn the underlying distributions of known materials to propose novel compositions and structures with targeted properties. Combined with AI-driven property prediction models that map structure directly to performance, our research enables an efficient, closed-loop materials design workflow: generate candidates, screen by predicted properties, and validate with simulation. This transforms the traditional trial-and-error paradigm into a directed, data-driven search, dramatically accelerating the discovery of materials for energy storage, catalysis, and functional applications.