Profile

Tongqi Wen

Research Assistant Professor
Ph.D. in Materials Science and Engineering
Department of Mechanical Engineering
The University of Hong Kong

Research Interests

AI for ScienceMachine Learning PotentialsAtomistic SimulationsHigh-entropy Materials
Liquid and Glass

Introduction

Hello! I'm Tongqi Wen, a lively and driven researcher, passionate about making meaningful contributions to the exciting fields of Artificial Intelligence and Materials Science. Currently, I'm a Research Assistant Professor in the Department of Mechanical Engineering at the University of Hong Kong (HKU). My journey in science has been full of exciting opportunities and collaborations, and I'm always looking forward to the next challenge.

My research combines machine learning with atomistic simulations to explore fascinating materials, including high-entropy alloys, glass, and defect properties. I'm driven by the belief that technology and innovation can create breakthroughs that shape a brighter future, and I'm always excited to push the boundaries of what we know. Let's explore, discover, and innovate together!

Before joining HKU, I had the honor of conducting research at renowned institutions like Ames National Laboratory, Iowa State University (USA), and City University of Hong Kong. Along the way, I was humbled to receive the Ross Coffin Purdy Award from the American Ceramic Society in 2021.

Recent News

View all news
Loading news...

Selected Papers

View all papers
Inverse Materials Design by Large Language Model-Assisted Generative Framework

Yun Hao, Che Fan, Beilin Ye, Wenhao Lu, Zhen Lu, Peilin Zhao, Zhifeng Gao*, Qingyao Wu*, Yanhui Liu*, Tongqi Wen*

February 25, 2025

arXiv
AI4SMaterials Inverse Design

🏗 How to efficiently mine high-quality knowledge from literature and apply it to new material discovery?

Deep generative models hold great promise for inverse materials design, yet their efficiency and accuracy remain constrained by data scarcity and model architecture. Here, we introduce AlloyGAN, a closed-loop framework that integrates Large Language Model (LLM)-assisted text mining with Conditional Generative Adversarial Networks (CGANs) to enhance data diversity and improve inverse design. Taking alloy discovery as a case study, AlloyGAN systematically refines material candidates through iterative screening and experimental validation. For metallic glasses, the framework predicts thermodynamic properties with discrepancies of less than 8% from experiments, demonstrating its robustness. By bridging generative AI with domain knowledge and validation workflows, AlloyGAN offers a scalable approach to accelerate the discovery of materials with tailored properties, paving the way for broader applications in materials science.

Illustration for Inverse Materials Design by Large Language Model-Assisted Generative Framework
Active Learning for Conditional Inverse Design with Crystal Generation and Foundation Atomic Models

Zhuoyuan Li, Siyu Liu, Beilin Ye, David J. Srolovitz*, Tongqi Wen*

February 24, 2025

arXiv
AI4SMaterials Inverse DesignAtomic Models

⚛️ Can AI automatically help us discover new materials with given properties?

Artificial intelligence (AI) is transforming materials science, enabling both theoretical advancements and accelerated materials discovery. Recent progress in crystal generation models, which design crystal structures for targeted properties, and foundation atomic models (FAMs), which capture interatomic interactions across the periodic table, has significantly improved inverse materials design. However, an efficient integration of these two approaches remains an open challenge. Here, we present an active learning framework that combines crystal generation models and foundation atomic models to enhance the accuracy and efficiency of inverse design. As a case study, we employ Con-CDVAE to generate candidate crystal structures and MACE-MP-0 FAM as one of the high-throughput screeners for bulk modulus evaluation. Through iterative active learning, we demonstrate that Con-CDVAE progressively improves its accuracy in generating crystals with target properties, highlighting the effectiveness of a property-driven fine-tuning process. Our framework is general to accommodate different crystal generation and foundation atomic models, and establishes a scalable approach for AI-driven materials discovery. By bridging generative modeling with atomic-scale simulations, this work paves the way for more accurate and efficient inverse materials design.

Illustration for Active Learning for Conditional Inverse Design with Crystal Generation and Foundation Atomic Models
A Multi-agent Framework for Materials Laws Discovery

Bo Hu, Siyu Liu, Beilin Ye, Yun Hao, Tongqi Wen*

November 25, 2024

arXiv
AI4SMaterials Laws

🚀 Does AI possess the intelligence to autonomously discover materials laws?

This paper introduces a multi-agent framework based on large language models (LLMs) specifically designed for symbolic regression in materials science. The framework was applied to derive an interpretable formula for the glass-forming ability of metallic glasses, achieving a correlation coefficient of up to 0.948 with low formula complexity.

Illustration for A Multi-agent Framework for Materials Laws Discovery
Large Language Models for Material Property Predictions: elastic constant tensor prediction and materials design

Siyu Liu, Tongqi Wen*, Beilin Ye, Zhuoyuan Li, David J. Srolovitz*

November 19, 2024

arXiv
AI4SMaterials Properties

🧀 Can the materials knowledge stored in LLMs help us predict material properties?

Efficient and accurate prediction of material properties is critical for advancing materials design and applications. The rapid-evolution of large language models (LLMs) presents a new opportunity for material property predictions, complementing experimental measurements and multi-scale computational methods. We focus on predicting the elastic constant tensor, as a case study, and develop domain-specific LLMs for predicting elastic constants and for materials discovery. The proposed ElaTBot LLM enables simultaneous prediction of elastic constant tensors, bulk modulus at finite temperatures, and the generation of new materials with targeted properties. Moreover, the capabilities of ElaTBot are further enhanced by integrating with general LLMs (GPT-4o) and Retrieval-Augmented Generation (RAG) for prediction. A specialized variant, ElaTBot-DFT, designed for 0 K elastic constant tensor prediction, reduces the prediction errors by 33.1% compared with domain-specific, material science LLMs (Darwin) trained on the same dataset. This natural language-based approach lowers the barriers to computational materials science and highlights the broader potential of LLMs for material property predictions and inverse design.

Illustration for Large Language Models for Material Property Predictions: elastic constant tensor prediction and materials design

Join Us

We are looking for passionate and motivated individuals to join our team! If you are interested in contributing to cutting-edge research in AI and Materials Science, feel free to reach out.