Hi, there! I'm Zhe Wang, a first year M.S. student at University of Illinois at Urbana-Champaign, majoring in Computer Science. I am advised by Prof. Lingming Zhang.
Before coming to UIUC, I completed my undergraduate study at Tsinghua University, majoring in Mathematics and Physics.
Research Interests
My research interest lies in the intersection of Artificial Intelligence and Software Engineering. More specifically:
") does not match the recommended repository name for your site ("
").
", so that your site can be accessed directly at "http://
".
However, if the current repository name is intended, you can ignore this message by removing "{% include widgets/debug_repo_name.html %}
" in index.html
.
",
which does not match the baseurl
("
") configured in _config.yml
.
baseurl
in _config.yml
to "
".
Kunlun Zhu†, Hongyi Du†, Zhaochen Hong†, Xiaocheng Yang†, Shuyi Guo†, Zhe Wang†, Zhenhailong Wang, Cheng Qian, Xiangru Tang, Heng Ji, Jiaxuan You († core contributors)
ACL 2025
In this paper, we introduce MultiAgentBench, a comprehensive benchmark designed to evaluate LLM-based multi-agent systems across diverse, interactive scenarios. Our framework measures not only task completion but also the quality of collaboration and competition using novel, milestone-based key performance indicators.
Kunlun Zhu†, Hongyi Du†, Zhaochen Hong†, Xiaocheng Yang†, Shuyi Guo†, Zhe Wang†, Zhenhailong Wang, Cheng Qian, Xiangru Tang, Heng Ji, Jiaxuan You († core contributors)
ACL 2025
In this paper, we introduce MultiAgentBench, a comprehensive benchmark designed to evaluate LLM-based multi-agent systems across diverse, interactive scenarios. Our framework measures not only task completion but also the quality of collaboration and competition using novel, milestone-based key performance indicators.
Yuxiang Wei, Zhe Wang, Jiawei Liu, Yifeng Ding, Lingming Zhang
ICML 2024
In this paper, we introduce Magicoder, a series of fully open-source (code, weights, and data) Large Language Models (LLMs) for code that significantly closes the gap with top code models while having no more than 7B parameters.
Yuxiang Wei, Zhe Wang, Jiawei Liu, Yifeng Ding, Lingming Zhang
ICML 2024
In this paper, we introduce Magicoder, a series of fully open-source (code, weights, and data) Large Language Models (LLMs) for code that significantly closes the gap with top code models while having no more than 7B parameters.