Zhe Wang
Logo Graduate Student, University of Illinois at Urbana-Champaign
Logo Undergraduate Student, Tsinghua University

Hi, there! I'm Zhe Wang, a first year M.S. student at University of Illinois at Urbana-Champaign, majoring in Computer Science. I am advised by Prof. Lingming Zhang.

Before coming to UIUC, I completed my undergraduate study at Tsinghua University, majoring in Mathematics and Physics.

Research Interests

My research interest lies in the intersection of Artificial Intelligence and Software Engineering. More specifically:

  • LLMs for Code: to develop LLMs to solve software engineering tasks through post-training via synthetic data
  • Trustworthy LLMs: to enhance trustworthiness, resilience and reliability of helpful-only LLMs against vulnerable code and malicious cyberactivity attacks
  • LLM Applications: to empower LLMs with reasoning, planning and collaboration capabilities through alignment training and agent-based systems
Curriculum Vitae

Education
  • University of Illinois at Urbana-Champaign
    University of Illinois at Urbana-Champaign
    M.S. in Computer Science
    Aug. 2024 - Present
  • Tsinghua University
    Tsinghua University
    B.S. in Mathematics and Physics
    Sep. 2020 - Jul. 2024
  • Shanghai High School
    Shanghai High School
    High School
    Sep. 2017 - Jun. 2020
Selected Publications (view all )
Magicoder: Empowering Code Generation with OSS-Instruct
Magicoder: Empowering Code Generation with OSS-Instruct

Yuxiang Wei, Zhe Wang, Jiawei Liu, Yifeng Ding, Lingming Zhang

ICML 2024

We introduce Magicoder, a series of fully open-source (code, weights, and data) Large Language Models (LLMs) for code that significantly closes the gap with top code models while having no more than 7B parameters.

Magicoder: Empowering Code Generation with OSS-Instruct

Yuxiang Wei, Zhe Wang, Jiawei Liu, Yifeng Ding, Lingming Zhang

ICML 2024

We introduce Magicoder, a series of fully open-source (code, weights, and data) Large Language Models (LLMs) for code that significantly closes the gap with top code models while having no more than 7B parameters.

All publications