Chang Ma

Ph.D student, The University of Hong Kong

cma [AT] cs.hku.hk

Bio

I am a second-year CS Ph.D student at HKUNLP Lab, The University of Hong Kong. I am fortunately co-advised by Lingpeng Kong and Tao Yu. I am also working closely with Junxian He. Before that, I received my B.S. in Computer Science from Peking University, advised by Prof.Zhihong Deng. I was also deeply inspired by these collaborators.

My research goal is to develop Autonomous, Efficient and Theory-Grounded machine learning algorithms, with a particular focus on advancing scientific discoveries.

  • For Autonomous Decision Making, I design agents powered by generative models, with an emphasis on improving reasoning and planning abilities of foundation models that could facilitate research-level abilities.
  • For Theory-Grounded Research, I focus on developing AI systems that are certifiably robust and provably optimal, ensuring reliability and efficiency in their decision-making processes.
  • For Efficient Scientific Discovery, I leverage natural language processing and generative agents to accelerate key areas of scientific progress, with a focus on drug discovery.
  • By bridging foundational machine learning research with real-world scientific applications, my work aims to create robust, reasoning-intensive AI systems capable of addressing critical challenges, including treating human diseases and uncovering the fundamental laws of nature.

    Research Interest
    #Agents   #AI4Science     #DrugDiscovery     #MachineLearningTheory    

    Publications

    Most recent publications on Google Scholar.
    indicates equal contribution.

    Retrieved Sequence Augmentation for Protein Representation Learning

    Chang Ma, Haiteng Zhao, Lin Zheng, Jiayi Xin, Qintong Li, Lijun Wu, Zhihong Deng, Yang Lu, Qi Liu, Sheng Wang, Lingpeng Kong

    EMNLP 2024

    Benchmarking and Enhancing Large Language Models for Biological Pathway Reasoning

    Haiteng Zhao, Chang Ma, Lingpeng Kong, Zhi-Hong Deng

    Preprint

    GIMLET: A Unified Graph-Text Model for Instruction-Based Molecule Zero-Shot Learning

    Haiteng Zhao, Shengchao Liu, Chang Ma, Hannan Xu, Jie Fu, Zhi-Hong Deng, Lingpeng Kong, Qi Liu

    NeurIPS 2023

    TorchDrug: A Powerful and Flexible Machine Learning Platform for Drug Discovery

    Zhaocheng Zhu, Chence Shi, Zuobai Zhang, Shengchao Liu, Minghao Xu, Xinyu Yuan, Yangtian Zhang, Junkun Chen, Huiyu Cai, Jiarui Lu, Chang Ma, Runcheng Liu, Louis-Pascal Xhonneux, Meng Qu, Jian Tang

    Preprint

    PEER: A Comprehensive and Multi-Task Benchmark for Protein Sequence Understanding

    Minghao Xu, Zuobai Zhang, Jiarui Lu, Zhaocheng Zhu, Yangtian Zhang, Chang Ma, Runcheng Liu, Jian Tang

    NeurIPS 2022 Dataset and Benchmark Track

    Non-myopic Generation of Language Models for Reasoning and Planning

    Chang Ma, Haiteng Zhao, Junlei Zhang, Junxian He, Lingpeng Kong

    Preprint

    AgentBoard: An Analytical Evaluation Board of Multi-Turn LLM Agents

    Chang Ma*, Junlei Zhang*, Zhihao Zhu*, Cheng Yang*, Yujiu Yang, Yaohui Jin, Zhenzhong Lan, Lingpeng Kong, Junxian He

    NeurIPS 2024 Dataset and Benchmark Track (Oral)

    Empowering Large Language Model Agents through Action Learning

    Haiteng Zhao, Chang Ma, Guoyin Wang, Jing su, Lingpeng Kong, Jingjing Xu, Zhi-Hong Deng, Hongxia Yang

    COLM 2024

    A Challenging Benchmark for Low-Resource Learning

    Yudong Wang*, Chang Ma*, Qingxiu Dong, Lingpeng Kong, Jingjing Xu

    ACL 2024 Findings

    KS-Lottery: Finding Certified Lottery Tickets for Multilingual Language Models

    Fei Yuan, Chang Ma, Shuai Yuan, Qiushi Sun, Lei Li

    Preprint

    Certified Robustness Against Natural Language Attacks by Causal Intervention

    Haiteng Zhao*, Chang Ma*, Xinshuai Dong*, Anh Tuan Luu, Zhi-Hong Deng, Hanwang Zhang

    ICML 2022

    Domain Adaptation via Mutual Information Maximization

    Haiteng Zhao, Chang Ma, Qinyu Chen, Zhihong Deng

    IJCAI 2022 Long Presentation

    Non-myopic Generation of Language Models for Reasoning and Planning

    Chang Ma, Haiteng Zhao, Junlei Zhang, Junxian He, Lingpeng Kong

    Preprint

    AgentBoard: An Analytical Evaluation Board of Multi-Turn LLM Agents

    Chang Ma*, Junlei Zhang*, Zhihao Zhu*, Cheng Yang*, Yujiu Yang, Yaohui Jin, Zhenzhong Lan, Lingpeng Kong, Junxian He

    NeurIPS 2024 Dataset and Benchmark Track (Oral)

    Empowering Large Language Model Agents through Action Learning

    Haiteng Zhao, Chang Ma, Guoyin Wang, Jing su, Lingpeng Kong, Jingjing Xu, Zhi-Hong Deng, Hongxia Yang

    COLM 2024

    Retrieved Sequence Augmentation for Protein Representation Learning

    Chang Ma, Haiteng Zhao, Lin Zheng, Jiayi Xin, Qintong Li, Lijun Wu, Zhihong Deng, Yang Lu, Qi Liu, Sheng Wang, Lingpeng Kong

    EMNLP 2024

    Benchmarking and Enhancing Large Language Models for Biological Pathway Reasoning

    Haiteng Zhao, Chang Ma, Lingpeng Kong, Zhi-Hong Deng

    Preprint

    GIMLET: A Unified Graph-Text Model for Instruction-Based Molecule Zero-Shot Learning

    Haiteng Zhao, Shengchao Liu, Chang Ma, Hannan Xu, Jie Fu, Zhi-Hong Deng, Lingpeng Kong, Qi Liu

    NeurIPS 2023

    TorchDrug: A Powerful and Flexible Machine Learning Platform for Drug Discovery

    Zhaocheng Zhu, Chence Shi, Zuobai Zhang, Shengchao Liu, Minghao Xu, Xinyu Yuan, Yangtian Zhang, Junkun Chen, Huiyu Cai, Jiarui Lu, Chang Ma, Runcheng Liu, Louis-Pascal Xhonneux, Meng Qu, Jian Tang

    Preprint

    PEER: A Comprehensive and Multi-Task Benchmark for Protein Sequence Understanding

    Minghao Xu, Zuobai Zhang, Jiarui Lu, Zhaocheng Zhu, Yangtian Zhang, Chang Ma, Runcheng Liu, Jian Tang

    NeurIPS 2022 Dataset and Benchmark Track

    A Challenging Benchmark for Low-Resource Learning

    Yudong Wang*, Chang Ma*, Qingxiu Dong, Lingpeng Kong, Jingjing Xu

    ACL 2024 Findings

    KS-Lottery: Finding Certified Lottery Tickets for Multilingual Language Models

    Fei Yuan, Chang Ma, Shuai Yuan, Qiushi Sun, Lei Li

    Preprint

    Certified Robustness Against Natural Language Attacks by Causal Intervention

    Haiteng Zhao*, Chang Ma*, Xinshuai Dong*, Anh Tuan Luu, Zhi-Hong Deng, Hanwang Zhang

    ICML 2022

    Domain Adaptation via Mutual Information Maximization

    Haiteng Zhao, Chang Ma, Qinyu Chen, Zhihong Deng

    IJCAI 2022 Long Presentation

    Switch-GPT: an effective method for constrained text generation under few-shot settings

    Chang Ma*, Song Zhang*, Gehui Shen, Zhihong Deng

    AAAI Student Abstract 2021

    Blogs

    Avoiding Mistakes and Fixing Mistakes -- The Two Tales of Planning

    A control theory view on Large Language Model reasoning and planning. In order to achieve successful planning, an agent can take two kinds of strategies handing mistakes: speculative planning v.s. self-reflection...

    Vitæ

    Full Resume in PDF.

    • Microsoft Research AI4Science April-December 2023
      Research Intern
      Foundation Model Group
    • The University of Hong Kong Sep 2022 -
      Ph.D. Student
      CS, NLP
    • Microsoft Nov 2021 - June 2022
      Software Engineer Research Intern
      Bing Advertisement Group, NLP
    • MILA Jan-Oct 2021
      Research Intern
      DeepGraph Group, Computational Biology
    • Nanyang Technological University June-Nov 2021
      Research Intern
      NAIL Group, NLP
    • Peking University July 2019 - July 2022
      Research Assistant
      Sigma Lab, Department of Machine Intelligence
    • Peking University Sep 2018 - July 2022
      B.Sc. Student
      Machine Intelligence

    Miscellaneous

    I am a hiking enthusiast and my favorite trail is the MacLehose Trail, Section 2 in Hong Kong. I played piano for 8 years as a child and just started getting back into it. I am a keen fan of Bach and recently learning the Goldberg-Variationen. I love two kinds of elegance -- beautiful music and elegant proof.

    Thanks to Martin Saveski for the website template.