Ziming Liu

School of Computing, National University of Singapore.

denver.jpg

liuziming[AT]comp[DOT] nus[DOT]edu[DOT]sg

AI Research Lab 1, COM 3, School of Computing

11 Research Link

Singapore, 119391

Hi, I am a second-year CS Ph.D. student at NUS, supervised by Prof. Yang You and work as a member of HPC-AI lab. I received my bachelor’s degree of computer science and engineering at Peking University in 2020.

My research interest is machine learning system and high performance computing. I have been working on pipeline parallelism in deep learning training, and I am currently digging into long-sequence training for LLMs. I am looking forward to collaborations and research internship opportunities, so please feel free to reach out to me if you are interested in my research.

In my spare time, I enjoy watching anime, playing rhythm games (especially maimai) and board games. I also play the piano and once played keyboard in an amateur band.

Check out more about me on Google Scholar, Github, and Twitter!

news

Mar 17, 2024 I am attending Nvidia GTC 2024 in San Jose! Looking forward to meeting new friends!
Feb 26, 2024 Our open source project OpenDiT is released! Check it out at https://github.com/NUS-HPC-AI-Lab/OpenDiT !
Nov 15, 2023 I presented Hanayo at SC ‘23! Glad to meet so many friends in the conference!
Jun 16, 2023 Our paper Hanayo was accepted by SC ‘23!
Mar 4, 2023 I got the plate of 暁将 in the game of maimai DX ;)

selected publications

  1. hanayo.jpg
    Hanayo: Harnessing Wave-like Pipeline Parallelism for Enhanced Large Model Training Efficiency
    Ziming Liu, Shenggan Cheng, Haotian Zhou, and Yang You
    In SC ’23, Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis, 2023
  2. hetegen.png
    HeteGen: Heterogeneous Parallel Inference for Large Language Models on Resource-Constrained Devices
    Xuanlei Zhao, Bin Jia, Haotian Zhou, Ziming Liu, Shenggan Cheng, and Yang You
    2024
  3. energon.jpg
    EnergonAI: An Inference System for 10-100 Billion Parameter Transformer Models
    Jiangsu Du, Ziming Liu, Jiarui Fang, Shenggui Li, Yongbin Li, Yutong Lu, and 1 more author
    2022
  4. atp.jpg
    ATP: Adaptive Tensor Parallelism for Foundation Models
    Shenggan Cheng, Ziming Liu, Jiangsu Du, and Yang You
    2023
  5. dsp.png
    DSP: Dynamic Sequence Parallelism for Multi-Dimensional Transformers
    Xuanlei Zhao, Shenggan Cheng, Zangwei Zheng, Zheming Yang, Ziming Liu, and Yang You
    2024