My name is Peijie Dong (董佩杰), a Ph.D. student in Data Science and Analysis at the Hong Kong University of Science and Technology (Guangzhou). Under the supervision of Prof. Xiaowen Chu, my research focuses on auto machine learning, deep learning, and model compression.

Before that, I obtained my B.S. in Software Engineering from Northwest Agriculture & Forestry University supervised by Prof. Hongming Zhang and A.P Yaojun Geng, where I ranked first in my class with a GPA of 3.78/4.0. Additionally, I completed my Master of Engineering at the National University of Defense Technology under the guidance of supervisor Xin Niu and Zhiliang Tian.

Research Interests

My research focuses on enhancing the efficiency and accessibility of deep learning models, particularly in the following areas:

  • Model Compression: Exploring pruning, quantization, and knowledge distillation techniques to reduce model size and computational demands.
  • Efficient Large Language Models: Optimizing LLM training and inference through innovative architectures and deployment strategies.
  • Automated Machine Learning (AutoML): Developing methods to streamline the ML pipeline, from architecture search to hyperparameter optimization.

I aim to contribute to the development of more efficient and accessible ML systems. If you are interested, please feel free to contact me.

πŸ”₯ News

  • [2024.05] Β πŸŽ‰πŸŽ‰ Pruner-Zero is accepted by ICML 2024. This work evolves symbolic pruning metrics from scratch for large language models. (paper, code)

  • [2024.03] Β πŸŽ‰πŸŽ‰ VMRNN is available. This work proposes the VMRNN cell, a new recurrent unit that integrates the strengths of Vision Mamba blocks with LSTM. We construct a network centered on VMRNN cells to tackle spatiotemporal prediction tasks effectively. (paper, code)

  • [2023.12] Β πŸŽ‰πŸŽ‰ KD-Zero is accepted by NeurIPS 2023. This work evolves knowledge distiller for any teacher-student pairs. (paper)

  • [2023.10] Β πŸŽ‰πŸŽ‰ EMQ is accepted by ICCV 2023. This work evolves training-free proxies for automated mixed precision quantization. (paper, code)

  • [2023.10] Β πŸŽ‰πŸŽ‰ AutoKD: Automated KD via MCTS is accepted by ICCV 2023. This work proposes automated knowledge distillation via Monte Carlo Tree Search. (paper)

  • [2023.03] Β πŸŽ‰πŸŽ‰ DisWOT is accepted by CVPR 2023. This work proposes student architecture search for distillation without training. (paper, code)

  • [2023.02] Β πŸŽ‰πŸŽ‰ Progressive Meta-Pooling Learning is accepted by ICASSP 2023. This work proposes a lightweight image classification model. (paper)

  • [2023.02] Β πŸŽ‰πŸŽ‰ RD-NAS is accepted by ICASSP 2023. This work enhances one-shot supernet ranking ability via ranking distillation. (paper)

  • [2023.01] Β πŸŽ‰πŸŽ‰ AutoRF is accepted by MMM 2022. This work proposes auto learning receptive fields with spatial pooling. (paper)

  • [2022.06] Β πŸŽ‰πŸŽ‰ Prior-Guided One-shot NAS is accepted by CVPR Workshop 2022. This work proposes prior-guided one-shot neural architecture search. (paper)

πŸ“– Educations

  • 2023.09 - now, The Hong Kong University of Science and Technology (Guangzhou), PhD Candidate in Computer Science

    • Supervisor: Prof. Xiaowen Chu
    • Research Interests: Large Language Models, Model Compression
  • 2020.09 - 2023.06, National University of Defence Technology, Master of Engineering

    • Supervisor: Prof. Xin Niu
    • Research Interests: AutoML, Neural Architecture Search
    • Achievement: Outstanding Graduate
  • 2016.09 - 2020.06, Northwest Agriculture & Forestry University, B.S. in Software Engineering

    • GPA: 3.78/4.0 (Ranked 1st out of 93)
    • Advisor: Prof. Hongming Zhang
    • Achievements: National Scholarship, Principal’s Scholarship, Outstanding Graduate
    • Research Interests: Object Detection, Multi-Object Tracking

πŸ‘” Professional Activities

  • Invited Program Committee Member (Reviewer):

    • Machine Learning: NeurIPS’23,24, ICLR’24, CVPR’24, ECCV’24
    • Signal Processing: ICASSP’23,24
  • Invited Reviewer for Journals

    • Machine Learning:
      • IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)
      • Neural Networks
      • Information Fusion
    • Signal Processing:
      • IEEE Computational Intelligence Magazine (CIM)

πŸŽ– Honors and Awards

  • 2023, Outstanding Graduate at School Level, National University of Defense Technology.
  • 2022, 1st Place, BDCI Retail Product Recognition based on MindSpore (CCF Big Data & Computing Intelligence Contest).
  • 2022, 1st Place, DCIC Intelligent Ship Detection Competition (Digital China Innovation Contest).
  • 2022, 2nd Place, DCIC Intelligent Cattle Segmentation Competition (Digital China Innovation Contest).
  • 2022, 1st Place, Baidu AI Competition - Blurred Document Image Recovery.
  • 2022, 3rd Place, Computer Vision and Pattern Recognition (CVPR) Third Workshop on NAS.
  • 2021, Outstanding MindSpore Developer.
  • 2020, Outstanding Dissertation, Northwest A&F University.
  • 2020, Outstanding Graduate, Northwest A&F University.
  • 2017, President’s Scholarship, Northwest A&F University.
  • 2016, National Scholarship, Northwest A&F University.

πŸ“ Publications

  • P. Dong, L. Li, Z. Tang, X. Liu, X. Pan, Q. Wang, X. Chu. Pruner-Zero: Evolving Symbolic Pruning Metric From Scratch for Large Language Models. In ICML 2024.

  • P. Dong, L. Li, Z. Wei. DisWOT: Student Architecture Search for Distillation without Training. In CVPR 2023.

  • P. Dong, L. Li, Z. Wei, X. Niu$^*$, Z. Tian, H. Pan. EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization. In ICCV 2023.

  • L. Li, P. Dong, A. Li, Z. Wei, Y. Yang. Kd-zero: Evolving knowledge distiller for any teacher-student pairs. In NeurIPS 2023.

  • P. Dong, X. Niu, Z. Tian, et al. Progressive Meta-Pooling Learning for Lightweight Image Classification Model. In ICASSP 2023.

  • P. Dong, X. Niu, L. Li, et al. RD-NAS: Enhancing One-shot Supernet Ranking Ability via Ranking Distillation. In ICASSP 2023.

  • P. Dong, X. Niu, H. Pan, et al. AutoRF: Auto Learning Receptive Fields with Spatial Pooling. In MMM 2023.

  • P. Dong, X. Niu, L. Li, et al. Prior-Guided One-shot Neural Architecture Search. In CVPR Workshop 2022.

  • L. Li, P. Dong, Z. Wei, Y. Ya. Automated Knowledge Distillation via Monte Carlo Tree Search. In ICCV 2023.