Kaixin Zhang

kaixinzhang@pku.edu.cn (+86) 198-0120-3004 kaixinzhang.org Leadlegend

Education

AI Turing Class, EECS College, Peking University - Beijing, China

Sep 2018 - Present

Bachelor of Engneering in Artificial Intelligence

Overall GPA - 3.67/4

Major GPA - 3.72/4

  • Honor:
    • Newbee Scholarship of Peking University - Dec 2018
    • School-level Scholarship of Peking University - Oct 2019

Research Interests

  • Natural Language Understanding (Classification, NER, EL, Knowledge Graph, etc.)
  • Natural Language Generation (QA, QG, Summarization, etc.)
  • Pretraining, self-supervised and semi-supervised learning

Research Experience

Development of Commonsense-based Question Generation Models - Beijing, China

Apr 2020 - Nov 2020

Research Assistant | Supervisor: Prof. Yunfang Wu, Institute of Computational Linguistics of PKU

  • Developed a seq-to-seq Question Generation model and designed static graph attention mechansim that extracts extern knowledge from Knowledge Graph and encodes context-irrelevant knowledge embedding as linguistic feature.
  • Participated in a multitask learning QG project, provided assistance about baseline implementation and dataset transfer (from SQuAD to RACE) in ablation experiments, which was finally accepted by ACL2020.
  • Reviewed development of pre-trained NLG methods (BERTsum, BART, ProphetNet and PEGASus), especially focusing on Text Summarization. Discussed feasible ways to introduce pretraining into question generation task.

Review of Semi-supervised and Self-supervised Text Classification Methods - Beijing, China

Jan 2020 - Mar 2020

Member | Supervisor: Assistant Prof. Rui Yan, Wangxuan Institute of Computer Technology of PKU

  • Investigated semi-supervised and self-supervised text classification methods between 2018 and 2020, discussed the mechanism of self-supervised networks and further modification direction.
  • Verified the validation of a noisy-label adversarial text classification method by model reproduction in PyTorch.

Professional Experience

Tencent Co., Ltd - Beijing, China

Apr 2021 - Dec 2021

Research Intern | Department: TEG AI Lab & AI Platform Department

  • Participated in maintenance of an universal-domain knowledge graph Topbase, optimized sampling strategy in construction of a paraellel dataset for entity linking to mitigate influence of severe label bias.
  • Implemented multi-GPU(ddp) gradient gathering mechanism for a bi-encoder based Entity Linking model, which supplied bigger batch size and more negative samples, thus improving model performance.
  • Transferred a contrastive learning paradigm Self-Tuning to domain-specific semi-supervised NER models, introduced Training Signal Annealing and self-distillation to utilize unlabeled data more efficiently.
  • Implemented an keyword-to-text model generating advertisement based on chinese GPT-2 and UER-py, optimized keyword coverage and generation diversity via in-domain pretraining and Mention Flags, which can encode keyword mention status into multi-head attentions of transformer encoder.
  • Above-mentioned advertisement generation project received “Tencent Monthly Innovation Award” and was applied to Tencent advertisement business of online reading.

Project Development

Commonsense-based Question Generation Model

Jul 2020 - Oct 2020

Leader | Supervisor: Prof. Yunfang Wu, Institute of Computational Linguistics of PKU

  • Implemented as standard seq-to-seq architecture in Bi-LSTM and adopted fundamental mechanisms in QG including Copy Pointer and Gated Self-Attention.(Repository Link)
  • Designed a static graph attention to generate knowledge embedding from Knowledge Graph (embedded by TransE), which can be viewed as additional linguistic feature and concatenated onto word embedding.
  • Our method reached 17.47 for BLEU-4 on SQuAD, which is distinguished among non-pretrained methods, and can capture more implicit relations among knowledge entities according to human evaluation.

Skills

  • Pytorch, Pytorch-Lightning, Huggingface-Transformers, Python, C++, Tensorflow, Assembly