Yuxuan Wang (王宇轩)

Ph.D., Zhejiang Lab | mail: hitalexwang@gmail.com | [Get Resume in PDF]

RESEARCH INTERESTS
Computational linguistics, Natural language processing, Dependency parsing, Cross-lingual learning.

EDUCATION
Ph.D., Harbin Institute of Technology2016.9 - 2021.11
Supvervisor: Wanxiang Che
Major: Computer Science

Visiting Student, University of Edinburgh2019.9 - 2020.9
Supvervisor: Ivan Titov and Shay Cohen
Major: Computer Science

B.E., Harbin Institute of Technology2012.9 - 2016.6
Major: Computer Science

PUBLICATION
YuxuanWang, Zhilin Lei and Wanxiang Che. 2021. Character-Level Syntax Infusion in Pre-Trained Models for Chinese Semantic Role Labeling. In International Journal of Machine Learning and Cybernetics. (https://doi.org/10.1007/s13042-021-01397-3)

YuxuanWang, Wanxiang Che, Ivan Titov, Shay B Cohen, Zhilin Lei and Ting Liu. 2021. A Closer Look into the Robustness of Neural Dependency Parsers Using Better Adversarial Examples. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (Findings of ACL 2021).

Xiaoming Shi, Sendong Zhao, YuxuanWang, Xi Chen, Ziheng Zhang, Yefeng Zheng, Wanxiang Che, Yutai Hou, Wanxiang Che. 2021. Understanding Patient Query with Weak Supervision from Doctor Response. In IEEE Journal of Biomedical and Health Informatics. (https://doi.org/10.1109/jbhi.2021.3133667)

YuxuanWang, Yutai Hou, Wanxiang Che and Ting Liu. 2020. From Static to Dynamic Word Representations: A Survey. In International Journal of Machine Learning and Cybernetics. (https://doi.org/10.1007/s13042-020-01069-8)

YuxuanWang, Wanxiang Che, Jiang Guo, Yijia Liu and Ting Liu. 2019. Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing. In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2019).

Wanxiang Che, Longxu Dou, Yang Xu, YuxuanWang, Yijia Liu and Ting Liu. 2019. HIT-SCIR at MRP 2019: A Unified Pipeline for Meaning Representation Parsing via Efficient Training and Effective Encoding. In Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the CoNLL 2019.

Yijia Liu, Wanxiang Che, YuxuanWang, Bo Zheng, Bing Qin and Ting Liu. 2019. Deep Contextualized Word Embeddings for Universal Dependency Parsing. In ACM Transactions on Asian and Low-Resource Language Information Processing (TALLIP).

Wanxiang Che, Yijia Liu, YuxuanWang, Bo Zheng and Ting Liu. 2018. Towards Better UD Parsing: Deep Contextualized Word Embeddings, Ensemble, and Treebank Concatenation. In Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies.

YuxuanWang, Wanxiang Che, Jiang Guo and Ting Liu. 2018. A Neural Transition-Based Approach for Semantic Dependency Graph Parsing. In Proceedings of the 32nd AAAI Conference on Artificial Intelligence (AAAI 2018).

Wanxiang Che, Jiang Guo, YuxuanWang, Bo Zheng, Huaipeng Zhao, Yang Liu, Dechuan Teng and Ting Liu. 2017. The HIT-SCIR System for End-to-End Parsing of Universal Dependencies. In Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies.

YuxuanWang, Jiang Guo, Wanxiang Che and Ting Liu. 2016. Transition-based Chinese Semantic Dependency Graph Parsing. In Proceedings of the 15th China National Conference on Computational Linguistics and the 4th International Symposium on Natural Language Processing based on Naturally Annotated Big Data (CCL 2016, Best Paper Award).

PROJECTS
LTP is a software package that provides a Chinese natural language processing pipeline along with web service API.
  • one of the developers of LTP.
  • developed the semantic dependency graph parsing module of LTP.

CoNLL 2017 Shared Task, 2017.3 - 2017.7
The goal of this task is to parse multilingual corpora from raw text to universal dependencies. (45 languages, 64 treebanks)
  • our system achieved 4th place among 113 registered team all over the world.
  • developed the major parsing module of our system.

CoNLL 2018 Shared Task, 2018.3 - 2018.7
The goal of this task is to parse multilingual corpora from raw text to universal dependencies. It is the same as CoNLL 2017 but with more languages and more treebanks. (57 languages, 82 treebanks)
  • our system achieved 1th place among 27 teams around world who submitted their systems, significantly outperforming the second system by more than 2% in LAS Ranking.
  • in charge of the major parsing module of our system.

TECHNIQUE SUMMARY
Programming Languages: C, C++, Python, Shell
Operating Systems: Windows, Linux
Experience: Git, Dynet, Tensorflow
Language: English (PETS5), Chinese (Native)

AWARDS
Best Paper Award of NLP-NABD 2016 2016.10
Best 100 graduation thesis in 2016 of Harbin Institute of Technology 2016.6