Fang Kong


2023

pdf bib
Improving Dialogue Discourse Parsing via Reply-to Structures of Addressee Recognition
Yaxin Fan | Feng Jiang | Peifeng Li | Fang Kong | Qiaoming Zhu
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing

Dialogue discourse parsing aims to reflect the relation-based structure of dialogue by establishing discourse links according to discourse relations. To alleviate data sparsity, previous studies have adopted multitasking approaches to jointly learn dialogue discourse parsing with related tasks (e.g., reading comprehension) that require additional human annotation, thus limiting their generality. In this paper, we propose a multitasking framework that integrates dialogue discourse parsing with its neighboring task addressee recognition. Addressee recognition reveals the reply-to structure that partially overlaps with the relation-based structure, which can be exploited to facilitate relation-based structure learning. To this end, we first proposed a reinforcement learning agent to identify training examples from addressee recognition that are most helpful for dialog discourse parsing. Then, a task-aware structure transformer is designed to capture the shared and private dialogue structure of different tasks, thereby further promoting dialogue discourse parsing. Experimental results on both the Molweni and STAC datasets show that our proposed method can outperform the SOTA baselines. The code will be available at https://github.com/yxfanSuda/RLTST.

2022

pdf bib
Discourse Parsing Enhanced by Discourse Dependence Perception
Yuqing Xing | Longyin Zhang | Fang Kong | Guodong Zhou
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)

In recent years, top-down neural models have achieved significant success in text-level discourse parsing. Nevertheless, they still suffer from the top-down error propagation issue, especially when the performance on the upper-level tree nodes is terrible. In this research, we aim to learn from the correlations in between EDUs directly to shorten the hierarchical distance of the RST structure to alleviate the above problem. Specifically, we contribute a joint top-down framework that learns from both discourse dependency and constituency parsing through one shared encoder and two independent decoders. Moreover, we also explore a constituency-to-dependency conversion scheme tailored for the Chinese discourse corpus to ensure the high quality of the joint learning process. Our experimental results on CDTB show that the dependency information we use well heightens the understanding of the rhetorical structure, especially for the upper-level tree layers.

pdf bib
A Distance-Aware Multi-Task Framework for Conversational Discourse Parsing
Yaxin Fan | Peifeng Li | Fang Kong | Qiaoming Zhu
Proceedings of the 29th International Conference on Computational Linguistics

Conversational discourse parsing aims to construct an implicit utterance dependency tree to reflect the turn-taking in a multi-party conversation. Existing works are generally divided into two lines: graph-based and transition-based paradigms, which perform well for short-distance and long-distance dependency links, respectively. However, there is no study to consider the advantages of both paradigms to facilitate conversational discourse parsing. As a result, we propose a distance-aware multi-task framework DAMT that incorporates the strengths of transition-based paradigm to facilitate the graph-based paradigm from the encoding and decoding process. To promote multi-task learning on two paradigms, we first introduce an Encoding Interactive Module (EIM) to enhance the flow of semantic information between both two paradigms during the encoding step. And then we apply a Distance-Aware Graph Convolutional Network (DAGCN) in the decoding process, which can incorporate the different-distance dependency links predicted by the transition-based paradigm to facilitate the decoding of the graph-based paradigm. The experimental results on the datasets STAC and Molweni show that our method can significantly improve the performance of the SOTA graph-based paradigm on long-distance dependency links.

2021

pdf bib
面向对话文本的实体关系抽取(Entity Relation Extraction for Dialogue Text)
Liang Liu (陆亮) | Fang Kong (孔芳)
Proceedings of the 20th Chinese National Conference on Computational Linguistics

实体关系抽取旨在从文本中抽取出实体之间的语义关系,是自然语言处理的一项基本任务。在新闻报道、维基百科等规范文本上该任务的研究相对丰富,已经取得了一定的效果,但面向对话文本的相关研究还处于起始阶段。相较于规范文本,用于实体关系抽取的对话语料规模较小,对话文本的有效特征难以捕获,这使得面向对话文本的实体关系抽取更具挑战。该文针对这一任务提出了一个基于Star-Transformer的实体关系抽取模型,通过融入高速网络进行信息桥接,并在此基础上融入交互信息和知识,最后使用多任务学习机制进一步提升模型的性能。在DialogRE公开数据集上实验得到F1值为55.7%,F1c值为52.3%,证明了提出方法的有效性。

pdf bib
基于词汇链强化表征的篇章修辞结构分析研究(Lexical Chain Based Strengthened Representation for Discourse Rhetorical Structure Parsing)
Jinfeng Wang (王金锋) | Fang Kong (孔芳)
Proceedings of the 20th Chinese National Conference on Computational Linguistics

篇章分析作为自然语言处理领域的基础问题一直广受关注。由于语料规模有限,绝大多数已有研究仍依赖于外部特征的加入。针对该问题,本文提出了提出一种通用的表征增强方法,借助图卷积神经网络将词汇链信息融入到基本篇章单元的表征中。在RST-DT和CDTB上的实验证明,本文提出的表征增强方法能够提升多种篇章解析器的性能。

pdf bib
基于信息交互增强的时序关系联合识别(Joint Recognition of Temporal Relation Based on Information Interaction Enhancement)
Qianying Dai (戴倩颖) | Fang Kong (孔芳)
Proceedings of the 20th Chinese National Conference on Computational Linguistics

时序关系识别是信息抽取领域的一个重要分支,对文本理解发挥着关键作用。按照关联对象的不同,时序关系分为三大类:事件对(E-E)间的时序关系,事件与时间表达式间(E-T)的时序关系,事件与文档建立时间(E-D)间的时序关系。不同关系类型孤立识别的方法忽视了其间隐含的关联信息,针对这一问题构建了基于信息交互增强的时序关系联合识别模型。通过在不同神经网络层之间共享参数实现E-E与E-T时序关系的语义交流,利用两者的潜在联系提高识别精度。在Time-Bank Dense语料上的一系列实验表明,该方法优于现有的大多数神经网络方法。

pdf bib
EDTC: A Corpus for Discourse-Level Topic Chain Parsing
Longyin Zhang | Xin Tan | Fang Kong | Guodong Zhou
Findings of the Association for Computational Linguistics: EMNLP 2021

Discourse analysis has long been known to be fundamental in natural language processing. In this research, we present our insight on discourse-level topic chain (DTC) parsing which aims at discovering new topics and investigating how these topics evolve over time within an article. To address the lack of data, we contribute a new discourse corpus with DTC-style dependency graphs annotated upon news articles. In particular, we ensure the high reliability of the corpus by utilizing a two-step annotation strategy to build the data and filtering out the annotations with low confidence scores. Based on the annotated corpus, we introduce a simple yet robust system for automatic discourse-level topic chain parsing.

pdf bib
Adversarial Learning for Discourse Rhetorical Structure Parsing
Longyin Zhang | Fang Kong | Guodong Zhou
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)

Text-level discourse rhetorical structure (DRS) parsing is known to be challenging due to the notorious lack of training data. Although recent top-down DRS parsers can better leverage global document context and have achieved certain success, the performance is still far from perfect. To our knowledge, all previous DRS parsers make local decisions for either bottom-up node composition or top-down split point ranking at each time step, and largely ignore DRS parsing from the global view point. Obviously, it is not sufficient to build an entire DRS tree only through these local decisions. In this work, we present our insight on evaluating the pros and cons of the entire DRS tree for global optimization. Specifically, based on recent well-performing top-down frameworks, we introduce a novel method to transform both gold standard and predicted constituency trees into tree diagrams with two color channels. After that, we learn an adversarial bot between gold and fake tree diagrams to estimate the generated DRS trees from a global perspective. We perform experiments on both RST-DT and CDTB corpora and use the original Parseval for performance evaluation. The experimental results show that our parser can substantially improve the performance when compared with previous state-of-the-art parsers.

pdf bib
Enhancing Entity Boundary Detection for Better Chinese Named Entity Recognition
Chun Chen | Fang Kong
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)

In comparison with English, due to the lack of explicit word boundary and tenses information, Chinese Named Entity Recognition (NER) is much more challenging. In this paper, we propose a boundary enhanced approach for better Chinese NER. In particular, our approach enhances the boundary information from two perspectives. On one hand, we enhance the representation of the internal dependency of phrases by an additional Graph Attention Network(GAT) layer. On the other hand, taking the entity head-tail prediction (i.e., boundaries) as an auxiliary task, we propose an unified framework to learn the boundary information and recognize the NE jointly. Experiments on both the OntoNotes and the Weibo corpora show the effectiveness of our approach.

2020

pdf bib
A Top-down Neural Architecture towards Text-level Parsing of Discourse Rhetorical Structure
Longyin Zhang | Yuqing Xing | Fang Kong | Peifeng Li | Guodong Zhou
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics

Due to its great importance in deep natural language understanding and various down-stream applications, text-level parsing of discourse rhetorical structure (DRS) has been drawing more and more attention in recent years. However, all the previous studies on text-level discourse parsing adopt bottom-up approaches, which much limit the DRS determination on local information and fail to well benefit from global information of the overall discourse. In this paper, we justify from both computational and perceptive points-of-view that the top-down architecture is more suitable for text-level DRS parsing. On the basis, we propose a top-down neural architecture toward text-level DRS parsing. In particular, we cast discourse parsing as a recursive split point ranking task, where a split point is classified to different levels according to its rank and the elementary discourse units (EDUs) associated with it are arranged accordingly. In this way, we can determine the complete DRS as a hierarchical tree structure via an encoder-decoder with an internal stack. Experimentation on both the English RST-DT corpus and the Chinese CDTB corpus shows the great effectiveness of our proposed top-down approach towards text-level DRS parsing.

pdf bib
融入对话上文整体信息的层次匹配回应选择(Learning Overall Dialogue Information for Dialogue Response Selection)
Bowen Si (司博文) | Fang Kong (孔芳)
Proceedings of the 19th Chinese National Conference on Computational Linguistics

对话是一个顺序交互的过程,回应选择旨在根据已有对话上文选择合适的回应,是自然语言处理领域的研究热点。已有研究取得了一定的成功,但仍然存在两个突出的问题。一是现有的编码器在挖掘对话文本语义信息上尚存在不足;二是只考虑每一回合对话与备选回应之间的关系,忽视了对话上文的整体语义信息。针对问题一,本文借助多头自注意力机制有效捕捉对话文本的语义信息;针对问题二,整合对话上文的整体语义信息,分别从单词、句子以及整体对话上文三个层次与备选回应进行匹配,充分保证匹配信息的完整。在Ubuntu Corpus V1和Douban Conversation Corpus数据集上的对比实验表明了本文给出方法的有效性。

pdf bib
面向微博文本的融合字词信息的轻量级命名实体识别(Lightweight Named Entity Recognition for Weibo Based on Word and Character)
Chun Chen (陈淳) | Mingyang Li (李明扬) | Fang Kong (孔芳)
Proceedings of the 19th Chinese National Conference on Computational Linguistics

中文社交媒体命名实体识别由于其领域特殊性,一直广受关注。非正式且无结构的微博文本存在以下两个问题:一是词语边界模糊;二是语料规模有限。针对问题一,本文将同维度的字词进行融合,获得丰富的文本序列表征;针对问题二,提出了基于Star-Transformer框架的命名实体识别模型,借助星型拓扑结构更好地捕获动态特征;同时利用高速网络优化Star-Transformer中的信息桥接,提升模型的鲁棒性。本文提出的轻量级命名实体识别模型取得了目前Weibo语料上最好的效果。

pdf bib
Chinese Paragraph-level Discourse Parsing with Global Backward and Local Reverse Reading
Feng Jiang | Xiaomin Chu | Peifeng Li | Fang Kong | Qiaoming Zhu
Proceedings of the 28th International Conference on Computational Linguistics

Discourse structure tree construction is the fundamental task of discourse parsing and most previous work focused on English. Due to the cultural and linguistic differences, existing successful methods on English discourse parsing cannot be transformed into Chinese directly, especially in paragraph level suffering from longer discourse units and fewer explicit connectives. To alleviate the above issues, we propose two reading modes, i.e., the global backward reading and the local reverse reading, to construct Chinese paragraph level discourse trees. The former processes discourse units from the end to the beginning in a document to utilize the left-branching bias of discourse structure in Chinese, while the latter reverses the position of paragraphs in a discourse unit to enhance the differentiation of coherence between adjacent discourse units. The experimental results on Chinese MCDTB demonstrate that our model outperforms all strong baselines.

2019

pdf bib
Topic Tensor Network for Implicit Discourse Relation Recognition in Chinese
Sheng Xu | Peifeng Li | Fang Kong | Qiaoming Zhu | Guodong Zhou
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics

In the literature, most of the previous studies on English implicit discourse relation recognition only use sentence-level representations, which cannot provide enough semantic information in Chinese due to its unique paratactic characteristics. In this paper, we propose a topic tensor network to recognize Chinese implicit discourse relations with both sentence-level and topic-level representations. In particular, besides encoding arguments (discourse units) using a gated convolutional network to obtain sentence-level representations, we train a simplified topic model to infer the latent topic-level representations. Moreover, we feed the two pairs of representations to two factored tensor networks, respectively, to capture both the sentence-level interactions and topic-level relevance using multi-slice tensors. Experimentation on CDTB, a Chinese discourse corpus, shows that our proposed model significantly outperforms several state-of-the-art baselines in both micro and macro F1-scores.

2016

pdf bib
SoNLP-DP System for ConLL-2016 English Shallow Discourse Parsing
Fang Kong | Sheng Li | Junhui Li | Muhua Zhu | Guodong Zhou
Proceedings of the CoNLL-16 shared task

pdf bib
SoNLP-DP System for ConLL-2016 Chinese Shallow Discourse Parsing
Junhui Li | Fang Kong | Sheng Li | Muhua Zhu | Guodong Zhou
Proceedings of the CoNLL-16 shared task

2015

pdf bib
The SoNLP-DP System in the CoNLL-2015 shared Task
Fang Kong | Sheng Li | Guodong Zhou
Proceedings of the Nineteenth Conference on Computational Natural Language Learning - Shared Task

2014

pdf bib
A Constituent-Based Approach to Argument Labeling with Joint Inference in Discourse Parsing
Fang Kong | Hwee Tou Ng | Guodong Zhou
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)

pdf bib
Building Chinese Discourse Corpus with Connective-driven Dependency Tree Structure
Yancui Li | Wenhe Feng | Jing Sun | Fang Kong | Guodong Zhou
Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP)

2013

pdf bib
Exploiting Zero Pronouns to Improve Chinese Coreference Resolution
Fang Kong | Hwee Tou Ng
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing

pdf bib
Collective Personal Profile Summarization with Social Networks
Zhongqing Wang | Shoushan Li | Fang Kong | Guodong Zhou
Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing

2012

pdf bib
Exploring Local and Global Semantic Information for Event Pronoun Resolution
Fang Kong | Guodong Zhou
Proceedings of COLING 2012

2011

pdf bib
Combining Dependency and Constituent-based Syntactic Information for Anaphoricity Determination in Coreference Resolution
Fang Kong | Guodong Zhou
Proceedings of the 25th Pacific Asia Conference on Language, Information and Computation

2010

pdf bib
A Tree Kernel-Based Unified Framework for Chinese Zero Anaphora Resolution
Fang Kong | Guodong Zhou
Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing

pdf bib
Dependency-driven Anaphoricity Determination for Coreference Resolution
Fang Kong | Guodong Zhou | Longhua Qian | Qiaoming Zhu
Proceedings of the 23rd International Conference on Computational Linguistics (Coling 2010)

2009

pdf bib
Global Learning of Noun Phrase Anaphoricity in Coreference Resolution via Label Propagation
GuoDong Zhou | Fang Kong
Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing

pdf bib
Employing the Centering Theory in Pronoun Resolution from the Semantic Perspective
Fang Kong | GuoDong Zhou | Qiaoming Zhu
Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing

pdf bib
Semi-Supervised Learning for Semantic Relation Classification using Stratified Sampling Strategy
Longhua Qian | Guodong Zhou | Fang Kong | Qiaoming Zhu
Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing

2008

pdf bib
Context-Sensitive Convolution Tree Kernel for Pronoun Resolution
GuoDong Zhou | Fang Kong | QiaoMing Zhu
Proceedings of the Third International Joint Conference on Natural Language Processing: Volume-I

pdf bib
Exploiting Constituent Dependencies for Tree Kernel-Based Semantic Relation Extraction
Longhua Qian | Guodong Zhou | Fang Kong | Qiaoming Zhu | Peide Qian
Proceedings of the 22nd International Conference on Computational Linguistics (Coling 2008)