In English

Chinese Semantic Role Labeling Using Recurrent Neural Networks

Yigeng Zhang
Göteborg : Chalmers tekniska högskola, 2017. 65 s.
[Examensarbete på avancerad nivå]

In the research field of natural language processing (NLP), semantic role labeling (SRL) is one of the essential problems. The task of SRL is to automatically find the semantic roles (such as AGENT and PATIENT) of each argument corresponding to each predicate in a sentence. Semantic roles are useful shallow semantic representations, and SRL is an important intermediate step for many NLP applications, such as Information Extraction, Question Answering and Machine Translation. Traditional methods for SRL are based on parsing output, and require much feature engineering. In this work, we implement an end-to-end system using deep bi-directional long-short term memory (LSTM) model to solve Chinese SRL problems. Its input is raw text which is segmented into characters as input features, and does not require any intermediate step of syntactic analysis. In this work, our method achieved a performance that almost approaches the level of a top-scoring system, but with a simpler process and a higher efficiency.

Nyckelord: Natural language processing, semantic role labelling, recurrent neural networks, deep learning.

Publikationen registrerades 2018-02-16. Den ändrades senast 2018-02-16

CPL ID: 254899

Detta är en tjänst från Chalmers bibliotek