中文版 | English
Title

A context-enhanced sentence representation learning method for close domains with topic modeling

Author
Corresponding AuthorLi, Shuangyin
Publication Years
2022-08-01
DOI
Source Title
ISSN
0020-0255
EISSN
1872-6291
Volume607Pages:186-210
Abstract
Sentence representation approaches have been widely used and proven to be effective in many text modeling tasks and downstream applications. Many recent proposals are avail-able on learning sentence representations based on deep neural frameworks. However, these methods are pre-trained in open domains and depend on the availability of large-scale data for model fitting. As a result, they may fail in some special scenarios, where data are sparse and embedding interpretations are required, such as legal, medical, or technical fields. In this paper, we present an unsupervised learning method to exploit representa-tions of sentences for some closed domains via topic modeling. We reformulate the infer-ence process of the sentences with the corresponding contextual sentences and the associated words, and propose an effective context-enhanced process called the bi-Directional Context-enhanced Sentence Representation Learning (bi-DCSR). This method takes advantage of the semantic distributions of the nearby contextual sentences and the associated words to form a context-enhanced sentence representation. To support the bi-DCSR, we develop a novel Bayesian topic model to embed sentences and words into the same latent interpretable topic space called the Hybrid Priors Topic Model (HPTM). Based on the defined topic space by the HPTM, the bi-DCSR method learns the embedding of a sentence by the two-directional contextual sentences and the words in it, which allows us to efficiently learn high-quality sentence representations in such closed domains. In addition to an open-domain dataset from Wikipedia, our method is validated using three closed-domain datasets from legal cases, electronic medical records, and technical reports. Our experiments indicate that the HPTM significantly outperforms on language modeling and topic coherence, compared with the existing topic models. Meanwhile, the bi-DCSR method does not only outperform the state-of-the-art unsupervised learning methods on closed domain sentence classification tasks, but also yields competitive performance com-pared to these established approaches on the open domain. Additionally, the visualizations of the semantics of sentences and words demonstrate the interpretable capacity of our model.(c) 2022 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
Keywords
URL[Source Record]
Indexed By
SCI ; EI
Language
English
SUSTech Authorship
Others
Funding Project
National Natural Science Foundation of China[62006083] ; GuangZhou Basic and Applied Basic Research Foundation[202102020654] ; Applied Basic Research Fund of Guangdong Province[2019B1515120085]
WOS Research Area
Computer Science
WOS Subject
Computer Science, Information Systems
WOS Accession No
WOS:000817892200011
Publisher
EI Accession Number
20222412213143
EI Keywords
Embeddings ; Medical computing ; Modeling languages ; Unsupervised learning
ESI Classification Code
Biomedical Engineering:461.1 ; Artificial Intelligence:723.4 ; Computer Applications:723.5
ESI Research Field
COMPUTER SCIENCE
Data Source
Web of Science
Citation statistics
Cited Times [WOS]:2
Document TypeJournal Article
Identifierhttp://kc.sustech.edu.cn/handle/2SGJ60CL/355862
DepartmentDepartment of Computer Science and Engineering
Affiliation
1.South China Normal Univ, Sch Comp Sci, Guangzhou, Guangdong, Peoples R China
2.Sun Yat sen Univ, Sch Data & Comp Sci, Guangzhou, Guangdong, Peoples R China
3.Southern Univ Sci & Technol, Dept Comp Sci & Engn, Shenzhen, Guangdong, Peoples R China
Recommended Citation
GB/T 7714
Li, Shuangyin,Chen, Weiwei,Zhang, Yu,et al. A context-enhanced sentence representation learning method for close domains with topic modeling[J]. INFORMATION SCIENCES,2022,607:186-210.
APA
Li, Shuangyin.,Chen, Weiwei.,Zhang, Yu.,Zhao, Gansen.,Pan, Rong.,...&Tang, Yong.(2022).A context-enhanced sentence representation learning method for close domains with topic modeling.INFORMATION SCIENCES,607,186-210.
MLA
Li, Shuangyin,et al."A context-enhanced sentence representation learning method for close domains with topic modeling".INFORMATION SCIENCES 607(2022):186-210.
Files in This Item:
There are no files associated with this item.
Related Services
Fulltext link
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Export to Excel
Export to Csv
Altmetrics Score
Google Scholar
Similar articles in Google Scholar
[Li, Shuangyin]'s Articles
[Chen, Weiwei]'s Articles
[Zhang, Yu]'s Articles
Baidu Scholar
Similar articles in Baidu Scholar
[Li, Shuangyin]'s Articles
[Chen, Weiwei]'s Articles
[Zhang, Yu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Li, Shuangyin]'s Articles
[Chen, Weiwei]'s Articles
[Zhang, Yu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.