A Rough-Fuzzy Ontology Generation Framework and Its Application to Bio-medical Text Processing

Author(s):  
Lipika Dey ◽  
Muhammad Abulaish ◽  
Rohit Goyal ◽  
Kumar Shubham
Author(s):  
Z.M. Ma ◽  
Yanhui Lv ◽  
Li Yan

Ontology is an important part of the W3C standards for the Semantic Web used to specify standard conceptual vocabularies to exchange data among systems, provide reusable knowledge bases, and facilitate interoperability across multiple heterogeneous systems and databases. However, current ontology is not sufficient for handling vague information that is commonly found in many application domains. A feasible solution is to import the fuzzy ability to extend the classical ontology. In this article, we propose a fuzzy ontology generation framework from the fuzzy relational databases, in which the fuzzy ontology consists of fuzzy ontology structure and instances. We simultaneously consider the schema and instances of the fuzzy relational databases, and respectively transform them to fuzzy ontology structure and fuzzy RDF data model. This can ensure the integrality of the original structure as well as the completeness and consistency of the original instances in the fuzzy relational databases.


2006 ◽  
Vol 2 (3) ◽  
pp. 155-164 ◽  
Author(s):  
T.T. Quan ◽  
S.C. Hui ◽  
A.C.M. Fong

2006 ◽  
Vol 18 (6) ◽  
pp. 842-856 ◽  
Author(s):  
Q.T. Tho ◽  
S.C. Hui ◽  
A.C.M. Fong ◽  
Tru Hoang Cao

2020 ◽  
Vol 34 (05) ◽  
pp. 9579-9586
Author(s):  
Xiao Zhang ◽  
Dejing Dou ◽  
Ji Wu

External knowledge is often useful for natural language understanding tasks. We introduce a contextual text representation model called Conceptual-Contextual (CC) embeddings, which incorporates structured knowledge into text representations. Unlike entity embedding methods, our approach encodes a knowledge graph into a context model. CC embeddings can be easily reused for a wide range of tasks in a similar fashion to pre-trained language models. Our model effectively encodes the huge UMLS database by leveraging semantic generalizability. Experiments on electronic health records (EHRs) and medical text processing benchmarks showed our model gives a major boost to the performance of supervised medical NLP tasks.


Sign in / Sign up

Export Citation Format

Share Document