scholarly journals Computing with words using intuitionistic fuzzy logic programming

2018 ◽  
Vol 7 (1.9) ◽  
pp. 178 ◽  
Author(s):  
Ashit Kumar Dutta

Computing with words is the terminology to indicate a set of numbers and words.It is the base for natural language processing and computational theory of perceptions.It is the art to combine both human and machine perception and find a solution for the real world problems left unsolved due to improper mechanism.Animal voice interpreter, lie detector, driving a vehicle in heavy traffic, and natural language interpreter are the applications need to be automated for the next generation.The computational theory is a group of perceptions used to express propositions in a natural language.The concept of the research is to utilize intutionistic fuzzy logic to interpret perceptions to solve vague problems.The output of the research shows that the performance of proposed method is better than the existing methods.

Author(s):  
L.A. Zadeh

<p>I feel honored by the dedication of the Special Issue of IJCCC to me. I should like to express my deep appreciation to the distinguished Co-Editors and my good friends, Professors Balas, Dzitac and Teodorescu, and to distinguished contributors, for honoring me. The subjects which are addressed in the Special Issue are on the frontiers of fuzzy logic.<br /> <br /> The Foreword gives me an opportunity to share with the readers of the Journal my recent thoughts regarding a subject which I have been pondering about for many years - fuzzy logic and natural languages. The first step toward linking fuzzy logic and natural languages was my 1973 paper," Outline of a New Approach to the Analysis of Complex Systems and Decision Processes." Two key concepts were introduced in that paper. First, the concept of a linguistic variable - a variable which takes words as values; and second, the concept of a fuzzy if- then rule - a rule in which the antecedent and consequent involve linguistic variables. Today, close to forty years later, these concepts are widely used in most applications of fuzzy logic.<br /> <br /> The second step was my 1978 paper, "PRUF - a Meaning Representation Language for Natural Languages." This paper laid the foundation for a series of papers in the eighties in which a fairly complete theory of fuzzy - logic-based semantics of natural languages was developed. My theory did not attract many followers either within the fuzzy logic community or within the linguistics and philosophy of languages communities. There is a reason. The fuzzy logic community is largely a community of engineers, computer scientists and mathematicians - a community which has always shied away from semantics of natural languages. Symmetrically, the linguistics and philosophy of languages communities have shied away from fuzzy logic.<br /> <br /> In the early nineties, a thought that began to crystallize in my mind was that in most of the applications of fuzzy logic linguistic concepts play an important, if not very visible role. It is this thought that motivated the concept of Computing with Words (CW or CWW), introduced in my 1996 paper "Fuzzy Logic = Computing with Words." In essence, Computing with Words is a system of computation in which the objects of computation are words, phrases and propositions drawn from a natural language. The same can be said about Natural Language Processing (NLP.) In fact, CW and NLP have little in common and have altogether different agendas.<br /> <br /> In large measure, CW is concerned with solution of computational problems which are stated in a natural language. Simple example. Given: Probably John is tall. What is the probability that John is short? What is the probability that John is very short? What is the probability that John is not very tall? A less simple example. Given: Usually Robert leaves office at about 5 pm. Typically it takes Robert about an hour to get home from work. What is the probability that Robert is home at 6:l5 pm.? What should be noted is that CW is the only system of computation which has the capability to deal with problems of this kind. The problem-solving capability of CW rests on two key ideas. First, employment of so-called restriction-based semantics (RS) for translation of a natural language into a mathematical language in which the concept of a restriction plays a pivotal role; and second, employment of a calculus of restrictions - a calculus which is centered on the Extension Principle of fuzzy logic.<br /> <br /> What is thought-provoking is that neither traditional mathematics nor standard probability theory has the capability to deal with computational problems which are stated in a natural language. Not having this capability, it is traditional to dismiss such problems as ill-posed. In this perspective, perhaps the most remarkable contribution of CW is that it opens the door to empowering of mathematics with a fascinating capability - the capability to construct mathematical solutions of computational problems which are stated in a natural language. The basic importance of this capability derives from the fact that much of human knowledge, and especially world knowledge, is described in natural language.<br /> <br /> In conclusion, only recently did I begin to realize that the formalism of CW suggests a new and challenging direction in mathematics - mathematical solution of computational problems which are stated in a natural language. For mathematics, this is an unexplored territory.</p>


Author(s):  
TRU H. CAO

Conceptual graphs and fuzzy logic are two logical formalisms that emphasize the target of natural language, where conceptual graphs provide a structure of formulas close to that of natural language sentences while fuzzy logic provides a methodology for computing with words. This paper proposes fuzzy conceptual graphs as a knowledge representation language that combines the advantages of both the two formalisms for artificial intelligence approaching human expression and reasoning. Firstly, the conceptual graph language is extended with functional relation types for representing functional dependency, and conjunctive types for joining concepts and relations. Then fuzzy conceptual graphs are formulated as a generalization of conceptual graphs where fuzzy types and fuzzy attribute-values are used in place of crisp types and crisp attribute-values. Projection and join as basic operations for reasoning on fuzzy conceptual graphs are defined, taking into account the semantics of fuzzy set-based values.


2020 ◽  
Author(s):  
Masashi Sugiyama

Recently, word embeddings have been used in many natural language processing problems successfully and how to train a robust and accurate word embedding system efficiently is a popular research area. Since many, if not all, words have more than one sense, it is necessary to learn vectors for all senses of word separately. Therefore, in this project, we have explored two multi-sense word embedding models, including Multi-Sense Skip-gram (MSSG) model and Non-parametric Multi-sense Skip Gram model (NP-MSSG). Furthermore, we propose an extension of the Multi-Sense Skip-gram model called Incremental Multi-Sense Skip-gram (IMSSG) model which could learn the vectors of all senses per word incrementally. We evaluate all the systems on word similarity task and show that IMSSG is better than the other models.


2011 ◽  
Vol 07 (01) ◽  
pp. 89-103
Author(s):  
CHONGFU HUANG

The information coded in natural language is called natural language information. It can be employed to analyze risks by computing with words. The disjunction and Cartesian product of fuzzy sets are the basic arithmetics to compute a probability distribution representing the random uncertainty of the risk source. An approach to infer a risk with words is to represent the risk system by using probabilistic and possibilistic constraints. In this paper, with fuzzy logic, we give a sample to verify that the suggested approach is more flexible and effective.


Author(s):  
PASCUAL JULIÁN-IRANZO ◽  
FERNANDO SÁENZ-PÉREZ

Abstarct This paper introduces techniques to integrate WordNet into a Fuzzy Logic Programming system. Since WordNet relates words but does not give graded information on the relation between them, we have implemented standard similarity measures and new directives allowing the proximity equations linking two words to be generated with an approximation degree. Proximity equations are the key syntactic structures which, in addition to a weak unification algorithm, make a flexible query-answering process possible in this kind of programming language. This addition widens the scope of Fuzzy Logic Programming, allowing certain forms of lexical reasoning, and reinforcing Natural Language Processing (NLP) applications.


2019 ◽  
Author(s):  
Dimmy Magalhães ◽  
Aurora Pozo ◽  
Roberto Santana

Text Classification is one of the tasks of Natural Language Processing (NLP). In this area, Graph Convolutional Networks (GCN) has achieved values higher than CNN's and other related models. For GCN, the metric that defines the correlation between words in a vector space plays a crucial role in the classification because it determines the weight of the edges between two words (represented by nodes in the graph). In this study, we empirically investigated the impact of thirteen measures of distance/similarity. A representation was built for each document using word embedding from word2vec model. Also, a graph-based representation of five dataset was created for each measure analyzed, where each word is a node in the graph, and each edge is weighted by distance/similarity between words. Finally, each model was run in a simple graph neural network. The results show that, concerning text classification, there is no statistical difference between the analyzed metrics and the Graph Convolution Network. Even with the incorporation of external words or external knowledge, the results were similar to the methods without the incorporation of words. However, the results indicate that some distance metrics behave better than others in relation to context capture, with Euclidean distance reaching the best values or having statistical similarity with the best.


Sign in / Sign up

Export Citation Format

Share Document