Header menu link for other important links
An Analysis of Decision Theoretic Kernalized Rough C-Means
Published in Springer Singapore
Volume: 517
Pages: 513 - 524
There are several algorithms used for data clustering and as imprecision has become an inherent part of datasets now days, many such algorithms have been developed so far using fuzzy sets, rough sets, intuitionistic fuzzy sets, and their hybrid models. In order to increase the flexibility of conventional rough approximations, a probability based rough sets concept was introduced in the 90s namely decision theoretic rough sets (DTRS). Using this model Li et al. extended the conventional rough c-means. Euclidean distance has been used to measure the similarity among data. As has been observed the Euclidean distance has the property of separability. So, as a solution to that several Kernel distances are used in literature. In fact, we have selected three of the most popular kernels and developed an improved Kernelized rough c-means algorithm. We compare the results with the basic decision theoretic rough c-means. For the comparison we have used three datasets namely Iris, Wine and Glass. The three Kernel functions used are the Radial Basis, the Gaussian, and the hyperbolic tangent. The experimental analysis by using the measuring indices DB and D show improved results for the Kernelized means. We also present various graphs to showcase the clustered data. © Springer Nature Singapore Pte Ltd. 2017.
About the journal
JournalData powered by TypesetAdvances in Intelligent Systems and Computing Artificial Intelligence and Evolutionary Computations in Engineering Systems
PublisherData powered by TypesetSpringer Singapore
Open Access0