Header menu link for other important links
X
Dimensionality Reduction With Multi-Fold Deep Denoising Autoencoder
Published in IGI Global
2020
Pages: 154 - 165
Abstract
Natural data erupting directly out of various data sources, such as text, image, video, audio, and sensor data, comes with an inherent property of having very large dimensions or features of the data. While these features add richness and perspectives to the data, due to sparsity associated with them, it adds to the computational complexity while learning, unable to visualize and interpret them, thus requiring large scale computational power to make insights out of it. This is famously called “curse of dimensionality.” This chapter discusses the methods by which curse of dimensionality is cured using conventional methods and analyzes its performance for given complex datasets. It also discusses the advantages of nonlinear methods over linear methods and neural networks, which could be a better approach when compared to other nonlinear methods. It also discusses future research areas such as application of deep learning techniques, which can be applied as a cure for this curse.
About the journal
JournalAdvances in Systems Analysis, Software Engineering, and High Performance Computing Deep Learning Techniques and Optimization Strategies in Big Data Analytics
PublisherIGI Global
ISSN2327-3453
Open Access0