Header menu link for other important links
X
CORE - An optimal data placement strategy in Hadoop for data intensive applications based on cohesion relation
Published in CRL Publishing
2019
Volume: 34
   
Issue: 1
Pages: 47 - 60
Abstract
The tremendous growth of data being generated today is making storage and computing a mammoth task. With its distributed processing capability Hadoop gives an efficient solution for such large data. Hadoop's default data placement strategy places the data blocks randomly across the nodes without considering the execution parameters resulting in several lacunas such as increased execution time, query latency etc., Also, most of the data required for a task execution may not be locally available which creates data-locality problem. Hence we propose an innovative data placement strategy based on dependency of data blocks across the nodes. Our strategy dynamically analyses the history log and establishes relationship between various tasks and blocks required for each task through Block Dependency Graph (BDG). Then Our CORE-Algorithm re-organizes the HDFS layout by redistributing the data blocks to give an optimal data placement, resulting in improved performance for Big Data sets in distributed environment. This strategy is tested in 20-node cluster with different real-world MR applications. The results conclude that proposed strategy reduces the query execution time by 23%, improves the data locality by 50.7%, compared to default. © 2019 CRL Publishing Ltd.
About the journal
JournalComputer Systems Science and Engineering
PublisherCRL Publishing
ISSN02676192
Open AccessNo