Storage and processing of intensively growing data has become easier with the invent of cloud computing. Computing paradigms have evolved from the era of centralized to distributed, distributed to grid and grid to cloud. Parallel programming models to process large data and with utilization of more cpus have got a new color with the introduction of mapreduce programming model. Hadoop, a framework for handling millions of data in clusters of computer uses MapReduce programming model. Users prefer java for their MapReduce jobs as Hadoop is written in Java. Here, we discuss the MapReduce programming approach and other approaches/languages that can be used to write MapReduce jobs. © Medwell Journals, 2016.