Today, the usage of data volume has increased manifold. The domain of Big Data utilizes an attribute referred to as ‘Predictability’. This extraordinary feature basically draws prediction and based on user’s need, exhibits data to the user. The database systems prevailing today are quiet complex in nature. The software verification involves an essential phase of testing. The quality of database system depends on what is the quality of testing. Because of the high level of database complexity, testing too has become complicated which in turn has led to intensive workload. The evolution of Big Data has led many enterprises to make a shift towards big data technique such as ‘Hadoop’. The proposed work recommends the technique of big data for carrying out processing of heterogeneous data items. On such technique is the MapReduce which being a well-known programming model build to achieve the proposed issues. The research utilizes machine learning techniques and recommends the approach of big data validation and testing via MMSV (Multi Model Structure Validation) in context to data handling. The MMSV model performs data analysis and comprehends it for making future forecast. Researchers have built methodologies keeping the automated testing components centralized for enhancing overall quality and efficiency of testing methods. Testing involves checking of massive data volume gathered from various sources into primary and central data warehouse. It’s revealed that the technique of MMSV and big data yields improvised output compared to the prevailing system. Moreover the recommended system consumes low execution time. These techniques assure to attend all the issues thereby providing possible remedies and tactics. © BEIESP.