Brain-Computer Interface (BCI) is a technology that enables a human to communicate with an external stratagem to achieve the desired result. This paper presents a Motor Imagery (MI) – Electroencephalography (EEG) signal based robotic hand movements of lifting and dropping of an external robotic arm. The MI-EEG signals were extracted using a 3-channel electrode system with the AD8232 amplifier. The electrodes were placed on three locations, namely, C3, C4, and right mastoid. Signal processing methods namely, Butterworth filter and Sym-9 Wavelet Packet Decomposition (WPD) were applied on the extracted EEG signals to de-noise the raw EEG signal. Statistical features like entropy, variance, standard deviation, covariance, and spectral centroid were extracted from the de-noised signals. The statistical features were then applied to train a Multi-Layer Perceptron (MLP) - Deep Neural Network (DNN) to classify the hand movement into two classes; ‘No Hand Movement’ and’Hand Movement’. The resultant k-fold cross-validated accuracy achieved was 85.41% and other classification metrics, such as precision, recall sensitivity, specificity, and F1 Score were also calculated. The trained model was interfaced with Arduino to move the robotic arm according to the class predicted by the DNN model in a real-time environment. The proposed end to end low-cost deep learning framework provides a substantial improvement in real-time BCI. © 2020 North Atlantic University Union NAUN. All rights reserved.