The communication between care takers and patients are limited and complex when the patients are too weary and weak. A clinical base patient monitoring involves reading implicit communications of such patients carefully from their gestures to identify their need. The sensitivity of such system should be of high precision to avoid any misinterpretation that may lead to adverse effects. The wearable system can accurately interpret the implied communication to the care takers or to an automated support device. The simplest form of palpable hand movements are used for the above purpose. The proposed system suggests not only a novel methodology simpler than the existing sign language interpretations for such implicit communication, but also suggests the gestures appropriate to be used in such systems. The experimental results by two different modeling methods show a well-distinguished realization of diverse hand movement activities using a wearable sensor medium and the interpretation results always show significant and acute thresholds. © 2013 IEEE.