Recurrent Neural Networks(RNNs) are popular deep learning architectures used in Natural Language Processing for analyzing sentiments in sentences. The recurrent nature of these networks enable them to use information from previous time steps. In this paper, we analyze the performance of three RNNs namely vanilla RNNs, Long Short-Term Memory(LSTM) and Gated Recurrent Units(GRU). Both unidirectional and bidirectional nature of these networks are considered. Pretrained word vectors from the Google News dataset are used. We evaluate the performance of these networks on the Amazon health product reviews dataset and sentiment analysis benchmark datasets SST-1 and SST-2. © 2017 IEEE.