Statistical machine translation has allowed us to construct models for translating a sequence of text from one language into another, but this model has its limitations in terms of translating material that is not similar to content from the training corpora. This leads to the use of neural machine translation that uses Encoder Decoder Network on a sequence of text to output a sequence of text in a different language. In this paper, we propose a neural network model called LSTM Encoder– Decoder that consists of two long short-term memory networks (LSTM). One network will encode a sequence of character into a fixed length vector representation, and the other decodes the representation into another sequence of characters. © 2020 SERSC.