Neural Machine Translation

Uses a character level encoder-decoder network of LSTMs.
The encoder network reads the input sentence character by character and summarizes the sentence in its state.
This state is then used as initial state of the decoder network to produce the translated sentence one character at a time.
During prediction, the decoder also recieves its previous output as input to the next time.
For training we use a technique called "teacher forcing" i.e. we feed the actual previous character instead of the previous prediction which greatly benefits the training.

Neural Machine Translation

 

Resources

EXAMPLES Server: 04_Analytics/14_Deep_Learning/02_Keras/05_Neural_Machine_Translation04_Analytics/14_Deep_Learning/02_Keras/05_Neural_Machine_Translation*
Download a zip-archive

Blog:

 

 


* Find more about the Examples Server here.
The link will open the workflow directly in KNIME Analytics Platform (requirements: Windows; KNIME Analytics Platform must be installed with the Installer version 3.2.0 or higher). In other cases, please use the link to a zip-archive or open the provided path manually