This workflow trains a simple convolutional neural network (CNN) on the MNIST dataset via TensorFlow.
This workflow uses the TensorFlow Python bindings to create and train a multilayer perceptron using the Python API. The trained network is then used to predict the class of unseen data. For more information on the dataset see https://archive.ics.uci.edu/ml/datasets/Statlog+(Landsat+Satellite)
This workflow shows how to train a simple neural network for text classification, in this case sentiment analysis. The used network learns a 128 dimensional word embedding followed by an LSTM.
In this workflow we pre-process the image data, which we will use throughout the following example workflows.
Instead of creating our own network architecture as in the previous workflow "Train simple CNN", in this workflow we use the pre-trained network architecture VGG16.
In this workflow we create a simple Convolutional Neural Network using the DL Python Network Creator. We train this network on our image data using the DL Python Network Learner and finally score it using the DL Python Network Executor. The DL Python Network Learner and Executor can be used to write custom training and execution code using Python.
In this workflow we are fine-tuning a VGG1G network, similar to "Fine-tune VGG16 (Python)". However, we won't make use of the DL Python Learner/Executor nodes, rather we use the DL Keras Network Learner and DL Network Executor to train and execute our networks in this workflow.
The workflows generates text in fairy tale style. It reads the previously trained TensorFlow network and predicts a sequences of index-encoded characters within a loop, and translates the sequnce of indexes into characters.
The workflow builds, trains, and saves an RNN with an LSTM layer to generate new fictive fairy tales. The brown nodes define the network structure. The "Pre-Processing" metanode reads fairy tales and index-encodes them, and creates semi-overlapping sequences. The Keras Network Learner node trains the network using the index-encoded fairy tales. Finally, the trained network is converted into a TensorFlow model, and saved to a file.
The workflow generates 200 new, fictive mountain names. It reads the previously trained TensorFlow network and predicts 200 sequences of index-encoded characters within a loop. The last node, named Extract Mountain Names translates the sequence of indexes into characters and visualizes the new fictive mountain names.