Day 3 - Other Applications for LSTM Networks
Last night I was thinking about how in the very first ML excercise I used a Recursive Neural Network to generate the next possible characters given a text sequence as an input. Even though that was a failed experiment since all I could generate was in gibberish, I thought this could play nicely with a ML task I’m currently working on with a client.
I had to create a model to identify patterns in different time series. My thought is that time series and text sequences are similar things from a machine’s point of view. In both every single point in the sequence is connected to what comes before and what comes next and ultimately they’re both encoded as numbers before they can be fed into an ML model.
So what I did is split a long time serie into fixed-length sequences with a
(10, 5) shape: 10 points every sequence
and 5 features. This way, no matter how long the sequence is, I could classify each 10-points-long pattern as an ok
Here’s the RNN I used in Tensorflow:
model = tf.keras.Sequential([ tf.keras.layers.Bidirectional( tf.keras.layers.LSTM(units=128, input_shape=[x_train.shape,x_train.shape]) ), tf.keras.layers.Dropout(.2), tf.keras.layers.Dense(128, activation='relu'), tf.keras.layers.Dense(y_train.shape, activation='softmax'), ])
- Classification models are a very good fit for pattern recognition tasks!
- Data preprocessing is a big part of any Machine Learning project