in

How to reproduce Keras SimpleRNN behaviour


I’m trying to learn how keras.layers.SimpleRNN works by following a relatively straightforward tutorial (https://machinelearningmastery.com/understanding-simple-recurrent-neural-networks-in-keras/). However, this tutorial assumes the input is scalar, and I’ve been unable to scale this to higher-dimensional inputs. Here is my attempt to reproduce the behaviour of a simple RNN with 1×2 input for 3 time steps:

import keras

inputs = np.array([[[1, 2]]])
inputs = np.repeat(inputs, repeats = 3, axis=1)
inputs.shape

model = keras.Sequential()
model.add(keras.Input(shape=(3, 2), name="input"))
model.add(layers.SimpleRNN(4, name="rnn", activation='linear'))
model.add(layers.Dense(5, name="output", activation='linear'))

outputs = model(inputs)

w_inputs = model.get_weights()[0]
w_hidden = model.get_weights()[1]
b_hidden = model.get_weights()[2]
w_dense = model.get_weights()[3]
b_dense = model.get_weights()[4]

h0 = np.zeros(4)
h1 = np.matmul(np.array([[1, 2]]), w_inputs) + h0 + b_hidden
h2 = np.matmul(np.array([[1, 2]]), w_inputs) + b_hidden
h3 = np.matmul(np.array([[1, 2]]), w_inputs) + b_hidden
o3 = np.matmul(h3, w_dense) + b_dense

print(f'output: {outputs}')
print(f'expected: {o3}')

The output generated from keras and the values I calculated myself don’t match:

output: [[-6.937807    4.0890574  -0.15574443 -1.2262737   1.9948364 ]]
expected: [[-2.38697288  1.03174649 -1.44180991 -1.28951854  0.39630487]]

why?



Source: https://stackoverflow.com/questions/70629215/how-to-reproduce-keras-simplernn-behaviour

A Daily Progress Tracker App Built Using React

Sending data from react to node by get request using axios library