0
$\begingroup$

RNN or LSTM are known to hold the previous timestamp data as "memory" so that short or long range dependencies can be remembered.
But in the following simple keras model, where is that delay or memory thing? It is just taking each input at a time and throwing the output at same timestamp!

model = Sequential() model.add(LSTM(10)) model.add(Dense(2)) 
$\endgroup$

1 Answer 1

0
$\begingroup$

The LSTM is just applied to each of the time steps in the input. The information from the distant time steps is passed from time step to time step in the hidden state of the LSTM. That's how it can use information from past timesteps in the current one.

Due to the way you initialized the LSTM, it just returns the output of the last time step. However, if you initialize the LSTM with the parameter return_sequences=True it will return the whole sequence of intermediate outputs instead of just the final one.

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.