本文共 2246 字,大约阅读时间需要 7 分钟。
如下图片解释的清楚,来自stackoverflow
通俗易懂例子
其它解释的补充的相当好:LSTM输出并不改变输入的维数,暗示输入输出的维数均为num_units
Good answer, You usually have embeddings for your input data and thus assume for every word for simplicity. So let's say each word has a distributed representation of 150 dimensions which are the features in the above diagram. Then num_units will act as the dimensionality of RNN/LSTM cell (say 128). So 150 -> 128. And hence output dimensions will be 128. Batch size and time_steps remains as it is. –
5
This term num_units
or num_hidden_units
sometimes noted using the variable name nhid
in the implementations, means that the input to the LSTM cell is a vector of dimension nhid
(or for a batched implementation, it would a matrix of shape batch_size
x nhid
). As a result, the output (from LSTM cell) would also be of same dimensionality since RNN/LSTM/GRU cell doesn't alter the dimensionality of the input vector or matrix.
As pointed out earlier, this term was borrowed from Feed-Forward Neural Networks (FFNs) literature and has caused confusion when used in the context of RNNs. But, the idea is that even RNNs can be viewed as FFNs at each time step. In this view, the hidden layer would indeed be containing num_hidden
units as depicted in this figure:
Source:
More concretely, in the below example the num_hidden_units
or nhid
would be 3 since the size of hidden state (middle layer) is a 3D vector.
follow
answered Sep 28 '18 at 20:01
29.9k77 gold badges9797 silver badges106106 bronze badges
1
You say "the input to the LSTM cell is a vector of dimension nhid
". But the input is generally of shape [batch, T, input]
where the input
can be of any shape. So, when input is dynamically unrolled we would have an input of [b,t, input]
. RNN would transform it as [b,t, nhid]
. So, the output would be shape nhid
not the input. –
1
Most LSTM/RNN diagrams just show the hidden cells but never the units of those cells. Hence, the confusion. Each hidden layer has hidden cells, as many as the number of time steps. And further, each hidden cell is made up of multiple hidden units, like in the diagram below. Therefore, the dimensionality of a hidden layer matrix in RNN is (number of time steps, number of hidden units).
follow
answered Jan 30 '19 at 10:05
转载地址:http://xifti.baihongyu.com/