Static implementation of a gated recurrent unit (GRU) layer with tanh activation and sigmoid recurrent activation. More...
Public Member Functions | |
GRULayerT () | |
std::string | getName () const noexcept |
Returns the name of this layer. More... | |
constexpr bool | isActivation () const noexcept |
Returns false since GRU is not an activation layer. More... | |
template<SampleRateCorrectionMode srCorr = sampleRateCorr> | |
std::enable_if_t< srCorr==SampleRateCorrectionMode::NoInterp, void > | prepare (int delaySamples) |
Prepares the GRU to process with a given delay length. More... | |
template<SampleRateCorrectionMode srCorr = sampleRateCorr> | |
std::enable_if_t< srCorr==SampleRateCorrectionMode::LinInterp, void > | prepare (T delaySamples) |
Prepares the GRU to process with a given delay length. More... | |
RTNEURAL_REALTIME void | reset () |
Resets the state of the GRU. More... | |
template<int N = in_size> | |
RTNEURAL_REALTIME std::enable_if<(N > 1), void >::type | forward (const T(&ins)[in_size]) noexcept |
Performs forward propagation for this layer. More... | |
template<int N = in_size> | |
RTNEURAL_REALTIME std::enable_if< N==1, void >::type | forward (const T(&ins)[in_size]) noexcept |
Performs forward propagation for this layer. More... | |
RTNEURAL_REALTIME void | setWVals (const std::vector< std::vector< T >> &wVals) |
Sets the layer kernel weights. More... | |
RTNEURAL_REALTIME void | setUVals (const std::vector< std::vector< T >> &uVals) |
Sets the layer recurrent weights. More... | |
RTNEURAL_REALTIME void | setBVals (const std::vector< std::vector< T >> &bVals) |
Sets the layer bias. More... | |
Public Attributes | |
T | outs [out_size] |
Static Public Attributes | |
static constexpr auto | in_size = in_sizet |
static constexpr auto | out_size = out_sizet |
Static implementation of a gated recurrent unit (GRU) layer with tanh activation and sigmoid recurrent activation.
To ensure that the recurrent state is initialized to zero, please make sure to call reset()
before your first call to the forward()
method.
Compared to TensorFlow's GRU implementation, this layer will behave by default as if the parameter stateful=True
. A "stateless" GRU can be achieved by calling the reset()
function in between calls to forward()
.
GRULayerT | ( | ) |
|
noexcept |
Returns the name of this layer.
|
constexprnoexcept |
Returns false since GRU is not an activation layer.
std::enable_if_t<srCorr == SampleRateCorrectionMode::NoInterp, void> prepare | ( | int | delaySamples | ) |
Prepares the GRU to process with a given delay length.
std::enable_if_t<srCorr == SampleRateCorrectionMode::LinInterp, void> prepare | ( | T | delaySamples | ) |
Prepares the GRU to process with a given delay length.
RTNEURAL_REALTIME void reset | ( | ) |
Resets the state of the GRU.
|
noexcept |
Performs forward propagation for this layer.
References GRULayerT< T, in_sizet, out_sizet, sampleRateCorr, MathsProvider >::out_size, and GRULayerT< T, in_sizet, out_sizet, sampleRateCorr, MathsProvider >::outs.
|
noexcept |
Performs forward propagation for this layer.
References GRULayerT< T, in_sizet, out_sizet, sampleRateCorr, MathsProvider >::out_size, and GRULayerT< T, in_sizet, out_sizet, sampleRateCorr, MathsProvider >::outs.
RTNEURAL_REALTIME void setWVals | ( | const std::vector< std::vector< T >> & | wVals | ) |
Sets the layer kernel weights.
The weights vector must have size weights[in_size][3 * out_size]
RTNEURAL_REALTIME void setUVals | ( | const std::vector< std::vector< T >> & | uVals | ) |
Sets the layer recurrent weights.
The weights vector must have size weights[out_size][3 * out_size]
RTNEURAL_REALTIME void setBVals | ( | const std::vector< std::vector< T >> & | bVals | ) |
Sets the layer bias.
The bias vector must have size weights[2][3 * out_size]
|
staticconstexpr |
|
staticconstexpr |
T outs[out_size] |