Gated Recurrent Unit
Date
The Gated Recurrent Unit (GRU) is a variant of the recurrent neural network (RNN) proposed by Cho et al. in 2014.Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling". GRU is designed to solve the gradient vanishing problem encountered by traditional RNN when processing long sequence data. It controls the flow of information by introducing update gate and reset gate to better capture long-term dependencies in time series.
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.