Abstract
After describing the RNN abstraction, we are now in place to discuss specific instantiations of it. Recall that we are interested in a recursive function si = R(xi; 1i1/ such that si encodes the sequence x1:n. We will present several concrete instantiations of the abstract RNN architecture, providing concrete definitions of the functions R and O. These include the Simple RNN (SRNN), the Long Short-Term Memory (LSTM) and the Gated Recurrent Unit (GRU).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Goldberg, Y. (2017). Concrete Recurrent Neural Network Architectures. In: Neural Network Methods for Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02165-7_15
Download citation
DOI: https://doi.org/10.1007/978-3-031-02165-7_15
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-01037-8
Online ISBN: 978-3-031-02165-7
eBook Packages: Synthesis Collection of Technology (R0)eBColl Synthesis Collection 7