Abstract
After enumerating common usage patterns in Chapter 14 and learning the details of concrete RNN architectures in Chapter 15, we now explore the use of RNNs in NLP applications through some concrete examples. While we use the generic term RNN, we usually mean gated architectures such as the LSTM or the GRU. The Simple RNN consistently results in lower accuracies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Goldberg, Y. (2017). Modeling with Recurrent Networks. In: Neural Network Methods for Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02165-7_16
Download citation
DOI: https://doi.org/10.1007/978-3-031-02165-7_16
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-01037-8
Online ISBN: 978-3-031-02165-7
eBook Packages: Synthesis Collection of Technology (R0)eBColl Synthesis Collection 7