Skip to main content

Part of the book series: Synthesis Lectures on Human Language Technologies ((SLHLT))

  • 1011 Accesses

Abstract

After enumerating common usage patterns in Chapter 14 and learning the details of concrete RNN architectures in Chapter 15, we now explore the use of RNNs in NLP applications through some concrete examples. While we use the generic term RNN, we usually mean gated architectures such as the LSTM or the GRU. The Simple RNN consistently results in lower accuracies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer Nature Switzerland AG

About this chapter

Cite this chapter

Goldberg, Y. (2017). Modeling with Recurrent Networks. In: Neural Network Methods for Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02165-7_16

Download citation

Publish with us

Policies and ethics