Skip to main content

Part of the book series: Synthesis Lectures on Human Language Technologies ((SLHLT))

Abstract

After describing the RNN abstraction, we are now in place to discuss specific instantiations of it. Recall that we are interested in a recursive function si = R(xi; 1i1/ such that si encodes the sequence x1:n. We will present several concrete instantiations of the abstract RNN architecture, providing concrete definitions of the functions R and O. These include the Simple RNN (SRNN), the Long Short-Term Memory (LSTM) and the Gated Recurrent Unit (GRU).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer Nature Switzerland AG

About this chapter

Cite this chapter

Goldberg, Y. (2017). Concrete Recurrent Neural Network Architectures. In: Neural Network Methods for Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02165-7_15

Download citation

Publish with us

Policies and ethics