Abstract
In Section 11.6 we introduced the sum of pairwise word similarities as a strong baseline for the short document similarity task. Given two sentences, the first one with words \(w_1^1,....,w_{\ell_1}^1\) and the second one with words \(w_1^2,....,w_{\ell_2}^2\), each word is associated with a corresponding pre-trained word vector \(w_{1:\ell}^1,w_{1:\ell_2}^2\), and the similarity between the documents is given by:
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Goldberg, Y. (2017). Case Study: A Feed-forward Architecture for Sentence Meaning Inference. In: Neural Network Methods for Natural Language Processing. Synthesis Lectures on Human Language Technologies. Springer, Cham. https://doi.org/10.1007/978-3-031-02165-7_12
Download citation
DOI: https://doi.org/10.1007/978-3-031-02165-7_12
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-01037-8
Online ISBN: 978-3-031-02165-7
eBook Packages: Synthesis Collection of Technology (R0)eBColl Synthesis Collection 7