Sentence Embedding Demo

How does it work?

  1. Encode an arbitrary piece of text.
  2. Extract the sentences that are closest in meaning to a query sentence of your choice.

Note: this demo does not answer questions. It extracts the sentences from the text that are semantically most similar to your query sentence (its k-nearest-neighbours).

Each extracted sentence has an associated distance. The closer this distance is to zero, the more semantically similar the sentence is to your query sentence.

To better see how it works, start by querying an exact copy of a sentence appearing in the text. Then alter one word, and so on, and observe the reported distances.


Enter a query sentence for which to find the k-nearest-neighbours:


The k-nearest-neighbours for your query sentence (k = 3):


Enter some text to encode:


The sentence embedding model used in this demo has been created at the University of Toronto, and is described in the following paper: