Go Forth and Prosper: Language Modeling with Ancient Textual History

Published in Pre-Print, 2021

Abstract: We introduce a technique for improving document-level a language models (LM) by leveraging “ancient history”: text that is outside the LM’s current context window. We learn an auxiliary function to select spans from the ancient history which can help the LM to predict future text. The selected text spans are then copied directly into the LM’s context window, replacing less predictive spans.
This method can improve perplexity of pretrained LMs with no updates to the LM’s own parameters. We further observe that an auxiliary function trained in a specific textual domain like Wikipedia will also work in a substantially different domain such as scientific publications. With this technique we see a 7% perplexity reduction on Wikipedia articles, and a 12% perplexity reduction on scientific texts.

PDF

Bibtex:

@inproceedings{koncel2021goforth,
  title={Go Forth and Prosper: Language Modeling with Ancient Textual History},
  author={Koncel-Kedziorski, Rik and Smith, Noah A.},
  booktitle={Preprint},
  year={2021}
}