T5 was introduced in the paper Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer, where we provided a comprehensive picture of how we pre-trained a standard text-to-text Transformer model on a large text corpus, achieving state-of-the-art results on many NLP tasks after fine-tuning.
In our Colab demo and followup paper, we trained T5 to answer trivia questions in a "closed-book" setting, without access to any external knowledge. In other words, in order to answer a question T5 can only use knowledge stored in its parameters that it picked up during unsupervised pre-training!
In this app, you can play against T5 in a trivia challenge to see what it has learned!
All questions and accepted answers come from the TriviaQA dataset. While T5's answers were pre-computed offline and its "thinking" time is artificial, in reality it produces answers in less than a second.
Learn more in the accompanying blog post.