A deep language model, GPT-2, is trained on scientific manuscripts from NASAs Astrophysical Data System and the ArXiv. The language model is used to generate text and explore the relationships within the scientific literature. Explore the source code on GitHub!