Language modelling is a hard task in Natural Language Processing. Each language has so many nuances to it and we've barely scratched the surface in trying to understand how we came about learning them. In the talk, I'll be briefly discussing the importance of language in our society, how it helps us become a sort of a hive mind to learn from each other. I'll show the progress we've made in text generation and the various techniques that have proven to be successful so far (bow, embeddings, language models). Lastly, I'll be showing some recent advances in unsupervised transfer learning in NLP (ULMFiT, ELMo) with a practical implementation of a pretrained model learning to write like Shakespeare using his texts. Will end the talk with possible areas to explore in the field. The code will be in the form of a Jupyter notebook.