Recent advances in Artificial Intelligence have shown how computers can compete with humans in a variety of mundane tasks, but what happens when creativity is required?
This talk introduces the concept of Natural Language Generation, the task of automatically generating text, for examples articles on a particular topic, poems that follow a particular style, or speech transcripts that express some attitude. Specifically, we’ll discuss the case for Recurrent Neural Networks, a family of algorithms that can be trained on sequential data, and how they improve on traditional language models.
The talk is for beginners, we’ll focus more on the intuitions behind the algorithms and their practical implications, and less on the mathematical details. Practical examples with Python will showcase Keras, a library to quickly prototype deep learning architectures.
- Introduction to Natural Language Generation
- Language Modelling
- Recurrent Neural Networks and Long Short Term Memory for NLG
- RNN examples with Keras
Feedback form: https://python.it/feedback-1574
in __on Sunday 5 May at 10:15 **See schedule**