Contribute Media
A thank you to everyone who makes this possible: Read More

Understanding and Implementing Recurrent Neural Networks using Python

Description

Recurrent Neural Networks (RNNs) have become famous over time due to their property of retaining internal memory. These neural nets are widely used in recognizing patterns in sequences of data, like numerical timer series data, images, handwritten text, spoken words, genome sequences, and much more. Since these nets possess memory, there is a certain analogy that we can make to the human brain in order to learn how RNNs work. RNNs can be thought of as a network of neurons with feedback connections, unlike feedforward connections which exist in other types of Artificial Neural Networks.

The flow of talk will be as follows: - Self Introduction - Introduction to Deep Learning - Artificial Neural Networks (ANNs) - Diving DEEP into Recurrent Neural Networks (RNNs) - Comparing Feedforward Networks with Feedback Networks - Quick walkthrough: Implementing RNNs using Python (Keras) - Understanding Backpropagation Through Time (BPTT) and Vanishing Gradient Problem - Towards more sophisticated RNNs: Gated Recurrent Units (GRUs)/Long Short-Term Memory (LSTMs) - End of talk - Questions and Answers Session

Details

Improve this page