Deep Learning School

Here you can watch lectures from the 2016 Deep Learning Summer School in Montreal.

Course excerpt:

Deep neural networks that learn to represent data in multiple layers of increasing abstraction have dramatically improved the state-of-the-art for speech recognition, object recognition, object detection, predicting the activity of drug molecules, and many other tasks. Deep learning discovers intricate structure in large datasets by building distributed representations, either via supervised, unsupervised or reinforcement learning.

The Deep Learning Summer School 2016 is aimed at graduate students and industrial engineers and researchers who already have some basic knowledge of machine learning (and possibly but not necessarily of deep learning) and wish to learn more about this rapidly growing field of research.

 

 

Here is the schedule in which you could view the presentations

table.schedule td {
vertical-align: top;
padding: 10px;
}

 01/08/2016  02/08/2016  03/08/2016  04/08/2016  05/08/2016  06/08/2016  07/08/2016
9:00
10:30
Doina
Precup
Rob
Fergus
Yoshua
Bengio
Kyunghyun

Cho
Joelle
Pineau
Ruslan
Salakhutdinov
Bruno
Olshausen
Neuro I
10:30
11:00
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
11:00
12:30
Hugo Larochelle
Antonio Torralba
Sumit
Chopra
Edward Grefenstette
Pieter
Abbeel
Shakir
Mohamed
Surya
Ganguli

and Deep Learning Theory
12:30
14:30
Lunch Lunch LunchWiDL event Lunch Lunch Lunch Lunch
14:30
16:00
Hugo Larochelle

Neural Networks II (click on part II)

Alex Wiltschko
Torch I
Jeff
Dean

& TensorFlow

Julie Bernauer
(NVIDIA)
GPU programming with CUDA
Joelle, Pieter & Doina
Advanced Topics in RL
Contributed
talks

Session 4
Contributed
talks

Session 4
16:00
16:30
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
Coffee
Break
16:30
18:00
Pascal
Lamblin

Practical
Session
 Alex Wiltschko
(Torch)

Frédéric Bastien
(Theano)

Jeff
Dean

& TensorFlow (click on part II)

Contributed
talks

Session 1
Contributed
talks

Session 2
Contributed
Posters

Session 1
Contributed
Posters

Session 2
 Evening Opening Reception
(18:00-20:30)
— by —
Imagia
 Happy Hour
(18:45-22:30)
buses at 18:30
— by —
Maluuba
Happy Hour
(18:30-20:30)
— by —
Creative Destruction Lab

(or you can just follow them in consecutive order at http://videolectures.net/deeplearning2016_montreal/ since they seem to be in the order they were presented.)

Contributed talks:

12:55 Chains of Reasoning over Entities, Relations, and Text using Recurrent Neural Networks
Rajarshi Das

14:29 Professor Forcing: A New Algorithm for Training Recurrent Networks
Anirudh Goyal

10:59 Zoneout: Regularizing RNNs by Randomly Preserving Hidden Activations
Tegan Maharaj

18:58 Deep multi-view representation learning of brain responses to natural stimuli
Leila Wehbe

14:49 Learning to Communicate with Deep Multi­-Agent Reinforcement Learning
Jakob Foerster

13:57 Model-Based Relative Entropy Stochastic Search
Abbas Abdolmaleki

16:33 Learning Nash Equilibrium for General-Sum Markov Games from Batch Dat
Julien Pérolat

20:30 A Network-based End-to-End Trainable Task-oriented Dialogue System
Tsung-Hsien Wen

15:28 Inference Learning
Patrick Putzky

16:45 Variational Autoencoders with PixelCNN Decoders
Ishaan Gulrajani

13:33 An Infinite Restricted Boltzmann Machine
Marc-Alexandre Côté

15:15 Deep siamese neural network for prediction of long-range interactions in chromatin
Davide Chicco

14:09 Beam Search Message Passing in Bidirectional RNNs: Applications to Fill-in-the-Blank Image Captioning
Qing Sun

18:40 Analyzing the Behavior of Deep Visual Question Answering Models
Aishwarya Agrawal

13:55 Recombinator Networks: Learning Coarse-to-Fine Feature Aggregation
Sina Honari

Leave a Reply

Your email address will not be published. Required fields are marked *