Deep learning¶
Team: Blake Richards, Jessica Thomson, Joseph Viviano, Yu Zhang
Date: November 15th, 2019, 9h-17h. Breakfast/registration at 8h30.
Summary: Deep learning is increasingly used in neuroscience research. This course is an introduction to deep learning geared towards neuroscientists, with the aim of gaining a basic understanding of:
principles of a deep neural network architecture.
convolutional neural networks.
variational auto-encoders.
graph convolutional neural networks.
In addition, a key objective of the course is to gain an understanding of how deep learning can be used for neuroscience research. Current approaches to link artificial and biological neural networks will thus be reviewed, and hands-on tutorials using neuro data will illustrate most of the presented notions.
Morning (9h-12h30): Foundation of deep learning
9:00 am – 10:30 am: Deep Learning and convolutional networks (Joseph Viviano) | google slides | pdf slides | github repo
10:30 am – 11:00 am: Break
11:00 am – 12:30 pm: Variational auto-encoders (Blake Richards) | pdf slides | VAE tutorial | LFADS tutorial
12:30 pm - 1:30 pm: Lunch
Afternoon (13h30-17h) intersection of deep learning and neuroscience
13:30 pm – 15:00 pm: Comparing the activity of artificial and biological networks (Jessica Thompson) | google slides | pdf slides
15:00 pm – 15:30 pm: Break
15:30 pm – 17:00 pm: (Brain) graph convolutional networks (Yu Zhang) | google slides | pdf slides | github repo
Prerequisites
Basic familiarity with Python would be preferable
You will need enough space for Anaconda and all the course data.
Installation instructions
Please join the brainhack mattermost and the channel main-training-dl
. To install training material locally, please download and install python with the full-suite 64-bit Anaconda distribution. Then download or clone the github repositories specific to each session:
variational auto-encoders and a simple tutorial.
graph convolutional network The dependencies for each session are described on each repo. Test everything by opening each of the
.ipynb
notebooks and running the first few cells.