EE-559 – Deep Learning

Idiap's logo EPFL's logo

You can find here the materials for the EPFL course EE-559 “Deep Learning”, taught by Fran├žois Fleuret. The pdf files and videos on this page are licensed under the Creative Commons BY-NC-SA 4.0 International License.

Info sheet: dlc-info-sheet.pdf

We will use the PyTorch framework for implementations. You can find below a Linux virtual machine for the practical sessions.

Thanks to Adam Paszke, Alexandre Nanchen, Xavier Glorot, Matus Telgarsky, and Diederik Kingma, for their help, comments, or remarks.

Course material

You will find here the slides which are full of “animations” and not convenient to use as notes, handouts with two slides per pages, and for some of the lectures videos of voice-over.

Practical session prologue

Helper python prologue for the practical sessions:

Lecture 1 (Feb 21, 2018) – Introduction and tensors

Thumbnail made from a slide

What is deep learning, some history, what are the current applications. torch.Tensor, linear regression.

Lecture 2 (Feb 28, 2018) – Machine learning fundamentals

Thumbnail made from a slide

Empirical risk minimization, capacity, bias-variance dilemma, polynomial regression, k-means and PCA.

Lecture 3 (Mar 07, 2018) – Multi-layer perceptrons

Thumbnail made from a slide

Linear classifiers, perceptron, linear separability and feature extraction, Multi-Layer Perceptron, gradient descent, back-propagation.

Lecture 4 (Mar 14, 2018) – Convolutional networks and autograd

Thumbnail made from a slide

Generalized acyclic graph networks, torch.autograd, batch processing, convolutional layers and pooling, torch.nn.Module.

Lecture 5 (Mar 21, 2018) – Optimization

Thumbnail made from a slide

Cross-entropy, L1 and L2 penalty. Weight initialization, Xavier's rule, loss monitoring. torch.autograd.Function.

Lecture 6 (Mar 28, 2018) – Going deeper

Thumbnail made from a slide

Theoretical advantages of depth, rectifiers, drop-out, batch normalization, residual networks, advanced weight initialization. GPUs and torch.cuda.

No lecture (Apr 4, 2018) – Easter holidays

Lecture 7 (Apr 11, 2018) – Computer vision

Thumbnail made from a slide

Deep networks for image classification (AlexNet, VGGNet), object detection (YOLO), and semantic segmentation (FCN). Data-loaders, neuro-surgery, and fine-tuning.

Lecture 8 (Apr 18, 2018) – Under the hood

Thumbnail made from a slide

Visualizing filters and activations, smoothgrad, deconvolution, guided back-propagation. Optimizing samples from scratch, adversarial examples.

Lecture 9 (Apr 25, 2018) – Autoencoders and generative models

Thumbnail made from a slide

Transposed convolution layers, autoencoders, variational autoencoders, non volume-preserving networks.

Lecture 10 (May 2, 2018) – Generative Adversarial Networks

Thumbnail made from a slide

GAN, Wasserstein GAN, Deep Convolutional GAN, Image-to-Image translations, model persistence.

Lecture 11 (May 9, 2018) – Recurrent networks and NLP


Lecture 12 (May 16, 2018) – TBD (guest speaker: Soumith Chintala, Facebook)


Lecture 13 (May 23, 2018) – TBD (guest speaker: Andreas Steiner, Google)


Lecture 14 (May 30, 2018) – TBD (guest speaker: Andreas Steiner, Google)


Virtual machine for the practicals

A Virtual Machine (VM) is a software that simulates a complete computer. The one we provide here includes a Linux operating system and all the tools needed to use PyTorch from a web browser (firefox or chrome).

It is already installed on EPFL's machines in room CM1103 for the exercise sessions.

If you want to use your own laptop, first download and install Oracle's VirtualBox on your machine, then download one of the image files (pick the QWERTZ version if you have a Swiss layout keyboard):

and open it in VirtualBox with

File -> Import Appliance

You should now see an entry in the list of VMs. If you run it, it automatically starts a JupyterLab on port 8888 and exports that port to the host. This means that you can access this JupyterLab with a web browser on the machine running VirtualBox at


and use python notebooks, view files, start terminals, and edit source files. Typing !bye in a notebook or bye in a terminal will shutdown the VM.

You can run a terminal and a text editor from inside the Jupyter notebook for exercises that require more than the notebook itself. Source files can be executed by running in a terminal the python command with the source file name as argument.

Files saved in the VM are not kept if the VM is re-installed, which happens for each session on the EPFL machines. So you should download files you want to keep from the jupyter notebook to your account and re-upload them later when you need them.

This VM also exports an ssh port to the port 2022 on the host, which allows to log in with standard ssh clients on Linux and OSX, and with applications such as PuTTY on Windows. The default login is 'dave' and password 'dummy', same password for the root.

Note that performance for computation will not be as good as if you install PyTorch natively on your machine (which is possible only on Linux and OSX at the moment). In particular, the VM does not take advantage of a GPU if you have one.

Finally, please also note that this VM is configured in a convenient but highly non-secured manner, with easy to guess passwords, including for the root, and network-accessible non-protected Jupyter notebooks.

This VM is built on a Linux Debian 9.3 “stretch”, with miniconda, PyTorch 0.3.1, TensorFlow 1.4.1, and many Python utility packages installed.