Strumenti Utente

Strumenti Sito


bionics-engineering:computational-neuroscience:start

Applied Brain Science - Computational Neuroscience (CNS)

Master of Science in Bionics Engineering

Instructors: Alessio Micheli (email) - Davide Bacciu (email)

Additional web page: http://www.di.unipi.it/~micheli/DID/CNS.htm


News

(07/04/2017) The lecture missed due to Easter holidays will be recovered on Thursday 20/04/2017, Room B24, from 10.30 to 13.30.

(21/02/2017) Course Didawiki updated with course information and first lesson for academic year 2016/17.

Course Information

Note for Computer Science Students

In academic year 2016/2017, “Machine Learning: neural networks and advanced models” (AA2) (Master programme in Computer Science - Corso di Laurea Magistrale in Informatica) is borrowed from CNS.

Weekly Schedule

The course is held on the second term. The preliminary schedule for A.A. 2016/17 is provided in table below.

Day Time Room
Monday 11.30-13.30 SI3 (Polo B Ingegneria)
Wednsday 15.30-18.30 SI3 (Polo B Ingegneria)

First lecture: Wednsday 01/03/2017

Objectives

The content of the Computational Neuroscience course includes:

  • bio-inspired neural modelling, spiking and reservoir computing neural networks;
  • advanced computational neural models for learning;
  • architectures and learning methods for dynamical/recurrent neural networks for temporal data and the analysis of their properties;
  • the role of computational neuroscience in real-world applications (by case studies).

Textbook and Teaching Materials

The official textbooks of the course are the following:

[IZHI] E.M. Izhikevich
Dynamical Systems in Neuroscience:  The Geometry of Excitability and Bursting
The MIT press, 2007
[DAYAN] P. Dayan and L.F. Abbott
Theoretical Neuroscience
The MIT press, 2001
[NN] Simon O. Haykin
Neural Networks and Learning Machines
(3rd Edition), Prentice Hall, 2009

Additionally:

for the part of the course on bio-inspired neural modelling, it is also useful the book freely available online:

[GERSTNER] W. Gerstner and W.M. Kistler, Spiking Neuron Models: Single
Neurons, Population, Plasticity. Cambridge University Press, 2002

for the second module of the course (Unsupervised and Representation Learning), it will be referenced material from a book freely available online:

[PANINSKI] W. Gerstner, W.M. Kistler, R. Naud and L. Paninski 
Neuronal Dynamics: From single neurons to networks and models of cognition
Cambridge University Press, 2014

Lectures

Date Topic References & Additional Material
1 01/03/17 (15.30-18.30) Introduction to the course Lecture 1
2 06/03/17 (11.30 - 13.30) Introduction to Neural Modeling Lecture 2
3 08/03/17 (15.30 - 18.30) Conductance-based and Spiking Neuron Models Lecture 3
4 13/03/17 (11.30 - 13.30) Neural and Neuron-Astrocyte Modeling Lecture 4
5 15/03/17 (15.30 - 18.30) Implementing Spiking Neurons using Izhikevich's Model Lab1-1-assignment
6 20/03/17 (11.30 - 13.30) Statistics for In-vitro neuro-astrocyte culture Lecture 6 - seminar
7 22/03/17 (15.30 - 18.30) Introduction to Liquid State Machines Lecture 7
8 27/03/17 (11.30 - 13.30) Spikinglab2 - Liquid State Machines Lab1-2-assignment
9 29/03/17 (15.30-18.30) Representation Learning - Synaptic Plasticity and Hebbian Learning Lecture 8
References:
[DAYAN] Sect. 8.1-8.3
[PANINSKI] Sect 19.1, 19.2.1, 19.3.1, 19.3.2
10 03/04/17 (11.30-13.30) Associative Memories I - Hopfield Networks Lecture 9
References:
[DAYAN] Sect. 7.4 (Associative Memory part)
[PANINSKI] Sect. 17.1, 17.2
11 05/04/17 (15.30-18.30) Lab 2.1 - Hebbian learning and Hopfield networks Assignment 2.1
12 10/04/17 (11.30-13.30) Associative Memories II - Stochastic networks and Boltzmann machines Lecture 10
References:
[DAYAN] Sect. 7.6

Further readings:
[1] A clean and clear introduction to RBM
13 12/04/17 (15.30-18.30) Lab 2.1b - Hebbian learning and Hopfield networks (continued)
17/04/17 (11.30-13.30) No class due to Italian national holiday
14 20/04/17 (10.30-13.30) Lecture 11
Part 1: Adaptive Resonance Theory
Part 2: Representation learning and deep NN
Recovery Lesson: will be held in room B24
Lecture 11 - Part 1
Lecture 11 - Part 2
References:
[DAYAN] Sect. 10.1

Futher Readings:
A gentle introduction to ART networks (with coding examples) can be found here
[2] A classic divulgative paper from the initiator of Deep Learning
[3] Recent review paper
[4] A freely available book on deep learning from Microsoft RC
15 03/05/17 (15.30-18.30) Module conclusions and Lab 2.2 Assignment 2.2
List of presentation and project topics for the second module
16 08/05/17 (11.30-13.30) Introduction to RNN: tasks and basic models Lecture and info multifiles
17 10/05/17 (15.30-18.30) Introduction to RNN: properties and taxonomy; intro to learning by BPTT Lecture and info multifiles (also RNN learning)
18 15/05/17 (11.30-13.30) Introduction to RNN: learning by RTRL Lecture and info multifiles (RNN learning) plus blackboard notes
19 17/05/17 (15.30-18.30) Introduction to RNN: LAB3-1 - learning with IDNN and RNN Info and assignment multifiles (see "RNN - LAB3-1" section). New version 1.1
20 22/05/17 (11.30-13.30) Introduction to RNN: Reservoir Computing Lecture and info multifiles (ESN)
21 24/05/17 (15.30-18.30) Introduction to RNN: LAB 3-2 - learning with ESN Info and assignment multifiles (see "RNN - Lab2" section).
22 29/05/17 (11.30-13.30) Introduction to RNN: LABs 3-1 and 3-2 continue Info and assignment multifiles

Past Editions

Further Readings

[1] Geoffrey Hinton, A Practical Guide to Training Restricted Boltzmann Machines, Technical Report 2010-003, Department of Computer Science, University of Toronto, 2010

[2] G.E. Hinton, R. R. Salakhutdinov. Reducing the dimensionality of data with neural networks.Science 313.5786 (2006): 504-507.

3] Y. Bengio, A. Courville, and P. Vincent. Representation learning: A review and new perspectives. Pattern Analysis and Machine Intelligence, IEEE Transactions on, Vol. 35(8) (2013): 1798-1828.

[4] L. Deng and D. Yu. Deep Learning Methods and Applications, 2014

[5] W. Maass, Liquid state machines: motivation, theory, and applications. Computability in context: computation and logic in the real world (2010): 275-296.

See other references in the slide notes.

bionics-engineering/computational-neuroscience/start.txt · Ultima modifica: 30/05/2017 alle 08:37 (7 anni fa) da Alessio Micheli