Annotated Course Schedule and Outline#
Note: papers that report on models we will be implementing in the assignments are denoted by asterisks.
Week 1: Introduction, Hopfield Networks#
Discussions:
What is memory?
What does it mean to build a “model” of memory?
Are neural networks like biological brains?
Hebbian learning and Hopfield networks
Readings:
Discussion: Hopfield network simulations (storage capacity, cued recall, contextual drift)
Assignment 1: Explore Hopfield Networks
Weeks 2–3: Free recall, Short Term and Long Term Memory#
Discussions:
free recall and memory search
naturalistic memory tasks
Readings:
Data science primer:
Where to find behavioral datasets: Penn Behavioral Data Archive, OpenCogData, OpenNeuro, UCLA Psychological Dataset Archive, Context Lab
Web scraping with requests and Beautiful Soup
Data manipulation with Pandas
Text analyses with Scikit-learn, NLTK, and HuggingFace Transformers
Assignment 2: Build the Search of Associative Memory Model
Weeks 4–5: Temporal Context and Multi-Timescale Models#
Discussion: the temporal scales of memory, event boundaries, and situation models
Readings:
Assignemnt 3: Build the Context Maintenance and Retrieval Model
Week 6–7: Laplace Transforms#
Discussion: is TCM really multi-timescale?
Discussion: Introduction to the Laplace Transform (and its inverse) and its relevance to memory
Readings:
Assignment 4: Implement the Laplace Temporal Context Model
Week 8: Biologically Inspired Network Models#
Discussion: what does “biologically inspired” mean in practice?
Readings:
Week 9: Recurrent networks, LSTM networks, Transformers#
Readings:
Assignment 5: Final Project
Week 10: Final project presentations#
Discussion: ad-hoc discussions and demos of final projects
Final projects are due on the last day of class at 11:59PM Eastern Time