Official code for "A Cognitively Plausible Visual Working Memory Model" published at CogSci'25
Find a file
2025-07-30 11:46:47 +02:00
colour_reproduction_task.png update README 2025-07-30 11:44:58 +02:00
README.md update README 2025-07-30 11:46:47 +02:00
rgb_colors.json add demo notebook 2025-07-30 11:41:47 +02:00
SSP-VWM.ipynb update README 2025-07-30 11:44:58 +02:00

SSP-VWM

Official repository for paper: A Cognitively Plausible Visual Working Memory Model published at CogSci 2025.

Colour Reproduction Task

Participants memorise up to eight coloured squares. After a 1s delay, a probe screen indicates the target location and participants select the remembered colour from a colour wheel. colour_reproduction_task.png

Method

We encode locations and colour with Spatial Semantic Pointers (SSPs). Memory of one trial is encoded as the bundle of all colours bound with their locations:

M = \sum_{i} \delta(i) \cdot \left[ \phi_{\lambda_c}(c_i) \circledast \phi_{\lambda_g}(x_i, y_i) \right]

Installation

pip install numpy
pip install pandas
pip install matplotlib

# install nengo dependencies
pip install nengo
pip install nengo-spa

# install sspspace library 
git clone https://github.com/ctn-waterloo/sspspace.git
cd sspspace/
python setup.py install

Citation

Please cite this paper if you use the SSP-VWM model or parts of this publication in your research:

@inproceedings{penzkofer25_cogsci,
  title = {A {Cognitively} {Plausible} {Visual} {Working} {Memory} {Model}},
  booktitle = {Proc. Annual Meeting of the Cognitive Science Society (CogSci)},
  author = {Penzkofer, Anna and Furlong, Michael and Eliasmith, Chris and Bulling, Andreas},
  year = {2025},
  pages = {1-6},
  doi = {https://escholarship.org/uc/item/3928d5s4}
}