Official code for "A Cognitively Plausible Visual Working Memory Model" published at CogSci'25
colour_reproduction_task.png | ||
README.md | ||
rgb_colors.json | ||
SSP-VWM.ipynb |
SSP-VWM
Official repository for paper: A Cognitively Plausible Visual Working Memory Model published at CogSci 2025.
Colour Reproduction Task
Participants memorise up to eight coloured squares. After a 1s delay, a probe screen indicates the target location and participants select the remembered colour from a colour wheel.
Method
We encode locations and colour with Spatial Semantic Pointers (SSPs). Memory of one trial is encoded as the bundle of all colours bound with their locations:
M = \sum_{i} \delta(i) \cdot \left[ \phi_{\lambda_c}(c_i) \circledast \phi_{\lambda_g}(x_i, y_i) \right]
Installation
pip install numpy
pip install pandas
pip install matplotlib
# install nengo dependencies
pip install nengo
pip install nengo-spa
# install sspspace library
git clone https://github.com/ctn-waterloo/sspspace.git
cd sspspace/
python setup.py install
Citation
Please cite this paper if you use the SSP-VWM model or parts of this publication in your research:
@inproceedings{penzkofer25_cogsci,
title = {A {Cognitively} {Plausible} {Visual} {Working} {Memory} {Model}},
booktitle = {Proc. Annual Meeting of the Cognitive Science Society (CogSci)},
author = {Penzkofer, Anna and Furlong, Michael and Eliasmith, Chris and Bulling, Andreas},
year = {2025},
pages = {1-6},
doi = {https://escholarship.org/uc/item/3928d5s4}
}