Official code for "SummAct: Uncovering User Intentions Through Interactive Behaviour Summarisation" published at CHI'25
hf_bmt | ||
inference | ||
preprocess | ||
train | ||
environment.yml | ||
README.md |
SummAct: Uncovering User Intentions Through Interactive Behaviour Summarisation
Guanhua Zhang, Mohamed Ahmed, Zhiming Hu, Andreas Bulling
ACM CHI 2025, Yokohama, Japan
[Project] [Paper]
Directory Structure
SummAct
│ README.md
│ environment.yml
│
└───preprocess
│ convert_dataset.py
│ create_steps.py
│
└───hf_bmt
│ hf_2_bmtrain.py
│ hf_2_bmtrain.sh
│ bmt_hf.py
│
└───train
│ train.py
│ train.sh
│
└───inference
│ inference.py
│ inference.sh
Setup
We recommend setting up a virtual environment using Anaconda.
- Create a conda environment and install dependencies
conda env create --name summact --file=environment.yml conda activate summact
- Since
model_center==1.0.3
is needed but is not yet available on PYPI, build from source$ git clone https://github.com/OpenBMB/ModelCenter.git $ cd ModelCenter $ pip install -r requirements.txt $ python3 setup.py install
- Clone our repository to download our code and a pretrained model
git clone this_repo.git
Preprocessing
- Convert actions from symbolic formats to natural language by running
preprocess/convert_dataset.py
. Adapt it to your local dataset paths. - Prompting the pretrained LLM with examples to generate sub-intentions using
preprocess/create_steps.py
. Adapt it to your local prompt txt path.
Fine-tuning
- After downloading the model from Hugging Face, convert it into
model_center
weights using the script inhf_bmt/hf_2_bmtrain.sh
. Adapt it to your local paths of the downloaded model and the wanted output. - Run
train/train.sh
, which will calltrain/train.py
to fine-tune the model for interactive behaviour summarisation. Make sure your computer has GPUs.
Inference
Run inference/inference.sh
, which will call inference/inference.py
to convert the fine-tuned model back to HF format, and then calculate metrics to evaluate the summarisation quality.
Citation
If you find our code useful or use it in your own projects, please cite our paper:
@inproceedings{zhang25_chi,
title = {SummAct: Uncovering User Intentions Through Interactive Behaviour Summarisation},
author = {Zhang, Guanhua and Ahmed, Mohamed and Hu, Zhiming and Bulling, Andreas},
year = {2025},
pages = {1--17},
booktitle = {Proc. ACM SIGCHI Conference on Human Factors in Computing Systems (CHI)},
doi = {10.1145/3706598.3713190}
}
Acknowledgements
Our work relied on the codebase of Mind2Web, ScreenAgent and Tell Me More!. Thanks to the authors for sharing their code.