This repository is the official implementation of paper (https://dl.acm.org/doi/pdf/10.1145/3701551.3703558)
Nhu-Thuat Tran and Hady W. Lauw. 2025. VARIUM: Variational Autoencoder for Multi-Interest Representation with Inter-User Memory. In Proceedings of 18th ACM International Conference on Web Search and Data Mining (WSDM’25), Hannover, Germany, March 10-14, 2025.
- Anaconda: 4.12.0
- Python: 3.7.5
- OS: MacOS
Please follow the instruction in README.md
file under data folder
Create virtual environment
conda create --prefix ./varium python=3.7.5 -y
Activate environment
conda activate ./varium
Install requirements
pip install -r requirements.txt
-
Create a YAML config file under
configs
folder as samples. -
Prepare
run.sh
file as follows
python run_varium.py --dataset <dataset_name> --config_file <your_config_file> --device_id <ID of GPU machine>
- To run training and evaluation
bash run.sh
The base model follows the implementation of VALID (https://github.com/PreferredAI/VALID).
Therefore, we first follow the hyper-parameter tuning from VALID, without using memory network by setting use_memory
to False
.
Then we tune the key hyper-parameters in VARIUM's architecture
num_steps
: the number of refinement layers in1, 2, 3, 4
n_memory_blocks
: the number of slots in memory16, 32, 48, 64
(extending this list for your custom datasets might lead to better performance)tau_memory
: temperature in memory0.2, 0.3, 0.4, 0.5
rho_carry
: hyper-parameter (Equation 7 in the paper) in range[0.5, 3]
with step size0.5
If you find our work useful for your research, please cite our paper as
@inproceedings{VARIUM,
author = {Nhu{-}Thuat Tran and
Hady W. Lauw},
title = {{VARIUM:} Variational Autoencoder for Multi-Interest Representation
with Inter-User Memory},
booktitle = {Proceedings of the Eighteenth {ACM} International Conference on Web
Search and Data Mining, {WSDM} 2025, Hannover, Germany, March 10-14,
2025},
pages = {156--164},
year = {2025}
}