experimenting with egocentric
Download the dataset from the [official webpage][ego4d_page] and place them
in data/
folder. Specify the paths in config.yaml
Run the preprocessing script using:
python utils/get_clip_features.py -c config.yaml
This creates JSON files in data/dataset/nlq_official_v1
that can be used for training and evaluating the VSLNet baseline model.
Download the official video features released from [official webpage][ego4d_page] and place them in data/features/nlq_official_v1/video_features/
folder.