Skip to content

Two-Stream Adaptive Graph Convolutional Networks for Skeleton-Based Action Recognition in CVPR19

License

Notifications You must be signed in to change notification settings

xiehaizheng/2s-AGCN

This branch is 3 commits behind lshiwjx/2s-AGCN:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

d4880e1 · Aug 7, 2019

History

7 Commits
May 23, 2019
May 5, 2019
May 5, 2019
May 23, 2019
May 5, 2019
May 5, 2019
May 5, 2019
May 23, 2019
Aug 7, 2019
May 23, 2019
May 5, 2019

Repository files navigation

2s-AGCN

Two-Stream Adaptive Graph Convolutional Networks for Skeleton-Based Action Recognition in CVPR19

Note

PyTorch version should be 0.3! For PyTorch0.4 or higher, the codes need to be modified.

Data Preparation

  • Download the raw data from NTU-RGB+D and Skeleton-Kinetics. Then put them under the data directory:

     -data\  
       -kinetics_raw\  
         -kinetics_train\
           ...
         -kinetics_val\
           ...
         -kinetics_train_label.json
         -keintics_val_label.json
       -nturgbd_raw\  
         -nturgb+d_skeletons\
           ...
         -samples_with_missing_skeletons.txt
    
  • Preprocess the data with

    python data_gen/ntu_gendata.py

    python data_gen/kinetics-gendata.py.

  • Generate the bone data with:

    python data_gen/gen_bone_data.py

Training & Testing

Change the config file depending on what you want.

`python main.py --config ./config/nturgbd-cross-view/train_joint.yaml`

`python main.py --config ./config/nturgbd-cross-view/train_bone.yaml`

To ensemble the results of joints and bones, run test firstly to generate the scores of the softmax layer.

`python main.py --config ./config/nturgbd-cross-view/test_joint.yaml`

`python main.py --config ./config/nturgbd-cross-view/test_bone.yaml`

Then combine the generated scores with:

`python ensemble.py` --datasets ntu/xview

Citation

Please cite the following paper if you use this repository in your reseach.

@inproceedings{2sagcn2019cvpr,  
  title     = {Two-Stream Adaptive Graph Convolutional Networks for Skeleton-Based Action Recognition},  
  author    = {Lei Shi and Yifan Zhang and Jian Cheng and Hanqing Lu},  
  booktitle = {CVPR},  
  year      = {2019},  
}

Contact

For any questions, feel free to contact: lei.shi@nlpr.ia.ac.cn

About

Two-Stream Adaptive Graph Convolutional Networks for Skeleton-Based Action Recognition in CVPR19

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%