Skip to content

🔥(CVPR 2023) ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction

License

Notifications You must be signed in to change notification settings

ZhengdiYu/Arbitrary-Hands-3D-Reconstruction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

65 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

arXiv code visitors GitHub license

ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction [CVPR 2023]

This is the official repository of the ACR.

🔥(CVPR 2023) ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction

Zhengdi Yu, Shaoli Huang, Chen Fang, Toby P. Breckon, Jue Wang

Conference on Computer Vision and Pattern Recognition (CVPR), 2023

[Paper][Project Page][Video]

News 🚩

  • [2023/03/24] Code release!
  • [2023/03/10] ACR is on arXiv now.
  • [2023/02/27] ACR got accepted by CVPR 2023! 🎉

Requirements

Conda environments

conda create -n ACR python==3.8.8  
conda activate ACR 
conda install -n ACR pytorch==1.10.0 torchvision==0.11.1 cudatoolkit=10.2 -c pytorch
pip install -r requirements.txt

For rendering and visualization on headless server, please consider install pytorch3d follow the official instruction and set renderer to pytorch3d in configs/demo.yml. Note that pyrender can only be used on desktop.

Pre-trained model and data

  • Register and download MANO model. Put MANO_LEFT.pkl and MANO_RIGHT.pkl in mano/
  • Download pre-trained weights from here (update on 3.28) and put it in checkpoints/

Demo

Note: use -t to smooth your results. We provide examples in demos/

# Run a real-time demo:
python -m acr.main --demo_mode webcam -t

# Run on a single image:
python -m acr.main --demo_mode image --inputs <PATH_TO_IMAGE>

# Run on a folder of images:
python -m acr.main --demo_mode folder -t --inputs <PATH_TO_FOLDER> 

# Run on a video:
python -m acr.main --demo_mode video -t --inputs <PATH_TO_VIDEO> 

Finally, the visualization will be saved in demos_outputs/. In video or folder mode, the results will also be saved as <FILENAME>_output.mp4.

More qualitative results

image wild

Applications

Citation

@inproceedings{yu2023acr,
  title = {ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction},
  author = {Yu, Zhengdi and Huang, Shaoli and Chen, Fang and Breckon, Toby P. and Wang, Jue},
  booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  month     = {June},
  year      = {2023}
  }

Acknowledgement

The pytorch implementation of MANO is based on manopth. We use some parts of the great code from ROMP. For MANO segmentation and rendering, we follow zc-alexfan. We thank all the authors for their impressive works!

Contact

For technical questions, please contact z.yu23@imperial.ac.uk or ZhengdiYu@hotmail.com

For commercial licensing, please contact shaolihuang@tencent.com

About

🔥(CVPR 2023) ACR: Attention Collaboration-based Regressor for Arbitrary Two-Hand Reconstruction

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages