Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VMZ. Pytorch models #117

Open
jaypatravali opened this issue Jul 15, 2020 · 8 comments
Open

VMZ. Pytorch models #117

jaypatravali opened this issue Jul 15, 2020 · 8 comments

Comments

@jaypatravali
Copy link

jaypatravali commented Jul 15, 2020

Hi followed the install instructions to get vmz locally.

I wanted to try out some of the pretrained models like

$ from vmz.models import r2plus1d_34
$ r2plus1d_34(pretraining='sports1m_32frms')

posted below is the tail output of the model:
(avgpool): AdaptiveAvgPool3d(output_size=(1, 1, 1)) (fc): Linear(in_features=512, out_features=400, bias=True)
this seems like its the finetuned weights on kinetics 400 using pretrained weights, as the fc output should have been 487 from Sports1m instead of 400. I figure this is a bug in matching args to urls in . utils.py

similarly, you have key mismatch running
$ r2plus1d_34(pretraining='ig65m_32frms')

looking up the model_urls at line 7 of utils.py it works when i run with

$ r2plus1d_34(pretraining='65m_32frms')
) ) (avgpool): AdaptiveAvgPool3d(output_size=(1, 1, 1)) (fc): Linear(in_features=512, out_features=400, bias=True) )
I might be off since I am new to video CNN's, just wanted to point this out. I am interested in using pretrained weights from sport1m and IG65m on my own video datasets using newer CNN's like r2plus1d_34

@bjuncek
Copy link

bjuncek commented Jul 24, 2020

hi @jaypatravali,
so for r2plus1d_34 I don't think we provide sports1M pretraining (see here.

For the IG models, that might be the case as I was actually using models from a different repo. I'll send a PR to fix that in about 2 weeks (I'm on vacation ATM). If you need a fix beforehand, you can download the model from the caffe2 webpage, and use the convert_model tool from the repo get the pretrained model. Sorry again for the issue - it was my sloppiness!

@jaypatravali
Copy link
Author

hi @bjuncek thanks for your replies. I went through the caffe2 models and its seems the sports1m is only available for r2plus1_152 model. If i were to your paper which you co-authored recently https://arxiv.org/pdf/2007.04755.pdf which uses 34 layer model with sports1m. Is that model going to be available (r21d_34_sports1m)

@bjuncek
Copy link

bjuncek commented Jul 27, 2020

Ah, yes - I believe that was one of the internal ones.

@dutran do you perhaps have the R(2+1)D-34 pretrained on sports1m only lying around somewhere?
The one you've later finetuned for the 2017/8 CVPR paper in models.md.

@jaypatravali
Copy link
Author

@dutran @bjuncek any updates on the Sports1m model for r2plus1_34 model :-)

@jaypatravali
Copy link
Author

@dutran @bjuncek any updates?

@bjuncek
Copy link

bjuncek commented Nov 13, 2020

@dutran if you can upload the model, I'll gladly convert it and test it for PT :)

@dutran
Copy link
Contributor

dutran commented Nov 13, 2020

we have R(2+1)D-34 preatrained on Sports-1M but on 32x112x112 not 224x224 which may have worse performance, you can find them here. https://github.com/facebookresearch/VMZ/blob/master/c2/tutorials/models.md

@lovelyczli
Copy link

@jaypatravali @bjuncek @dutran I got R(2+1)D34 models pretrained on ig65m ("r2plus1d_34_32_ig65m"), but could you please provide a R(2+1)D34 model pretrained on sports1M.

As you know, it is reasonable to compare other works with the same pretraining datasets. However, I do not have enough GPUs to train on sports1M, let alone adjusting parameters.

Especially in https://github.com/facebookresearch/VMZ/blob/master/c2/tutorials/models.md @dutran , the model has fine-tuned on Kinetics.

So could you please provide a R(2+1)D34 model pretrained on sports1M. Many thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants