You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Luowei,
thanks for releasing code ro the public.
I find the link ' UniLM checkpoints ' is invalid now, can you release a accessible link again?
I find you use BERT-base as Transformer backbone as stated in your paper, and the weights in your BERT model are initialized from UniLM. However, in UniLM paper and their github they only explore BERT-large. So i don't know UniLM checkpoint you use is BERT-base or BERT-large?
The text was updated successfully, but these errors were encountered:
@lostnighter The BERT-base UniLM checkpoint has not been officially released yet, so we do not have the permission to distribute the model. Please contact the UniLM authors for more details.
As an alternative, you can initialize the VLP pre-training model with the original BERT checkpoint, which gives similar results (at least on CC pre-training) as shown in Tab. 5.
Hi Luowei,
thanks for releasing code ro the public.
I find the link ' UniLM checkpoints ' is invalid now, can you release a accessible link again?
I find you use BERT-base as Transformer backbone as stated in your paper, and the weights in your BERT model are initialized from UniLM. However, in UniLM paper and their github they only explore BERT-large. So i don't know UniLM checkpoint you use is BERT-base or BERT-large?
The text was updated successfully, but these errors were encountered: