You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 1, 2024. It is now read-only.
Hi, appreciate your greet work for code translation!
I wonder if you have done ablation study on the data size. Since the unsupervised model needs way much more training data (over 500M funcitons ) than exisiting code PLMs, such as CodeT5(8.35M funcitons).
How's the performance of TransCoder if less data are provided?
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi, appreciate your greet work for code translation!
I wonder if you have done ablation study on the data size. Since the unsupervised model needs way much more training data (over 500M funcitons ) than exisiting code PLMs, such as CodeT5(8.35M funcitons).
How's the performance of TransCoder if less data are provided?
The text was updated successfully, but these errors were encountered: