Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update ParallelWaveGAN config + Tacotron2 masked loss #1545

Closed
wants to merge 2 commits into from

Conversation

iamanigeeit
Copy link

@iamanigeeit iamanigeeit commented Apr 29, 2022

Referencing #1187 and #1192

I did not realize that a PR must include all commits in a branch and i should have split into a separate branch for the ParallelWaveGAN config and the Tacotron2 masked loss.

@CLAassistant
Copy link

CLAassistant commented Apr 29, 2022

CLA assistant check
All committers have signed the CLA.

@iamanigeeit iamanigeeit changed the title Update ParallelWaveGAN config to match original paper Update ParallelWaveGAN config + Tacotron2 masked loss Apr 29, 2022
@@ -135,6 +134,8 @@ class TacotronConfig(BaseTTSConfig):
ga_alpha (float):
Weight for the guided attention loss. If set less than or equal to zero, it disables the corresponding loss
function. Defaults to 5.
stopnet_alpha (float):
Weight for the guided attention loss. Defaults to 100.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment incorrectly explains the parameter.

optimizer (torch.optim.Optimizer):
Optimizer used for the training. Defaults to `AdamW`.
optimizer_params (dict):
Optimizer kwargs. Defaults to `{"betas": [0.8, 0.99], "weight_decay": 0.0}`
lr_scheduler_gen (torch.optim.Scheduler):
Learning rate scheduler for the generator. Defaults to `ExponentialLR`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the reason for these config changes?

@erogol
Copy link
Member

erogol commented May 7, 2022

Thanks for the PR. As you said, it'd be better to have separate PRs.

@stale
Copy link

stale bot commented Jun 19, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. You might also look our discussion channels.

@stale stale bot added the wontfix This will not be worked on but feel free to help. label Jun 19, 2022
@stale stale bot closed this Jun 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
wontfix This will not be worked on but feel free to help.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants