Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TTS] Fix adapter duration issue #6697

Merged
merged 9 commits into from
May 22, 2023
Merged

[TTS] Fix adapter duration issue #6697

merged 9 commits into from
May 22, 2023

Conversation

hsiehjackson
Copy link
Collaborator

@hsiehjackson hsiehjackson commented May 22, 2023

What does this PR do ?

Fix TTS adapter duration issue.

Collection: [TTS]

Changelog

  • examples/tts/conf/fastpitch_align_44100_adapter.yaml
  • examples/tts/fastpitch_finetune_adapters.py
  • nemo/collections/tts/losses/aligner_loss.py
  • nemo/collections/tts/models/fastpitch.py
  • tutorials/tts/FastPitch_Adapter_Finetuning.ipynb

Usage

  • You can potentially add a usage example below

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>
hsiehjackson and others added 2 commits May 22, 2023 09:34
Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>
@subhankar-ghosh
Copy link
Collaborator

@hsiehjackson Can you also add the code to increase loss scale for aligner loss may be. We can add these two methods in the tutorial later. I feel the "increasing loss scale for aligner loss" should also be a good approach especially because that doesn't add parameters to finetune.

hsiehjackson and others added 3 commits May 22, 2023 11:08
Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>
Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>
@hsiehjackson
Copy link
Collaborator Author

@hsiehjackson Can you also add the code to increase loss scale for aligner loss may be. We can add these two methods in the tutorial later. I feel the "increasing loss scale for aligner loss" should also be a good approach especially because that doesn't add parameters to finetune.

@subhankar-ghosh Added

@subhankar-ghosh
Copy link
Collaborator

1 Minor issue, remaining LGTM.

hsiehjackson and others added 3 commits May 22, 2023 14:15
Copy link
Collaborator

@subhankar-ghosh subhankar-ghosh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@hsiehjackson hsiehjackson merged commit efec347 into main May 22, 2023
@hsiehjackson hsiehjackson deleted the fix_adapter_duration branch May 22, 2023 23:18
hsiehjackson added a commit to hsiehjackson/NeMo that referenced this pull request Jun 2, 2023
* Fix duration issue

Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>

* Fix duration issue

Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Add scale aligner loss

Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fix bug

Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>

---------

Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Signed-off-by: hsiehjackson <c2hsieh@ucsd.edu>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants