Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Hybrid RNNT-CTC model #5364

Merged
merged 76 commits into from
Dec 6, 2022
Merged
Show file tree
Hide file tree
Changes from 42 commits
Commits
Show all changes
76 commits
Select commit Hold shift + click to select a range
eae0620
added initial code.
VahidooX Nov 9, 2022
b4c3f2a
added the confs.
VahidooX Nov 9, 2022
a8b7886
added the confs.
VahidooX Nov 9, 2022
416af52
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 9, 2022
384729f
changed name from joint to hybrid.
VahidooX Nov 9, 2022
c84f981
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 9, 2022
3164ac6
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 9, 2022
4623bd6
fixed format.
VahidooX Nov 9, 2022
50f70bf
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 9, 2022
fa92433
fixed format.
VahidooX Nov 9, 2022
512eef8
fixed bug.
VahidooX Nov 15, 2022
edd79e3
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 15, 2022
2fa7c57
fixed bug.
VahidooX Nov 15, 2022
2b3902a
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 15, 2022
3820433
addressed comments.
VahidooX Nov 15, 2022
4b29ec7
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 15, 2022
e7b1171
addressed comments.
VahidooX Nov 15, 2022
ef5cca3
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 15, 2022
bab1fc2
Merge branch 'main' of https://github.com/NVIDIA/NeMo into add_joint_…
VahidooX Nov 15, 2022
a0d584e
added docs.
VahidooX Nov 15, 2022
7ab6384
added docs.
VahidooX Nov 16, 2022
31d9f07
added docs.
VahidooX Nov 16, 2022
a84d29f
added docs.
VahidooX Nov 16, 2022
defe4fd
fixed bug.
VahidooX Nov 16, 2022
38889c3
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 16, 2022
6a77431
fixed bug.
VahidooX Nov 17, 2022
d6c7262
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 17, 2022
48f5a6d
fixed bug.
VahidooX Nov 17, 2022
4ae005e
fixed bug.
VahidooX Nov 17, 2022
3821c43
fixed bug.
VahidooX Nov 17, 2022
d0e0788
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 17, 2022
8c05435
fixed bug.
VahidooX Nov 18, 2022
925e364
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 18, 2022
cd403be
fixed bug.
VahidooX Nov 18, 2022
d1d1e6a
fixed bug.
VahidooX Nov 18, 2022
24c8a4d
addec CI test.
VahidooX Nov 18, 2022
a05fd0d
addec CI test.
VahidooX Nov 18, 2022
21ad6f3
fixed bugs in change_vocabs.
VahidooX Nov 19, 2022
cd28395
fixed bugs in change_vocabs.
VahidooX Nov 19, 2022
1aac4aa
fixed style.
VahidooX Nov 19, 2022
3a40e1e
fixed style.
VahidooX Nov 19, 2022
93ae749
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Nov 19, 2022
2a2f995
fixed style.
VahidooX Nov 19, 2022
1b86c72
raise error for aux_ctc.
VahidooX Nov 23, 2022
d26304a
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 23, 2022
e36d7c5
raise error for aux_ctc.
VahidooX Nov 23, 2022
1ab4578
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 23, 2022
d576312
raise error for aux_ctc.
VahidooX Nov 23, 2022
c32ee7b
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 23, 2022
93c6133
raise error for aux_ctc.
VahidooX Nov 23, 2022
3979335
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Nov 23, 2022
394c5fb
updated the streaming names.
VahidooX Nov 23, 2022
3d70f69
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 23, 2022
2237d19
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Nov 29, 2022
6dc2394
added unittests.
VahidooX Nov 29, 2022
39e6197
added unittests.
VahidooX Nov 30, 2022
3c32e41
added unittests.
VahidooX Nov 30, 2022
a4499b1
fixed tests.
VahidooX Nov 30, 2022
c239b3d
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 30, 2022
c991d18
fixed tests.
VahidooX Nov 30, 2022
8c678ed
fixed tests.
VahidooX Nov 30, 2022
b5a56b4
fixed tests.
VahidooX Nov 30, 2022
fb8e923
fixed tests.
VahidooX Nov 30, 2022
b33a6aa
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Nov 30, 2022
ecf9f1d
fixed tests.
VahidooX Nov 30, 2022
add76b3
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Nov 30, 2022
0ca5d2d
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Nov 30, 2022
f6c5558
added methods.
VahidooX Dec 2, 2022
aab6e4e
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Dec 2, 2022
ab072da
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Dec 2, 2022
853a490
Merge remote-tracking branch 'origin/add_joint_rnnt_ctc' into add_joi…
VahidooX Dec 2, 2022
f7361db
added decoding.
VahidooX Dec 2, 2022
49c8c14
fxied the tests.
VahidooX Dec 2, 2022
d27012a
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Dec 5, 2022
52b84ba
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Dec 5, 2022
78490e3
Merge branch 'main' into add_joint_rnnt_ctc
VahidooX Dec 5, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -562,6 +562,24 @@ pipeline {
// sh 'rm -rf examples/asr/speech_to_text_rnnt_wpe_results'
// }
// }
// stage('L3: Speech to Text Hybrid Transducer-CTC WPE') {
titu1994 marked this conversation as resolved.
Show resolved Hide resolved
// steps {
// sh 'STRICT_NUMBA_COMPAT_CHECK=false python examples/asr/asr_hybrid_transducer_ctc/speech_to_text_hybrid_rnnt_ctc_bpe.py \
// --config-path="../conf/conformer/hybrid_transducer_ctc/conformer_hybrid_transducer_ctc/" --config-name="conformer_hybrid_transducer_ctc_bpe.yaml" \
// model.train_ds.manifest_filepath=/home/TestData/an4_dataset/an4_train.json \
// model.validation_ds.manifest_filepath=/home/TestData/an4_dataset/an4_val.json \
// model.encoder.n_layers= 2 \
// model.train_ds.batch_size=2 \
// model.validation_ds.batch_size=2 \
// model.tokenizer.dir="/home/TestData/asr_tokenizers/an4_wpe_128/" \
// model.tokenizer.type="wpe" \
// trainer.devices=[0] \
// trainer.accelerator="gpu" \
// +trainer.fast_dev_run=True \
// exp_manager.exp_dir=examples/asr/speech_to_text_hybrid_transducer_ctc_wpe_results'
// sh 'rm -rf examples/asr/speech_to_text_hybrid_transducer_ctc_wpe_results'
// }
// }
// }
// }

Expand Down
21 changes: 21 additions & 0 deletions docs/source/asr/models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,27 @@ You may find the example config files of Squeezeformer-CTC model with character-
``<NeMo_git_root>/examples/asr/conf/squeezeformer/squeezeformer_ctc_char.yaml`` and
with sub-word encoding at ``<NeMo_git_root>/examples/asr/conf/squeezeformer/squeezeformer_ctc_bpe.yaml``.

.. _Hybrid-Transducer_CTC_model:

Hybrid-Transducer-CTC
---------------------

Hybrid RNNT-CTC models is a group of models with both the RNNT and CTC decoders. Training a unified model would speedup the convergence for the CTC models and would enable
the user to use a single model which works as both a CTC and RNNT model. This category can be used with any of the ASR models.
Hybrid models uses two decoders of CTC and RNNT on the top of the encoder. The default decoding strategy after the training is done is RNNT.
User may use the ``asr_model.change_decoding_strategy(decoder_type='ctc' or 'rnnt')`` to change the default decoding.

The variant with sub-word encoding is a BPE-based model
which can be instantiated using the :class:`~nemo.collections.asr.models.EncDecHybridRNNTCTCBPEModel` class, while the
character-based variant is based on :class:`~nemo.collections.asr.models.EncDecHybridRNNTCTCModel`.

You may use the example scripts under ``<NeMo_git_root>/examples/asr/asr_hybrid_transducer_ctc`` for both the char-based encoding and sub-word encoding.
These examples can be used to train any Hybrid ASR model like Conformer, Citrinet, QuartzNet, etc.

You may find the example config files of Conformer variant of such hybrid models with character-based encoding at
``<NeMo_git_root>/examples/asr/conf/conformer/hybrid_transducer_ctc/conformer_hybrid_transducer_ctc_char.yaml`` and
with sub-word encoding at ``<NeMo_git_root>/examples/asr/conf/conformer/hybrid_transducer_ctc/conformer_hybrid_transducer_ctc_bpe.yaml``.


References
----------
Expand Down
32 changes: 32 additions & 0 deletions examples/asr/asr_hybrid_transducer_ctc/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# ASR with Hybrid Transducer/CTC Models

This directory contains example scripts to train ASR models with two decoders of Transducer and CTC Loss.

Currently supported models are -

* Character based Hybrid RNNT/CTC model
* Subword based Hybrid RNNT/CTC model

# Model execution overview

The training scripts in this directory execute in the following order. When preparing your own training-from-scratch / fine-tuning scripts, please follow this order for correct training/inference.

```mermaid

graph TD
A[Hydra Overrides + Yaml Config] --> B{Config}
B --> |Init| C[Trainer]
C --> D[ExpManager]
B --> D[ExpManager]
C --> E[Model]
B --> |Init| E[Model]
E --> |Constructor| F1(Change Vocabulary)
F1 --> F2(Setup Adapters if available)
F2 --> G(Setup Train + Validation + Test Data loaders)
G --> H1(Setup Optimization)
H1 --> H2(Change Transducer Decoding Strategy)
H2 --> I[Maybe init from pretrained]
I --> J["trainer.fit(model)"]
```

During restoration of the model, you may pass the Trainer to the restore_from / from_pretrained call, or set it after the model has been initialized by using `model.set_trainer(Trainer)`.
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
# Copyright (c) 2022, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""
# Preparing the Tokenizer for the dataset
Use the `process_asr_text_tokenizer.py` script under <NEMO_ROOT>/scripts/tokenizers/ in order to prepare the tokenizer.

```sh
python <NEMO_ROOT>/scripts/tokenizers/process_asr_text_tokenizer.py \
--manifest=<path to train manifest files, seperated by commas>
OR
--data_file=<path to text data, seperated by commas> \
--data_root="<output directory>" \
--vocab_size=<number of tokens in vocabulary> \
--tokenizer=<"spe" or "wpe"> \
--no_lower_case \
--spe_type=<"unigram", "bpe", "char" or "word"> \
--spe_character_coverage=1.0 \
--log
```

# Training the model
```sh
python speech_to_text_hybrid_rnnt_ctc_bpe.py \
# (Optional: --config-path=<path to dir of configs> --config-name=<name of config without .yaml>) \
model.train_ds.manifest_filepath=<path to train manifest> \
model.validation_ds.manifest_filepath=<path to val/test manifest> \
model.tokenizer.dir=<path to directory of tokenizer (not full path to the vocab file!)> \
model.tokenizer.type=<either bpe or wpe> \
model.aux_ctc.ctc_loss_weight=0.3 \
trainer.devices=-1 \
trainer.max_epochs=100 \
model.optim.name="adamw" \
model.optim.lr=0.001 \
model.optim.betas=[0.9,0.999] \
model.optim.weight_decay=0.0001 \
model.optim.sched.warmup_steps=2000
exp_manager.create_wandb_logger=True \
exp_manager.wandb_logger_kwargs.name="<Name of experiment>" \
exp_manager.wandb_logger_kwargs.project="<Name of project>"
```

# Fine-tune a model

For documentation on fine-tuning this model, please visit -
https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/main/asr/configs.html#fine-tuning-configurations

"""

import pytorch_lightning as pl
from omegaconf import OmegaConf

from nemo.collections.asr.models import EncDecHybridRNNTCTCBPEModel
from nemo.core.config import hydra_runner
from nemo.utils import logging
from nemo.utils.exp_manager import exp_manager


@hydra_runner(
config_path="../conf/conformer/hybrid_transducer_ctc/", config_name="conformer_hybrid_transducer_ctc_bpe"
)
def main(cfg):
logging.info(f'Hydra config: {OmegaConf.to_yaml(cfg)}')

trainer = pl.Trainer(**cfg.trainer)
exp_manager(trainer, cfg.get("exp_manager", None))
asr_model = EncDecHybridRNNTCTCBPEModel(cfg=cfg.model, trainer=trainer)

# Initialize the weights of the model from another model, if provided via config
asr_model.maybe_init_from_pretrained_checkpoint(cfg)

trainer.fit(asr_model)

if hasattr(cfg.model, 'test_ds') and cfg.model.test_ds.manifest_filepath is not None:
if asr_model.prepare_test(trainer):
trainer.test(asr_model)


if __name__ == '__main__':
main() # noqa pylint: disable=no-value-for-parameter
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
# Copyright (c) 2022, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

"""
# Training the model

Basic run (on CPU for 50 epochs):
python examples/asr/asr_transducer/speech_to_text_hybrid_rnnt_ctc.py \
# (Optional: --config-path=<path to dir of configs> --config-name=<name of config without .yaml>) \
model.train_ds.manifest_filepath="<path to manifest file>" \
model.validation_ds.manifest_filepath="<path to manifest file>" \
trainer.devices=1 \
trainer.accelerator='cpu' \
trainer.max_epochs=50


Add PyTorch Lightning Trainer arguments from CLI:
python speech_to_text_rnnt.py \
... \
+trainer.fast_dev_run=true

Hydra logs will be found in "$(./outputs/$(date +"%y-%m-%d")/$(date +"%H-%M-%S")/.hydra)"
PTL logs will be found in "$(./outputs/$(date +"%y-%m-%d")/$(date +"%H-%M-%S")/lightning_logs)"

Override some args of optimizer:
python speech_to_text_hybrid_rnnt_ctc.py \
--config-path="../conf/conformer/hybrid_transducer_ctc/conformer_hybrid_transducer_ctc" \
--config-name="config_rnnt" \
model.train_ds.manifest_filepath="./an4/train_manifest.json" \
model.validation_ds.manifest_filepath="./an4/test_manifest.json" \
trainer.devices=2 \
model.aux_ctc.ctc_loss_weight=0.3 \
trainer.precision=16 \
trainer.max_epochs=2 \
model.optim.betas=[0.8,0.5] \
model.optim.weight_decay=0.0001

Override optimizer entirely
python speech_to_text_hybrid_rnnt_ctc.py \
--config-path="../conf/conformer/hybrid_transducer_ctc/conformer_hybrid_transducer_ctc" \
--config-name="config_rnnt" \
model.train_ds.manifest_filepath="./an4/train_manifest.json" \
model.validation_ds.manifest_filepath="./an4/test_manifest.json" \
model.aux_ctc.ctc_loss_weight=0.3 \
trainer.devices=2 \
trainer.precision=16 \
trainer.max_epochs=2 \
model.optim.name=adamw \
model.optim.lr=0.001 \
~model.optim.args \
+model.optim.args.betas=[0.8,0.5]\
+model.optim.args.weight_decay=0.0005

# Fine-tune a model

For documentation on fine-tuning this model, please visit -
https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/main/asr/configs.html#fine-tuning-configurations

"""

import pytorch_lightning as pl
from omegaconf import OmegaConf

from nemo.collections.asr.models import EncDecHybridRNNTCTCModel
from nemo.core.config import hydra_runner
from nemo.utils import logging
from nemo.utils.exp_manager import exp_manager


@hydra_runner(config_path="../conf/conformer/hybrid_transducer_ctc/", config_name="conformer_hybrid_transducer_ctc")
def main(cfg):
logging.info(f'Hydra config: {OmegaConf.to_yaml(cfg)}')

trainer = pl.Trainer(**cfg.trainer)
exp_manager(trainer, cfg.get("exp_manager", None))
asr_model = EncDecHybridRNNTCTCModel(cfg=cfg.model, trainer=trainer)

# Initialize the weights of the model from another model, if provided via config
asr_model.maybe_init_from_pretrained_checkpoint(cfg)

trainer.fit(asr_model)

if hasattr(cfg.model, 'test_ds') and cfg.model.test_ds.manifest_filepath is not None:
if asr_model.prepare_test(trainer):
trainer.test(asr_model)


if __name__ == '__main__':
main() # noqa pylint: disable=no-value-for-parameter
Loading