Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mixtral to NeMo conversion script. #8155

Merged
merged 17 commits into from
Jan 27, 2024

Conversation

akoumpa
Copy link
Member

@akoumpa akoumpa commented Jan 11, 2024

What does this PR do ?

Add a one line overview of what this PR aims to accomplish.

Collection: [Note which collection this PR will affect]

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

  • You can potentially add a usage example below
# Add a code snippet demonstrating how to use this 

Jenkins CI

To run Jenkins, a NeMo User with write access must comment jenkins on the PR.

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
  • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

@github-actions github-actions bot added the NLP label Jan 11, 2024
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch 2 times, most recently from 85a4249 to eccdece Compare January 11, 2024 21:11
@cdj0311
Copy link

cdj0311 commented Jan 12, 2024

Could you provide the convert NeMo to huggingface script?

@akoumpa
Copy link
Member Author

akoumpa commented Jan 12, 2024

@cdj0311 will try to have the NeMo to HF by next week

@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from eccdece to e32c3f2 Compare January 17, 2024 00:22
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch 3 times, most recently from 10804f8 to e22b72e Compare January 17, 2024 21:40
@akoumpa akoumpa marked this pull request as ready for review January 18, 2024 17:58
@akoumpa akoumpa requested a review from ericharper January 18, 2024 17:58
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch 5 times, most recently from 65c3935 to 3f985b6 Compare January 19, 2024 09:54
@ericharper
Copy link
Collaborator

jenkins

@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from 4c45018 to a5ba328 Compare January 20, 2024 00:27
@github-actions github-actions bot added core Changes to NeMo Core CI common labels Jan 20, 2024
@ericharper
Copy link
Collaborator

jenkins

@github-actions github-actions bot removed core Changes to NeMo Core CI common labels Jan 22, 2024
Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from d70c027 to d926b1a Compare January 22, 2024 19:29
@ericharper
Copy link
Collaborator

jenkins

1 similar comment
@ericharper
Copy link
Collaborator

jenkins

@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from aa09f9e to 171f37f Compare January 25, 2024 19:55
akoumpa and others added 2 commits January 26, 2024 10:42
Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
for more information, see https://pre-commit.ci

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from b3af500 to 02a2a2f Compare January 26, 2024 18:44
@akoumpa
Copy link
Member Author

akoumpa commented Jan 26, 2024

jenkins

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
@akoumpa
Copy link
Member Author

akoumpa commented Jan 26, 2024

jenkins

@github-actions github-actions bot added the CI label Jan 26, 2024
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from 33a71fd to 2ee101b Compare January 26, 2024 21:58
Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
@akoumpa akoumpa force-pushed the mistral_moe_support/akoumparouli branch from f3c8be2 to efdd42e Compare January 26, 2024 22:01
@akoumpa
Copy link
Member Author

akoumpa commented Jan 26, 2024

jenkins

@akoumpa
Copy link
Member Author

akoumpa commented Jan 26, 2024

jenkins

Copy link
Collaborator

@ericharper ericharper left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@ericharper ericharper merged commit 13c1db4 into NVIDIA:main Jan 27, 2024
11 checks passed
yaoyu-33 pushed a commit that referenced this pull request Jan 31, 2024
* HF-Mixtral to NeMo conversion script.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Pass MoE options from NeMo config to TransformerConfig.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Add version check for get_gpt_layer_with_transformer_engine_spec

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Determine MoE support by attempting to import MoETokenDispatcher.

Using importlib.metadata.version would be an alternative, however,
a) it requires having mcore installed via pip (not always the case)
and b) one might override megatron's location (e.g. via PYTHONPATH)
and as a result would get inaccurate version from importlib.metadata.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-NeMo to Mixtral-HF converter.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* fixup: Update mcore_supports_moe due to file rename in upcoming MoE

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-converters: use `set_expert_model_parallel_world_size` to specify MoE world size.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Fix import

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Jenkins: install lightning.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Match latest MoE parameter names.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Eric Harper <complex451@gmail.com>
stevehuang52 pushed a commit that referenced this pull request Jan 31, 2024
* HF-Mixtral to NeMo conversion script.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Pass MoE options from NeMo config to TransformerConfig.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Add version check for get_gpt_layer_with_transformer_engine_spec

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Determine MoE support by attempting to import MoETokenDispatcher.

Using importlib.metadata.version would be an alternative, however,
a) it requires having mcore installed via pip (not always the case)
and b) one might override megatron's location (e.g. via PYTHONPATH)
and as a result would get inaccurate version from importlib.metadata.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-NeMo to Mixtral-HF converter.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* fixup: Update mcore_supports_moe due to file rename in upcoming MoE

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-converters: use `set_expert_model_parallel_world_size` to specify MoE world size.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Fix import

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Jenkins: install lightning.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Match latest MoE parameter names.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Eric Harper <complex451@gmail.com>
Signed-off-by: stevehuang52 <heh@nvidia.com>
@janekl
Copy link
Collaborator

janekl commented Feb 12, 2024

Hi @ericharper. I browsed CodeQL suggestions in this pull request. This is a good example when Flake8 is better than CodeQL: CodeQL fails to detect undefined variables (a simple issue), see head_num here, while Flake8 catches them. I strongly suggest to start using Flake8 complementary to CodeQL in CI.

@janekl
Copy link
Collaborator

janekl commented Feb 12, 2024

Another example are the unused imports from megatron.core.utils import init_method_normal, scaled_init_method_normal here. This file is frequently edited btw. Possibly CodeQL misses this due to import guards, but I'm not sure. cc @ericharper

ssh-meister pushed a commit to ssh-meister/NeMo that referenced this pull request Feb 15, 2024
* HF-Mixtral to NeMo conversion script.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Pass MoE options from NeMo config to TransformerConfig.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Add version check for get_gpt_layer_with_transformer_engine_spec

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Determine MoE support by attempting to import MoETokenDispatcher.

Using importlib.metadata.version would be an alternative, however,
a) it requires having mcore installed via pip (not always the case)
and b) one might override megatron's location (e.g. via PYTHONPATH)
and as a result would get inaccurate version from importlib.metadata.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-NeMo to Mixtral-HF converter.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* fixup: Update mcore_supports_moe due to file rename in upcoming MoE

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-converters: use `set_expert_model_parallel_world_size` to specify MoE world size.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Fix import

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Jenkins: install lightning.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Match latest MoE parameter names.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Eric Harper <complex451@gmail.com>
Signed-off-by: Sasha Meister <ameister@nvidia.com>
pablo-garay pushed a commit that referenced this pull request Mar 19, 2024
* HF-Mixtral to NeMo conversion script.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Pass MoE options from NeMo config to TransformerConfig.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Add version check for get_gpt_layer_with_transformer_engine_spec

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Determine MoE support by attempting to import MoETokenDispatcher.

Using importlib.metadata.version would be an alternative, however,
a) it requires having mcore installed via pip (not always the case)
and b) one might override megatron's location (e.g. via PYTHONPATH)
and as a result would get inaccurate version from importlib.metadata.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-NeMo to Mixtral-HF converter.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* fixup: Update mcore_supports_moe due to file rename in upcoming MoE

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-converters: use `set_expert_model_parallel_world_size` to specify MoE world size.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Fix import

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Jenkins: install lightning.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Match latest MoE parameter names.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Eric Harper <complex451@gmail.com>
Signed-off-by: Pablo Garay <pagaray@nvidia.com>
rohitrango pushed a commit to rohitrango/NeMo that referenced this pull request Jun 25, 2024
* HF-Mixtral to NeMo conversion script.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Pass MoE options from NeMo config to TransformerConfig.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Add version check for get_gpt_layer_with_transformer_engine_spec

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Determine MoE support by attempting to import MoETokenDispatcher.

Using importlib.metadata.version would be an alternative, however,
a) it requires having mcore installed via pip (not always the case)
and b) one might override megatron's location (e.g. via PYTHONPATH)
and as a result would get inaccurate version from importlib.metadata.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-NeMo to Mixtral-HF converter.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* fixup: Update mcore_supports_moe due to file rename in upcoming MoE

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Mixtral-converters: use `set_expert_model_parallel_world_size` to specify MoE world size.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Fix import

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Jenkins: install lightning.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* Match latest MoE parameter names.

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Eric Harper <complex451@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants