Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[shardformer] adapted T5 and LLaMa test to use kit #4049

Merged

Conversation

FrankLeeeee
Copy link
Contributor

@FrankLeeeee FrankLeeeee commented Jun 20, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

Fixed #4044
Fixed #4037

This PR is a continuation of #4045 and #4036

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

This PR adapted the T5 and LLaMA tests to use the tests/kit/model_zoo. The changes can be summarized as:

  1. added a loss_fn in the model_zoo to compute a dummy loss from the model output
  2. extracted the common functions for test_model
  3. integrated model_zoo to test_model

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@FrankLeeeee FrankLeeeee added testing related to our testing shardformer labels Jun 20, 2023
@FoolPlayer FoolPlayer merged commit c3ba3e1 into hpcaitech:feature/shardformer Jun 21, 2023
FoolPlayer pushed a commit to FoolPlayer/ColossalAI that referenced this pull request Jun 21, 2023
* [shardformer] adapted T5 and LLaMa test to use kit

* polish code
FrankLeeeee added a commit that referenced this pull request Jun 26, 2023
* [shardformer] adapted T5 and LLaMa test to use kit

* polish code
flybird11111 pushed a commit to flybird11111/ColossalAI that referenced this pull request Jul 3, 2023
* [shardformer] adapted T5 and LLaMa test to use kit

* polish code
FrankLeeeee added a commit that referenced this pull request Jul 4, 2023
* [shardformer] adapted T5 and LLaMa test to use kit

* polish code
ver217 pushed a commit to ver217/ColossalAI that referenced this pull request Jul 13, 2023
* [shardformer] adapted T5 and LLaMa test to use kit

* polish code
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
shardformer testing related to our testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[shardformer] support T5 in shardformer [FEATURE]: support llama in shardformer
2 participants