-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(moe): support isp for moe #57
Draft
blankde
wants to merge
18
commits into
InternLM:develop
Choose a base branch
from
blankde:feat/support_wp_for_moe
base: develop
Could not load branches
Branch not found: {{ refName }}
Could not load tags
Nothing to show
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
configs/7B_MoE4_sft.py
Outdated
2. overlap: bool, enable/disable all_gather/reduce_scatter communication overlap, defaults to False. | ||
3. memory_pool: bool, enable/disable memory pool, defaults to False. | ||
expert parallel (dict): | ||
1. size: int, the size of expert parallel, each device would save {num_expert/ep_size} local experts. | ||
expert parallel (dict): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
应该改成 expert weight parallel (dict): ?
internlm/utils/parallel.py
Outdated
@@ -71,6 +72,14 @@ def is_tensor_expert_data_parallel_parameter(p): | |||
) | |||
|
|||
|
|||
def is_weight_expert_data_parallel_parameter(p): | |||
return ( | |||
gpc.is_initialized(ParallelMode.TENSOR) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里应该是gpc.is_initialized(ParallelMode.WEIGHT)?
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily get feedback. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.
Motivation
This PR supports weight parallel for moe. If isp is used, then the ws(world size) =wps*eps*edps, otherwise, ws = tp*eps*edps. Related to #44
Modification
BC-breaking (Optional)
Does the modification introduce changes that break the backward compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.
Use cases (Optional)
If this PR introduces a new feature, it is better to list some use cases here and update the documentation.
Checklist
Before PR:
After PR: