Skip to content

Comments

Make fine-tuning code runs properly in non-distributed mode.#1478

Closed
JieShenAI wants to merge 1 commit intoFlagOpen:masterfrom
JieShenAI:master
Closed

Make fine-tuning code runs properly in non-distributed mode.#1478
JieShenAI wants to merge 1 commit intoFlagOpen:masterfrom
JieShenAI:master

Conversation

@JieShenAI
Copy link

To avoid the error ValueError: Default process group has not been initialized, ensure the code can run correctly in non-distributed mode (e.g., during debugging or single-machine training).

… single-machine training) to avoid the error: `ValueError: Default process group has not been initialized`.
@hanhainebula
Copy link
Collaborator

Hello, @JieShenAI. Thanks for your suggestion and PR! I have solved this issue in this PR: #1532

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants