Skip to content

fix unused parameters error for distributed training with multihead#1311

Open
bernstei wants to merge 1 commit intomainfrom
unused_parameters_readout_bias
Open

fix unused parameters error for distributed training with multihead#1311
bernstei wants to merge 1 commit intomainfrom
unused_parameters_readout_bias

Conversation

@bernstei
Copy link
Copy Markdown
Collaborator

When copying readouts from foundations models, don't copy if tensor has size 0, to avoid unused parameters error. Checking only for is not None led to unused parameter errors related to bias tensors

closes #1304

…as size 0, to avoid unused parameters error
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

multihead fine-tuning + distributed training failing with an error about unused parameters

1 participant