Fix: Linear8bitLt crash with tied weights (skip_modules)#1865
Open
TimDettmers wants to merge 3 commits intomainfrom
Open
Fix: Linear8bitLt crash with tied weights (skip_modules)#1865TimDettmers wants to merge 3 commits intomainfrom
TimDettmers wants to merge 3 commits intomainfrom
Conversation
Worker agents were running the full test suite (10+ min) which is wasteful when only a small area of code changed. Updated the completion workflow to instruct agents to run only relevant test files/functions. The full suite will be run separately later. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
When lm_head is converted to Linear8bitLt but its weight is tied to an embedding layer, weight tying replaces the Int8Params with a regular nn.Parameter that lacks SCB/CB attributes, causing AttributeError. Add an isinstance check in Linear8bitLt.forward() to fall back to F.linear when the weight is not Int8Params (e.g. due to weight tying). Also make _save_to_state_dict and _load_from_state_dict robust to non-Int8Params weights via getattr defaults. Fixes #1634 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
AttributeError: 'Parameter' object has no attribute 'SCB'when usingllm_int8_skip_moduleswithoutlm_headon models with tied weights (e.g. OPT)isinstance(self.weight, Int8Params)guard inLinear8bitLt.forward()to fall back toF.linearwhen weight is a plainParameter(e.g. due to weight tying with embedding layer)_save_to_state_dictand_load_from_state_dictrobust to non-Int8Paramsweights viagetattrdefaultsTest plan
test_linear8bitlt_tied_weights_no_crashpasses on CPU and CUDAtest_linear8bitlt.pytests remain unaffectedFixes #1634
🤖 Generated with Claude Code