We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
transformers/src/transformers/models/llama4/modeling_llama4.py
Line 1096 in d1b9236
Im pretty sure this should be:
self.fc2 = nn.Linear(config.projector_input_dim, config.projector_output_dim, bias=False)
No response
examples
Nothing to reproduce, I just noticed a shape mistake, but i havent had a chance to run it personally yet!
The output of fc1 should be the input of fc2, unless there is some hidden logic that I am missing
The text was updated successfully, but these errors were encountered:
I now see the input and output are the same in the config, but still this can be a bug if someone changed that!
Sorry, something went wrong.
cc @ArthurZucker but this looks like a bug, yes!
No branches or pull requests
System Info
transformers/src/transformers/models/llama4/modeling_llama4.py
Line 1096 in d1b9236
Im pretty sure this should be:
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Nothing to reproduce, I just noticed a shape mistake, but i havent had a chance to run it personally yet!
Expected behavior
The output of fc1 should be the input of fc2, unless there is some hidden logic that I am missing
The text was updated successfully, but these errors were encountered: