-
Notifications
You must be signed in to change notification settings - Fork 28.6k
Issues: huggingface/transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Improve Request for a new feature
auxiliary_in_channels
default behavior in UperNet
Feature request
#37345
opened Apr 7, 2025 by
simonreise
Llama4TextExperts module implementation
bug
Usage
General questions about the library
#37325
opened Apr 6, 2025 by
Godofnothing
Llama 4: eager attention results in wrong casual mask shape
bug
#37322
opened Apr 6, 2025 by
Qubitium
4 tasks
Lama4 scout. Any chance it could ever be in the browser?
Feature request
Request for a new feature
#37316
opened Apr 6, 2025 by
hpssjellis
OSError: meta-llama/Llama-4-Scout-17B-16E-Instruct does not appear to have a file named X
bug
#37314
opened Apr 6, 2025 by
sam-h-bean
4 tasks
AutoModel.from_pretrained without accelerate raises a NameError because
init_empty_weights
is not available
#37311
opened Apr 5, 2025 by
LoicGrobol
push_to_hub()
for Llama 3.1 8B doesn't save lm_head.weight
tensor
bug
#37303
opened Apr 5, 2025 by
wizeng23
2 of 4 tasks
**ValueError: Unrecognized model in lmsys/vicuna-7b-v1.5. Should have a
model_type
key**
bug
#37302
opened Apr 5, 2025 by
ZhanliangAaronWang
4 tasks
[torch-xla 2.7] Change xm.xrt_world_size() to xr.world_size(). xm.get_ordinal() to xr.global_ordinal()
bug
#37301
opened Apr 5, 2025 by
jeffhataws
1 of 4 tasks
flex_attention support for Qwen2.5/Gemma is broken
bug
#37299
opened Apr 5, 2025 by
flukeskywalker
2 of 4 tasks
https://huggingface.co/hf-internal-testing tiny random models need to be converted to safetensors
#37296
opened Apr 4, 2025 by
sfc-gh-sbekman
[i18n-<Transformers>] Translating docs to <Tibetain>
WIP
Label your PR/Issue with WIP for some long outstanding Issues/PRs that are work in progress
#37293
opened Apr 4, 2025 by
OSUer600
10 tasks
Add support for higher jax and flax version
Feature request
Request for a new feature
Flax
#37262
opened Apr 3, 2025 by
rxng8
Incorrect word timestamps and word repetitions with Whisper-Large-v3-turbo model
bug
#37248
opened Apr 3, 2025 by
Asma-droid
1 of 4 tasks
Inconsistent results between torch and jax versions of DINOv2
bug
Flax
#37246
opened Apr 3, 2025 by
MasterXiong
2 of 4 tasks
Previous Next
ProTip!
Adding no:label will show everything without a label.