-
Notifications
You must be signed in to change notification settings - Fork 2.6k
Issues: UKPLab/sentence-transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Fix JSON Serialization Error in TrainerState due to np.float32
#3250
opened Feb 27, 2025 by
belo-involead
Loading a PEFT model uses Something isn't working
peft.peft_model.PeftModelForFeatureExtraction
; incompatible with ST PEFT Methods
bug
#3247
opened Feb 26, 2025 by
tomaarsen
torchrun leads to 0.0 loss/grad for various losses
bug
Something isn't working
#3242
opened Feb 21, 2025 by
ccdv-ai
Use "optuna" for HPO but the grad_norm becomes 0 after first trial
bug
Something isn't working
#3240
opened Feb 19, 2025 by
chz816
Increase in model training time after upgrading sentence-transformers to 3.4.1
#3237
opened Feb 17, 2025 by
gnatesan
[New Feature] extending SoftmaxLoss.py to accept for class probabilities in target
#3228
opened Feb 11, 2025 by
i-plusplus
Update transformers library into latest version due to security vulnerability.
#3215
opened Feb 4, 2025 by
bannarisoftwares
HPO torch.OutOfMemoryError despite large GPU VRAM (model seems to load model 3 times to gpu)
#3214
opened Feb 4, 2025 by
maayansharon10
Issues loading model to
SentenceTransformer
without passing HF token
#3212
opened Feb 4, 2025 by
nathan-az
CLS token mode returns PAD tokens instead of CLS tokens when
padding_side
is left
#3208
opened Feb 1, 2025 by
FremyCompany
OSError from bad request between sentence-transformers and huggingface_hub
#3206
opened Jan 31, 2025 by
Gwenn-LR
a large number of all-zero tensors appeared in the encoded tensors
#3195
opened Jan 25, 2025 by
rangehow
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.