-
Notifications
You must be signed in to change notification settings - Fork 856
Issues: meta-llama/llama-models
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Issue Title: Inconsistent Tool Calling Behavior with LLaMA 3.1 70B Model on AWS Bedrock
#229
opened Nov 29, 2024 by
nileshmalode11
NotADirectoryError: [WinError 267] The directory name is invalid:
#212
opened Nov 7, 2024 by
taras-kamo
Proper Loading and Usage of meta-llama/Llama-3.2-1B-Instruct-SpinQuant_INT4_EO8 Model
#206
opened Nov 4, 2024 by
l-bat
Unable to determine the device handle for GPU0000:17:00.0: Unknown Error
#200
opened Oct 27, 2024 by
Fujiaoji
OSError with Llama3.2-3B-Instruct-QLORA_INT4_EO8 - missing files?
#194
opened Oct 25, 2024 by
StephenQuirolgico
Llama verification link 403 forbidden right after recieving the email providing me access
#190
opened Oct 24, 2024 by
RobKy1969
Model used as RAG generates questions with answer instead of just answer to user's query
#179
opened Oct 17, 2024 by
myke11j
Non Deterministic response generated for same set of config
#176
opened Oct 15, 2024 by
aabbhishekksr
Question about Llama3.2 tool call prompt/training (pythonic output)
#169
opened Oct 4, 2024 by
gregnr
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.