Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get ollama.ps to respond similarly to ollama ps #166

Open
remon-nashid opened this issue Nov 21, 2024 · 1 comment
Open

Get ollama.ps to respond similarly to ollama ps #166

remon-nashid opened this issue Nov 21, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@remon-nashid
Copy link

remon-nashid commented Nov 21, 2024

Besides returning the list response, can it specify the gpu/cpu percentages? Figuring out how much of the model is loaded into GPU is not as clear cut as dividing the size_vram by vram size.

@BruceMacD BruceMacD added the enhancement New feature or request label Nov 21, 2024
@BruceMacD
Copy link
Collaborator

Hi @remon-nashid, thanks for the feature request. For reference, these values aren't actually returned from the Ollama API, they are found by the CLI/client. You can see how the Ollama CLI does it in Go here:
https://github.com/ollama/ollama/blob/723f285813f504375f0e6be6c76edfbaaabd961f/cmd/cmd.go#L670

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants