Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Safe and pragmatic way to use API key #6

Open
flavour-of-qualia opened this issue Nov 7, 2023 · 3 comments
Open

Safe and pragmatic way to use API key #6

flavour-of-qualia opened this issue Nov 7, 2023 · 3 comments
Labels
enhancement New feature or request

Comments

@flavour-of-qualia
Copy link

I am concerned about setting my openai API key as global env. WIl other packages be able to use it without my confirmation?

I like how llm manages it with "llm keys set openai". It would be great to have here the same.

@kquinsland
Copy link

WIl other packages be able to use it without my confirmation?

They could do this with llm as well since the key is just stored in plain text json file on disk. You could wrap llm and ospeak with an alias that also pulls in the API key from a password manager if you're that concerned. Here's an example of how this can work with 1password

Having said all that...

It would be great to have here the same.

Agreed. Here's where llm implements the keys command: https://github.com/simonw/llm/blob/main/llm/cli.py#L470. I've not got any spare time to fork and add but it shouldn't be that difficult for somebody with a spare hour or two.

@simonw
Copy link
Owner

simonw commented Nov 7, 2023

Yeah I'm not a huge fan of the environment variable thing either.

I'd like to make this double as an LLM plugin - if you install both this and LLM in the same environment you could run llm ospeak - at which point it would make sense for it to
optionally use the LLM API key as well.

This tool uses a different incompatible version of the OpenAI Python library though so I'd need to fix that first:

@simonw simonw added the enhancement New feature or request label Nov 7, 2023
@mislav
Copy link

mislav commented Nov 7, 2023

With something like 1password-cli, or basically anything that allows fetching a credential from the command line, there is a simple workaround for interactive shells: making a function wrapper.

ospeak() {
  local token="${OPENAI_API_KEY:-}"
  if [[ -z $token && $1 != "--help" && $1 != "--version" ]]; then 
    token="$(op read 'op://Personal/OpenAI API Key/api key')" || return 1
  fi
  OPENAI_API_KEY="$token" command ospeak "$@"
}

Explanation: the op command (1password-cli) reads the 'api key' field from the item named "OpenAI API Key" in my Personal vault.

With this approach, OPENAI_API_KEY is only ever 1) fetched on-demand, and 2) exported to the ospeak command and nothing else. This is essentially the same mechanism that enables 1Password shell plugins, with a small difference that their plugins work by defining shell aliases instead of functions. 1Password shell plugins, however, are only available for a predefined set of tools that specific plugins exist for, so they are useless for a small, newer utility like ospeak.

The downside, of course, is that every ospeak invocation will now have a small overhead. On my machine, the op command takes about 0.75s to fetch an API key from a vault.

With macOS Keychain instead of 1Password, for example, the command to fetch the credential could be as following:

security find-generic-password -s api.openai.com -w

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants