Skip to main content
ertius.org

let someone else worry about yer secrets

I've been playing around with llm the last few days, and it's quite neat. It can use both local models and remote models, and for remote models like OpenAI, it takes an auth key. I was also using some other things that needed the key from the environment, but I hate having to put secrets in shell config, so I wanted to just get the key from llm's config.

Had a look, and it's just JSON: keys.json is a file that looks like this:

{
  "openai": "sk-1234567890"
}

jq is a very handle tool for extracting things from json structures via a path (yq is a handy equivalent for yaml). In this case, the path to extract is just .openai. So, we can do this:

export OPENAI_API_KEY="$(jq -r .openai ~/"Library/Application Support/io.datasette.llm/keys.json")"

(-r means "raw" - i.e. don't quote the output, took me a while to realise the error from tools about invalid key "sk-1234567890" was about the "", which was from jq).