Store an LLM credential
Agents need a language model to answer with. Hiveloom stores provider secrets in an encrypted vault. Secrets never go on the command line as arguments.
Quick rule
Hiveloom picks the provider from the agent’s model ID:
- Models starting with
claude-use the credential namedanthropic. - Everything else uses the credential named
openai.
For the guided path, store one of these:
| Provider | Credential name | Model ID examples |
|---|---|---|
| Anthropic | anthropic | claude-sonnet-4-20250514 |
| OpenAI | openai | gpt-4o, gpt-4o-mini |
Set the credential
Three ways to feed the secret — never as a CLI argument:
# 1. From a file
hiveloom credential set anthropic --from-file ~/anthropic.key
# 2. From an environment variable
hiveloom credential set anthropic --from-env ANTHROPIC_API_KEY
# 3. From stdin (convenient for one-off pipes)
echo "sk-ant-..." | hiveloom credential set anthropicFor OpenAI, replace anthropic with openai.
Want OpenRouter, Groq, Ollama, or another OpenAPI-compatible provider?
Hiveloom ships an OpenAPI-spec client that can be pointed at any provider that
implements OpenAI’s Chat Completions schema — same openai credential,
different base URL.
Set HIVELOOM_OPENAI_BASE_URL. Includes a 9-row matrix and the caveats.
Verify
hiveloom credential listThe table shows credential names and metadata but never the values.
Rotate or remove
cat ~/new-anthropic.key | hiveloom credential rotate anthropic
hiveloom credential remove anthropicTroubleshooting
credential 'anthropic' not found— the agent model starts withclaude-, so Hiveloom is looking for the credential namedanthropic.credential 'openai' not found— the agent model does not start withclaude-, so Hiveloom is looking foropenai.- Auth failure at chat time — the key is stored but invalid or revoked. Rotate it.
Next: Create the agent.
Last updated on