lovelace setup
Configure AI providers with an interactive setup wizard
The setup command provides a guided, interactive wizard for configuring LLM (Large Language Model) providers like Anthropic Claude, OpenAI GPT, Google Gemini, and local models. It automatically detects installed providers, tests connections, and stores credentials securely.
Overview
The setup wizard helps you:
- Detect providers automatically - Finds installed LLM providers on your system
- Configure API keys - Securely stores credentials for cloud providers
- Test connections - Validates provider configurations before saving
- Set defaults - Choose your preferred provider and model
- First-run experience - Automatically launches on first CLI use
The wizard runs in an interactive terminal UI with step-by-step guidance, making provider configuration straightforward even for first-time users.
Usage
lovelace setup [options]
Options
| Option | Description | Default |
|---|---|---|
--skip-detection | Skip automatic provider detection | false |
--force | Force re-setup even if providers are configured | false |
--status | Show provider status without making changes | false |
Interactive Examples
First-Time Setup
When you run Lovelace CLI for the first time, the setup wizard launches automatically:
$ lovelace chat
┌─ Welcome to Lovelace CLI ──────────────────────────────┐
│ │
│ 👋 Welcome! Let's get you set up with AI providers │
│ │
│ This wizard will: │
│ • Detect installed LLM providers │
│ • Configure your API keys │
│ • Test connections │
│ • Set your default provider │
│ │
└─────────────────────────────────────────────────────────┘
🔍 Detecting providers...
Found 3 providers:
✓ Anthropic Claude (API key detected)
✓ OpenAI GPT (API key detected)
✓ Ollama (local, no credentials needed)
Press Enter to continue...
Manual Setup
Run setup manually to add or update providers:
$ lovelace setup
┌─ Provider Configuration ───────────────────────────────┐
│ │
│ Current providers: │
│ • Anthropic Claude (configured) │
│ • OpenAI GPT (not configured) │
│ • Google Gemini (not configured) │
│ • Ollama (local, available) │
│ │
│ What would you like to do? │
│ │
│ › Configure new provider │
│ Update existing provider │
│ Set default provider │
│ Test connections │
│ Exit │
│ │
└─────────────────────────────────────────────────────────┘
Use ↑↓ arrows to navigate, Enter to select
Configuring a Provider
Step-by-step provider configuration:
┌─ Configure Provider ───────────────────────────────────┐
│ │
│ Select provider to configure: │
│ │
│ › Anthropic Claude │
│ OpenAI GPT │
│ Google Gemini │
│ Local Ollama │
│ │
└─────────────────────────────────────────────────────────┘
[Selected: Anthropic Claude]
┌─ Anthropic Configuration ──────────────────────────────┐
│ │
│ API Key: sk-ant-api03-**************************** │
│ │
│ Where to find your API key: │
│ → Visit https://console.anthropic.com/account/keys │
│ │
│ Enter your Anthropic API key: │
│ _ │
│ │
└─────────────────────────────────────────────────────────┘
[After entering API key]
✓ API key saved
🧪 Testing connection...
┌─ Connection Test ──────────────────────────────────────┐
│ │
│ Testing Anthropic Claude... │
│ │
│ ✓ Authentication successful │
│ ✓ Available models: │
│ • claude-sonnet-4-5 │
│ • claude-3-opus-20240229 │
│ • claude-3-sonnet-20240229 │
│ • claude-3-haiku-20240307 │
│ │
│ Connection test passed! ✓ │
│ │
└─────────────────────────────────────────────────────────┘
Set as default provider? (Y/n): y
✓ Anthropic Claude configured and set as default
Configuring Local Provider (Ollama)
Local providers don't require API keys:
┌─ Configure Provider ───────────────────────────────────┐
│ │
│ Selected: Ollama (Local) │
│ │
│ Ollama runs locally and doesn't require an API key. │
│ │
│ Checking Ollama installation... │
│ │
└─────────────────────────────────────────────────────────┘
✓ Ollama detected at http://localhost:11434
🧪 Testing connection...
┌─ Connection Test ──────────────────────────────────────┐
│ │
│ Testing Ollama... │
│ │
│ ✓ Server responding │
│ ✓ Available models: │
│ • llama2 │
│ • codellama │
│ • mistral │
│ │
│ Connection test passed! ✓ │
│ │
└─────────────────────────────────────────────────────────┘
Set as default provider? (Y/n): y
✓ Ollama configured and set as default
Checking Status
View provider status without making changes:
$ lovelace setup --status
Provider Configuration Status
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Configured Providers:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
✓ Anthropic Claude (default)
Status: Connected
Models: 4 available
API Key: sk-ant-api03-****....**** (configured)
✓ Ollama (local)
Status: Connected
Models: 3 available
Endpoint: http://localhost:11434
Not Configured:
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
✗ OpenAI GPT
Status: No API key configured
Setup: lovelace setup
✗ Google Gemini
Status: No API key configured
Setup: lovelace setup
Default Provider: Anthropic Claude (claude-sonnet-4-5)
Force Re-setup
Re-run setup even when providers are already configured:
$ lovelace setup --force
⚠️ Warning: Existing Configuration Detected
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
You already have providers configured:
• Anthropic Claude (default)
• Ollama (local)
Re-running setup will allow you to:
• Update API keys
• Add new providers
• Change default provider
Continue with setup? (Y/n): y
[Setup wizard proceeds...]
Skip Provider Detection
Skip automatic detection and configure manually:
$ lovelace setup --skip-detection
┌─ Manual Provider Configuration ────────────────────────┐
│ │
│ Provider detection skipped. │
│ │
│ Select provider to configure: │
│ │
│ › Anthropic Claude │
│ OpenAI GPT │
│ Google Gemini │
│ Ollama │
│ │
└─────────────────────────────────────────────────────────┘
Common Use Cases
Initial CLI Setup
First time using Lovelace CLI:
$ lovelace chat
[Setup wizard launches automatically]
✓ Provider detection complete
✓ Anthropic Claude configured
✓ Default provider set
Ready to use Lovelace CLI!
Adding a New Provider
Add provider to existing configuration:
$ lovelace setup
[Select "Configure new provider"]
[Choose OpenAI GPT]
[Enter API key]
✓ OpenAI GPT configured
✓ 2 providers now available
Switching Default Provider
Change your preferred provider:
$ lovelace setup
[Select "Set default provider"]
Current default: Anthropic Claude
Available providers:
› Anthropic Claude (current default)
OpenAI GPT
Ollama
[Select OpenAI GPT]
✓ Default provider changed to OpenAI GPT
Testing Provider Connections
Verify all providers are working:
$ lovelace setup
[Select "Test connections"]
Testing providers...
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
✓ Anthropic Claude
Response time: 245ms
Models: 4 available
✓ OpenAI GPT
Response time: 312ms
Models: 5 available
✗ Google Gemini
Error: API key invalid
Fix: Re-enter API key
✓ Ollama
Response time: 89ms
Models: 3 available
Results: 3 passed, 1 failed
Updating API Keys
Update credentials for existing provider:
$ lovelace setup
[Select "Update existing provider"]
[Choose Anthropic Claude]
Current API key: sk-ant-api03-****....**** (ends Dec 2025)
Enter new API key: _
[Enter new key]
✓ API key updated
🧪 Testing connection...
✓ Connection successful
Provider Support
Cloud Providers
| Provider | API Key Required | Models Available | Setup Guide |
|---|---|---|---|
| Anthropic Claude | Yes | claude-sonnet-4-5, opus, sonnet, haiku | Get API key |
| OpenAI GPT | Yes | gpt-4, gpt-4-turbo, gpt-3.5-turbo | Get API key |
| Google Gemini | Yes | gemini-pro, gemini-ultra | Get API key |
Local Providers
| Provider | Installation | Models Available | Setup Guide |
|---|---|---|---|
| Ollama | Local server | llama2, codellama, mistral, etc. | Install Ollama |
Configuration Storage
Provider configurations are stored securely:
~/.lovelace/
├── config.json # Provider settings and defaults
└── credentials/ # Encrypted API keys
├── anthropic.key
├── openai.key
└── google.key
Security:
- API keys are encrypted at rest
- Credentials never logged or transmitted
- File permissions restricted to user only
- Keys stored separately from config
Environment Variables
| Variable | Description | Default |
|---|---|---|
ANTHROPIC_API_KEY | Anthropic API key | - |
OPENAI_API_KEY | OpenAI API key | - |
GOOGLE_API_KEY | Google AI API key | - |
OLLAMA_HOST | Ollama server endpoint | http://localhost:11434 |
Note: Environment variables take precedence over stored credentials.
Exit Codes
| Code | Meaning |
|---|---|
0 | Setup completed successfully |
1 | Setup failed or was cancelled |
2 | Interactive mode not supported (CI/non-TTY environment) |
Tips & Best Practices
Keeping API Keys Secure
Store in environment variables:
# Add to ~/.bashrc or ~/.zshrc
export ANTHROPIC_API_KEY="sk-ant-api03-..."
export OPENAI_API_KEY="sk-..."
# Then run setup without manual entry
lovelace setup
Rotate keys regularly:
# Update keys periodically
lovelace setup
# Select "Update existing provider"
Using Multiple Providers
Configure all providers you have access to:
# Configure all three
lovelace setup
# Set up Anthropic, OpenAI, Gemini
# Switch between them easily
lovelace chat --provider openai
lovelace chat --provider anthropic
Local Development with Ollama
Run entirely offline using Ollama:
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Start Ollama
ollama serve
# Configure in Lovelace
lovelace setup
# Select Ollama, set as default
# No API costs, full privacy
lovelace chat --provider ollama
Verifying Setup
Check configuration is working:
# 1. Check status
lovelace setup --status
# 2. Test a simple chat
lovelace chat
# 3. Verify provider switching
lovelace chat --provider openai
lovelace chat --provider anthropic
Troubleshooting
Setup Wizard Won't Launch
$ lovelace setup
Error: Interactive mode not supported in this terminal.
Solution: You're in a non-interactive environment (CI/CD, piped output)
Use status-only mode: lovelace setup --status
API Key Invalid
✗ Connection test failed
Error: Invalid API key
Solution:
1. Verify API key is correct (check for extra spaces)
2. Ensure key has proper permissions
3. Check if key has expired
4. Re-run setup: lovelace setup --force
Provider Not Detected
🔍 Detecting providers...
⚠️ Warning: No providers detected
Solution:
1. Install provider (e.g., ollama)
2. Set environment variables (e.g., ANTHROPIC_API_KEY)
3. Or configure manually: lovelace setup --skip-detection
Ollama Connection Failed
✗ Ollama connection failed
Error: Could not connect to http://localhost:11434
Solution:
1. Check if Ollama is running: ollama serve
2. Verify port: echo $OLLAMA_HOST
3. Check firewall settings
4. Try explicit host: OLLAMA_HOST=http://localhost:11434 lovelace setup
Cannot Update Credentials
Error: Failed to save credentials
Permission denied
Solution:
1. Check ~/.lovelace/ permissions: ls -la ~/.lovelace/
2. Fix permissions: chmod 700 ~/.lovelace/
3. Ensure disk space available: df -h
Related Commands
- lovelace config - View and manage configuration
- lovelace doctor - Diagnose setup issues
- lovelace chat - Start using configured providers
Related Guides
- Getting Started Guide - First-time setup walkthrough
- Provider Configuration - Detailed provider setup
- Troubleshooting Guide - Common setup issues
See the Command Reference for all available commands.