Skip to main content

Documentation Index

Fetch the complete documentation index at: https://penseapp.vercel.app/docs/llms.txt

Use this file to discover all available pages before exploring further.

ProviderSupported Models
OpenAIView models
OpenRouterView models

Configuration

Set the required environment variable for your chosen provider:
# OpenAI
export OPENAI_API_KEY=your_key

# OpenRouter
export OPENROUTER_API_KEY=your_key

Next Steps

Quickstart

Learn how to evaluate different LLMs across different test cases

Configure your agent

Configure the LLM for your agent

CLI

Run tests for any LLM from the command line