Configuration
PromptOps is configured via a promptops.config.ts file in your project root. This file controls models, datasets, optimization settings, and deployment targets.
Config File
After running promptops init, you'll have this file:
import { defineConfig } from "@promptops/sdk";
export default defineConfig({
// Project settings
project: "my-ai-app",
// Default model for new prompts
defaultModel: "claude-sonnet",
// Available models for benchmarking
models: [
"claude-sonnet",
"gpt-4o",
"gemini-pro",
"llama-3.1-70b",
],
// Optimization defaults
optimization: {
iterations: 50,
objective: "accuracy",
earlyStop: true,
},
// Deployment settings
deploy: {
strategy: "canary",
monitoring: {
latency: { max: "2s" },
accuracy: { min: 0.85 },
cost: { max: "$0.01/req" },
},
rollback: "automatic",
},
});Models
The models array defines which LLMs are available for benchmarking. You can use any model supported by your API keys.
| Model | Provider |
|---|---|
| claude-sonnet | Anthropic |
| gpt-4o | OpenAI |
| gemini-pro | |
| llama-3.1-70b | Meta (via API) |
Environment Variables
API keys are stored as environment variables, never in config files:
# .env.local PROMPTOPS_API_KEY=po_sk_... ANTHROPIC_API_KEY=sk-ant-... OPENAI_API_KEY=sk-... GOOGLE_AI_API_KEY=AI...