Complete schema
sundew.yaml
Field details
traps
| Field | Type | Default | Description |
|---|---|---|---|
mcp_server | bool | true | Enable the MCP server trap |
rest_api | bool | true | Enable the REST API trap |
ai_discovery | bool | true | Enable AI discovery endpoints |
persona
| Value | Behavior |
|---|---|
auto | Generate a random persona using configured LLM or fallback packs |
| File path | Load a custom persona from the specified YAML file |
llm
| Field | Type | Default | Description |
|---|---|---|---|
provider | string | ollama | LLM provider: ollama, anthropic, openai, bedrock, none |
model | string | llama3 | Model name |
base_url | string | null | Custom API endpoint |
api_key | string | null | API key (prefer environment variables) |
region | string | null | AWS region for Bedrock provider |
temperature | float | 0.7 | Generation randomness |
max_tokens | int | 2048 | Maximum output tokens |
server
| Field | Type | Default | Description |
|---|---|---|---|
host | string | 0.0.0.0 | Listen address |
port | int | 8080 | Listen port |
storage
| Field | Type | Default | Description |
|---|---|---|---|
database | string | ./data/sundew.db | SQLite database file path |
log_file | string | ./data/events.jsonl | JSONL event log file path |
logging
| Field | Type | Default | Description |
|---|---|---|---|
level | string | info | Log level: debug, info, warning, error |
output | string | stdout | Log output: stdout or a file path |
Environment variables
| Variable | Overrides |
|---|---|
ANTHROPIC_API_KEY | llm.api_key when provider is anthropic |
OPENAI_API_KEY | llm.api_key when provider is openai |
AWS_DEFAULT_REGION | llm.region when provider is bedrock |
SUNDEW_LOG_LEVEL | logging.level |