Overview
Coevolved treats prompts as first-class inputs to LLM steps. You can build prompts as:
- A simple
str
- A
Prompt template (with metadata and versioning)
- A full
PromptPayload (text or chat messages + IDs/hashes)
Prompt templates
Use Prompt when you want:
- Structured prompt versioning (
id, version)
- Templating via Python
str.format
- Stable hashes for caching or trace grouping
from coevolved.core.prompt import Prompt
greeting = Prompt(
id="support.greeting",
version="1.0",
template="You are a helpful assistant. User: {question}",
)
Prompt payloads
When you need full control, build a PromptPayload with either:
text (single user prompt)
messages (chat-style prompts compatible with tool calling)
llm_step will coerce your prompt builder output into a PromptPayload and will compute a prompt hash if missing.
Prompt versioning and hashing
Hashes are useful for:
- Caching and memoization
- Trace grouping across runs
- Regression testing (“did the prompt change?”)
Keep your prompt IDs stable and bump versions intentionally. This makes it easier to trace behavioral shifts over time.
Next steps