Overview
Coevolved ships with an OpenAI provider that maps Coevolved’s core types to OpenAI Chat Completions:PromptPayload→ OpenAImessagesToolSpec→ OpenAItools(function calling schema)- OpenAI response →
LLMResponse(text + tool calls + usage)
Setup
Install credentials and initialize the OpenAI client:Models and config
UseLLMConfig to configure the model and generation parameters:
- Supply
request_optionswhen creatingOpenAIProvider(...), and/or - Put
request_optionsunderLLMConfig.metadata["request_options"]
Tool calling
Tool calling requires:- tool steps created with
tool_step(...) - tool specs generated from those steps
- an
LLMConfigthat includes those specs
- Sends tool schemas to OpenAI as
tools - Parses tool calls and exposes them as
LLMResponse.tool_calls