Skip to content

Index

LLM backend abstraction layer.

MODULE DESCRIPTION
anthropic

Anthropic LLM backend via LiteLLM.

base

Abstract LLMBackend base class and factory.

ollama

Ollama LLM backend via LiteLLM.

CLASS DESCRIPTION
AnthropicBackend

LLM backend that calls Anthropic models through LiteLLM.

LLMBackend

Abstract base class for LLM provider backends.

OllamaBackend

LLM backend that calls Ollama models through LiteLLM.

FUNCTION DESCRIPTION
from_config

Instantiate the correct [LLMBackend][] from an [LLMConfig][].

Classes

AnthropicBackend

AnthropicBackend(config: LLMConfig)

Bases: LLMBackend

LLM backend that calls Anthropic models through LiteLLM.

PARAMETER DESCRIPTION
config

The LLM configuration section from the Foreman runtime config.

TYPE: LLMConfig

METHOD DESCRIPTION
complete

Send a prompt to an Anthropic model and return the text response.

Functions

complete
complete(prompt: str, system: str | None = None) -> str

Send a prompt to an Anthropic model and return the text response.

PARAMETER DESCRIPTION
prompt

The user prompt to send to the model.

TYPE: str

system

Optional system prompt.

TYPE: str | None DEFAULT: None

RETURNS DESCRIPTION
str

The model's text response.

LLMBackend

Bases: ABC

Abstract base class for LLM provider backends.

All concrete backends must implement complete.

METHOD DESCRIPTION
complete

Send a prompt to the LLM and return the text response.

Functions

complete abstractmethod
complete(prompt: str, system: str | None = None) -> str

Send a prompt to the LLM and return the text response.

PARAMETER DESCRIPTION
prompt

The user prompt to send to the model.

TYPE: str

system

Optional system prompt to configure model behaviour.

TYPE: str | None DEFAULT: None

RETURNS DESCRIPTION
str

The model's text response as a string.

OllamaBackend

OllamaBackend(config: LLMConfig)

Bases: LLMBackend

LLM backend that calls Ollama models through LiteLLM.

PARAMETER DESCRIPTION
config

The LLM configuration section from the Foreman runtime config.

TYPE: LLMConfig

METHOD DESCRIPTION
complete

Send a prompt to an Ollama model and return the text response.

Functions

complete
complete(prompt: str, system: str | None = None) -> str

Send a prompt to an Ollama model and return the text response.

PARAMETER DESCRIPTION
prompt

The user prompt to send to the model.

TYPE: str

system

Optional system prompt.

TYPE: str | None DEFAULT: None

RETURNS DESCRIPTION
str

The model's text response.

Functions

from_config

from_config(config: LLMConfig) -> LLMBackend

Instantiate the correct LLMBackend from an LLMConfig.

PARAMETER DESCRIPTION
config

The LLM section of the Foreman runtime config.

TYPE: LLMConfig

RETURNS DESCRIPTION
LLMBackend

A concrete LLMBackend instance.

RAISES DESCRIPTION
ValueError

If config.provider is not a supported backend.