Illm

Classes

class arshai.core.interfaces.illm.ILLM(config)[source]

Bases: Protocol

Protocol class for LLM providers - Updated to match BaseLLMClient implementation

__init__(config)[source]
async chat(input)[source]

Main chat interface supporting all functionality

Return type:

Dict[str, Any]

async stream(input)[source]

Main streaming interface supporting all functionality

Return type:

AsyncGenerator[Dict[str, Any], None]

class arshai.core.interfaces.illm.ILLMConfig(**data)[source]

Bases: IDTO

Configuration for LLM providers

model: str
temperature: float
max_tokens: Optional[int]
top_p: Optional[float]
frequency_penalty: Optional[float]
presence_penalty: Optional[float]
model_config: ClassVar[ConfigDict] = {'allow_mutation': False, 'arbitrary_types_allowed': True, 'smart_union': True, 'validate_assignment': True}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class arshai.core.interfaces.illm.ILLMInput(**data)[source]

Bases: IDTO

Represents the input for the llm - Unified interface supporting all functionality

system_prompt: str
user_message: str
regular_functions: Dict[str, Callable]
background_tasks: Dict[str, Callable]
structure_type: Type[TypeVar(T)]
max_turns: int
classmethod validate_input(data)[source]

Simplified validation focusing on actual requirements

model_config: ClassVar[ConfigDict] = {'allow_mutation': False, 'arbitrary_types_allowed': True, 'smart_union': True, 'validate_assignment': True}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].