Agents Llm Client

Functions

Functions

getProviderConfig

Signature:

getProviderConfig(provider: LLMProvider): ProviderConfig

Parameters:

ParameterTypeRequiredDescription
providerLLMProviderYes

Returns:

ProviderConfig -

createLLMProxy

Server-side LLM proxy entry point

This module provides server-side integration for LLM clients, enabling proxy functionality for API calls and request handling.

Signature:

createLLMProxy(_config?: any): any

Parameters:

ParameterTypeRequiredDescription
_configanyNo

Returns:

any -

calculateCost

Signature:

calculateCost(usage: TokenUsage, modelConfig: ModelConfig, modelName: string): CostBreakdown

Parameters:

ParameterTypeRequiredDescription
usageTokenUsageYes
modelConfigModelConfigYes
modelNamestringYes

Returns:

CostBreakdown -

detectEnvironment

Signature:

detectEnvironment(): Environment

Returns:

Environment -

createLogger

Signature:

createLogger(config?: LoggingConfig | undefined): Logger

Parameters:

ParameterTypeRequiredDescription
configLoggingConfig | undefinedNo

Returns:

Logger -

withRetry

Wraps a function with retry logic including exponential backoff with jitter

Signature:

withRetry(fn: () => Promise<T>, config?: RetryConfig | undefined): Promise<{ result: T; retryCount: number; }>

Parameters:

ParameterTypeRequiredDescription
fn() => Promise<T>Yes- The function to retry
configRetryConfig | undefinedNo- Retry configuration

Returns:

Promise<{ result: T; retryCount: number; }> - Object with result and retry count

Previous
Classes