This class provides a structure for creating llm_provider()
objects with different implementations of the complete_chat
function. Using
this class, you can create an llm_provider()
object that interacts with
different LLM providers, such Ollama, OpenAI, or other custom providers.
See also
Other llm_provider:
llm_provider_google_gemini()
,
llm_provider_groq()
,
llm_provider_mistral()
,
llm_provider_ollama()
,
llm_provider_openai()
,
llm_provider_openrouter()
,
llm_provider_xai()
Public fields
parameters
A named list of parameters to configure the
llm_provider()
. Parameters may be appended to the request body when interacting with the LLM provider APIverbose
A logical indicating whether interaction with the LLM provider should be printed to the console
url
The URL to the LLM provider API endpoint for chat completion
api_key
The API key to use for authentication with the LLM provider API
Methods
Method new()
Create a new llm_provider()
object
Usage
llm_provider$new(
complete_chat_function,
parameters = list(),
verbose = TRUE,
url = NULL,
api_key = NULL
)
Arguments
complete_chat_function
Function that will be called by the
llm_provider()
to complete a chat. This function should take achat_history
data frame as input and return a response object (a list withrole
andcontent
, detailing the chat completion)parameters
A named list of parameters to configure the
llm_provider()
. These parameters may be appended to the request body when interacting with the LLM provider. For example, themodel
parameter may often be required. The 'stream' parameter may be used to indicate that the API should stream, which will be handled down the line by the make_llm_provider_request function. Parameters should not include the chat_history, as this is passed as a separate argument to thecomplete_chat_function
. Paramters should also not include 'api_key' or 'url'; these are treated separatelyverbose
A logical indicating whether interaction with the LLM provider should be printed to the console
url
The URL to the LLM provider API endpoint for chat completion (typically required, but may be left NULL in some cases, for instance when creating a fake LLM provider)
api_key
The API key to use for authentication with the LLM provider API (optional, not required for, for instance, Ollama)
Method set_parameters()
Helper function to set the parameters of the llm_provider()
object. This function appends new parameters to the existing parameters
list.
Method complete_chat()
complete_chat function; sends a chat_history to the LLM
provider using the configured complete_chat_function
. This function is
typically called by the send_prompt
function to interact with the LLM
provider, but it can also be called directly.
Returns
The response from the LLM provider, in a named list with 'role', 'content', and 'http'. The 'role' and 'content' fields (required) contain the extracted role and content from the response (e.g., 'assistant' and 'Hello, how can I help you?'). The 'http' field (optional) may contain any additional information, e.g., data from the HTTP response about the number of tokens used.