Skip to contents

This class provides a structure for creating llm_provider() objects with different implementations of the complete_chat function. Using this class, you can create an llm_provider() object that interacts with different LLM providers, such Ollama, OpenAI, or other custom providers.

Public fields

parameters

A named list of parameters to configure the llm_provider(). Parameters may be appended to the request body when interacting with the LLM provider API

verbose

A logical indicating whether interaction with the LLM provider should be printed to the console

url

The URL to the LLM provider API endpoint for chat completion

api_key

The API key to use for authentication with the LLM provider API

Methods


Method new()

Create a new llm_provider() object

Usage

llm_provider$new(
  complete_chat_function,
  parameters = list(),
  verbose = TRUE,
  url = NULL,
  api_key = NULL
)

Arguments

complete_chat_function

Function that will be called by the llm_provider() to complete a chat. This function should take a chat_history data frame as input and return a response object (a list with role and content, detailing the chat completion)

parameters

A named list of parameters to configure the llm_provider(). These parameters may be appended to the request body when interacting with the LLM provider. For example, the model parameter may often be required. The 'stream' parameter may be used to indicate that the API should stream, which will be handled down the line by the make_llm_provider_request function. Parameters should not include the chat_history, as this is passed as a separate argument to the complete_chat_function. Paramters should also not include 'api_key' or 'url'; these are treated separately

verbose

A logical indicating whether interaction with the LLM provider should be printed to the console

url

The URL to the LLM provider API endpoint for chat completion (typically required, but may be left NULL in some cases, for instance when creating a fake LLM provider)

api_key

The API key to use for authentication with the LLM provider API (optional, not required for, for instance, Ollama)

Returns

A new llm_provider() R6 object


Method set_parameters()

Helper function to set the parameters of the llm_provider() object. This function appends new parameters to the existing parameters list.

Usage

llm_provider$set_parameters(new_parameters)

Arguments

new_parameters

A named list of new parameters to append to the existing parameters list

Returns

The modified llm_provider() object


Method complete_chat()

complete_chat function; sends a chat_history to the LLM provider using the configured complete_chat_function. This function is typically called by the send_prompt function to interact with the LLM provider, but it can also be called directly.

Usage

llm_provider$complete_chat(chat_history)

Arguments

chat_history

A data frame with 'role' and 'content' columns

Returns

The response from the LLM provider, in a named list with 'role', 'content', and 'http'. The 'role' and 'content' fields (required) contain the extracted role and content from the response (e.g., 'assistant' and 'Hello, how can I help you?'). The 'http' field (optional) may contain any additional information, e.g., data from the HTTP response about the number of tokens used.


Method clone()

The objects of this class are cloneable with this method.

Usage

llm_provider$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.