Skip to contents

Helper function to handle making requests to LLM providers, to be used within a complete_chat() function for a LLM provider.

Usage

make_llm_provider_request(
  url,
  headers = NULL,
  body,
  stream = NULL,
  verbose = getOption("tidyprompt.verbose", TRUE),
  stream_api_type = c("openai", "ollama")
)

Arguments

url

The URL of the LLM provider API endpoint

headers

A named list of headers to be passed to the API (can be NULL)

body

The body of the POST request

stream

A logical indicating whether the API should stream responses

verbose

A logical indicating whether the interaction with the LLM provider should be printed to the console. Default is TRUE.

stream_api_type

The type of API to use; specifically required to handle streaming. Currently, "openai" and "ollama" have been implemented. "openai" should also work with other similar APIs for chat completion

Value

A list with the role and content of the response from the LLM provider