Skip to contents

Prompt wrapping and evaluation

tidyprompt

Functions for creating and handling tidyprompt objects. These functions are typically called internally; in most scenarios, prompt wrap functions can be used.

tidyprompt()
Methods to create, construct, and empower prompt objects
is_tidyprompt()
Validate tidyprompt: returns TRUE if a valid tidyprompt object, otherwise FALSE
get_base_prompt()
Get base prompt from tidyprompt
construct_prompt_text()
Construct prompt text from a tidyprompt object
get_prompt_wraps()
Get prompt wraps from tidyprompt

prompt_wrap

Functions for creating prompt wraps.

prompt_wrap()
Wrap a prompt with empowering functions
llm_feedback()
Create an llm_feedback object
llm_break()
Create an llm_break object

send_prompt

Function to evaluate prompts with an LLM provider.

send_prompt()
Send a prompt or tidyprompt() to a LLM provider

Prompt wrap library

A library of pre-built prompt wraps for common use cases.

Answer as (structured output)

Functions to have LLMs answer in a specific format.

answer_as_boolean()
Make LLM answer as a boolean (TRUE or FALSE)
answer_as_code()
Instruct LLM to answer a prompt with R code
answer_as_integer()
Make LLM answer as an integer (between min and max)
answer_as_list()
Make LLM answer as a list of items
answer_as_named_list()
Extract named list from LLM response with optional item instructions and validations
answer_as_regex()
Make LLM answer match a specific regex

Answer by (mode)

Functions to have LLMs answer in a specific way.

answer_by_chain_of_thought()
Set chain of thought mode for a prompt
answer_by_react()
Set ReAct mode for a prompt

Tools (function-calling)

Functions that empower a LLM with tools.

add_tools()
Enable R function calling for prompt evaluation by a LLM
add_tools_extract_documentation()
Extract docstring-documentation from a function
answer_as_code()
Instruct LLM to answer a prompt with R code

Miscellaneous

add_text()
Add text to a tidyprompt
set_system_prompt()
Set system prompt
quit_if()
Make evaluation of a prompt stop if LLM gives a specific response

LLM providers

Functions to interact with LLM providers.

LLM provider class

llm_provider
llm_provider() R6 Class
make_llm_provider_request()
Make a request to an LLM provider

Pre-built LLM providers

llm_provider_google_gemini()
Create a new Google Gemini llm_provider() instance
llm_provider_groq()
Create a new Groq llm_provider() instance
llm_provider_mistral()
Create a new Mistral llm_provider() instance
llm_provider_ollama()
Create a new Ollama llm_provider() instance
llm_provider_openai()
Create a new OpenAI llm_provider() instance
llm_provider_openrouter()
Create a new OpenRouter llm_provider() instance
llm_provider_xai()
Create a new XAI (Grok) llm_provider() instance

chat_history

chat_history()
Create or validate chat_history object

Helper functions

Text helpers

Functions to create text to be used in prompts.

df_to_string()
Convert a dataframe to a string representation
vector_list_to_string()
Convert a named or unnamed list/vector to a string representation
skim_with_labels_and_levels()
Skim a dataframe and include labels and levels

Other helper functions

extract_from_return_list()
Function to extract a specific element from a list