Skip to contents

Prompt wrapping and evaluation

Tidyprompt

Functions to create and handle tidyprompt objects.

tidyprompt()
Create a tidyprompt object
tidyprompt-class
Tidyprompt R6 Class
is_tidyprompt()
Check if object is a tidyprompt object
construct_prompt_text()
Construct prompt text from a tidyprompt object
get_prompt_wraps()
Get prompt wraps from a tidyprompt object
get_chat_history()
Get the chat history of a tidyprompt object
set_chat_history()
Set the chat history of a tidyprompt object
set_system_prompt()
Set system prompt of a tidyprompt object

Prompt wrap

Functions for creating prompt wraps.

prompt_wrap()
Wrap a prompt with functions for modification and handling the LLM response
llm_feedback()
Create an llm_feedback object
llm_break()
Create an llm_break object

Send prompt

Function to evaluate prompts with an LLM provider.

send_prompt()
Send a prompt to a LLM provider

Prompt wrap library

A library of pre-built prompt wraps for common use cases.

Answer as

Functions to have LLMs answer in a specific format (structured output).

answer_as_boolean()
Make LLM answer as a boolean (TRUE or FALSE)
answer_as_integer()
Make LLM answer as an integer (between min and max)
answer_as_json()
Make LLM answer as JSON (with optional schema)
answer_as_key_value()
Make LLM answer as a list of key-value pairs
answer_as_list()
Make LLM answer as a list of items
answer_as_named_list()
Make LLM answer as a named list
answer_as_regex_match()
Make LLM answer match a specific regex
answer_as_text()
Make LLM answer as a constrained text response

Answer by

Functions to have LLMs answer in a specific way.

answer_by_chain_of_thought()
Set chain of thought mode for a prompt
answer_by_react()
Set ReAct mode for a prompt

Answer using

Functions that give LLMs access to tools (function-calling) and code

answer_using_r()
Enable LLM to draft and execute R code
answer_using_sql()
Enable LLM to draft and execute SQL queries on a database
answer_using_tools()
Enable LLM to call R functions
tools_add_docs()
Add tidyprompt function documentation to a function
tools_get_docs()
Extract documentation from a function

Miscellaneous

add_text()
Add text to a tidyprompt
quit_if()
Make evaluation of a prompt stop if LLM gives a specific response
user_verify()
Have user check the result of a prompt (human-in-the-loop)
llm_verify()
Have LLM check the result of a prompt (LLM-in-the-loop)

LLM providers & chat history

Functions to interact with LLM providers and manage chat history.

LLM provider class

llm_provider-class
LlmProvider R6 Class

Pre-built LLM providers

llm_provider_google_gemini()
Create a new Google Gemini LLM provider
llm_provider_groq()
Create a new Groq LLM provider
llm_provider_mistral()
Create a new Mistral LLM provider
llm_provider_ollama()
Create a new Ollama LLM provider
llm_provider_openai()
Create a new OpenAI LLM provider
llm_provider_openrouter()
Create a new OpenRouter LLM provider
llm_provider_xai()
Create a new XAI (Grok) LLM provider

Chat history

chat_history()
Create or validate chat_history object
add_msg_to_chat_history()
Add a message to a chat history

Persistent chat

A helper class to have a persistent, manual conversation with a LLM provider.

persistent_chat-class
PersistentChat R6 class

Helper functions

Text helpers

Functions to create text to be used in prompts.

df_to_string()
Convert a dataframe to a string representation
vector_list_to_string()
Convert a named or unnamed list/vector to a string representation
skim_with_labels_and_levels()
Skim a dataframe and include labels and levels

Other helper functions

extract_from_return_list()
Function to extract a specific element from a list
r_json_schema_to_example()
Generate an example object from a JSON schema