Package index
tidyprompt
Functions for creating and handling tidyprompt objects. These functions are typically called internally; in most scenarios, prompt wrap functions can be used.
-
tidyprompt()
- Methods to create, construct, and empower prompt objects
-
is_tidyprompt()
- Validate tidyprompt: returns TRUE if a valid tidyprompt object, otherwise FALSE
-
get_base_prompt()
- Get base prompt from tidyprompt
-
construct_prompt_text()
- Construct prompt text from a tidyprompt object
-
get_prompt_wraps()
- Get prompt wraps from tidyprompt
-
prompt_wrap()
- Wrap a prompt with empowering functions
-
llm_feedback()
- Create an
llm_feedback
object
-
llm_break()
- Create an
llm_break
object
-
send_prompt()
- Send a prompt or
tidyprompt()
to a LLM provider
-
answer_as_boolean()
- Make LLM answer as a boolean (TRUE or FALSE)
-
answer_as_code()
- Instruct LLM to answer a prompt with R code
-
answer_as_integer()
- Make LLM answer as an integer (between min and max)
-
answer_as_list()
- Make LLM answer as a list of items
-
answer_as_named_list()
- Extract named list from LLM response with optional item instructions and validations
-
answer_as_regex()
- Make LLM answer match a specific regex
-
answer_by_chain_of_thought()
- Set chain of thought mode for a prompt
-
answer_by_react()
- Set ReAct mode for a prompt
-
add_tools()
- Enable R function calling for prompt evaluation by a LLM
-
add_tools_extract_documentation()
- Extract docstring-documentation from a function
-
answer_as_code()
- Instruct LLM to answer a prompt with R code
-
add_text()
- Add text to a tidyprompt
-
set_system_prompt()
- Set system prompt
-
quit_if()
- Make evaluation of a prompt stop if LLM gives a specific response
-
llm_provider
llm_provider()
R6 Class
-
make_llm_provider_request()
- Make a request to an LLM provider
-
llm_provider_google_gemini()
- Create a new Google Gemini
llm_provider()
instance
-
llm_provider_groq()
- Create a new Groq
llm_provider()
instance
-
llm_provider_mistral()
- Create a new Mistral
llm_provider()
instance
-
llm_provider_ollama()
- Create a new Ollama
llm_provider()
instance
-
llm_provider_openai()
- Create a new OpenAI
llm_provider()
instance
-
llm_provider_openrouter()
- Create a new OpenRouter
llm_provider()
instance
-
llm_provider_xai()
- Create a new XAI (Grok)
llm_provider()
instance
-
chat_history()
- Create or validate
chat_history
object
-
df_to_string()
- Convert a dataframe to a string representation
-
vector_list_to_string()
- Convert a named or unnamed list/vector to a string representation
-
skim_with_labels_and_levels()
- Skim a dataframe and include labels and levels
-
extract_from_return_list()
- Function to extract a specific element from a list