Skip to contents

This function is used to wrap a tidyprompt() object and ensure that the evaluation will stop if the LLM says it cannot answer the prompt. This is useful in scenarios where it is determined the LLM is unable to provide a response to a prompt.

Usage

quit_if(
  prompt,
  quit_detect_regex = "NO ANSWER",
  instruction =
    paste0("If you think that you cannot provide a valid answer, you must type:\n",
    "'NO ANSWER' (use no other characters)"),
  success = TRUE,
  response_result = c("null", "llm_response", "regex_match")
)

Arguments

prompt

A single string or a tidyprompt() object

quit_detect_regex

A regular expression to detect in the LLM's response which will cause the evaluation to stop. The default will detect the string "NO ANSWER" in the response

instruction

A string to be added to the prompt to instruct the LLM how to respond if it cannot answer the prompt. The default is "If you think that you cannot provide a valid answer, you must type: 'NO ANSWER' (use no other characters)". This parameter can be set to NULL if no instruction is needed in the prompt

success

A logical indicating whether the send_prompt() loop break should nonetheless be considered as a successful completion of the extraction and validation process. If FALSE, the object_to_return must will always be set to NULL and thus parameter 'response_result' must also be set to 'null'; if FALSE, send_prompt() will also print a warning about the unsuccessful evaluation. If TRUE, the object_to_return will be returned as the response result of send_prompt() (and send_prompt() will print no warning about unsuccessful evaluation); parameter 'response_result' will then determine what is returned as the response result of send_prompt().

response_result

A character string indicating what should be returned when the quit_detect_regex is detected in the LLM's response. The default is 'null', which will return NULL as the response result o f send_prompt(). Under 'llm_response', the full LLM response will be returned as the response result of send_prompt(). Under 'regex_match', the part of the LLM response that matches the quit_detect_regex will be returned as the response result of send_prompt()

Value

A tidyprompt() with an added prompt_wrap() which will ensure that the evaluation will stop upon detection of the quit_detect_regex in the LLM's response

Examples

if (FALSE) { # \dontrun{
  "What the favourite food of my cat on Thursday mornings?" |>
    quit_if() |>
    send_prompt(llm_provider_ollama())
  # --- Sending request to LLM provider (llama3.1:8b): ---
  #   What the favourite food of my cat on Thursday mornings?
  #
  #   If you think that you cannot provide a valid answer, you must type:
  #   'NO ANSWER' (use no other characters)
  # --- Receiving response from LLM provider: ---
  #   NO ANSWER
  # NULL
} # }