Skip to contents

Make LLM answer as a boolean (TRUE or FALSE)

Usage

answer_as_boolean(
  prompt,
  true_definition = NULL,
  false_definition = NULL,
  add_instruction_to_prompt = TRUE
)

Arguments

prompt

A single string or a tidyprompt() object

true_definition

(optional) Definition of what would constitute TRUE. This will be included in the instruction to the LLM. Should be a single string

false_definition

(optional) Definition of what would constitute FALSE. This will be included in the instruction to the LLM. Should be a single string

add_instruction_to_prompt

(optional) Add instruction for replying as a boolean to the prompt text. Set to FALSE for debugging if extractions/validations are working as expected (without instruction the answer should fail the validation function, initiating a retry)

Value

A tidyprompt() with an added prompt_wrap() which will ensure that the LLM response is a boolean

Examples

if (FALSE) { # \dontrun{
  "Are you a large language model?" |>
    answer_as_boolean() |>
    send_prompt(llm_provider_ollama())
  # --- Sending request to LLM provider (llama3.1:8b): ---
  #   Are you a large language model?
  #
  #   You must answer with only TRUE or FALSE (use no other characters).
  # --- Receiving response from LLM provider: ---
  #   TRUE
  # [1] TRUE
} # }