Skip to contents

Make LLM answer as a constrained text response

Usage

answer_as_text(
  prompt,
  max_words = NULL,
  max_characters = NULL,
  add_instruction_to_prompt = TRUE
)

Arguments

prompt

A single string or a tidyprompt() object

max_words

(optional) Maximum number of words allowed in the response. If specified, responses exceeding this limit will fail validation

max_characters

(optional) Maximum number of characters allowed in the response. If specified, responses exceeding this limit will fail validation

add_instruction_to_prompt

(optional) Add instruction for replying within the constraints to the prompt text. Set to FALSE for debugging if extractions/validations are working as expected (without instruction the answer should fail the validation function, initiating a retry)

Value

A tidyprompt() with an added prompt_wrap() which will ensure that the LLM response conforms to the specified constraints

Examples

if (FALSE) { # \dontrun{
  "What is a large language model?" |>
    answer_as_text(max_words = 10) |>
    send_prompt()
  # --- Sending request to LLM provider (llama3.1:8b): ---
  # What is a large language model?
  #
  # You must provide a text response. The response must be at most 10 words.
  # --- Receiving response from LLM provider: ---
  # A type of AI that processes and generates human-like text.
  # [1] "A type of AI that processes and generates human-like text."
} # }