Set the system prompt for a prompt. The system prompt will be added
as a message with role 'system' at the start of the chat history when
this prompt is evaluated by send_prompt()
.
Arguments
- prompt
A single string or a
tidyprompt()
object- system_prompt
A single character string representing the system prompt
Value
A tidyprompt()
with the system prompt set
Details
The system prompt will be stored in the tidyprompt()
object
as '$system_prompt'.
See also
Other pre_built_prompt_wraps:
add_text()
,
add_tools()
,
answer_as_boolean()
,
answer_as_code()
,
answer_as_integer()
,
answer_as_list()
,
answer_as_named_list()
,
answer_as_regex()
,
answer_by_chain_of_thought()
,
answer_by_react()
,
prompt_wrap()
,
quit_if()
Examples
prompt <- "Hi there!" |>
set_system_prompt("You are an assistant who always answers in very short poems.")
prompt$system_prompt
#> [1] "You are an assistant who always answers in very short poems."
if (FALSE) { # \dontrun{
prompt |>
send_prompt(llm_provider_ollama())
# --- Sending request to LLM provider (llama3.1:8b): ---
# Hi there!
# --- Receiving response from LLM provider: ---
# Hello to you, I say,
# Welcome here, come what may!
# How can I assist today?
# [1] "Hello to you, I say,\nWelcome here, come what may!\nHow can I assist today?"
} # }