Package index
-
tidyprompt() - Create a tidyprompt object
-
tidyprompt-class - Tidyprompt R6 Class
-
is_tidyprompt() - Check if object is a tidyprompt object
-
construct_prompt_text() - Construct prompt text from a tidyprompt object
-
get_prompt_wraps() - Get prompt wraps from a tidyprompt object
-
get_chat_history() - Get the chat history of a tidyprompt object
-
set_chat_history() - Set the chat history of a tidyprompt object
-
set_system_prompt() - Set system prompt of a tidyprompt object
-
prompt_wrap() - Wrap a prompt with functions for modification and handling the LLM response
-
provider_prompt_wrap()experimental - Create a provider-level prompt wrap
-
llm_feedback() - Create an
llm_feedbackobject -
llm_break() - Create an
llm_breakobject -
llm_break_soft() - Create an
llm_break_softobject
-
send_prompt() - Send a prompt to a LLM provider
-
answer_as_boolean() - Make LLM answer as a boolean (TRUE or FALSE)
-
answer_as_category() - Make LLM answer as a category
-
answer_as_integer() - Make LLM answer as an integer (between min and max)
-
answer_as_json() - Make LLM answer as JSON (with optional schema; structured output)
-
answer_as_key_value() - Make LLM answer as a list of key-value pairs
-
answer_as_list() - Make LLM answer as a list of items
-
answer_as_multi_category() - Build prompt for categorizing a text into multiple categories
-
answer_as_named_list() - Make LLM answer as a named list
-
answer_as_regex_match() - Make LLM answer match a specific regex
-
answer_as_text() - Make LLM answer as a constrained text response
-
answer_by_chain_of_thought() - Set chain of thought mode for a prompt
-
answer_by_react() - Set ReAct mode for a prompt
-
answer_using_r() - Enable LLM to draft and execute R code
-
answer_using_sql() - Enable LLM to draft and execute SQL queries on a database
-
answer_using_tools() - Enable LLM to call R functions (and/or MCP server tools)
-
tools_add_docs() - Add tidyprompt function documentation to a function
-
tools_get_docs() - Extract documentation from a function
-
add_text() - Add text to a tidyprompt
-
quit_if() - Make evaluation of a prompt stop if LLM gives a specific response
-
user_verify() - Have user check the result of a prompt (human-in-the-loop)
-
llm_verify() - Have LLM check the result of a prompt (LLM-in-the-loop)
-
llm_provider-class - LlmProvider R6 Class
-
llm_provider_ellmer()experimental - Create a new LLM provider from an
ellmer::chat()object -
llm_provider_google_gemini()superseded - Create a new Google Gemini LLM provider
-
llm_provider_groq() - Create a new Groq LLM provider
-
llm_provider_mistral() - Create a new Mistral LLM provider
-
llm_provider_ollama() - Create a new Ollama LLM provider
-
llm_provider_openai() - Create a new OpenAI LLM provider
-
llm_provider_openrouter() - Create a new OpenRouter LLM provider
-
llm_provider_xai() - Create a new XAI (Grok) LLM provider
-
chat_history() - Create or validate
chat_historyobject -
add_msg_to_chat_history() - Add a message to a chat history
-
persistent_chat-class - PersistentChat R6 class
-
df_to_string() - Convert a dataframe to a string representation
-
vector_list_to_string() - Convert a named or unnamed list/vector to a string representation
-
skim_with_labels_and_levels() - Skim a dataframe and include labels and levels
-
extract_from_return_list() - Function to extract a specific element from a list
-
r_json_schema_to_example() - Generate an example object from a JSON schema