The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
knitr::opts_chunk$set(
collapse = TRUE, comment = "#>",
eval = identical(tolower(Sys.getenv("LLMR_RUN_VIGNETTES", "false")), "true")
)
This vignette shows basic chat usage with four providers and model names: - OpenAI: gpt-5-nano - Anthropic: claude-sonnet-4-20250514 - Gemini: gemini-2.5-flash - Groq: openai/gpt-oss-20b
You will need API keys in these environment variables: OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, GROQ_API_KEY.
To run these examples locally, set a local flag: - Sys.setenv(LLMR_RUN_VIGNETTES = “true”) - or add LLMR_RUN_VIGNETTES=true to ~/.Renviron
Chat sessions remember context automatically:
schema <- list(
type = "object",
properties = list(
answer = list(type = "string"),
confidence = list(type = "number")
),
required = list("answer", "confidence"),
additionalProperties = FALSE
)
chat_oai$send_structured(
"Return an answer and a confidence score (0-1) about: Why is the sky blue?",
schema
)
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.