The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Unified interface for creating LLM and
Agent objects, generating responses, and
performing batch inference.
Built on a type-checked and validated ‘S7’
backend.
Features reasoning, structured output,
memory management, and tool use.
Supports Ollama, OpenAI-compatible,
and Anthropic-compatible endpoints.
LLM |
Agent |
|
|---|---|---|
| Reasoning | ✓ | ✓ |
| Structured output | ✓ | ✓ |
| Tool use | x | ✓ |
| Memory management | x | ✓ |
| Batch generation | ✓ | ✓ |
pak::repo_add(myuniverse = "https://rtemis-org.r-universe.dev")
pak::pak("rtemis.llm")pak::pak("rtemis-org/llm")For detailed documentation, see the rtemis.llm documentation.
library(rtemis.llm)List available Ollama models
ollama_list_models()Create an LLM object
llm <- create_Ollama(
model_name = "gemma4:26b",
system_prompt = "You are a meticulous research assistant.",
temperature = 0.3
)generate(llm, "What is the role of the telomere?")Create an Agent object
agent <- create_agent(
llmconfig = config_Ollama(
model_name = "gemma4:26b",
temperature = 0.3
),
system_prompt = "You are a meticulous research assistant.",
name = "Kaimana"
)generate(agent, "Explain quantum superposition in seven bullet points.")These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.