The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

Using Ellmer Chat Models

There are two ways to use ellmer chat models for batch processing. This flexibility allows you to 1) pass a chat model object or 2) pass a function. The chat object method is more flexible because the object can be reused.

Method 1: Passing an Object

The first method is to pass a chat model object. This is useful when you want to reuse an existing model configuration:

library(hellmer)

openai <- chat_openai(
  model = "o3-mini",
  system_prompt = "Reply concisely, one sentence"
)

chat <- chat_sequential(openai)

Method 2: Passing a Function

The second method is to pass an ellmer chat model function directly. This method may be preferred if you only need to use a model once or aesthetically prefer not to nest functions. There is not any great reason to use this method, but it’s available for backward compatibility.

chat <- chat_sequential(
  chat_openai,
  model = "o3-mini",
  system_prompt = "Reply concisely, one sentence"
)

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.