OpenAI provides a number of chat-based models, mostly under the ChatGPT brand. Note that a ChatGPT Plus membership does not grant access to the API. You will need to sign up for a developer account (and pay for it) at the developer platform.

chat_openai(
  system_prompt = NULL,
  base_url = "https://api.openai.com/v1",
  api_key = openai_key(),
  model = NULL,
  params = NULL,
  seed = lifecycle::deprecated(),
  api_args = list(),
  echo = c("none", "output", "all")
)

models_openai(base_url = "https://api.openai.com/v1", api_key = openai_key())

Arguments

system_prompt

A system prompt to set the behavior of the assistant.

base_url

The base URL to the endpoint; the default uses OpenAI.

api_key

API key to use for authentication.

You generally should not supply this directly, but instead set the OPENAI_API_KEY environment variable. The best place to set this is in .Renviron, which you can easily edit by calling usethis::edit_r_environ().

model

The model to use for the chat (defaults to "gpt-4.1"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use. Use models_openai() to see all options.

params

Common model parameters, usually created by params().

seed

Optional integer seed that ChatGPT uses to try and make output more reproducible.

api_args

Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with modifyList().

echo

One of the following options:

  • none: don't emit any output (default when running in a function).

  • text: echo text output as it streams in (default when running at the console).

  • all: echo all input and output.

Note this only affects the chat() method.

Value

A Chat object.

Examples

if (FALSE) { # has_credentials("openai")
chat <- chat_openai()
chat$chat("
  What is the difference between a tibble and a data frame?
  Answer with a bulleted list
")

chat$chat("Tell me three funny jokes about statisticians")
}