Cloudflare works AI hosts a variety of open-source AI models. To use the Cloudflare API, you must have an Account ID and an Access Token, which you can obtain by following these instructions.
chat_cloudflare(
account = cloudflare_account(),
system_prompt = NULL,
params = NULL,
api_key = cloudflare_key(),
model = NULL,
api_args = list(),
echo = NULL
)
The Cloudflare account ID. Taken from the
CLOUDFLARE_ACCOUNT_ID
env var, if defined.
A system prompt to set the behavior of the assistant.
Common model parameters, usually created by params()
.
The API key to use for authentication. You generally should
not supply this directly, but instead set the HUGGINGFACE_API_KEY
environment
variable.
The model to use for the chat (defaults to "meta-llama/Llama-3.3-70b-instruct-fp8-fast"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
Named list of arbitrary extra arguments appended to the body
of every chat API call. Combined with the body object generated by ellmer
with modifyList()
.
One of the following options:
none
: don't emit any output (default when running in a function).
text
: echo text output as it streams in (default when running at
the console).
all
: echo all input and output.
Note this only affects the chat()
method.
A Chat object.
Other chatbots:
chat_anthropic()
,
chat_aws_bedrock()
,
chat_azure_openai()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
,
chat_portkey()
if (FALSE) { # \dontrun{
chat <- chat_cloudflare()
chat$chat("Tell me three jokes about statisticians")
} # }