PortkeyAI provides an interface (AI Gateway) to connect through its Universal API to a variety of LLMs providers with a single endpoint.
chat_portkey(
system_prompt = NULL,
base_url = "https://api.portkey.ai/v1",
api_key = portkeyai_key(),
virtual_key = NULL,
model = NULL,
params = NULL,
api_args = list(),
echo = NULL
)
models_portkey(
base_url = "https://api.portkey.ai/v1",
api_key = portkeyai_key(),
virtual_key = NULL
)
A system prompt to set the behavior of the assistant.
The base URL to the endpoint; the default uses OpenAI.
API key to use for authentication.
You generally should not supply this directly, but instead set the PORTKEY_API_KEY
environment variable.
The best place to set this is in .Renviron
,
which you can easily edit by calling usethis::edit_r_environ()
.
A virtual identifier storing LLM provider's API key. See documentation.
The model to use for the chat (defaults to "gpt-4o").
We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
Use models_openai()
to see all options.
Common model parameters, usually created by params()
.
Named list of arbitrary extra arguments appended to the body
of every chat API call. Combined with the body object generated by ellmer
with modifyList()
.
One of the following options:
none
: don't emit any output (default when running in a function).
text
: echo text output as it streams in (default when running at
the console).
all
: echo all input and output.
Note this only affects the chat()
method.
A Chat object.
Other chatbots:
chat_anthropic()
,
chat_aws_bedrock()
,
chat_azure_openai()
,
chat_cloudflare()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
if (FALSE) { # \dontrun{
chat <- chat_portkey(virtual_key = Sys.getenv("PORTKEY_VIRTUAL_KEY"))
chat$chat("Tell me three jokes about statisticians")
} # }