The Snowflake provider allows you to interact with LLM models available through the Cortex LLM REST API.
chat_snowflake()
picks up the following ambient Snowflake credentials:
A static OAuth token defined via the SNOWFLAKE_TOKEN
environment
variable.
Key-pair authentication credentials defined via the SNOWFLAKE_USER
and
SNOWFLAKE_PRIVATE_KEY
(which can be a PEM-encoded private key or a path
to one) environment variables.
Posit Workbench-managed Snowflake credentials for the corresponding
account
.
Viewer-based credentials on Posit Connect. Requires the connectcreds package.
Note that Snowflake-hosted models do not support images or tool calling.
See chat_cortex_analyst()
to chat with the Snowflake Cortex Analyst rather
than a general-purpose model.
A system prompt to set the behavior of the assistant.
A Snowflake account identifier,
e.g. "testorg-test_account"
. Defaults to the value of the
SNOWFLAKE_ACCOUNT
environment variable.
A list of authentication headers to pass into
httr2::req_headers()
, a function that returns them when called, or
NULL
, the default, to use ambient credentials.
The model to use for the chat (defaults to "claude-3-7-sonnet"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
Common model parameters, usually created by params()
.
Named list of arbitrary extra arguments appended to the body
of every chat API call. Combined with the body object generated by ellmer
with modifyList()
.
One of the following options:
none
: don't emit any output (default when running in a function).
output
: echo text and tool-calling output as it streams in (default
when running at the console).
all
: echo all input and output.
Note this only affects the chat()
method.
A Chat object.
if (FALSE) { # has_credentials("cortex")
chat <- chat_snowflake()
chat$chat("Tell me a joke in the form of a SQL query.")
}