AWS Bedrock provides a number of language models, including those from Anthropic's Claude, using the Bedrock Converse API.
Authentication is handled through {paws.common}, so if authentication
does not work for you automatically, you'll need to follow the advice
at https://www.paws-r-sdk.com/#credentials. In particular, if your
org uses AWS SSO, you'll need to run aws sso login
at the terminal.
chat_aws_bedrock(
system_prompt = NULL,
model = NULL,
profile = NULL,
api_args = list(),
echo = NULL
)
models_aws_bedrock(profile = NULL)
A system prompt to set the behavior of the assistant.
The model to use for the chat (defaults to "anthropic.claude-3-5-sonnet-20240620-v1:0").
We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
Use models_models_aws_bedrock()
to see all options.
.
While ellmer provides a default model, there's no guarantee that you'll
have access to it, so you'll need to specify a model that you can.
If you're using cross-region inference,
you'll need to use the inference profile ID, e.g.
model="us.anthropic.claude-3-5-sonnet-20240620-v1:0"
.
AWS profile to use.
Named list of arbitrary extra arguments appended to the body of every chat API call. Some useful arguments include:
One of the following options:
none
: don't emit any output (default when running in a function).
text
: echo text output as it streams in (default when running at
the console).
all
: echo all input and output.
Note this only affects the chat()
method.
A Chat object.
Other chatbots:
chat_anthropic()
,
chat_azure_openai()
,
chat_cloudflare()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
,
chat_portkey()
if (FALSE) { # \dontrun{
# Basic usage
chat <- chat_aws_bedrock()
chat$chat("Tell me three jokes about statisticians")
} # }