LLM Message HandlingFunctions for creating, managing, and retrieving messages and metadata. |
|
|---|---|
Create or Update Large Language Model Message Object |
|
Convert a Data Frame to an LLMMessage Object |
|
Retrieve Assistant Reply as Text |
|
Retrieve Assistant Reply as Structured Data |
|
Retrieve a User Message by Index |
|
Retrieve Metadata from Assistant Replies |
|
Retrieve Log Probabilities from Assistant Replies |
|
Get the current rate limit information for all or a specific API |
|
Tidyllm Main VerbsCore verbs facilitating interactions with LLMs, including sending messages, generating embeddings, and managing batch requests. |
|
Chat with a Language Model |
|
Generate text embeddings |
|
Send a batch of messages to a batch API |
|
Check Batch Processing Status |
|
Fetch Results from a Batch API |
|
List all Batch Requests on a Batch API |
|
List Available Models for a Provider |
|
Schemata, Tools, and Media HandlingFunctions for json_schema, tools and media handling |
|
Create a JSON Schema for Structured Outputs |
|
Define Field Descriptors for JSON Schema |
|
Define a nested object field |
|
Create a Tool Definition for tidyllm |
|
Create an Image Object |
|
API Provider FunctionsFunctions interfacing with various API providers called from main verbs |
|
OpenAI Provider Function |
|
Provider Function for Claude models on the Anthropic API |
|
Google Gemini Provider Function |
|
Groq API Provider Function |
|
Mistral Provider Function |
|
Ollama API Provider Function |
|
Perplexity Provider Function |
|
Deepseek Provider Function |
|
Voyage Provider Function |
|
Azure OpenAI Endpoint Provider Function |
|
Alias for the OpenAI Provider Function |
|
OpenAI-Specific FunctionsFunctions for OpenAI services, including chat interactions, batch processing and embedding generation. |
|
Send LLM Messages to the OpenAI Chat Completions API |
|
Send a Batch of Messages to OpenAI Batch API |
|
Check Batch Processing Status for OpenAI Batch API |
|
Fetch Results for an OpenAI Batch |
|
List OpenAI Batch Requests |
|
Cancel an In-Progress OpenAI Batch |
|
Generate Embeddings Using OpenAI API |
|
List Available Models from the OpenAI API |
|
Claude-Specific FunctionsFunctions designed for Claude services for chat interactions and batch processing. |
|
Interact with Claude AI models via the Anthropic API |
|
Send a Batch of Messages to Claude API |
|
Check Batch Processing Status for Claude API |
|
Fetch Results for a Claude Batch |
|
List Claude Batch Requests |
|
List Available Models from the Anthropic Claude API |
|
Upload a File to Claude API |
|
Delete a File from Claude API |
|
Retrieve Metadata for a File from Claude API |
|
List Files in Claude API |
|
Gemini-Specific FunctionsFunctions specific to Google Gemini services, including chat, embedding, and file management operations. |
|
Send LLMMessage to Gemini API |
|
Generate Embeddings Using the Google Gemini API |
|
Upload a File to Gemini API |
|
List Files in Gemini API |
|
Retrieve Metadata for a File from Gemini API |
|
Delete a File from Gemini API |
|
Submit a list of LLMMessage objects to Gemini's batch API |
|
Check the Status of a Gemini Batch Operation |
|
List Recent Gemini Batch Operations |
|
Fetch Results for a Gemini Batch |
|
Ollama-Specific FunctionsFunctions for engaging with Ollama services, including chat, embedding, and model management. |
|
Interact with local AI models via the Ollama API |
|
Generate Embeddings Using Ollama API |
|
Send a Batch of Messages to Ollama API |
|
Download a model from the Ollama API |
|
Delete a model from the Ollama API |
|
Retrieve and return model information from the Ollama API |
|
Mistral-Specific FunctionsFunctions for interacting with the Mistral API . |
|
Send LLMMessage to Mistral API |
|
Generate Embeddings Using Mistral API |
|
Send a Batch of Requests to the Mistral API |
|
Check Batch Processing Status for Mistral Batch API |
|
Fetch Results for an Mistral Batch |
|
List Mistral Batch Requests |
|
List Available Models from the Mistral API |
|
Perplexity-Specific FunctionsFunctions for the Perplexity API |
|
Send LLM Messages to the Perplexity Chat API (All Features, No .json Option) |
|
Groq-Specific FunctionsFunctions for interacting with Groq services, such as chat and transcription. |
|
Send LLM Messages to the Groq Chat API |
|
Transcribe an Audio File Using Groq transcription API |
|
List Available Models from the Groq API |
|
Send a Batch of Messages to the Groq API |
|
Check Batch Processing Status for Groq API |
|
Fetch Results for a Groq Batch |
|
List Groq Batch Requests |
|
Azure-Specific FunctionsFunctions for interacting with the Azure OpenAI API . |
|
Send LLM Messages to an Azure OpenAI Chat Completions endpoint |
|
Generate Embeddings Using OpenAI API on Azure |
|
Send a Batch of Messages to Azure OpenAI Batch API |
|
Check Batch Processing Status for Azure OpenAI Batch API |
|
List Azure OpenAI Batch Requests |
|
Fetch Results for an Azure OpenAI Batch |
|
DeepSeek-Specific FunctionsFunctions for DeepSeek. |
|
Send LLM Messages to the DeepSeek Chat API |
|
VoyageAI-Specific FunctionsFunctions for VoyageAI Embeddings. |
|
Generate Embeddings Using Voyage AI API |
|
PDF ProcessingProcessing PDF documents, enabling page-by-page analysis. |
|
Batch Process PDF into LLM Messages |
|
InternalsMain definition of the LLMMessage object at the core of most tidyllm workflows |
|
Large Language Model Message Class |
|