Skip to content

Coverage of Model Providers

PromptingTools.jl routes AI calls through the use of subtypes of AbstractPromptSchema, which determine how data is formatted and where it is sent. (For example, OpenAI models have the corresponding subtype AbstractOpenAISchema, having the corresponding schemas - OpenAISchema, CustomOpenAISchema, etc.) This ensures that the data is correctly formatted for the specific AI model provider.

Below is an overview of the model providers supported by PromptingTools.jl, along with the corresponding schema information.

Abstract SchemaSchemaModel Provideraigenerateaiembedaiextractaiscanaiimageaiclassify
AbstractOpenAISchemaOpenAISchemaOpenAI
AbstractOpenAISchemaCustomOpenAISchema*Any OpenAI-compatible API (eg, vLLM)*
AbstractOpenAISchemaLocalServerOpenAISchema**Any OpenAI-compatible Local server**
AbstractOpenAISchemaMistralOpenAISchemaMistral AI
AbstractOpenAISchemaDatabricksOpenAISchemaDatabricks
AbstractOpenAISchemaFireworksOpenAISchemaFireworks AI
AbstractOpenAISchemaTogetherOpenAISchemaTogether AI
AbstractOpenAISchemaGroqOpenAISchemaGroq
AbstractOllamaSchemaOllamaSchemaOllama (endpoint api/chat)
AbstractManagedSchemaAbstractOllamaManagedSchemaOllama (endpoint api/generate)
AbstractAnthropicSchemaAnthropicSchemaAnthropic
AbstractGoogleSchemaGoogleSchemaGoogle Gemini
  • Catch-all implementation - Requires providing a url with api_kwargs and corresponding API key.

** This schema is a flavor of CustomOpenAISchema with a url key preset by global preference key LOCAL_SERVER. It is specifically designed for seamless integration with Llama.jl and utilizes an ENV variable for the URL, making integration easier in certain workflows, such as when nested calls are involved and passing api_kwargs is more challenging.

Note: The aiscan and aiimage functions rely on specific endpoints being implemented by the provider. Ensure that the provider you choose supports these functionalities.

For more detailed explanations of the functions and schema information, refer to How It Works.