Supported Models¶
LangChain Chat Models¶
You can set the settings.llm
with any LangChain ChatModel.
from funcchain import settings
from langchain_community.chat_models import AzureChatOpenAI
settings.llm = AzureChatOpenAI(...)
String Model Identifiers¶
You can also set the settings.llm
with a string identifier of a ChatModel including local models.
Schema¶
<provider>/<model_name>:<optional_label>
Providers¶
openai
: OpenAI Chat Modelsllamacpp
: Run local models directly using llamacpp (alias:thebloke
,gguf
)ollama
: Run local models through Ollama (wrapper for llamacpp)azure
: Azure Chat Modelsanthropic
: Anthropic Chat Modelsgoogle
: Google Chat Models
Examples¶
openai/gpt-3.5-turbo
: ChatGPT Classicopenai/gpt-4-1106-preview
: GPT-4-Turboollama/openchat
: OpenChat3.5-1210ollama/openhermes2.5-mistral
: OpenHermes 2.5llamacpp/openchat-3.5-1210
: OpenChat3.5-1210TheBloke/Nous-Hermes-2-SOLAR-10.7B-GGUF
: alias forllamacpp/...
TheBloke/openchat-3.5-0106-GGUF:Q3_K_L
: with Q label
additional notes¶
Checkout the file src/funcchain/model/defaults.py
for the code that parses the string identifier.
Feel free to create a PR to add more models to the defaults. Or tell me how wrong I am and create a better system.