The ai module contains functions for working with local and remote AI models.
Models can be managed in Settings ▸ AI. See AI Models for details.
The following model names are supported:
To use local models, you need to install the model files in Settings ▸ AI.
llama2-7b-chat — LLama 2 7B-Chat, chat-tuned 7B-parameter transformer model for text generation (4-bit quantized).openchat — OpenChat, chat-tuned 7B-parameter transformer model for text generation (4-bit quantized).neural-chat-7b — Neural Chat 7B, chat-tuned 7B-parameter transformer model for text generation (4-bit quantized).tinyllama-chat - TinyLlama-Chat, 1.1B-parameter transformer model for text generation (5-bit quantized).To use OpenAI models, you need to set the OpenAI API key in Settings ▸ Keys.
gpt-5.2gpt-5.2-codexgpt-5-chat-latestgpt-5-minigpt-5-nanogpt-4.1gpt-4o.To use Google AI models, you need to set the Google AI API key in Settings ▸ Keys.
gemini-3.1-pro-previewgemini-3-flash-previewgemini-2.5-flashgemini-2.5-proTo use OpenRouter models, you need to set the OpenRouter API key in Settings ▸ Keys.
openrouter:claude-opus-4.6openrouter:claude-sonnet-4.6openrouter:claude-opus-4.5openrouter:claude-sonnet-4.5openrouter:claude-haiku-4.5openrouter:gemini-3.1-pro-preview,openrouter:kimi-k2.5openrouter:free.To use DeepSeek models, you need to set the DeepSeek API key in Settings ▸ Keys.
deepseek-chatTo use TextSynth models, you need to set the TextSynth API key in Settings ▸ Keys.
textsynth-gptj — GPT-J 6B model via the TextSynth API.ai