Models
Tabnine provides various AI models for Tabnine Chat. Tabnine Enterprise admins can connect Tabnine to their internal endpoints to enrich the Tabnine Chat experience for users. This allows administrators to integrate their company's private LLM instances, making them accessible to the engineering team directly within Tabnine.
Private Model Endpoints Supported by Tabnine
Currently, Tabnine supports the following model providers for private endpoint connections:
Claude 3.5 Sonnet
✓
✓
Claude 3.7 Sonnet
✓
✓
Claude 4 Sonnet
✓
✓
Claude 4.5 Haiku
✓
✓
Claude 4.5 Sonnet
✓
✓
Gemini 2.0 Flash
✓
Gemini 2.5 Flash
✓
Gemini 2.5 Pro
✓
Gemini 3 Pro
✓
Gemma 3 27B
✓
GPT-4.1
✓
✓
GPT-4o
✓
✓
GPT-5
✓
✓
GPT-OSS
✓
Llama 3.1 405B
✓
Llama 3.1 70B
✓
Llama 3.3 70B
✓
Mistral 7B
✓
Qwen
✓
Integration Requirements for Models
Amazon Bedrock: Region, Access Key ID, Secret Access Key | Learn more
Azure: Azure endpoint, Key, Deployment ID | Learn more
OpenAI: Key, OpenAI endpoint | Learn more
OpenAI-Compatible: Llama endpoint, OpenAICompatible Model name
Google Vertex AI: Region, Project ID, Service Account | Learn more
Viewing and setting up the available chat models
Admins manage the available chat models for their accounts and set up private endpoints for chat models:
Sign in to the Tabnine console as an admin.
Go to the Models page under Settings:

Toggle a model on and enter the relevant provider settings for your private endpoint.


Setting the default chat model
Admins can set a model as the account's default chat model. The account default model is the default chat model for the account users. However, users can still switch to any other available model.
Change the account default model by expanding a specific model and toggling Set as default.

Last updated
Was this helpful?
