Models settings

Tabnine provides various AI models for Tabnine Chat. Tabnine Enterprise admins, in private installations, can connect Tabnine to their internal endpoints to enrich the Tabnine Chat experience for users. This allows administrators to integrate their company's private LLM instances, making them accessible to the engineering team directly within Tabnine.

Note

Tabnine's code completions only use the Tabnine Universal code completions model, which is both private and protected.

Private Model endpoints supported by Tabnine

Note

The list of supported models is updated frequently as new models become available.

Currently, Tabnine supports the following model providers for private endpoint connections:

Amazon Bedrock

Models: Claude 3.5 Sonnet

Integration requirements: Region, Access Key ID, Secret Access Key

Learn more

Azure

Models: GPT Series

Integration requirements: Azure endpoint, Key, Deployment ID

Learn more

OpenAI

Models: GPT Series

Integration requirements: Key, OpenAI endpoint

Learn more

OpenAI Compatible

Models: Llama 3.1 70B, Llama 3.1 405B

Integration requirements: Llama endpoint, OpenAICompatible Model name

Viewing and setting up the available chat models

Note

If this functionality isn't visible, we recommend contacting your dedicated account manager at Tabnine. They'll assist you in setting the available chat AI models for your team.

Admins manage the available chat models for their accounts and set up private endpoints for chat models:

  1. Sign in to the Tabnine console as an admin.

  2. Go to the Models page under Settings:

  1. Enable a model with the toggle and fill in the relevant provider settings for your private endpoint:

Setting the default chat model

Admins can set a model as the account's default chat model. The account default model is the default chat model for the account users. However, users can still switch to any other available model.

Change the account default model by expanding a specific model and clicking Set as default.

Last updated