Tabnine's Prompting Guide
You must be very specific and clear when prompting an LLM
Defining context is critical when prompting Tabnine Chat.
Start a fresh conversation as appropriate
Starting a fresh conversation helps keep Tabnine Chat focused on the correct context when having a long or detailed conversation.
It is often helpful to provide the LLM with the necessary details to get quality output.
Tabnine Chat can be used to ask for examples of the desired output.
LLMs perform best when prompts are concise but complete.
Use examples of expected output
Providing Tabnine Chat with examples of expected output will improve the generated response.
Working with LLM-based products requires expertise. Models are becoming increasingly better in understanding and following instructions, even when the instructions are ambiguous. However, models are often sensitive to exact phrasing, and minor changes to the prompt and context can yield significantly different results.
The goal of this guide is to provide high-level guidance for prompting Tabnine Chat. Different backend models will have slightly different sensitivities and would benefit from using specific phrases or forms. However, the guidance in this document should be practical and beneficial across all models.
Be Specific and Clear
Ask for a detailed explanation with examples
Here is the referenced function in the text editor:

Less effective prompt: Explain this function:

More effective prompt: Explain this function step by step, including an example:

Ask for alternative implementations as part of an explanation
Here, we ask Tabnine to Explain this function in a detailed and precise manner, including alternative implementations. Having the prompt contains the phrase "detailed and precise manner" often helps Tabnine produce a more focused response.
Here is the referenced function in the text editor:

Part one of the response, using the prompt Explain this function in a detailed and precise manner, including alternative implementations:

Part two of the response:

Define the Context
General rule: Provide context about the task, the project, or the specific issue you are dealing with.
1. Refer to the relevant entity or select it in the editor
Here is the referenced function in the text editor:

As an alternative to mentioning the function name in the prompt, you can also select the code in the editor or use the CodeLens ask
option just above the code. Selecting the code in the current open file provides Tabnine Chat with explicit context. If you'd like to perform operations on specific code, be sure to select the code in the IDE before you ask Chat to operate it.
Less effective: prompt Improve this code:

More effective: with the relevant code context made explicit as described above, prompt Refactor the function printCelsius to have the formula just once:

2. Using (or not using) workspace context
Using workspace context is a form of personalization that Tabnine offers to improve the quality of responses from an LLM that was not trained on your existing code base. If workspace context is enabled by your team admin, it is on by default. The toggle to control whether it is on or off is in the lower right corner of the Tabnine Chat window.
Here is the workspace context selector with the toggle switched off:

Using the same reference code in example 1 above, here is the response to the prompt Modify this code to convert from Celsius to Fahrenheit with workspace context on:


Here is the response to the same prompt with the workspace context turned off:


3. Using Mentions to refer to a specific code element outside of the currently opened files
It is not always best to add more context (such as by opening a large number of files in the workspace). Only the most recently touched 2 files in the text editor will be loaded into the context window. In cases where you do not want to open additional files, you can use Mentions to reference specific files, methods or functions elsewhere in your local workspace as part of your prompt.
Start a Fresh Convo When Appropriate
Tabnine uses previous chat messages as context. When you move between tasks, use the "New Conversation" button just above the chat text entry to clear the existing conversation's context:
The "New Conversation" option is not available when there is no current chat context.
The "New Conversation" option is available when there is current chat context.
You can also start a new chat from the chat menu at the top of the chat window:
Include Necessary Details
General Rule: Include any specific requirements, constraints, or desired outcomes.
1. Specifying input and output types
Here is the referenced code:
Less effective prompt: Write a function to convert from Celsius to Fahrenheit
More effective prompt: Write a function to convert from Celsius to Fahrenheit, the function should take an integer and return an integer
2. Specifying frameworks or libraries
If you want a specific testing framework to be used, include that detail in the prompt.
The prompt output below: Write tests for this function using Mockito
Ask for Examples
General Rule: Request examples to understand concepts better.
1. Asking for examples to improve results
Less effective prompt: Explain SQL joins
More effective prompt: Explain SQL joins with examples for inner join, left join, right join, and full join
Response image truncated for ease of presentation.
Be Concise, But Complete
General Rule: Be concise to avoid overwhelming the model, but include all necessary information.
1. Non-specific prompts vs. specific prompts
Less effective prompt: Can you help me with my project?
More effective prompt: I need help writing a Python function that reads a CSV file and prints the contents. The file has three columns: 'name', 'age', and 'email'.
R
Response image truncated for ease of presentation.
Use examples of expected output
Providing Tabnine Chat with examples of expected output will improve the generated response.
Last updated
Was this helpful?