Skip to content

Supported models

twinny is a configurable extension/interface which means many models are technically supported. However, not all models work well with twinny in certain scenarios. The following is a list of the models that have been tested and found to work well with twinny. If you find a model that works but is not listed here, please let us know so we can add it to the list or open a pull request to add it.

Chat

In theory any chat model which is trained for instructing will work with twinny. The following are some example of models recommended for chat.

Fill-in-middle

Only certain models support fill in the middle due to their training data. The following are some example of models recommended for fill in the middle. If you find a model that works but is not listed here, please let us know so we can add it to the list or open a pull request to add it.

Codellama models

code versions of codellama models.

Note: The 34b version of codellama does not work well with fill in the middle.

Deepseek Coder models

base versions of deepseek-coder models.

Note: Models which are not base versions do not work well with fill in the middle.

Starcoder models

base versions of starcoder models. The default and base models are the same.

Note: Starcoder2 doesn’t always stop when it is finished. Lowering the temperature and upping the repeat penalty helps with this issue.

Stablecode models

code versions of stablecode models.

Codegemma models

code versions of codegemma models.

Note: CodeGemma doesn’t always stop when it is finished. Lowering the temperature and upping the repeat penalty helps with this issue.