Custom models
StartKit.AI can be configured to use your own self-hosted LLM, or any other custom model that’s not directly supported.
Custom models file
Simply add your model to the config/models/custom.yml
file like this:
As long as the model is in the Portkey.ai Gateway supported list then it should work, as this is what StartKit.AI uses behind the scenes.
Then simply reference your model in the config file that you want to use it for. For example:
API Key
If your custom model requires an API key, then StartKIt.AI will attempt to use one from the .env
that matches the provider name. For example if your provider is named anthropic
, we will try and use the ANTHROPIC_KEY
env value to authenticate.