OpenAI
To get started with OpenAI you'll need to get an access token and configure unCtl to use it.
unCtl will look for a token in the OPENAI_API_KEY
env var.
Now you'll need to configure unCtl to use OpenAI. This can be done in the unCtl config file.
Put the following in the config (this will be configured for you by default):
llm_config:
- provider: OpenAI
models:
- name: gpt-4
config:
# select model tokenization type. 3 types are available:
# gpt - to use with GPT-based models.
# llama - to use with llama-based models, e.g. llama2 or codellama.
# mixtral - a tokenizer specific to mistral and mixtral models.
tokenizer_type: gpt
# LLM provider name (OpenAI, Ollama or LocalAI)
llm_provider: OpenAI
# specify model name to use with the given LLM provider
llm_model: gpt-4
Once that is done, you can simply run the unCtl as usual.
Last updated