Usage
Root usage
% usage: unctl [-h] [--llm-provider {OpenAI,Ollama,LocalAI}] [--llm-model LLM_MODEL]
[--llm-debug] [--llm-summarizing-model LLM_SUMMARIZING_MODEL]
[--llm-summarization-enabled] [--resolve] [-v] [--config CONFIG]
{k8s,mysql,redis} ...
Welcome to unSkript CLI Interface
options:
-h, --help show this help message and exit
--llm-provider {OpenAI,Ollama,LocalAI}
Select LLM provider.
--llm-model LLM_MODEL
Select LLM model.
--llm-debug Enable local LLM debugging. Will print all the data that's being sent to LLM.
--llm-summarizing-model LLM_SUMMARIZING_MODEL
Select LLM model to be used for summarization.
--llm-summarization-enabled
Enable local LLM context summarization. Will summarize user messages in case token limit is exceeded.
--resolve Run interactive app to resolve problem.
-v, --version show program's version number and exit
--config CONFIG Specify path to the unctl config file.
unctl available providers:
{k8s,mysql,redis}
To see the different available options on a specific provider, run:
unctl {provider} -h|--helpProvider level usage
Last updated