python
installedtorch
, we suggest installing it from the official torch docs.username
and password
in every requestBACKEND
is the LLM backend satori will use for tasks like QUERY and ASK. This parameter must have this format openai:model-name
or ollama:model-name
. This will be the default backend you will use for all LLM operations, but if you wish you can change it dinamically by adding the backend
field to QUERY or ASK operations. If this parameter isn’t specified it will default to openai:gpt-4o-mini
. If you’re using OpenAI as your backend you need to have the OPENAI_API_KEY
env variable declared.
This will download a ~100MB
model.
train
ann
query
ask
ollama:model-name
|| openai:model-name
. If not specified it defaults to openai:gpt-4o-mini
. If you’re using OpenAI as backend you must specify OPENAI_API_KEY
as an env variable.ollama:model-name
|| openai:model-name
. If not specified it defaults to openai:gpt-4o-mini
. If you’re using OpenAI as backend you must specify OPENAI_API_KEY
as an env variable.