{"result" : "", "status": ""}

Pricing

In order to use Satori AI features you must purchase and have a valid Satori AICore License.

Setup

  • You must have python installed
  • You must install the dependencies listed on requirements.txt
  pip install -r https://www.satoridb.com/requirements.txt
  • You must install torch, we suggest installing it from the official torch docs.

If authentication is enabled you must specify username and password in every request
Satori’s mission is to give the developer more power over his data, that’s why we plan offering a complete suite of AI capabilities built-in with satori, to start using our AI capabilities run satori with this command
satori -ai -b <BACKEND> -l <LICENSE_NUMBER> //you only need to specify the license number in the first launch
BACKEND is the LLM backend satori will use for tasks like QUERY and ASK. This parameter must have this format openai:model-name or ollama:model-name. This will be the default backend you will use for all LLM operations, but if you wish you can change it dinamically by adding the backend field to QUERY or ASK operations. If this parameter isn’t specified it will default to openai:gpt-4o-mini. If you’re using OpenAI as your backend you need to have the OPENAI_API_KEY env variable declared. This will download a ~100MB model.

AI Operations

Satori implements some interesting AI features like training your own embedding model with your data, performing Aproximate Nearest Neighbors searchs, Natural Language Querys and Chatting with your data. This is accomplished via 4 function train ann query ask

QUERY

Allows to make any query to the DB by using Natural language
query
string
Your query in natural language. Ex: Insert the value 5 in the grades array of everyone on class B.
backend
string
The LLM backend to be used, this can be ollama:model-name || openai:model-name. If not specified it defaults to openai:gpt-4o-mini. If you’re using OpenAI as backend you must specify OPENAI_API_KEY as an env variable.
{"result" : "", "status": ""}

ASK

Allows to chat with the DB.
question
string
Your question in natural language. Ex: How many user over 25 years do we have?
backend
string
The LLM backend to be used, this can be ollama:model-name || openai:model-name. If not specified it defaults to openai:gpt-4o-mini. If you’re using OpenAI as backend you must specify OPENAI_API_KEY as an env variable.
{"response" : ""}

ANN

Perform an Aproximate Nearest Neighbors search
key
string
Source object key
top_k
int
Number of results to return
{"results" : []}