š„ OpenAI functions and tools
LocalAI supports running OpenAI functions and tools API with llama.cpp
compatible models.
To learn more about OpenAI functions, see also the OpenAI API blog post.
LocalAI is also supporting JSON mode out of the box with llama.cpp-compatible models.
š” Check out also LocalAGI for an example on how to use LocalAI functions.
Setup
OpenAI functions are available only with ggml
or gguf
models compatible with llama.cpp
.
You donāt need to do anything specific - just use ggml
or gguf
models.
Usage example
You can configure a model manually with a YAML config file in the models directory, for example:
To use the functions with the OpenAI client in python:
For example, with curl:
Return dataļ¼
Advanced
Use functions without grammars
The functions calls maps automatically to grammars which are currently supported only by llama.cpp, however, it is possible to turn off the use of grammars, and extract tool arguments from the LLM responses, by specifying in the YAML file no_grammar
and a regex to map the response from the LLM:
The response regex have to be a regex with named parameters to allow to scan the function name and the arguments. For instance, consider:
will catch
Parallel tools calls
This feature is experimental and has to be configured in the YAML of the model by enabling function.parallel_calls
:
Use functions with grammar
It is possible to also specify the full function signature (for debugging, or to use with other clients).
The chat endpoint accepts the grammar_json_functions
additional parameter which takes a JSON schema object.
For example, with curl:
Grammars and function tools can be used as well in conjunction with vision APIs:
š” Examples
A full e2e example with docker-compose
is available here.
Last updated 7 months ago.