System Info and Version

LocalAI provides endpoints to inspect the running instance, including available backends, loaded models, and version information.

System Information

  • Method: GET
  • Endpoint: /system

Returns available backends and currently loaded models.

Response

FieldTypeDescription
backendsarrayList of available backend names (strings)
loaded_modelsarrayList of currently loaded models
loaded_models[].idstringModel identifier

Usage

curl http://localhost:8080/system

Example response

{
  "backends": [
    "llama-cpp",
    "huggingface",
    "diffusers",
    "whisper"
  ],
  "loaded_models": [
    {
      "id": "my-llama-model"
    },
    {
      "id": "whisper-1"
    }
  ]
}

Version

  • Method: GET
  • Endpoint: /version

Returns the LocalAI version and build commit.

Response

FieldTypeDescription
versionstringVersion string in the format version (commit)

Usage

curl http://localhost:8080/version

Example response

{
  "version": "2.26.0 (a1b2c3d4)"
}

Error Responses

Status CodeDescription
500Internal server error