HI there, i got a problem with my local AI in Onlyoffice - if i use it local on my Mac Mini with local installed LLMStudio it works well (with localhost) - but if i want to use my own local AI Server with Ollama, it doesn’t work.
Wenn ich use on my Mac Mini “curl http://192.168.1.100:11434/v1” i get in the logs “200 get /v1/models” and get an overview of my models (and in my Nextcloud too), if i use the same settings in Onlyoffice on the same device, i get “403 options /v1/models” - in tcpdump i see something, that try TLS Handshake - but Ollama self is not connected to the Internet, so i can not create an certificate like Lets Encrypt and want to use internaly http.
Modern Systems lets connect localhost without security - but “external” adresses only with security.
Is it possible to disable TLS/SSL for AI in Onlyoffice?
Hello @pleibling
Sorry for the late reply.
Please record a videofile demonstrating the situation and show us your AI model settings in the editor. Do I understand it right that you’re using desktop app? What is its exact version?
Good news!
An official guide for connecting Ollama to ONLYOFFICEis now available.
Please do not forget to set the origins to avoid CORS issues.
(http://,https://,onlyoffice://* )
Hi, i solve the problem. In Dev Tools i saw error at Cohrs Origin and the more important error was “mixed content” (https nextcloud/open Office and http ollama communication) - so the browser doesn’t send the query.
I put an reverse Proxy with LE Certificate in front of ollama and create an minimal nginx configuration for an Bearer Token to secure the connection.
Now it works with Only Office Desktop App and Only Office Document Server.