Cannot Add Ollama Local Provider Invalid URL Error

I have the same problem. Please let me know when this problem is resolved.

I see, the case is totally understandable. We will run some tests from our end to confirm. Please await my feedback.

I can confirm that I have the same issue.

  • ollama is running locally
  • ollama works for everything else I connected to it
  • http://localhost:11434 gets invalid URL. – which is funny, because it is the default value, when you select Ollama from the “Add Provider” menu.

@here, to connect Ollama from other sources, e.g. a separate dedicated server, you should specify following variables with the values:

  • OLLAMA_HOST="0.0.0.0:11434"
  • OLLAMA_ORIGINS="*"

Make sure to specify URL to Ollama with v1, e.g. http://<IP>:11434/v1

I have tested this configuration and confirm that is works.

For the usage via localhost, please consult with the guide I have mentioned here:

I have same problem. But I use ollama on server (postman/curl work well with same url)

@Constantine
For me, the suggestions do not work. Running the latest OnlyOffice Desktop on Linux via Flatpak with all AI Plugin updates does not work with anything other than localhost.

I added the above Linux environment variables to ~/.bashrc so they persist and executed both reboots and source ~/.bashrc.

The BASH recognizes the environment variables . E.g.,

echo $OLLAMA_HOST
0.0.0.0:11434

I will not post the results of echo $OLLAMA_ORIGINS because this gives strange output when, as suggested, this variable is set to “*”. (On Linux, it seems to set to the local directories and recent doicuments.)

Nevertheleess, setting and trying to usse something like http:192.168.222.171/v1 when setting up Ollama using the pliugin results in invalid URL and/or no models found.

Note, I try several permutations to try to get it to find after the suggested items did not work. Nothing works when using a non-same-machine Ollama instance.

I then tried setting the environment variables above manually in Flatpak using Flatseal (a nice tool at https://flathub.org/en/apps/com.github.tchx84.Flatseal.

I thought perhaps Flatpak was ‘sandboxing’ the env variables. Even with the same environment variables defined in Flatseal (Flatpak), the same results–does not work.

For me at least, this is still not working on Linux with the latest OnyOffice Desktop via Flatpak.

Note, I forgot to show above that I AM using the 192.168.222.171:11434/v1 (including port) so that is not the issue.

Curl and other tools all connect to this remote instance without issues.

Can you install Desktop Editors with DEB packages to confirm the behavior?

Leaving this here for future people who have this problem-- I was not able to add Ollama in the flow from the initial open screen because there is seemingly no box to indicate which Ollama model. If you open a document, go to the AI tab, click settings, and add a model that way, you’re able to pick the specific model you’re using via Ollama, like llama3:latest.

Hello @fuzzyllama
Thank you for your sharing, I hope it will be useful for other users in the same scenario.
However, the case itself seems to be unusual. If you can provide a video demonstration of the described behavior, we will take a closer look at it.

I have had the same error when using the AI agent in onlyoffice desktop setings: I fix it with the following:

root@/etc/systemd/system/ollama.service.d# cat override.conf
[Service]
Environment=“OLLAMA_ORIGINS=*”

It seems to be a CORS issue with me. And is now working.

3 Likes

I have the solution for you all. This drove me absolutely crazy for an entire weekend as well. It works on my iPad and my laptop with Fedora Cosmic.

Add this line to your Ollama container -

-e OLLAMA_ORIGINS=“http://,https://,onlyoffice://*”

2 Likes