Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

continuous loading after running docker compose #437

Open
donvito opened this issue Oct 25, 2024 · 7 comments
Open

continuous loading after running docker compose #437

donvito opened this issue Oct 25, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@donvito
Copy link

donvito commented Oct 25, 2024

Environment
MacOS Sequioa
Docker version 4.21.1 (114176)

Describe the bug
I used docker-compose up -d on a fresh install and I see a loading animation.

To Reproduce
Steps to reproduce the behavior:

  1. Clone the repo
  2. docker-compose up -d

Expected behavior
Expecting the UI to load

Screenshots

Screenshot 2024-10-26 at 1 49 13 AM

error in the logs
Screenshot 2024-10-26 at 1 50 19 AM

@donvito donvito added the bug Something isn't working label Oct 25, 2024
@ItzCrazyKns
Copy link
Owner

You seem to forgot to configure the config.toml file. Remove the new config.toml directory that was created by Docker and then rename sample.config.toml to config.toml. This will solve your issue.

@Omaha2002
Copy link

Omaha2002 commented Nov 9, 2024

Same problem here, tried installing it alongside Ollama and openwebui docker compose and with harbor: harbor up perplexica searxng

Both result in spinning wheel:

afbeelding

clicking settings:

afbeelding

@ItzCrazyKns
Copy link
Owner

ItzCrazyKns commented Nov 16, 2024

Same problem here, tried installing it alongside Ollama and openwebui docker compose and with harbor: harbor up perplexica searxng

Both result in spinning wheel:

afbeelding clicking settings: afbeelding

Please follow: https://github.com/ItzCrazyKns/Perplexica?tab=readme-ov-file#ollama-connection-errors. And if you're getting the same error as the author, please follow the steps I mentioned above.

@Ganju0
Copy link

Ganju0 commented Nov 20, 2024

I just installed like 5 minutes ago with the docker compose and am getting this error. Whats weird is if I use the desktop browser on my pc running ollama and perplexca (and open webui), perplexica works fine. If I try to open the service on any other device on my home network I'm getting infinite loading

So I know the url is set right, I set ollama host=0.0.0.0 and I know it works cause I connect to it on other devices on my home network. And I don't think its an issue with my ai server set up cause I can access open webui from other devices just fine

@mr-biz
Copy link

mr-biz commented Nov 20, 2024 via email

@Ganju0
Copy link

Ganju0 commented Nov 20, 2024

You need to specify the IP address, not 0.0.0.0 see: https://github.com/ItzCrazyKns/Perplexica/blob/master/docs/installation/NETWORKING.md

On Wed, 20 Nov 2024, 02:10 Ganju0, @.> wrote: I just installed like 5 minutes ago with the docker compose and am getting this error. Whats weird is if I use the desktop browser on my pc running ollama and perplexca (and open webui), perplexica works fine. If I try to open the service on any other device on my home network I'm getting infinite loading So I know the url is set right, I set ollama host=0.0.0.0 and I know it works cause I connect to it on other devices on my home network. And I don't think its an issue with my ai server set up cause I can access open webui from other devices just fine — Reply to this email directly, view it on GitHub <#437 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKVYRVF7H6JCOBX46PD74X32BPVRLAVCNFSM6AAAAABQTYDWYCVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIOBXGE3TIMRWGA . You are receiving this because you are subscribed to this thread.Message ID: @.>

You are correct and this fixed it

@Froggy232
Copy link

Froggy232 commented Nov 22, 2024

Hi,
I'm trying to make perplexica work with podman and the latest published images.
So far, I have created a pod and two container inside, one for the backend and one for the frontend. I have used quadlet files and I think don't need the network because creating a pod imply the creation of a network, and I haven't used searxng container as I already have one on this server.
Everythings spin up, but I have the exact same problem, an infinite running circle. I think there is a problem with the connection to either the backend, or searxng/ollama. Is there any environment value I can use to specify the address of them? I have modified the config.toml for searxng and ollama, but there is nothing for the backend address except if you build it yourself it seems.
I actually use morphic in a very, very similar configuration and everything works well. I'm of course able to provide any logs or config files that could help.
Thanks you a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

6 participants