Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can't load any models in textgen with "no module named" error #100

Open
WildPowerHammer opened this issue Aug 5, 2024 · 0 comments
Open

Comments

@WildPowerHammer
Copy link

I have few gptq and gguf models, when i try to run gptq i have this:

2024-08-06 00:47:26 ERROR:Failed to load the model.
Traceback (most recent call last):
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/ui_model_menu.py", line 201, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(shared.model_name, loader)
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/models.py", line 79, in load_model
    output = load_func_map[loader](model_name)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/models.py", line 318, in AutoGPTQ_loader
    import modules.AutoGPTQ_loader
  File "/nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/modules/AutoGPTQ_loader.py", line 3, in <module>
    from auto_gptq import AutoGPTQForCausalLM, BaseQuantizeConfig
ModuleNotFoundError: No module named 'auto_gptq'

Similar with other model loaders, just changes the name of module that not found

If i try to run gguf model i have

2024-08-06 00:52:34 INFO:Loading dolphin-2.6-mistral-7b.Q6_K.gguf...
2024-08-06 00:52:34 INFO:llama.cpp weights detected: /home/digital_dominus/.textgen/state/models/dolphin-2.6-mistral-7b.Q6_K.gguf
2024-08-06 00:52:34 INFO:Cache capacity is 0 bytes
/nix/store/rfpyasrilnfv9xmv5z50jnyv5w12178y-textgen/bin/textgen: line 26: 126248 Illegal instruction     (core dumped) /nix/store/7hpffz24mjm12y5ymd2is43lxl7nf27b-python3-3.11.6-env/bin/python /nix/store/1zzqmn5cl5c8dcbv37xp8xvvii892015-textgen-patchedSrc/server.py $@ --model-dir $HOME/.textgen/state/models/ --lora-dir $HOME/.textgen/state/loras/

I'm absolutely unsure what i can try to fix this, for me looks like i somehow broke very reproducible installation process and now i'm here

P.S. If anyone can at least recommend me some other ways to run local models, I will be very grateful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant