You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to deploy Mistral-7B-v0.1 locally, as the documentation mentions it's supported, but it fails to generate the GGUF model file.
The error is NotImplementedError: Architecture "MistralForCausalLM" not supported!
As using GGUF files is a breaking change and the Mistral-7B model should be supported, I think adding support for MistralForCausalLM architecture to convert-hf-to-gguf.py is essential.
I'm running the latest version as of Dec 14, 2023, which is b1637