facebook/MobileLLM-Pro
#1494
by
Edge-Quant
- opened
I'm almost certain that the architecture MobileLLMP1ForCausalLM is not supported by llama.cpp so this model can't be converted into a GGUF.
I'm almost certain that the architecture MobileLLMP1ForCausalLM is not supported by llama.cpp so this model can't be converted into a GGUF.