facebook/MobileLLM-Pro

#1494
by Edge-Quant - opened

I'm almost certain that the architecture MobileLLMP1ForCausalLM is not supported by llama.cpp so this model can't be converted into a GGUF.

Sign up or log in to comment