Liquid AI Quantised Models
Collection
This repository contains Liquid AI Quantised Models to enable direct deployment with Ollama
โข
3 items
โข
Updated
This is a quantized GGUF model (Q8_0) compatible with Ollama.
You can pull and run this model directly with Ollama:
ollama pull hf.co/Sadiah/ollama-q8_0-LFM2-2.6B:Q8_0
Then run it:
ollama run hf.co/Sadiah/ollama-q8_0-LFM2-2.6B:Q8_0 "Write your prompt here"
Please refer to the original model card for licensing information.
8-bit
Base model
LiquidAI/LFM2-2.6B