fka/prompts.chat
Viewer • Updated • 1.8k • 56k • 9.69k
How to use jawk5/GJMiii with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="jawk5/GJMiii") # Load model directly
from transformers import AutoModel
model = AutoModel.from_pretrained("jawk5/GJMiii", dtype="auto")How to use jawk5/GJMiii with vLLM:
# Install vLLM from pip:
pip install vllm
# Start the vLLM server:
vllm serve "jawk5/GJMiii"
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:8000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "jawk5/GJMiii",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker model run hf.co/jawk5/GJMiii
How to use jawk5/GJMiii with SGLang:
# Install SGLang from pip:
pip install sglang
# Start the SGLang server:
python3 -m sglang.launch_server \
--model-path "jawk5/GJMiii" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "jawk5/GJMiii",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker run --gpus all \
--shm-size 32g \
-p 30000:30000 \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HF_TOKEN=<secret>" \
--ipc=host \
lmsysorg/sglang:latest \
python3 -m sglang.launch_server \
--model-path "jawk5/GJMiii" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "jawk5/GJMiii",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'How to use jawk5/GJMiii with Docker Model Runner:
docker model run hf.co/jawk5/GJMiii
Lilith is a versatile and powerful AI assistant designed to perform a wide range of tasks, including text generation, answering scientific queries, solving mathematical problems, and more. Built using the OLMo model from AllenAI, Lilith is ideal for researchers, developers, and AI enthusiasts.
Clone the Repository:
git clone https://huggingface.co/jawk5/GJMiii
cd GJMiii
Set Up the Virtual Environment:
python3 -m venv venv
source venv/bin/activate # On Windows: venv\\Scripts\\activate
Install Dependencies:
pip install transformers
Run Lilith:
python lilith.py
Lilith> add_task Complete project documentation
Task 'Complete project documentation' added.
Lilith> list_tasks
- Complete project documentation
Lilith> science Explain quantum mechanics.
Lilith: Quantum mechanics is a fundamental theory in physics that describes...
Lilith> chat Hi, who are you?
Lilith: I am Lilith, your AI assistant. How can I help you today?
This project is licensed under the MIT License. See the LICENSE file for details.
Base model
allenai/OLMo-2-1124-7B