
nateraw / wizard-mega-13b-awq
wizard-mega-13b quantized with AWQ and served with vLLM
- Public
- 5.5K runs
- GitHub
Want to make some of these yourself?
Run this modelwizard-mega-13b quantized with AWQ and served with vLLM
Want to make some of these yourself?
Run this model