TheBloke’s quantized version of Deepseek’s Coder 33B Instruct model in GGUF format. The full model card can be found here.
Specifically, this is the deepseek-coder-33b-instruct.Q5_K_M.gguf model, with a 16k context window.
A quantized 33B parameter language model from Deepseek for SOTA repository level code completion
TheBloke’s quantized version of Deepseek’s Coder 33B Instruct model in GGUF format. The full model card can be found here.
Specifically, this is the deepseek-coder-33b-instruct.Q5_K_M.gguf model, with a 16k context window.