nateraw / mixtral-8x7b-32kseqlen

The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.

  • Public
  • 15K runs
  • GitHub
  1. Author
    @nateraw
    Version
    22.04

    defd13a4

    Latest
  2. Author
    @nateraw
    Version
    22.04