cuuupid / e5-mistral-7b-instruct

Finetuned E5 embeddings for instruct based on Mistral.

  • Public
  • 131 runs
  • Paper
  • License

E5-mistral-7b-instruct

Improving Text Embeddings with Large Language Models. Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024

This model has 32 layers, produces 4096-length embeddings, and has a context window of 32k tokens.

Model Card