Skip to content

Latest commit

 

History

History
25 lines (18 loc) · 1.28 KB

README.md

File metadata and controls

25 lines (18 loc) · 1.28 KB
benchmark type submission_name
mteb
evaluation
MTEB

Note

Previously it was possible to submit models results to MTEB by adding the results to the model metadata. This is no longer an option as we want to ensure high quality metadata.

This repository contains the results of the embedding benchmark evaluated using the package mteb.

Reference
🦾 Leaderboard An up to date leaderboard of embedding models
📚 mteb Guides and instructions on how to use mteb, including running, submitting scores, etc.
🙋 Questions Questions about the results
🙋 Issues Issues or bugs you have found