Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plans for RRF? #82

Open
saswat0 opened this issue Jan 13, 2024 · 3 comments
Open

Plans for RRF? #82

saswat0 opened this issue Jan 13, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@saswat0
Copy link

saswat0 commented Jan 13, 2024

Hey @snexus
Are there any plans of including an option for [RRF](Reciprocal Rank Fusion) along with Marco and BGE for reranking?

@snexus
Copy link
Owner

snexus commented Jan 14, 2024

Hi @saswat0

It crossed my mind as well, are you aware of any benefits of RRF compared to cross-encoder (besides speed)? In RRF, there are manual hyper-parameters that the user will need to adjust, such as weights dedicated to sparse vs dense results.

@saswat0
Copy link
Author

saswat0 commented Jan 14, 2024

@snexus I've found hardly any upside except computational efficiency. In staging setups, it gives us an additional degree of controllability (weighing) against various retrievers. But when using a non-embeddings based sparse retriever (BM25, tf-idf), I found RRF to be a better bet.

@snexus
Copy link
Owner

snexus commented Jan 14, 2024

Makes sense, think it is worth implementing in sake of feature completeness, not as a priority though

@snexus snexus added the enhancement New feature or request label Jan 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants