Fact-checking with Iterative Retrieval and Verification
-
Updated
Nov 3, 2024 - Python
Fact-checking with Iterative Retrieval and Verification
Chrome extension for the ATLAS project.
Antibodies for LLMs hallucinations (grouping LLM as a judge, NLI, reward models)
Binary hallucination detection classifier using logistic regression
Fully automated LLM evaluator
🔢Hallucination detector for Large Language Models.
Different approaches to evaluate RAG !!!
API for the atlas project
Hallucination in Chat-bots: Faithful Benchmark for Information-Seeking Dialogue
Detecting Hallucinations in Large Language Model Generations using Graph Structures
VideoHallucer, The first comprehensive benchmark for hallucination detection in large video-language models (LVLMs)
[ACL 2024] ANAH & [NeurIPS 2024] ANAH-v2
Competition: SemEval-2024 Task-6 - SHROOM, a Shared-task on Hallucinations and Related Observable Overgeneration Mistakes
[ACL 2024] An Easy-to-use Hallucination Detection Framework for LLMs.
up-to-date curated list of state-of-the-art Large vision language models hallucinations research work, papers & resources
An Easy-to-use Hallucination Detection Framework for LLMs.
This repo hosts the Python SDK and related examples for AIMon, which is a proprietary, state-of-the-art system for detecting LLM quality issues such as Hallucinations. It can be used during offline evals, continuous monitoring or inline detection. We offer various model quality metrics that are fast, reliable and cost-effective.
Code for the EMNLP 2024 paper "Detecting and Mitigating Contextual Hallucinations in Large Language Models Using Only Attention Maps"
Official repo for the paper PHUDGE: Phi-3 as Scalable Judge. Evaluate your LLMs with or without custom rubric, reference answer, absolute, relative and much more. It contains a list of all the available tool, methods, repo, code etc to detect hallucination, LLM evaluation, grading and much more.
Add a description, image, and links to the hallucination-detection topic page so that developers can more easily learn about it.
To associate your repository with the hallucination-detection topic, visit your repo's landing page and select "manage topics."