fact-checker

Fact-checking LLM outputs with self-ask

Visit WebsiteView on GitHub
306
Stars
+26
Stars/month
0
Releases (6m)

Overview

A proof-of-concept tool that implements fact-checking for LLM outputs using a self-interrogation approach through prompt chaining. The system works by having an LLM generate an initial answer to a question, then self-examine the assumptions underlying that answer, systematically verify each assumption, and finally generate a corrected response incorporating the new information. This creates a four-step verification loop: initial response → assumption identification → assumption verification → corrected answer. The tool demonstrates how large language models can be prompted to catch and correct their own factual errors through structured self-questioning. While simple in implementation, it showcases an important technique for improving AI reliability by making models explicitly examine their reasoning process. The approach is particularly effective at catching obvious factual errors where the model has conflicting knowledge, as shown in the example where it correctly identifies that mammals don't lay eggs after initially claiming elephants lay the biggest eggs. As a research demonstration with 306 GitHub stars, it provides a clear foundation for understanding self-verification techniques in AI systems.

Pros

  • + Simple and elegant demonstration of LLM self-verification through structured prompt chaining
  • + Effectively catches factual errors by forcing explicit examination of underlying assumptions
  • + Lightweight implementation that can be easily understood and modified for research purposes

Cons

  • - Limited to proof-of-concept status rather than production-ready fact-checking solution
  • - Relies on the same LLM for both initial answers and verification, creating potential circular reasoning
  • - May not catch subtle factual errors or complex reasoning flaws that require external knowledge sources

Use Cases

Getting Started

1. Clone the repository and ensure Python 3 is installed on your system. 2. Run the fact-checker with your question: `python3 fact_checker.py 'your question here'` (remember to wrap questions in quotes). 3. Alternatively, open and run the provided `fact_checker.ipynb` Jupyter notebook for an interactive experience.