Detect hallucinations in LLM responses. Verify every claim against source documents using hybrid STS + NLI. Works with LangChain, LlamaIndex, or any RAG pipeline. pip install longtracer
python nlp monitoring verification tracing observability ai-safety nli rag llm langchain sentence- hallucination-detection claim-verification lamaindex gaurdrail
-
Updated
Apr 6, 2026 - Python