SOTAVerified

Hallucination

Papers

Showing 16711680 of 1816 papers

TitleStatusHype
Evolutionary thoughts: integration of large language models and evolutionary algorithmsCode0
Investigating and Mitigating Object Hallucinations in Pretrained Vision-Language (CLIP) ModelsCode0
Integrating Chemistry Knowledge in Large Language Models via Prompt EngineeringCode0
Confidence Estimation for LLM-Based Dialogue State TrackingCode0
Instruction Makes a DifferenceCode0
SLPL SHROOM at SemEval2024 Task 06: A comprehensive study on models ability to detect hallucinationCode0
Incorporating Task-specific Concept Knowledge into Script LearningCode0
Ever: Mitigating Hallucination in Large Language Models through Real-Time Verification and RectificationCode0
SmallPlan: Leverage Small Language Models for Sequential Path Planning with Simulation-Powered, LLM-Guided DistillationCode0
Improving Factuality in Large Language Models via Decoding-Time Hallucinatory and Truthful ComparatorsCode0
Show:102550
← PrevPage 168 of 182Next →

No leaderboard results yet.