SOTAVerified

Cross-modal retrieval with noisy correspondence

Noisy correspondence learning aims to eliminate the negative impact of the mismatched pairs (e.g., false positives/negatives) instead of annotation errors in several tasks.

Papers

Showing 120 of 20 papers

TitleStatusHype
ReCon: Enhancing True Correspondence Discrimination through Relation Consistency for Robust Noisy Correspondence LearningCode1
PC^2: Pseudo-Classification Based Pseudo-Captioning for Noisy Correspondence Learning in Cross-Modal RetrievalCode1
UGNCL: Uncertainty-Guided Noisy Correspondence Learning for Efficient Cross-Modal MatchingCode1
Mitigating Noisy Correspondence by Geometrical Structure Consistency LearningCode1
Cross-modal Retrieval with Noisy Correspondence via Consistency Refining and MiningCode1
Learning to Rematch Mismatched Pairs for Robust Cross-Modal RetrievalCode1
Negative Pre-aware for Noisy Cross-modal MatchingCode1
Cross-modal Active Complementary Learning with Self-refining CorrespondenceCode1
Noisy Correspondence Learning with Meta Similarity CorrectionCode1
BiCro: Noisy Correspondence Rectification for Multi-modality Data via Bi-directional Cross-modal Similarity ConsistencyCode1
Cross-Modal Retrieval with Partially Mismatched PairsCode1
Deep Evidential Learning with Noisy Correspondence for Cross-Modal RetrievalCode1
Learning with Noisy Correspondence for Cross-modal MatchingCode1
Breaking Through the Noisy Correspondence: A Robust Model for Image-Text Matching0
Learning with Noisy Correspondence0
NAC: Mitigating Noisy Correspondence in Cross-Modal Matching Via Neighbor Auxiliary Corrector0
REPAIR: Rank Correlation and Noisy Pair Half-replacing with Memory for Noisy Correspondence0
Noisy Correspondence Learning with Self-Reinforcing Errors Mitigation0
Learning From Noisy Correspondence With Tri-Partition for Cross-Modal Matching0
Integrating Language Guidance Into Image-Text Matching for Correcting False NegativesCode0
Show:102550

No leaderboard results yet.