Metric Learning
A Metric Learning Reality Check,Arxiv2003
Important paper.
Reducing Class Collapse in Metric Learning with Easy Positive Sampling
- theoretically prove and empirically show that under reasonable noise assumptions, prevalent embedding losses in metric learning, e.g., triplet loss, tend to project all samples of a class with various modes onto a single point in the embedding space, resulting in class collapse that usually renders the space illsorted for classification or retrieval.
Cross-Batch Memory for Embedding Learning,CVPR20,oral
NormFace: L2 Hypersphere Embedding for Face Verification,ACM17
SoftTriple Loss: Deep Metric Learning Without Triplet Sampling,ICCV19
Useful for image classification, retrieval, few-shot learning