对比学习

对比学习

核心思想 - 优化目标:

img

img

典型的比较函数:

img

研究的方面:

  • 如何定义目标函数?
  • 如何构建正负样本、比例等

相关工作

Arxiv - 2018 - CPC

Representation Learning with Contrastive Predictive Coding

对比loss infoNCE

CVPR 2020 - MoCo

Momentum Contrast for Unsupervised Visual Representation Learning

image-20201222163030380

Memory Bank 提出把所有样本的表示都存起来,然后每次随机采样

MoCo 把负例样本的 encoder 和 mini-batch 大小解耦,用一个 queue 来维护当前的 negative candidates pool,对于负例样本的参数,采用 Momentum update 的方式,来把正例 encoder 的参数copy给负例

image-20201222162948546

ICML 2020 - SimCLR

A Simple Framework for Contrastive Learning of Visual Representations

SimCLR 着重于构建负例的方式

有效结论:

(1) 数据增强的组合在定义有效的预测任务方面起着关键作用;(但是这里是CV)
(2) 在表示和对比损失之间引入一个可学习的非线性变换,大大提高了学习表示的质量。
(3) 与有监督学习相比,对比学习可以从更大的batch size和更多的训练步骤中获益。

image-20201222164246168

一些论文

2020

•Contrastive Representation Learning: A Framework and Review, Phuc H. Le-Khac

•Supervised Contrastive Learning, Prannay Khosla, 2020, [pytorch*]

•A Simple Framework for Contrastive Learning of Visual Representations, Ting Chen, 2020, [pytroch, tensorflow*]

•Improved Baselines with Momentum Contrastive Learning, Xinlei Chen, 2020, [tensorflow]

•Contrastive Representation Distillation, Yonglong Tian, ICLR-2020 [pytorch*]

•COBRA: Contrastive Bi-Modal Representation Algorithm, Vishaal Udandarao, 2020

•What makes for good views for contrastive learning, Yonglong Tian, 2020

•Prototypical Contrastive Learning of Unsupervised Representations, Junnan Li, 2020

•Contrastive Multi-View Representation Learning on Graphs, Kaveh Hassani, 2020

•DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations, John M. Giorgi, 2020

•On Mutual Information in Contrastive Learning for Visual Representations, Mike Wu, 2020

•Semi-Supervised Contrastive Learning with Generalized Contrastive Loss and Its Application to Speaker Recognition, Nakamasa Inoue, 2020

2019

•Momentum Contrast for Unsupervised Visual Representation Learning, Kaiming He, 2019, [pytorch]

•Data-Efficient Image Recognition with Contrastive Predictive Coding, Olivier J. Hénaff, 2019

•Contrastive Multiview Coding, Yonglong Tian, 2019, [pytorch*]

•Learning deep representations by mutual information estimation and maximization, R Devon Hjelm, ICLR-2019, [pytorch]

•Contrastive Adaptation Network for Unsupervised Domain Adaptation, Guoliang Kang, CVPR-2019

2018

•Representation learning with contrastive predictive coding, Aaron van den Oord, 2018, [pytorch]

•Unsupervised Feature Learning via Non-Parametric Instance-level Discrimination, Zhirong Wu, CVPR-2018, [pytorch*]

•Adversarial Contrastive Estimation, Avishek Joey Bose, ACL-2018,

Before2017

•Time-Contrastive Networks: Self-Supervised Learning from Video, Pierre Sermanet, CVPR-2017

•Contrastive Learning for Image Captioning, Bo Dai, NeurIPS-2017, [lua*]

•Noise-contrastive estimation for answer selection with deep neural networks, Jinfeng Rao, 2016, [torch]

•Improved Deep Metric Learning with Multi-class N-pair Loss Objective, Kihyuk Sohn, NeurIPS-2016, [pytorch]

•Learning word embeddings efficiently with noise-contrastive estimation, Andriy Mnih, NeurIPS-2013,

•Noise-contrastive estimation: A new estimation principle for unnormalized statistical models, Michael Gutmann, AISTATS 2010, [pytorch]

•Dimensionality Reduction by Learning an Invariant Mapping, Raia Hadsell, 2006

Reference

https://zhuanlan.zhihu.com/p/141141365

Representation Learning with Contrastive Predictive Coding

Momentum Contrast for Unsupervised Visual Representation Learning

A Simple Framework for Contrastive Learning of Visual Representations

A SURVEY ON CONTRASTIVE SELF-SUPERVISED LEARNING

Unsupervised Reference-Free Summary Quality Evaluation via Contrastive Learning

Contrastive Learning with Adversarial Examples. NeurIPS 2020


文章作者: 长腿咚咚咚
版权声明: 本博客所有文章除特別声明外,均采用 CC BY 4.0 许可协议。转载请注明来源 长腿咚咚咚 !
评论
 上一篇
预训练模型Fine-Tuning微调优化 预训练模型Fine-Tuning微调优化
1 Fine-Tuning预训练语言模型的微调,本质上是迁移学习的一种应用。 将先进的语言模型在大规模语料上进行预训练后,在下游任务上进行微调,使得预训练语言模型迁移并充分适应下游任务。 针对微调技巧本身 or 从不同角度对微调做优化 根
2021-03-11
下一篇 
Helm Helm
1 K8S编排工具——HelmHelm 是 Kubernetes 的软件包管理工具,类似于Python的pip centos的yum,主要用来管理 Charts Helm Chart是用来封装Kubernetes原生应用程序的一系列YAML
  目录