본문 바로가기

ML

(6)
[논문리뷰] A Simple Framework for Contrastive Learning of Visual Representations 논문 링크 : https://arxiv.org/abs/2002.05709 A Simple Framework for Contrastive Learning of Visual Representations This paper presents SimCLR: a simple framework for contrastive learning of visual representations. We simplify recently proposed contrastive self-supervised learning algorithms without requiring specialized architectures or a memory bank. In order to under arxiv.org 해당 논문은 MoCo와 같이 self..
[논문리뷰] Momentum Contrast for Unsupervised Visual Representation Learning 논문링크 : https://arxiv.org/abs/1911.05722 Momentum Contrast for Unsupervised Visual Representation Learning We present Momentum Contrast (MoCo) for unsupervised visual representation learning. From a perspective on contrastive learning as dictionary look-up, we build a dynamic dictionary with a queue and a moving-averaged encoder. This enables building a large a arxiv.org 해당 논문은 Facebook AI Resera..
[논문리뷰] FixMatch: Simplifiying Semi-Supervised Learning with Consistency and Confidence 논문 링크 : https://arxiv.org/abs/2001.07685 FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence Semi-supervised learning (SSL) provides an effective means of leveraging unlabeled data to improve a model's performance. In this paper, we demonstrate the power of a simple combination of two common SSL methods: consistency regularization and pseudo-label arxiv.org 해당 논문은 Semi..
[논문리뷰] iCaRL: Incremental Classifier and Representation Learning 본 논문 리뷰는 유투버 '동빈나'의 꼼꼼한 논문 설명 영상과 아래의 아카이브 논문을 기반으로 정리했습니다. 논문 링크 : https://arxiv.org/abs/1611.07725 iCaRL: Incremental Classifier and Representation Learning A major open problem on the road to artificial intelligence is the development of incrementally learning systems that learn about more and more concepts over time from a stream of data. In this work, we introduce a new training strategy, i..
[논문리뷰] Regularization With Stochastic Transformation and Perturbations for Deep Semi-Supervised Learning 논문 링크 : https://proceedings.neurips.cc/paper/2016/hash/30ef30b64204a3088a26bc2e6ecf7602-Abstract.html Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning Requests for name changes in the electronic proceedings will be accepted with no questions asked. However name changes may cause bibliographic tracking issues. Authors are asked to consider this ca..
[논문리뷰] Learning Without Forgetting(ECCV 2016) 해당 논문은 ECCV 2016에 실린 초창기 Continual learning에 관한 paper이다. 해당 논문은 새로운 task에 대해서 Continual한 상황에서 학습하는 방법을 제안한다. 이때 여러가지 방법을 사용할 수 있는데 기존의 method인 Fine-tuning, Feature Extraction, Joint Training과 논문에서 제안한 Learning without forgetting의 성능과 장단점을 비교한다. -Feature extraction 공유된 파라미터와 old task 파라미터는 건드리지 않고, 1개 또는 하나 이상의 output layer가 새로운 task에 대한 feature로써 사용되어진다. 이때, 새로운 task에 대해서 성능을 발휘하지 못하는데, 그 이유는 공유..