unsupervised learning

A Simple Framework for Contrastive Learning of Visual Representations

SimCLR, a simple unsupervised contrastive learning framework, uses data augmentation for positive pairs, a nonlinear projection head, normalized temperature-scaled cross entropy loss, and large batch sizes to achieve SotA in self-supervised, semi-supervised, and transfer learning domains.

A critique of pure learning and what artificial neural networks can learn from animal brains

Development of artificial neural networks should leverage the insight that much of animal behavior is innate as a result of wiring rules encoded in the genome, learned through billions of years of evolution.

Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations

A large scale, comprehensive study challenges various assumptions in learning disentangled representations, which motivates demonstrating concrete benefits in robust experimental setups in future work.