A Simple Framework for Contrastive Learning of Visual Representations

SimCLR, a simple unsupervised contrastive learning framework, uses data augmentation for positive pairs, a nonlinear projection head, normalized temperature-scaled cross entropy loss, and large batch sizes to achieve SotA in self-supervised, semi-supervised, and transfer learning domains.

Challenging Common Assumptions in the Unsupervised Learning of Disentangled Representations

A large scale, comprehensive study challenges various assumptions in learning disentangled representations, which motivates demonstrating concrete benefits in robust experimental setups in future work.

Automated Curriculum Learning for Neural Networks

Investigates automatically generating curricula based on a variety of progress signals that are computed for each data sample.

Automatic Goal Generation for Reinforcement Learning Agents

Applies curriculum learning to a RL context to achieve policies.