Addresses a popular belief in neuroscience that the field is primarily data limited by using a microprocessor as a model organism and applying modern data analysis methods from neuroscience to understand its information processing, with generally poor results.
Demonstrates that huamns use scene information to guide search towards likely target sizes, resulting in higher miss rates for mis-scaled targets, which does not occur for object detection DNNs.
Success of reasonably sized neural networks hinges on symmetry, locality, and polynomial log-probability in data from the natural world.
The Transformer, a sequence transduction model that replaces recurrent layers and relies entirely on attention mechanisms, achieves new SotA on machine translation tasks while reducing training time significantly.
Recent research in intuitive physics, guided by knowledge-based and learning-based approaches, shifts to a probabilistic simulation framework that better explains human intuitive physics predictions compared to earlier heuristic models.
Investigates automatically generating curricula based on a variety of progress signals that are computed for each data sample.