My goal is to understand how neural systems – both biological and artificial – perform visual perception.
The final version of our ECCV 2018 paper on visualizing invariances in convolutional neural networks is available. We find that early and mid-level convolutional layers in VGG-19 exhibit various forms of response invariance: near-perfect phase invariance in some units and invariance to local diffeomorphic transformations in others. At the same time, we uncover representational differences with ResNet-50 in its corresponding layers.
Variability in neuronal responses to identical stimuli is frequently correlated across a population. Attention is thought to reduce these correlations by suppressing noisy inputs shared by the population. However, even with precise control of the visual stimulus, the subject’s attentional state varies across trials. In 2016, we put out the hypothesis that such fluctuations in attentional state could be a cause for some of the correlated variability observed in cortical areas. To address this question empirically, we designed a novel paradigm that allows us to manipulate the strength of attentional fluctuations.
In the new paper just published in Nature Communications, we recorded from monkeys’ primary visual cortex (V1) while they were performing this task. We found both a pronounced effect of attentional fluctuations on correlated variability at long timescales and attention-dependent reductions in correlations at short timescales. These effects predominate in layers 2/3, as expected from a feedback signal such as attention.
Our paper on one-shot segmentation in clutter has been accepted to ICML. In this paper, we tackle a one-shot visual search task: based on a single instruction example (the red Φ in the image below), the goal is to find the same letter in a cluttered image that consists of many letters (left) and segment it. This task is pretty hard for computer vision systems, because the image clutter consists of other letters (i.e. very similar statistics), the letters can have arbitrary colors, are drawn by different people, transformed by affine transformations, and have not been seen during training.
Marissa Weis and Max Günthner have started their Master’s thesis projects on March 1st. Mara will be working on image processing using foveated image representations. Max will be investigating nonlinearities in neural responses in primary visual cortex using techniques to visualize convolutional neural networks.
In the review, written by Leon Gatys, Matthias Bethge and myself, we discuss recent advances in texture synthesis using Convolutional Neural Networks (CNNs) that were motivated by visual neuroscience and have led to a substantial advance in image synthesis and manipulation in computer vision. We also discuss how these advanecs can in turn inspire new research in visual perception and computational neuroscience.
Our psychophysical evaluation of our CNN-based texture model is now available on bioRxiv. In the study led by Tom Wallis, we compared our recent parameteric model of texture appearance (CNN model) that uses the features encoded by a deep convolutional neural network (VGG-19) with two other models: the venerable Portilla and Simoncelli model (PS) and an extension of the CNN model in which the power spectrum is additionally matched.
We just put a preprint on arXiv describing a number of improvements to the style transfer algorithm we developed a while ago. These new features include spatial control, color control and scale control.
Spatial control: applying different styles to different parts of the image (panel b).
Color control: transferring only the style of a painting, but keeping the colors of the original photograph (panel c). You can find additional examples in our blog post on blog.deepart.io.
Scale control: combine small-scale features of one style with large-scale features of another.
In a new paper that just came out in the Journal of Neuroscience we investigate how unobserved fluctuations in attentional state can induce correlated variability among neuronal populations. Interestingly, we found that an extremely simple model that treats attentional state as a shared gain can explain a wide variety of experimental findings on correlated variability.
Our paper A neural algorithm of artistic style (posted on arXiv on Aug 26) is ranked #9 among the most widely discussed and shared academic papers in 2015. See here for the top 100 list and the Altmetric report for our paper. Remarkably, in the all-time stats it ranks #102 out of more than 4.6 million articles.