Miscellaneous 2018

Decoding the direction of implied motion in human early visual cortex

{Implied motion perception is a striking case of our capacity to infer motion features from static pictures that imply movement. At a higher, cognitive level, the mere configuration of an object (such as a snapshot of a walking human) can imply motion in a directional way. Previous studies have shown that implied motion processing recruits direction selective neurons and activates cortical motion processing regions. However, it is unknown whether object-processing regions or early visual regions are involved in implied motion processing. In the present study we used fMRI and multivariate pattern classification to examine which human brain regions differentiate implicit direction information in static images of implied motion. We hence examined BOLD ac­tivity patterns within independently defined early visual (V1-V3), motion (V5+/MT+) and object-processing (LO1, LO2) regions when participants viewed still images with directional implied motion (rightward vs. leftward). The stimuli contained both animate (birds) and inanimate (airplanes, cars) objects as sources of implied motion. The objects were presented at the center of the visual field on a horizontally blurred background in the periphery. We found that response patterns in visual areas V2, V3, human motion complex V5+/MT+, and object responsive region LO2 coded for the direction of the implied motion stimuli significantly better than chance. Decoding in visual areas V1 and LO1 was at chance level. We then examined decoding in retinotopically defined foveal and peripheral representations of V1-V3. Only the foveal representation was stimulated by the foreground objects, the periphery by blurred background. We found that peripheral V1-V3 allowed decoding of implied motion directions, while foveal representations did not. Hence, high-level information of implied motion directionality is represented in peripheral V1-V3, i.e. regions that were never given the information through bottom-up stimulation. This suggests that higher-level cognitive processes (potentially based in LO2, V5+/MT+) detect implied motion direction based on object configuration and feed it back to cover the peripheral context in early visual cortex, potentially encoding expected background-motion. The results provide direct evidence for information in early visual cortex originating from feedback, compatible with predictive coding theory.}

Author(s): Altan, G and Bartels, A
Book Title: 48th Annual Meeting of the Society for Neuroscience (Neuroscience 2018)
Year: 2018
Bibtex Type: Miscellaneous (misc)
Electronic Archiving: grant_archive

BibTex

@misc{item_3005951,
  title = {{Decoding the direction of implied motion in human early visual cortex}},
  booktitle = {{48th Annual Meeting of the Society for Neuroscience (Neuroscience 2018)}},
  abstract = {{Implied motion perception is a striking case of our capacity to infer motion features from static pictures that imply movement. At a higher, cognitive level, the mere configuration of an object (such as a snapshot of a walking human) can imply motion in a directional way. Previous studies have shown that implied motion processing recruits direction selective neurons and activates cortical motion processing regions. However, it is unknown whether object-processing regions or early visual regions are involved in implied motion processing. In the present study we used fMRI and multivariate pattern classification to examine which human brain regions differentiate implicit direction information in static images of implied motion. We hence examined BOLD ac­tivity patterns within independently defined early visual (V1-V3), motion (V5+/MT+) and object-processing (LO1, LO2) regions when participants viewed still images with directional implied motion (rightward vs. leftward). The stimuli contained both animate (birds) and inanimate (airplanes, cars) objects as sources of implied motion. The objects were presented at the center of the visual field on a horizontally blurred background in the periphery. We found that response patterns in visual areas V2, V3, human motion complex V5+/MT+, and object responsive region LO2 coded for the direction of the implied motion stimuli significantly better than chance. Decoding in visual areas V1 and LO1 was at chance level. We then examined decoding in retinotopically defined foveal and peripheral representations of V1-V3. Only the foveal representation was stimulated by the foreground objects, the periphery by blurred background. We found that peripheral V1-V3 allowed decoding of implied motion directions, while foveal representations did not. Hence, high-level information of implied motion directionality is represented in peripheral V1-V3, i.e. regions that were never given the information through bottom-up stimulation. This suggests that higher-level cognitive processes (potentially based in LO2, V5+/MT+) detect implied motion direction based on object configuration and feed it back to cover the peripheral context in early visual cortex, potentially encoding expected background-motion. The results provide direct evidence for information in early visual cortex originating from feedback, compatible with predictive coding theory.}},
  year = {2018},
  slug = {item_3005951},
  author = {Altan, G and Bartels, A}
}