Modes of effective connectivity within cortical pathways are distinguished for different categories of visual context: An fMRI study

Qiong Wu, Jinglong Wu, Shigeko Takahashi, Qiang Huang, Hongzan Sun, Qiyong Guo, Yoshio Ohtani, Yoshimichi Ejima, Xu Zhang, Chunlin Li*, Tianyi Yan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)

Abstract

Context contributes to accurate and efficient information processing. To reveal the dynamics of the neural mechanisms that underlie the processing of visual contexts during the recognition of color, shape, and 3D structure of objects, we carried out functional magnetic resonance imaging (fMRI) of subjects while judging the contextual validity of the three visual contexts. Our results demonstrated that the modes of effective connectivity in the cortical pathways, as well as the patterns of activation in these pathways, were dynamical depending on the nature of the visual contexts. While the fusiform gyrus, superior parietal lobe, and inferior prefrontal gyrus were activated by the three visual contexts, the temporal and parahippocampal gyrus/Amygdala (PHG/Amg) cortices were activated only by the color context. We further carried out dynamic causal modeling (DCM) analysis and revealed the nature of the effective connectivity involved in the three contextual information processing. DCM showed that there were dynamic connections and collaborations among the brain regions belonging to the previously identified ventral and dorsal visual pathways.

Original languageEnglish
Article number64
JournalFrontiers in Behavioral Neuroscience
Volume11
DOIs
Publication statusPublished - 23 May 2017

Keywords

  • 3D-depth context
  • Color context
  • Contextual information
  • Dynamic causal modeling
  • FMRI
  • Shape context

Fingerprint

Dive into the research topics of 'Modes of effective connectivity within cortical pathways are distinguished for different categories of visual context: An fMRI study'. Together they form a unique fingerprint.

Cite this