EN
Compositions of visual scenes are related here to neural signals in visual cortex and to cortical circuit models to understand neural mechanisms of perceptual feature grouping. Starting from the hypothesis that synchronization and decoupling of cortical y-activities (35-90 Hz) define the relations among visual objects, we concentrate on synchronization related to (1) static retinal stimulation during ocular fixation, and (2) transient stimulation by sudden shifts in object position. The synchronization hypothesis has been tested by analyzing signal correlations in visual cortex of monkeys with the following results: Static retinal stimuli induce loosely phase-coupled y-activities among neurons of an object's cortical representation. Patches of y-synchronization become decoupled across the representation of an object's contour, and therby can code figure-ground segregation. Transient stimuli evoke synchronized volleys of stimulus-locked activities that are typically non-rhythmic and include low frequency components in addition to those in the y-range. It is argued that stimulus-induced and stimulus-locked synchronizations may play different roles in perceptual feature grouping.