La lecture à portée de main
Description
Sujets
Informations
Publié par | otto-von-guericke-universitat_magdeburg |
Publié le | 01 janvier 2009 |
Nombre de lectures | 34 |
Langue | English |
Poids de l'ouvrage | 5 Mo |
Extrait
Basic components of cortical
processing are shared in visual
and auditory modality
Dissertation
zur Erlangung des akademischen Grades
doctor rerum naturalium
(Dr. rer. nat.)
genehmigt durch die Fakultät für Naturwissenschaften
der Otto-von-Guericke-Universität Magdeburg
von Dipl.-Psych. Jeanette Schadow
geb. am 21. April 1979 in Schönebeck
Gutachter: Prof. Dr. Christoph S. Herrmann
Prof. Dr. Stefan Debener
eingereicht am: 22. Juni 2009
verteidigt am: 29. Oktober 2009iiContents
1 Introduction 1
1.1 Oscillatory activity in the human brain . . . . . . . . . . . . . . . 4
1.2 Bottom-up modulation in the auditory and visual system . . . . . 8
1.3 Top-down modulation in the auditory and visual system . . . . . 10
2 Hypotheses and objectives 15
2.1 Bottom-up modulation in the auditory and visual system . . . . . 15
2.2 Top-down modulation in the auditory system . . . . . . . . . . . 16
3 General Method 17
3.1 Electroencephalogram . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2 Event-related potentials . . . . . . . . . . . . . . . . . . . . . . . 18
3.3 Oscillatory brain activity . . . . . . . . . . . . . . . . . . . . . . . 19
3.3.1 Classification of brain oscillations . . . . . . . . . . . . . . 19
3.3.2 Time-frequency analysis . . . . . . . . . . . . . . . . . . . 20
4 Experiment I: Visual contrast modulates evoked gamma-band ac-
tivity in human EEG 23
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . 25
4.2.2 Stimuli and Task . . . . . . . . . . . . . . . . . . . . . . . 25
4.2.3 Data acquisition . . . . . . . . . . . . . . . . . . . . . . . . 26
4.2.4 Data analysis . . . . . . . . . . . . . . . . . . . . . . . . . 26
4.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4.3.1 Behavioral data . . . . . . . . . . . . . . . . . . . . . . . . 28
4.3.2 Visual evoked potentials . . . . . . . . . . . . . . . . . . . 28
4.3.3 Early and late gamma-band responses . . . . . . . . . . . 28
4.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
4.4.1 Behavioral data . . . . . . . . . . . . . . . . . . . . . . . . 31
4.4.2 Early gamma-band responses . . . . . . . . . . . . . . . . 31
4.4.3 Visual evoked potentials . . . . . . . . . . . . . . . . . . . 33
4.4.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
5 Experiment II: Sound intensity modulates auditory evoked gamma-
band activity in human EEG 35
iiiContents
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
5.2 Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.2.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . 37
5.2.2 Stimuli and Task . . . . . . . . . . . . . . . . . . . . . . . 37
5.2.3 Data acquisition . . . . . . . . . . . . . . . . . . . . . . . . 38
5.2.4 Data analysis . . . . . . . . . . . . . . . . . . . . . . . . . 38
5.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
5.3.1 Behavioral data . . . . . . . . . . . . . . . . . . . . . . . . 40
5.3.2 Auditory evoked potentials . . . . . . . . . . . . . . . . . . 40
5.3.3 Early gamma-band responses . . . . . . . . . . . . . . . . 41
5.3.4 Comparison of AEPs and evoked GBRs . . . . . . . . . . . 43
5.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
5.4.1 Auditory evoked potentials . . . . . . . . . . . . . . . . . . 44
5.4.2 Evoked gamma-band responses . . . . . . . . . . . . . . . 45
5.4.3 Comparison of AEPs and evoked GBRs . . . . . . . . . . . 48
5.4.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
6 Experiment III: Early gamma-band responses reflect anticipatory
top-down modulation in the auditory cortex 49
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
6.2 Materials and methods . . . . . . . . . . . . . . . . . . . . . . . . 51
6.2.1 Participants . . . . . . . . . . . . . . . . . . . . . . . . . . 51
6.2.2 Stimuli and Task . . . . . . . . . . . . . . . . . . . . . . . 51
6.2.3 Data acquisition . . . . . . . . . . . . . . . . . . . . . . . . 52
6.2.4 Data analysis . . . . . . . . . . . . . . . . . . . . . . . . . 53
6.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
6.3.1 Behavioral data . . . . . . . . . . . . . . . . . . . . . . . . 55
6.3.2 Event-related potentials . . . . . . . . . . . . . . . . . . . 55
6.3.3 Evoked gamma-band responses . . . . . . . . . . . . . . . 56
6.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
6.4.1 Event-related potentials . . . . . . . . . . . . . . . . . . . 58
6.4.2 Evoked gamma-band responses . . . . . . . . . . . . . . . 60
6.4.3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
7 General Discussion 65
7.1 Summary and discussion of the main results . . . . . . . . . . . . 65
7.2 Match-and-utilization model in the auditory modality . . . . . . . 69
7.3 The visual and auditory modality in comparison . . . . . . . . . . 71
7.3.1 Low-level processing . . . . . . . . . . . . . . . . . . . . . 71
7.3.2 High-level processing . . . . . . . . . . . . . . . . . . . . . 73
A Curriculum vitae 75
ivContents
B Danksagung 77
C Selbstständigkeitserklärung 79
vContents
viList of Figures
1.1 Where is the face? . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Kanizsa and non-Kanizsa triangle stimulus. . . . . . . . . . . . . . 5
1.3 Kanizsa stimuli vs. equivalent sound stimuli. . . . . . . . . . . . . 6
1.4 Bottom-up and top-down processing. . . . . . . . . . . . . . . . . 7
1.5 Natural scenes and visual contrast. . . . . . . . . . . . . . . . . . 8
3.1 Analysis of evoked and induced GBR. . . . . . . . . . . . . . . . . 22
4.1 Grating stimuli at three contrast levels. . . . . . . . . . . . . . . . 26
4.2 Visual evoked potentials. . . . . . . . . . . . . . . . . . . . . . . . 29
4.3 Scalp topographies of the P100 and N200. . . . . . . . . . . . . . 30
4.4 Evoked gamma-band responses. . . . . . . . . . . . . . . . . . . . 31
4.5 Time courses for the evoked GBR, phase-locking, and total GBR. 32
4.6 Scalp topographies of the evoked GBR. . . . . . . . . . . . . . . . 33
5.1 Auditory evoked potentials and scalp topographies. . . . . . . . . 41
5.2 Evoked gamma-band responses. . . . . . . . . . . . . . . . . . . . 42
5.3 Time courses for the evoked GBR, phase-locking, and total GBR. 43
5.4 Auditory evoked potentials and evoked GBR in comparison. . . . 44
5.5 First-spike-latency of auditory neurons. . . . . . . . . . . . . . . . 47
6.1 Schematic illustration of the paradigm. . . . . . . . . . . . . . . . 52
6.2 Event-related potentials. . . . . . . . . . . . . . . . . . . . . . . . 56
6.3 Time-frequency representations of the evoked GBR. . . . . . . . . 57
6.4 Scalp topographies of the evoked GBR. . . . . . . . . . . . . . . . 58
6.5 Source modeling of the evoked GBR. . . . . . . . . . . . . . . . . 59
6.6 Time courses for the evoked GBR, phase-locking, and total GBR. 60
7.1 Match-and-utilization model.. . . . . . . . . . . . . . . . . . . . . 69
7.2 Grouping mechanisms. . . . . . . . . . . . . . . . . . . . . . . . . 73
viiList of Figures
viii1 Introduction
It is late in the afternoon. The traffic in the city is chaotic and
everywhere the cars are anxious for a green traffic signal. Probably,
everyone in the city have left off work at the same time. Tom and
his wife are on the way home. He drives the car, while talking to his
wife about his day, and further tapping his finger in the rhythm of
the radio music.
It is fascinating, how we are able to process all these simultaneously occurring
sensations in just a few milliseconds. Mostly, we reason very little about our
perception of the world around us. When we think about the aforementioned
situation more intensely, this requires multiple processing systems regarding per-
ception and cognition in parallel, in this example the visual, auditory, motor,
and speech system. To understand the complex interplay of several perceptual
systems, the first step is to study how these systems work on their own at differ-
ent processing stages. Within the current thesis, I will focus on the visual and
auditory system to investigate the following three issues: At first, I will consider
low-level processes during the auditory and visual modality. Second, I will study
how cognitive functions influence the processing of auditory information. Third,
I will compare which processing mechanisms in the visual and auditory system
are shared and which are different.
Foran introduction in perceptual processes, let us consider the visual modality
more precisely. This system is required to build a representation of the world
surrounding us and accomplishes a variety of complex tasks, including the iden-
tification and categorization of visual objects, assessing distances to and between
objects, and guiding body movements towards visual objects. Two examples of
objectperception indifferent contexts aredisplayed inFigure1.1andyourtaskis
it to identify the face in bothpictures. To identify the ’object’ (face), a multiplic-
ity of object