Evolutionary relevance facilitates visual information processing
16 pages
English

Evolutionary relevance facilitates visual information processing

Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
16 pages
English
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres

Description

From the book : Evolutionary Psychology 11 issue 5 : 1011-1026.
Visual search of the environment is a fundamental human behavior that perceptual load affects powerfully.
Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to real-world human behavior.
We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search.
Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is time-sensitive in ways that pose significant evolutionary consequences.
Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load.
Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load.
Evolutionary relevance may importantly affect everyday visual information processing.

Sujets

Informations

Publié par
Publié le 01 janvier 2013
Nombre de lectures 4
Licence : En savoir +
Paternité, pas d'utilisation commerciale, partage des conditions initiales à l'identique
Langue English
Poids de l'ouvrage 1 Mo

Extrait

Evolutionary Psychology
www.epjournal.net – 2013. 11(5): 10111026
¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯
Original Article
EvolutionaryRelevanceFacilitatesVisualInformationProcessing
Russell E. Jackson, Psychology and Communication Studies Department, University of Idaho, Moscow, ID, USA. Email:rjackson@uidaho.edu(Corresponding author).
Dustin P. Calvillo, Psychology Department, California State University San Marcos, San Marcos, CA, USA.
Abstract: search of the environment is a fundamental human behavior that Visual perceptual load affects powerfully. Previously investigated means for overcoming the inhibitions of high perceptual load, however, generalize poorly to realworld human behavior. We hypothesized that humans would process evolutionarily relevant stimuli more efficiently than evolutionarily novel stimuli, and evolutionary relevance would mitigate the repercussions of high perceptual load during visual search. Animacy is a significant component to evolutionary relevance of visual stimuli because perceiving animate entities is timesensitive in ways that pose significant evolutionary consequences. Participants completing a visual search task located evolutionarily relevant and animate objects fastest and with the least impact of high perceptual load. Evolutionarily novel and inanimate objects were located slowest and with the highest impact of perceptual load. Evolutionary relevance may importantly affect everyday visual information processing.
Keywords:categorization, visual search, perceptual load, animacy, evolution
¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯Introduction
Humans rely heavily on perceptual information in order to navigate and manipulate our environment. Constant engagement in search behavior manifests this heavy reliance on perceptual information (McSorley and Findlay, 2001). The efficient allocation of attention to the intended item while ignoring irrelevant stimuli is the ultimate goal of search behavior (Eriksen, 1955; Wolfe, 2000). Perceptual load importantly affects the efficiency with which humans allocate attention to the intended object. High perceptual load increases response time (Eriksen, 1955), narrows attention (Koivisto and Revonsuo, 2009; Lavie, 1995; Palmer, Ames, and Lindsay, 1993), and increases error rates (Goldstein and Allen, 1971). Previously hypothesized means for overcoming the detrimental effects of high perceptual load include using a distinctive color, shape, and orientation (Treisman and
Evolutionary relevance
Gelade, 1980; Treisman and Gormican, 1988; also see Wolfe, 2002). Though these findings highlight important components of basic cognitive processing, they may generalize poorly due to limited ecological validity in realworld search behavior. We suggest that evolutionary relevance (i.e., the extent to which humans encountered a stimulus over evolutionary history) may provide a solution to the drastic inhibition that high perceptual load poses on visual search. Prior research demonstrates that evolutionary relevance produces sizable effects in areas such as attention (New, Cosmides, and Tooby, 2007), memory (Nairne and Pandeirada, 2008), and navigation (Jackson and Cormack, 2007). We predict here that human observers should process evolutionarily relevant stimuli more proficiently than evolutionarily novel stimuli, and that evolutionary relevance should mitigate the repercussions of high perceptual load during visual search. One important dimension to evolutionary relevance is animacy. Animate objects’ selfdetermined motion makes their initial perception highly timesensitivetimesensitive (New et al., 2007). Failure to notice animate objects over evolutionary time likely posed high consequences for genetic fitness and thus represents significant selection pressure shaping human cognition. Indeed, brain regions that process animate objects appear differently activated and largely independent from areas processing inanimate objects (Wiggett, Pritchard, and Downing, 2009). Infants likely differentiate between animate and inanimate objects (see Rakison and PoulinDubois, 2001), and observers look longer at animate objects than inanimate objects when asked to name, or simply to look at, objects (KovićPlunkett, and Westermann, 2009). Consequently, we predicted that animate objects, should be processed more proficiently than inanimate stimuli and should help to mitigate the detrimental effects of high perceptual load. Observers visually searched for one picture fitting a given category among many (high load) or few (low load) other pictures across several trials. We predicted that observers would process evolutionarily relevant and animate objects fastest and with the least effect of high perceptual load, followed by evolutionarily relevant and inanimate objects, followed by evolutionarily novel and inanimate objects (see top of Figure 1).
Materials and Methods
Eightythree participants completed the 72 visual search trials on a desktop computer in our laboratory. We obtained informed consent from all participants prior to participation. As a general overview of the procedure, participants saw a written category, such as “tool,” and then had to search for the single image from that category out of a field of images while we timed their responses. Images appeared in black and white on a white background presented by EPrime 1.2 on a 19 inch (1280 x 1024 pixel) LCD screen at 75 Hz. Participants sat at a selfselected comfortable distance, which typically placed their eyes roughly 50 cm from the screen. We standardized all images to approximately 4.5 cm in diameter in the longest dimension, and the entire circular arrangement of the images was approximately 20 cm in diameter. See Appendix for all screens containing all images used in the study. For each trial, participants saw a written category, followed by a centered fixation cross, followed by images appearing around the cross. We allowed participants as much time as they needed in order to view each category before they pressed a key in order to
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1012
Evolutionary relevance
continue to the fixation cross, and we timed this duration. Participants then searched a field of images for the one that belonged to the category given at the beginning of the trial. This search phase featured either high (eight images) or low (four images) load, distributed equally across categories. We timed the search duration until participants pressed a key in order to acknowledge that they had found the target image. Participants then identified the location of the target. Each of the six categories fell under one of the three conditions. “Human” and “animal” categories fell under the evolutionarily relevant and animate condition, “body part” and “fruit” under the evolutionarily relevant and inanimate condition, and “tool” and “transportation” under evolutionarily novel and inanimate condition. Participants received 12 trials per category (24 per condition), with half featuring low load and half featuring high load (distributed equally across categories), for a total of 72 trials. We used partially randomized orders across participants while insuring that consecutive trials did not feature the same condition. Images from five categories normed by Van Overschelde, Rawson, and Dunlosky, (2003) (animal, body part, fruit, tool and transportation) and their 12 respective exemplars insured category and exemplar congruency. Target and incongruent distracter stimuli from various sources ensured picture quality (Bates et al., 2003; Bonin, Peereman, Malardier, Méot, and Chalard, 2003; Cycowicz, Friedman, Rothstein, and Snodgrass, 1997; “Free High Quality Clip Art”, 2010; Florida Center for Instructional Technology, 2010; Nishimoto, Miyawaki, Ueda, Une, and Takahashi, 2005). Alternate procedures for this study design include discrepantitem methods, where participants identify the sole image that differs from a field of other images that are all of the same category (e.g., Blanchette, 2006). Such a procedure is problematic because participants can use overall differences in a picture that are unrelated to the item itself, such as the features of the background or how particular items reflect light. We explicitly wanted participants to search for items of specific categories, as would occur in a normal visual search, rather than identify differences between pictures, which could be ecologically novel and falls outside of the current research question. We also used randomized heterogeneous distracter stimuli because such a method increases search difficulty and decreases the likelihood of finding unlimitedcapacity parallel search, even if the target has a unique feature (Duncan and Humphreys, 1989; Öhman, Juth, and Lundqvist, 2010).
Results
Participants responded fastest to, and suffered the least perceptual load decrement with, evolutionarily relevant and animate stimuli, followed by evolutionarily relevant and inanimate stimuli, followed by evolutionarily novel and inanimate stimuli (see bottom of Figure 1). A repeatedmeasures ANOVA suggested a significant main effect of load (F(1, 82) 2 2 = 109.10,p< .001, η = .12), condition (F(2, 164) = 142.57,p< .001, η = .41), and a 2 significant interaction between load and condition (F(2, 164) = 8.26,p< .001, η = .01). Mean reaction times significantly differed between evolutionarily relevant/animate stimuli (M ±CI = 1277 ± 49 ms [213 ms/item in display]) and evolutionarily 95% relevant/inanimate stimuli (1696 ± 47 ms [282 ms/item]),tpaired(82) = 10.58,p< .001;r(81) Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1013
Evolutionary relevance
= .82,p < .001. Mean reaction times significantly differed between evolutionarily relevant/inanimate stimuli and evolutionarily novel/inanimate stimuli (2008 ± 54 ms [335 ms/item]), tpaired(82) = 7.03,p < .001;r(81) = .81,p .001. Mean reaction times also < significantly differed between low (1498 ± 31 ms [375 ms/item]) and high load (1824 ± 31 ms [228 ms/item]),tpaired(82) = 10.45,p< .001;r(81) = .89,p< .001. Where possible, we report confidence intervals that reflect the withinsubject nature of these data (Cousineau, 2005; see also Loftus and Masson, 1994; Masson and Loftus, 2003). Figure 1.and observed (bottom) visual search reaction times acrossMean predicted (top) condition and load
Note: Error bars represent 95% confidence intervals
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1014
Evolutionary relevance
Faster object identification based on evolutionary relevance and animacy did not result from participants spending more time viewing the prompt for those categories (see Figure 2). On the contrary, participants viewed most briefly those categories to which they responded most quickly. There was a significant difference in time spent viewing each category prompt across conditions, with the shortest time for the evolutionarily relevant/animate condition (1266 ± 28 ms), followed by the evolutionarily relevant/inanimate condition (1302 ± 29 ms), followed by the evolutionarily 2 novel/inanimate condition (1343 ± 25 ms), (F(2, 164) = 5.275,p=.006, η= .06). Figure 2.Mean time spent viewing each prompt by condition
Note: Error bars represent 95% confidence intervalsParsing apart reaction times for each category independent of condition also generally reflected the predicted structure: human (1264 ± 61 ms [211 ms/item]), animal (1290 ± 56 ms [215 ms/item]), body part (1737 ± 70 ms [290 ms/item]), fruit (1655 ± 58 ms [276 ms/item]), tool (2116 ± 95 ms [353 ms/item]), and transportation (1901 ± 62 ms [317 ms/item]),F(5, 492) = 23.584,p .001. Figure 3 displays comparisons across < categories.Errors in identifying the correct target location demonstrated a potential floor effect and so we suggest caution in drawing conclusions from the pattern of errors, even though they support the experimental predictions (see Figure 4). Errors totaled 119 across the 5,976 trials (83 participants completing 72 trials apiece), giving an average of 1.43 errors per participant. Participants made 19, 45, and 55 total errors respectively across the evolutionarily relevant/animate, evolutionarily relevant/inanimate, and evolutionarily 2 novel/inanimate conditions—a significant difference,F(2, 164) = 8.35,p< .001, η= .04. Participants made 47 and 72 errors respectively across low and high load—a significant 2 difference,F(1, 82) = 5.26,p= .024, η = .01. Error frequency failed to significantly 2 interact across condition and loadF(2, 164) = 1.047,p= .353, η= .004. Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1015
Evolutionary relevance
Figure. 3.Mean visual search reaction times by category, rather than condition
Notes:Error bars represent 95% confidence intervals. Numbers between bars indicate the tpaired(p)statistic for statistical pairwise comparison between the two adjoining categories.Figure 4. Frequency of category identification errors by prompted condition (top) and load (bottom)
Note: Error bars represent 95% confidence intervals (binomial)Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1016
Discussion
Evolutionary relevance
Perceptual load affected the visual search for evolutionarily relevant and animate objects relatively little, but sizably impeded the search for evolutionarily novel and inanimate objects. As predicted, participants processed evolutionarily relevant, animate stimuli fastest and with the least decrement of high perceptual load. The next most proficiently processed were the evolutionarily relevant and inanimate stimuli, followed by evolutionarily novel and inanimate stimuli. Evolutionary relevance appeared to be a powerful determinant of efficient visual information processing and provided resilience against the negative effects of high perceptual load. In order to put this into perspective, humans identified evolutionarily relevant and animate stimuli far faster than evolutionarily novel and inanimate stimuli—even when identifying the evolutionary and animate stimuli required searching through twice as many items. This effect occurred even when the evolutionarily novel stimuli included objects that kill humans frequently in modern contexts, such as cars, and the evolutionarily relevant stimuli did not contain lethal objects. The observed effect was not the result of longer or delayed processing earlier in the information stream—time spent viewing the prompt for each condition mirrored the effect found in the main analyses. Participants spent the least time viewing the prompt for the evolutionarily relevant/animate stimuli and the most time viewing the evolutionarily novel/inanimate stimuli, with the evolutionarily relevant/inanimate stimuli falling in between. Participants not only identified evolutionarily relevant information fastest, but understood the task for those categories fastest as well. Analysis of the scarce error rates replicates these findings, as do analyses that exclude trials with errors. These high accuracy rates may have produced longer reaction times, which were still brief for what was a more complex task than a traditional visual search that does not require realistic categorization or contain distractors that match the target in color, size, complexity, etc. Evidence of semantic processing was apparent in the current study in that participants exhibited similar response times across both the written category prompt and the visual images within the trial. We were explicitly interested in this processing because we wanted to determine if evolutionarily relevant categorization would function more efficiently in visual behavior than the evolutionarily novel categories often used in research. Ultimately, the processing of basic, purely visual parameters such as color, motion, or movement are unlikely to be highly specialized for evolved contexts because they are largely invariant over human evolutionary history. However, the types of objects present in our environmentsare variant over evolutionary history. These data suggest that humans both understand written categories and process visual images in ways that appear specialized to the environments in which we evolved. It is interesting to note that no method yet can perfectly identify when participants actually visually categorize a target. Alternate procedures include use of targetabsent catch trials (which produce errors of indeterminate origin), method of limits (which produces high error rates and participant fatigue), or use of electroencephalographic detection of p300 waves (which may reflect visual familiarity, rather than categorical identification). The current method measured time of participant response, first when they acknowledged that they had identified the target, and then they identified the location. This method could
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1017
Evolutionary relevance
present a problem in that participants might have more rapidly acknowledged that they had identified certain targets, regardless of the actual time of identification. However, we know of no motivation that any participant would have, and certainly no motivation for the vast majority of participants, to acknowledge identification differently across targets. Further, participant willingness to confirm target identification does not explain the particular pattern of observed data, wherein participants identified evolutionarily relevant and animate stimuli from multiple conditions faster than similar inanimate stimuli, which participants identified faster than inanimate, evolutionarily novel stimuli. The current data appear to reflect a pattern of responses that indicate preference for animate objects that are relevant to the environments in which we evolved. These data coincide with evolutionary predictions, even when parsing the conditions apart by category. Participants processed the most evolutionarily relevant and animate category, humans, faster than the other categories, which were processed in descending order in a way that generally reflected evolutionary relevance and animacy. Further, the differences between category identification did not appear to be the result of the number or diagnosticity (i.e., the distinctiveness of the traits used to identify a particular category) of traits. Just to cite one example, “transportation” has a smaller number of more diagnostic traits (i.e., wheels or wings) than “fruit,” and yet the results, as predicted by evolutionary relevancy, support the opposite effect. This suggests that participants may have processed some objectively complex categories faster than relatively simpler categories and in a way predicted by evolutionary relevancy. We selected two redundant categories for each condition in order to integrate replicability within the study. These categories indeed replicated the predicted effects within all conditions. It was challenging to identify categories intermediate between evolutionarily novel/inanimate and evolutionarily relevant/animate. In particular, body parts in this intermediate condition posed a potential issue in that visual search for a <body part> is likely a subprocess that falls under searching for <human> in ecologically valid contexts. Although <body part> may not be clearly inanimate, it is also not clearly animate, hence our placement of it within the intermediate category. It was encouraging that the other intermediate category, <fruit>, generally replicated the effects seen in <body part>. The transportation category provided an especially clear example of the importance of ancestral environments in visual processing. The present research suggests a perceptual system better attuned to localizing categories present in our ancestral ecology (e.g., animals) than in our modern one (e.g., transportation), even when modern objects pose severe costs and even when we have high familiarity with such objects. Like New et al. (2007), we utilized the transportation category because failure to perceive vehicle motion leads to detrimental outcomes such as injury and death in modern society. The drastic impact that perceptual load had on this category may suggest inherent ancestral biases that an object’s importance in modern environments does little to change. Visual search behavior is ubiquitous and perceptual load usually poses a severe decrement to visual search performance. These data suggest an important shift in the understanding of visual performance: That even contentladen search performance may reflect specialization to the environments in which our visual systems evolved.
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1018
Evolutionary relevance
Acknowledgements:The authors thank Sandra Alvarado for her contributions throughout this work. We also thank Cate Woehlk, Mariana Shotliff, Anna Hood, and Erica Viola for data collection and Laurence Fiddick and an anonymous reviewer for excellent suggestions on this manuscript.
Received 12 November 2012; Revision submitted 15 April 2013; Accepted 15 April 2013
References
Bates, E., D’Amico, S., Jacobsen, T., Székely, A., Andonova, E., Devescovi, A., and Tzeng, O. (2003). Timed picture naming in seven languages.Psychonomic Bulletin and Review,10, 344380. Blanchette, I. (2006). Snakes, spiders, guns, and syringes: How specific are evolutionary constraints on the detection of threatening stimuli?The Quarterly Journal Of Experimental Psychology,59, 14841504. Bonin, P., Peereman, R., Malardier, N., Méot, A., and Chalard, M. (2003). A new set of 299 pictures for psycholinguistic studies: French norms for name agreement, imageagreement,conceptual familiarity,visual complexity,image variability,age ofacquisition,and naming latencies. Behavior Research Methods, Instruments and Computers, 35, 158167.Cousineau, D. (2005).Confidence intervals in withinsubject designs: A simpler solution to Loftus and Masson’s method.Tutorials in Quantitative Methods for Psychology,1, 4245. Cycowicz, Y. M., Friedman, D., Rothstein, M., and Snodgrass, J. G. (1997). Picture naming by young children: Norms for name agreement, familiarity, and visual complexity.Journalof Experimental Child Psychology,65, 171237. Duncan, J., and Humphreys, G. W. (1989). Visual search and stimulus similarity. Psychological Review, 96, 433458. Eriksen, C. W. (1955). Partitioning and saturation of visual displays and efficiency of visual search.Journal of Applied Psychology, 39, 7377. Florida Center for Instructional Technology. (2010).Clipart ETC. Retrieved from http://etc.usf.edu/clipart/index.htmFree High Quality Clip Art (n.d.). Retrieved June, 2010 from http://www.freeclipartnow.com/Goldstein, I. L., and Allen, J. C. (1971). Effects of irrelevant stimuli on the processing of information in complex displays.Journal of Applied Psychology, 55, 110113. Jackson, R. E., and Cormack, L. K. (2007). Evolved navigation theory and the descent illusion.Perception and Psychophysics,69, 353362. Koivisto, M., and Revonsuo, A. (2009). The effects of perceptual load on semantic processing under inattention.Psychonomic Bulletin and Review, 16, 864868. Ković, V., Plunkett, K., and Westermann, G. (2009). Eyetracking study of inanimate objects.Psihologija, 42, 417436. Lavie, N. (1995). Perceptual load as a necessary condition for selective attention.Journal of Experimental Psychology: Human Perception and Performance, 21, 451468.
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1019
Evolutionary relevance
Loftus, G. R., and Masson, M. J. (1994). Using confidence intervals in withinsubject designs.Psychonomic Bulletin and Review,1, 476490. Masson, M. J., and Loftus, G. R. (2003). Using confidence intervals for graphically based data interpretation.Canadian Journal of Experimental Psychology,57, 203220. McSorley, E., and Findlay, J. M. (2001). Visual search in depth.Vision Research,41, 34873496. Nairne, J. S., and Pandeirada, J. N. S. (2008). Adaptive memory: Is survival processing special?Journal of Memory and Language, 59, 377385. New, J., Cosmides, L., and Tooby, J. (2007). Categoryspecific attention for animal reflects ancestral priorities, not expertise.Proceedings of the National Academy of Sciences of theUnited States of America, 104, 1659816603. Nishimoto, T., Miyawaki, K., Ueda, T., Une, Y., and Takahashi, M. (2005). Japanese normative set of 359 pictures.Behavior Research Methods,37, 398416. Öhman, A., Juth, P., and Lundqvist, D. (2010). Finding the face in a crowd: Relationships between distractor redundancy, target emotion, and target gender.Cognition and Emotion, 24, 12161228. Palmer, J., Ames, C. T., and Lindsey, D. T. (1993). Measuring the effect of attention on simple visual displays.Journal of Experimental Psychology: Human Perception and Performance, 19, 108130. Rakison, D. H., and PoulinDubois, D. (2001). Developmental origin of the animate inanimate distinction.Psychological Bulletin, 127, 209228. Treisman, A., and Gelade, G., (1980). A featureintegration theory of attention.niogvetiC Psychology,12, 97136.Treisman, A., and Gormican, S. (1988). Feature analysis in early vision: Evidence from search asymmetries.Psychological Review,95, 1548. Van Overschelde, J. P., Rawson, K. A., and Dunlosky, J. (2003). Category norms: An updated and expanded version of the Battig and Montague (1969) norms.Journal of Memory andLanguage,50, 289335. Wiggett, A. J., Pritchard, I. C., and Downing, P. E. (2009). Animate and inanimate objects in human visual cortex: Evidence for taskindependent category effects. Neuropsychologia, 47,31117.113 Wolfe, J. M. (2000). Visual attention. In K. K. De Valois (Ed.),Seeing(pp. 335386). San Diego, CA: Academic Press. Wolfe, J. M. (2002). Visual search. In H. Pashler (Ed.),noAttneit(pp.1372). London, UK: University College London Press.
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1020
Appendix
Evolutionary relevance
Search phase screens from all seventytwo trials partitioned by category with low load images appearing first within category. Images do not appear in the partially randomized order experienced by participants.
Evolutionary Psychology – ISSN 14747049 – Volume 11(5). 2013. 1021
  • Univers Univers
  • Ebooks Ebooks
  • Livres audio Livres audio
  • Presse Presse
  • Podcasts Podcasts
  • BD BD
  • Documents Documents