Labmanager Logo
Vector image of young woman using binoculars

iStock, PCH-Vector

Context Facilitates More Effective Search Strategies

Study shows that distractor objects can help the visual system develop more effective search strategies

| 2 min read
Share this Article
Register for free to listen to this article
Listen with Speechify
0:00
2:00

People are continuously provided with an overwhelming stream of events flooding the sensory organs. However, while the brain has impressive processing capabilities, its capacity is strongly limited. Thus, an observer cannot consciously experience all the events and information available at any one time, but has to focus on some limited subset of the whole.

For many decades, researchers have investigated the neuro-cognitive mechanisms of this selective attention through the use of visual search and have shown that contextual cuing plays a role: If the searched-for target object is situated within a certain spatial arrangement of other objects, then it is located more quickly. That is to say, people react faster to a target object when it is embedded in a stable arrangement of distractor objects. According to the prevailing theory, this is because people store the relative position of the target stimulus in relation to the distractor stimuli in their long-term memory.

Want the latest lab management news?

Subscribe to our free Lab Manager Monitor newsletter.

Is the form not loading? If you use an ad blocker or browser privacy features, try turning them off and refresh the page.

Using a novel eyetracking scanpath analysis, a team led by professor Thomas Geyer from the Department of Psychology at LMU has now shown that contextual cuing is caused less by display-specific memories. Rather, they aid the visual system in developing new capabilities and learning more effective scanning strategies. The authors thus see the contextual facilitation of visual search as a byproduct of this effect. “This has strong implications on how we can understand visual search and adaptive behavior in general,” says Geyer. “Our results suggest that visual scanning strategies are learnable and act as a ‘gatekeeper’ between a plethora of sensory information and attentional selection.” This also means that eye movements are not just the consequence of a focused shifting of attention, but that they proactively support the orienting of attention toward objects that will most likely contain relevant information for further processing—before the actual attentional selection takes place.

- This press release was originally published on the Ludwig-Maximilians-Universität München website

Loading Next Article...
Loading Next Article...

CURRENT ISSUE - November 2024

The Blueprint for Lab Safety Success

Protecting your lab's greatest asset: its people

Lab Manager November 2024 Cover Image
Lab Manager eNewsletter

Stay Connected

Click below to subscribe to Lab Manager Monitor eNewsletter!

Subscribe Today