Target search is important in a number of fields, from industrial engineering and radiology to product searches and navigation. The processes involved have had to be inferred in the past based on final choice, but the evolution of eye movement tracking has opened the way to following closely just how people look for targets.
Ralf van der Lans and his co-authors Rik Pieters and Michel Wedel seized on these developments to create a model that gives a sharper insight into the ways that people search for targets - with the implication that this could be deployed to improve target-seeking abilities or to improve the design of marketing stimuli, such as packaging, product shelves and advertising, such thatattention is directed to certain targets.
Their model describes the two most important measures of eye movements: fixations and saccades. Fixations are when eye movement is stable and information is extracted from the scene, while saccades are rapid ballistic eye movements that depress vision as the line of sight is directed to a new fixation point. The authors contend that these two forms of eye movement are driven by two separate pathways in the brain related to two attention states, and hence to target search.
One state is localization - the "where" of the target search - in which the brain builds a saliency map that guides the focus of attention to quickly select regions with possible targets, sometimes doing this systematically. What is interesting here is that the pathways for localization are found in the same region of the brain involved in the motor control of eye movement.
The second state is identification - the "what" of the search - in which the brain analyses information to see if there is a target match.
The authors' model draws these strands together to propose that eye movement can indicate when a person is focusing on an object and drawing information about it, or when they are scanning the scene for a possible target.
The model was tested on data from 106 consumers who were asked to search for a specific brand of coffee from among 12 different existing brands on a computer-simulated supermarket shelf. Eye movements were tracked in relation to three things: color, luminance and edges. Unknown to the participants, blue was the color of the target brand and therefore highly-diagnostic, while red featured in most of the brands and gold was not present in the target brand. Luminance referred to the brightness of a location, and edges to the borders around the shelf, brand groups and text.
Eighty participants found the target within 10 seconds, 14 found the wrong brand and 12 ran out of time. Those who focused on blue, directed attention from brighter regions to darker ones, and refixated their attention to text after a first glance were more accurate and faster in identifying the target. This showed that the use of eye tracking was a good indicator of attention states, the authors said.
"The high spatiotemporal resolution of eye-tracking data allows our model to identify the activity of the attention states over time, presumably reflecting activity of the 'where' and 'what' pathways in the visual brain," they said.
"Application of the proposed model allows assessment of the effectiveness of search strategies. It may be used to develop guidelines for training personnel, such as radiologists that have to search for faint nodules in chest radiographs, or airport security that need to search for concealed weapons in X-ray images of luggage. The model can also be used to support the design of robotic vision systems, and to optimize the design of search displays based on the estimated saliency maps. As a case in point, the model here was used to assess the saliency of brands, which influences consumers' in-store choices, and so the findings could be used for optimal package and shelf design."
BizStudies
The Line of Vision