To gain insight into how vision guides eye movements,
monkeys were trained to make a single saccade to a specified
target stimulus during feature and conjunction search with
stimuli discriminated by color and shape. Monkeys performed
both tasks at levels well above chance. The latencies of
saccades to the target in conjunction search exhibited
shallow positive slopes as a function of set size, comparable
to slopes of reaction time of humans during target present/absent
judgments, but significantly different than the slopes
in feature search. Properties of the selection process
were revealed by the occasional saccades to distractors.
During feature search, errant saccades were directed more
often to a distractor near the target than to a distractor
at any other location. In contrast, during conjunction
search, saccades to distractors were guided more by similarity
than proximity to the target; monkeys were significantly
more likely to shift gaze to a distractor that had one
of the target features than to a distractor that had none.
Overall, color and shape information were used to similar
degrees in the search for the conjunction target. However,
in single sessions we observed an increased tendency of
saccades to a distractor that had been the target in the
previous experimental session. The establishment of this
tendency across sessions at least a day apart and its persistence
throughout a session distinguish this phenomenon from the
short-term (<10 trials) perceptual priming observed
in this and earlier studies using feature visual search.
Our findings support the hypothesis that the target in
at least some conjunction visual searches can be detected
efficiently based on visual similarity, most likely through
parallel processing of the individual features that define
the stimuli. These observations guide the interpretation
of neurophysiological data and constrain the development
of computational models.