Given the open-ended nature
of exploration and the specificity of case studies, it may be
beneficial to control some of the analysis process and study
using laboratory experiments. For example, the Scented
Widgets study measured how social navigation cues af-
fected information foraging based on the number of revisits,
unique discoveries, and user subjective preferences based
on log data [92]. In some cases, experimenters may use
a mixture of techniques to enrich the data collected in
laboratory experiments. One example is an early insight-
based evaluation [67]. The study used a think-aloud proto-
col and participants were asked to estimate the percentage
of potential insight they would be able to obtain about the
dataset with the tool every 15 minutes. In addition, Saraiya
et al. coded all individual occurrences of insights from
video recordings, with the characteristics of the insights
coded by domain experts. Findings were expressed in five
measures of insights: count, total domain value, average
final amount learned, average time to first insight, and
average total time spent before no more insight was felt to
be gained. Both the hand-coded as well as the participant-
recorded metrics helped to evaluate the most efficient of the
five visualization techniques in supporting insight discovery
and in influencing users’ perception of data.
Given the open-ended nature
of exploration and the specificity of case studies, it may be
beneficial to control some of the analysis process and study
using laboratory experiments. For example, the Scented
Widgets study measured how social navigation cues af-
fected information foraging based on the number of revisits,
unique discoveries, and user subjective preferences based
on log data [92]. In some cases, experimenters may use
a mixture of techniques to enrich the data collected in
laboratory experiments. One example is an early insight-
based evaluation [67]. The study used a think-aloud proto-
col and participants were asked to estimate the percentage
of potential insight they would be able to obtain about the
dataset with the tool every 15 minutes. In addition, Saraiya
et al. coded all individual occurrences of insights from
video recordings, with the characteristics of the insights
coded by domain experts. Findings were expressed in five
measures of insights: count, total domain value, average
final amount learned, average time to first insight, and
average total time spent before no more insight was felt to
be gained. Both the hand-coded as well as the participant-
recorded metrics helped to evaluate the most efficient of the
five visualization techniques in supporting insight discovery
and in influencing users’ perception of data.