Strategy Use in Automation-Aided Decision Making
When human operators make signal detection judgments with assistance from an automated decision aid, they perform better than they could unaided but fail to reach optimal sensitivity. We investigated the decision strategies that produce this suboptimal performance. Participants ( N = 130) performed a two-response classification task that required them to mentally estimate the mean of a set of randomly sampled values each trial. The task was performed with and without assistance from a 93% reliable decision aid. Psychometric functions were fit to the classification data, and data were fit with two cognitive models of automation use. The first model assumed that participants made automation-aided judgments using a contingent criterion strategy, adjusting their response cutoff for yes vs. no responses following a cue from the aid. The second strategy, a discrete state model, assumed that participants made aided judgments by simply deferring to the aid on some proportion of trials. A measure of model fit favored the discrete-state process model, with parameter estimates indicating large individual differences in deferral rate between participants (range = 2% and 95%).