Automation Bias in Intelligent Time Critical Decision Support Systems

M.L. Cummings, MIT, "Automation Bias in Intelligent Time Critical Decision Support Systems"


Various levels of automation can be introduced by intelligent decision support systems,
from fully automated, where the operator is completely left out of the decision process, to
minimal levels of automation, where the automation only makes recommendations and the
operator has the final say. For rigid tasks that require no flexibility in decision-making and with a low probability of system failure, higher levels of automation often provide the best
solution. However, in time critical environments with many external and changing constraints such as air traffic control and military command and control operations, higher

levels of automation are not advisable because of the risks and the complexity of both the
system and the inability of the automated decision aid to be perfectly reliable. Human-in- the-loop designs, which employ automation for redundant, manual, and monotonous tasks
and allow operators active participation, provide not only safety benefits, but also allow a
human operator and a system to respond more flexibly to uncertain and unexpected events.
However, there can be measurable costs to human performance when automation is used,
such as loss of situational awareness, complacency, skill degradation, and automation bias.
This paper will discuss the influence of automation bias in intelligent decision support
systems, particularly those in aviation domains. Automation bias occurs in decision-making

because humans have a tendency to disregard or not search for contradictory information in light of a computer-generated solution that is accepted as correct and can be exacerbated in time critical domains. Automated decision aids are designed to reduce human error but
actually can cause new errors in the operation of a system if not designed with human
cognitive limitations in mind.

We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Privacy Policy