Reference no: EM131841380
Interactive Session: People Are We Relying Too Much on Computers to Think for Us?
Does our ever burgeoning dependence on computers foster complacency, suppressing our ability to marshal our mental faculties when required? Although computerization has undoubtedly mitigated malfunctions, work stoppages, and breakdowns, are we concurrently losing our ability to assess alternatives independently and make optimal choices?
At least one technology writer is sure this is exactly what is happening. Nicholas Carr's book, The Glass Cage: Automation and Us, lays out the case that our overreliance on computers has dulled our reflexes and eroded expertise.
Two cognitive failures undermine performance. Complacency-overconfidence in the computer's ability-causes our attention to wander. Bias-overconfidence in the accuracy of the data we are receiving from the computer-causes us to disregard outside data sources, including conflicting sensory stimuli.
When pilots, soldiers, doctors, or even factory managers lose focus and lack situational awareness, they ignore both suspect data coming from the computer and the external cues that would refute it. The results can be catastrophic.
In two instances in 2009, commercial airplane pilots misinterpreted the signals when their autopilot controls disconnected after receiving warnings that the aircraft would stall. Rather than pushing the yoke forward to gain velocity, both pilots heeded faulty control panel data while ignoring environmental cues and pulled back on the yoke, lifting the plane's nose and decreasing airspeed-the exact opposite of what was required. Loss of automation triggered confusion and panic.
Sharply curtailed hands-on flight experience (on a typical passenger flight today, a human pilot mans the controls for just three minutes) resulted in stalled aircraft plunging to earth. Fifty died in Buffalo, New York; 228 perished in the Atlantic Ocean en route to Paris from Rio de Janeiro. The Federal Aviation Administration (FAA) is now pressing airlines to adopt stricter requirements for manual flying hours to offset the risks posed by complacency and bias.
Carr's critics point out that air travel is now safer than ever, with accidents and deaths steadily declining over decades and fatal airline crashes exceedingly rare. Carr concedes this point but still worries that pilots have come to rely so much on computers that they are forgetting how to fly. Andrew McAfee, a researcher at the MIT Sloan School of Management, points out that people have lamented the loss of skills due to technology for many centuries, but on balance, automation has made the world better off. There may be a high-profile crash, but he believes greater automation, not less, is the solution.
Although humans have historically believed that allocating tasks to machines liberates us from the mundane and enables us to pursue the extraordinary, computers have ushered in an altogether different era. Massive data compilation and complex analytical capabilities now mean that decision making, heretofore the sole province of the human brain, is increasingly being accomplished by computers. Offloading tasks to computers liberates us from complex thinking while requiring us to pursue mundane tasks such as inputting data, observing output, and absentmindedly awaiting equipment failure.
Complacency and bias-induced errors are piling up. For example, computer programs now highlight suspect spots on mammograms. With the compulsion to examine images scrupulously relieved, radiologists are now missing some early-stage tumors not flagged by the program. Australian researchers found that accountants at two international firms using advanced auditing software had a significantly weaker understanding of the different types of risk than did those at a firm using simpler software that required them to make risk assessment decisions themselves.
Even the most rudimentary tasks, such as editing and spell checking, are now performed differently. Rather than actively participating, we are observers, waiting to be told to correct an error. Are such short-term efficiencies worth the long-term loss of knowledge and expertise?
What's more, software programs are shouldering ever more capabilities heretofore thought to be the exclusive domain of the human brain. Sensory assessment, environmental awareness, coordinated movement, and conceptual knowledge are included in programming that has enabled Google to begin testing its driverless cars on public roads. Some argue that this is precisely the direction in which we should be going: autonomous computers with no human oversight or intervention at all. The solution to pilot error during automation failures? A wholly autonomous autopilot. The solution to doctors' declining diagnostic skills due to complacency and bias? Cut doctors out of the equation altogether.
Carr sees two problems with this thinking. First, complex computer systems require complex interdependencies among databases, algorithms, sensors, software, and hardware. The more mutually dependent elements there are in a system, the greater the potential points of failure and the more difficult they are to find.
Second, we have known for more than three decades that humans are spectacularly bad at precisely the job that increased computerization has relegated to them: passive observation. When not actively engaged, our minds tend to drift off to any topic other than the one we are supposed to be monitoring. What's more, because we now know that "use it or lose it" applies to flying airplanes, diagnosing illnesses, spell-checking, and everything in between, restricting humans to observation reduces experts to rookies, escalating the risk of improper responses to malfunctions.
One solution is to design programs that promote engagement and learning, for example, by returning control to the operator at frequent, but irregular, intervals or by ensuring that challenging tasks are included. If operators must perform and repeat complex manual and mental tasks, the generation effect will be reinforced.
Unfortunately, introducing these changes necessarily includes software slowdown and productivity decline. Businesses are unlikely to value long-term expertise preservation and development over short-term profits. Who does this technology benefit in the long run?
Case Study Questions
Identify the problem described in this case study. In what sense is it an ethical dilemma?
Should more tasks be automated? Why or why not? Explain your answer.
Can the problem of automation reducing cognitive skills be solved? Explain your answer.