News

Do you trust the machine?

  As Air France Flight 447 plunged into the Atlantic Ocean at nearly 300 kilometers per hour, pilot Pierre-Cédric Bonin was struggling with the plane’s control system at full speed. The cause of the malfunction was clear: the autopilot system suddenly shut down due to icing of aircraft components, after which Bonin and his crew members took over the aircraft. The situation required manual operation by the pilot.
  The pilots were inexperienced in this situation and struggled to hold the aircraft steady. The cockpit computer kept flooding with confusing messages and alerts, and it did not indicate to them that the plane was in a stall. The black box recorded Bonin’s last words, “We’re going to crash – this can’t be happening. What the hell is going on?”
  The day was June 1, 2009, and all the passengers and crew on board were killed. When an accident like this occurs, in which both humans and machines are involved, its root causes are usually multifaceted. But analysts blame the Air France Flight 447 crash in part on an over-reliance and trust in machines – the pilots’ expectation that the autopilot system would always function properly and that information systems would always provide accurate information. And this is not the only case of over-reliance on technology that has caused death and injury.
  Scientists have studied the phenomenon of “automation bias” in depth and found that it can sometimes lead to “automation complacency,” where people are less likely to find faults when computers are in charge. Surprisingly, however, the human tendency to over-rely on machines may have been directly influenced by millions of years of biological evolution.
  In a book on why humans over-trust machines, Patricia Adrey of the University of Oklahoma writes, “Over-trust in technology leads to a surprisingly high percentage of errors.” She argues that people generally lack the ability to judge the reliability of a specific technology. It works both ways: we are likely to reject the help of a computer when it is to our advantage; or we may blindly trust such a device until it ends up harming our interests and even our lives.
  As one of the species most closely related to humans, the behavior of chimpanzees may provide a clue to help unravel the reasons why we are not good at assessing the trustworthiness of machines. This may be because we are more inclined to evaluate others of our kind than machines.
  In a recent experiment conducted at an animal sanctuary in Kenya, researchers built a device that allowed chimpanzees to pull a string on a machine for a food reward. Pulling the first string would result in a basic reward – a slice of banana; the reward for pulling the second string was more enticing, two slices of banana and a slice of apple, which might be handed out by the machine or by the chimpanzee’s companion.

A Japanese zoo tries to get chimpanzees to learn to use a vending machine.

  The other end of the second string is sometimes the machine, sometimes another chimp, but not both at the same time. However, sometimes the machine would not dispense the reward, and sometimes the other chimp would choose to take the reward away alone. Thus, pulling the second rope may result in more rewards, but it is a more uncertain choice.
  In this way, the chimpanzees participating in the experiment face conditions that may be social or nonsocial. They would need to trust either that machine or another chimpanzee if they wanted to have a chance at more food rewards.
  The study showed that chimpanzees were less inclined to choose a second rope when a fellow chimpanzee was responsible for handing out food. They refused to participate in social experiments 12 percent of the time, while they resisted in machine-controlled nonsocial experiments only 4 percent of the time. In other words, they trusted the machines more.
  Lowu Hauks of the Max Planck Institute for Human Development and colleagues designed and conducted the experiment. She said, “The chimpanzees were more hesitant when they found out they were facing not a machine but another chimpanzee.” The experiments show that social risk plays an important role in the lives of chimpanzees and humans – something that only a few studies have revealed.
  This psychology is called “backstabbing avoidance,” explains Hauks: “The fear of being deceived by another person (or another chimp) can cause stronger emotions.” She uses an analogy: If you put money into a vending machine and it doesn’t deliver the drink you asked for, no doubt you’ll be angry. But imagine how you would feel if the bartender took your money and then drank the Coke you ordered right in front of you. You’d probably be furious. This is of course, because the vending machine did not take the liberty to cheat you, it just did not succeed in issuing goods; and the bartender when you order something to drink, he is aware that this will make you angry.
  The research team didn’t stop there, though. They conducted another experiment involving chimpanzees. Having participated in the first experiment, the chimpanzees already understood how likely they were to be rewarded with more food when they chose the uncertain option. In fact, the uncertain option was no longer completely uncertain – the chimpanzees were already aware of what they were risking.
  But that’s when the unexpected result emerged. The chimpanzees no longer distinguished between social and nonsocial options, and they no longer seemed to show any more trust in machines than they did in their own kind.

Living with blind trust in navigation systems can get one into trouble.

  ”That’s why we find this finding exciting, that the chimpanzees are surprisingly able to distinguish between the social and nonsocial worlds in the face of a lot of uncertainty.” Hauks said.
  According to Darby Proctor, a psychologist at the Florida Institute of Technology, it makes sense just to think about how important it is for primates to deal with their social environment.
  ”When dealing with machines, there is no impact on the future.” She explains, “You don’t have to worry about the additional potential social costs.” After all, the chimpanzees who participate in the experiments often have to go back and spend time with their companions after the experiment is over – and any unpleasantness caused by their participation could have a subsequent effect on their relationship.
  Proctor and colleagues had previously conducted similar tests, and they also found that chimpanzees were more likely to trust objects than their companions when looking for food rewards. Proctor mentioned that when a chimpanzee didn’t get the food reward that was supposed to be his from a companion, the frustrated chimpanzee would spit on his companion to express his feelings. “It’s a common sign of unhappiness,” she said.

  Proctor has reservations about whether the chimpanzees in the experiment actually trusted the machine more. After all, the phenomenon could also be described as individuals not reacting as violently to bad treatment as long as no social peers are involved.
  ”It’s not that we trust the machine to necessarily give us good feedback; it’s possible that we just think the machine doesn’t have emotions, so we’re just more inclined to take a gamble here on inanimate objects.” She speculated.
  In any case, evolution seems to have influenced the willingness of primates to face uncertainty – and it depends on whether we feel we are taking social risks.
  Francesca de Petrillo of the Institut Supérieur de Toulouse, who works on primates, argues that evolution has not made us aware that the cost of being betrayed by machines can be quite painful. Over the past millions of years, we have not needed to evaluate machines in the same way as we evaluate our own kind, and thus have not evolved this ability. And now that technology has been able to have a huge impact on human life, we arguably still need to assess the capabilities of machines.
  There are some other factors involved. In addition to evolution, our willingness to trust machines is also influenced by the individual’s relevant knowledge base and cultural expectations. a 2019 study showed that people were 29% more likely to give out their credit card information in a conversation when typing and chatting if they thought they were communicating with a computer rather than with another person. The researchers found that the effect was even more pronounced for people who had a preconceived notion that machines were more reliable than humans.
  That said, sometimes people also show a strong distrust of technology. Many survey results show that people are often uncomfortable with self-driving vehicles or handing over jobs to be done by machines. There are multiple reasons why we are always suspicious of new technology – it could be a fear of losing part of our role when replaced by an automated machine; it could also be a lack of trust that the computer has sufficient discretion and flexibility in performing the task. If you’ve seen hundreds of videos of robots falling over, or encountered a computer that stubbornly refuses to function properly, it’s not surprising that people are resistant to machines.
  Philip Kulms, a social psychologist at the University of Bielefeld in Germany, has studied the factors that influence an individual’s level of trust in a particular technology. He and a colleague developed a Tetris-like computer game in which the computer also controls some puzzle pieces when teamed with a player. If the computer plays well, the human player gets access to more valuable pieces and their team gets extra bonus points, a situation in which players are more inclined to trust their computer teammates and more likely to trade pieces with them for cooperation. This trust is developed through the way the game works.
  Kulms said, “We were surprised by the very limited set of variables we could manipulate, but they were clearly sufficient.”
  People typically lack the ability to assess the trustworthiness of machines because, from an evolutionary perspective, we have only acquired the ability to judge trustworthiness based on social cues – which makes sense if we accept the above point of view. Echoing this are other findings: gamblers bet more money when slot machines used for gambling are designed with anthropomorphic features.
  That is, not only are we bad at gauging the trustworthiness of machines, but we are also easily seduced by mechanical objects when they behave somewhat like a social partner with our best interests at heart.
  Thus, people who have learned to put their trust in certain systems (such as airplane autopilots) may have a harder time understanding how machines can be wrong, even if they are, than those who are exasperated by the slow response of computers and thus distrust all machines.
  Petrillo mentioned that she developed trust in computers when she interacted with voice-activated assistants such as Apple’s Siri or Amazon’s Alexa. “I would assume they were acting in my best interest, so I didn’t need to question them.” She said.
  Kulms’ research shows that for many people, including Petrillo, trust in machines will continue as long as they appear to be competent and show some friendliness. This is why machine designers must ensure that their systems are ethical and function transparently, Kulmers noted.
  The great irony is that behind a seemingly trustworthy machine may lie a person with a bad heart. It’s dangerous enough to place blind trust in a flawed machine, let alone a machine designed to deceive people.
  ”Given the same risks, if we were dealing with a person instead of a machine, we might be more accustomed to thinking about what the potential negative outcomes would be.” Proctor said. Both she and Haux agree that a lot of work needs to be done to figure out just how much the chimpanzees in the study trusted the machines, and how much that conclusion reveals about human behavior.
  But the excessive trust humans show in machines may be influenced by one simple fact: We evolved as social animals in a world where machines did not originally exist. And now, machines are here. We put our money, our personal information, and even our lives under the supervision of machines every moment of every day. It’s not necessarily wrong to trust machines, it’s just that we’re often not good judges of when we should do so.

error: Content is protected !!