Archives 18 April 2019

Stare Into The Lights My Pretties

Our phones make us feel like social-media activists, but they’re actually turning us into bystanders

On April 9, 2017, a video of a man being dragged off a United Airlines flight was posted on the internet and went viral. But I don’t need to tell you that. Each of your most outspoken Facebook friends probably posted about the event, highlighting the aspects of it that best reinforced their worldview. The incident was covered all over American media and even sparked outrage in China.

The collective focus may have now moved on to its next source of outrage, but there was something that only a few people noticed in the moment: a plane full of quiet passengers. Other than one woman screaming, hardly anyone else on the plane seemed bothered enough by what was happening to raise a ruckus. This calm scene is a rather unlikely precursor to the uproar that unfolded hours later on Facebook and Twitter.

Instead of intervening in the assault, the passengers stoically took out their cameraphones and pointed them toward David Dao, whose body was dragged along the aisle of the airplane, glasses askew, face bloody, and belly exposed. Their immediate response was not to speak out against the outrageousness of what was going on, but to create an instant digital record of the incident.

The act of recording a violent event but staying silent is a modern manifestation of the bystander effect. The bystander effect occurs when people refrain from intervening in an emergency situation because there are other people around. Psychologists Bibb Latané and John Darley, who first demonstrated the bystander effect, attributed this phenomenon to two factors: a perceived diffusion of responsibility (thinking that someone else in the group will help) and social influence (where observers see the inaction of the group as evidence that there is no reason to intervene).

Our cameraphones may make us feel like social-media activists, but when we’re recording an event instead of intervening, we’re actually just real-world bystanders. There is a gulf of dissonance between what we publicly declare as our values—online or otherwise—and how we act.

In the past few years, there have been scores of videos depicting abuse that have been recorded and then disseminated online. In New Jersey in 2014, people watched and recorded as a woman was punched and kicked by a co-worker. (The only one who said anything was her 2-year-old child, who knew, naturally, to help.) In Philadelphia in 2016, a man was beating and punching a woman in the streets while an observer videotaped the event. Even without violence, the temptation to be a recording bystander prevails. Take the case of a 2013 fire in Pincourt, Canada, where observers recorded the house burning to the ground from all angles—but nobody called the fire station.

To prevent a culture of disembodied bystanders, we must learn to better asses the appropriate actions when we’re in a situation that demands immediate attention. In doing so, we hopefully transcend the idea that recording an event is a replacement for action.

Sam Gregory is a program director at WITNESS, a global organization that incorporates video technology into human-rights advocacy. The goal of Gregory’s primary project, Mobil-Eyes-Us, is to find ways to translate “co-presence” in to action. “In these types of events, people do freeze,” Gregory says. “The goal is to get over the freeze reaction.”

Filming events doesn’t relinquish our moral responsibility to intervene, but Gregory believes it’s “a step up from the Kitty Genovese incident,” which was an infamous 1964 stabbing in Queens, New York that 38 neighbors observed over a half hour, but none of them called the police or stepped in to intervene. If those 38 people lived in an age of smartphones, you can safely bet what a large portion of them would be doing.

Gregory says the idea of his project is to develop “witnessing literacy:” a repertoire of actions people can take in order to prevent unethical witnessing. To that end, the WITNESS website has abundant resources and guides, from teaching observers how to capture and preserve video as evidence to how to protect your identity on Youtube. The organization has also produced a mini-guide to capturing ethical footage and a video showing how to share the United Airlines video in a way that would protect the victim, David Dao:

This said, documenting an event is only a viable contribution to an inclement situation if it is then used in an ethical manner; it’s not the recording that matters, it’s what you do with it. For example, a video of an assault on your phone helps no one if it’s not formally filed to the police or uploaded to the internet in an effective, ethical manner. And with all that effort, wouldn’t it have been better to try and pipe-up in the moment? (If all else fails, you might also try to sing, which is what this one brave woman did to fend off a man harassing a woman on public transport.)

Viral videos that incite outrage and prod at our sense of justice demonstrate both the difficulty and necessity of acting in accordance with our values. We argue so much online about the actions of people who we do not know and will never meet, and this takes time away from looking at our own actions and preparing ourselves to act better in similar situations. As we thank the one woman on the plane who dared to speak up on the United flight, we should consider what else productive protest looks like so that each of us has a repertoire of counter-violent actions to take.

For now, those of us who wish to believe in a world where people look out for each other will have to take it upon themselves to lead by example. We should learn how to translate our digital frustrations to analog action.

Microsoft Turned Down Facial-Recognition Sales over “Human Rights Concerns”

Microsoft recently rejected a California law enforcement agency’s request to install facial recognition technology in officers’ cars and body cameras due to human rights concerns, company President Brad Smith said on Tuesday. Microsoft concluded it would lead to innocent women and minorities being disproportionately held for questioning because the artificial intelligence has been trained on mostly white and male pictures. AI has more cases of mistaken identity with women and minorities, multiple research projects have found.

Smith explained the decisions as part of a commitment to human rights that he said was increasingly critical as rapid technological advances empower governments to conduct blanket surveillance, deploy autonomous weapons and take other steps that might prove impossible to reverse. Smith also said at a Stanford University conference that Microsoft had declined a deal to install facial recognition on cameras blanketing the capital city of an unnamed country that the nonprofit Freedom House had deemed not free. Smith said it would have suppressed freedom of assembly there.

On the other hand, Microsoft did agree to provide the technology to an American prison, after the company concluded that the environment would be limited and that it would improve safety inside the unnamed institution.

Chinese companies using GPS tracking device smartwatches to monitor, alert street cleaners

Street cleaners in parts of China are reportedly being forced to wear GPS-tracking smartwatches so employers can monitor how hard they work, sparking public outrage and concern over increasing mass surveillance across the country.

If the smartwatch detects a worker standing still for over 20 minutes, it sounds an alarm. “Add oil, add oil [work harder, work harder!],” the wristbands’ alarm says, several cleaners from the eastern city of Nanjing told Jiangsu Television earlier this month.

The smartwatch not only tracks the cleaners’ locations but also reports their activity back to the company’s control room, where a big screen displays their locations as a cluster of red dots on a map.

“It knows everything,” an anonymous cleaner told a reporter in the Jiangsu Television report. “Supervisors will come if we don’t move after hearing the alarm.”

Following backlash, the company said it removed the alarm function from the smartwatch, but reports maintain the employees are still being required to wear the device so their location can be tracked.

The Chinese Government is already in the process of building a Social Credit System aimed at monitoring the behaviour of its 1.4 billion citizens with the help an extensive network of CCTV cameras and facial recognition technology.

Senior researcher for Human Rights Watch China Maya Wang said the use of surveillance technology by the Government was sending private companies a message that it was “okay to [monitor] people”.