Archives 2017

That Game on Your Phone May Be Tracking What You’re Watching on TV

At first glance, the gaming apps — with names like “Pool 3D,” “Beer Pong: Trickshot” and “Real Bowling Strike 10 Pin” — seem innocuous. One called “Honey Quest” features Jumbo, an animated bear.

Yet these apps, once downloaded onto a smartphone, have the ability to keep tabs on the viewing habits of their users — some of whom may be children — even when the games aren’t being played.

It is yet another example of how companies, using devices that many people feel they can’t do without, are documenting how audiences in a rapidly changing entertainment landscape are viewing television and commercials.

The apps use software from Alphonso, a start-up that collects TV-viewing data for advertisers. Using a smartphone’s microphone, Alphonso’s software can detail what people watch by identifying audio signals in TV ads and shows, sometimes even matching that information with the places people visit and the movies they see. The information can then be used to target ads more precisely and to try to analyze things like which ads prompted a person to go to a car dealership.

More than 250 games that use Alphonso software are available in the Google Play store; some are also available in Apple’s app store.

Some of the tracking is taking place through gaming apps that do not otherwise involve a smartphone’s microphone, including some apps that are geared toward children. The software can also detect sounds even when a phone is in a pocket if the apps are running in the background.

12 Days In Xinjiang — China’s Surveillance State

Urumqi, China – This city on China’s Central Asia frontier may be one of the most closely surveilled places on earth.

Security checkpoints with identification scanners guard the train station and roads in and out of town. Facial scanners track comings and goings at hotels, shopping malls and banks. Police use hand-held devices to search smartphones for encrypted chat apps, politically charged videos and other suspect content. To fill up with gas, drivers must first swipe their ID cards and stare into a camera.

China’s efforts to snuff out a violent separatist movement by some members of the predominantly Muslim Uighur ethnic group have turned the autonomous region of Xinjiang, of which Urumqi is the capital, into a laboratory for high-tech social controls that civil-liberties activists say the government wants to roll out across the country.

It is nearly impossible to move about the region without feeling the unrelenting gaze of the government. Citizens and visitors alike must run a daily gantlet of police checkpoints, surveillance cameras and machines scanning their ID cards, faces, eyeballs and sometimes entire bodies.

When fruit vendor Parhat Imin swiped his card at a telecommunications office this summer to pay an overdue phone bill, his photo popped up with an “X.” Since then, he says, every scan of his ID card sets off an alarm. He isn’t sure what it signifies, but figures he is on some kind of government watch list because he is a Uighur and has had intermittent run-ins with the police.

He says he is reluctant to travel for fear of being detained. “They blacklisted me,” he says. “I can’t go anywhere.”

All across China, authorities are rolling out new technology to keep watch over people and shape their behavior. Controls on expression have tightened under President Xi Jinping, and the state’s vast security web now includes high-tech equipment to monitor online activity and even snoop in smartphone messaging apps.

China’s government has been on high alert since a surge in deadly terrorist attacks around the country in 2014 that authorities blamed on Xinjiang-based militants inspired by extremist Islamic messages from abroad. Now officials are putting the world’s most state-of-the-art tools in the hands of a ramped-up security force to create a system of social control in Xinjiang—one that falls heaviest on Uighurs.

At a security exposition in October, an executive of Guangzhou-based CloudWalk Technology Co., which has sold facial-recognition algorithms to police and identity-verification systems to gas stations in Xinjiang, called the region the world’s most heavily guarded place. According to the executive, Jiang Jun, for every 100,000 people the police in Xinjiang want to monitor, they use the same amount of surveillance equipment that police in other parts of China would use to monitor millions.

Authorities in Xinjiang declined to respond to questions about surveillance. Top party officials from Xinjiang said at a Communist Party gathering in Beijing in October that “social stability and long-term security” were the local government’s bottom-line goals.

Chinese and foreign civil-liberty activists say the surveillance in this northwestern corner of China offers a preview of what is to come nationwide.

“They constantly take lessons from the high-pressure rule they apply in Xinjiang and implement them in the east,” says Zhu Shengwu, a Chinese human-rights lawyer who has worked on surveillance cases. “What happens in Xinjiang has bearing on the fate of all Chinese people.”

During an October road trip into Xinjiang along a modern highway, two Wall Street Journal reporters encountered a succession of checkpoints that turned the ride into a strange and tense journey.

At Xingxing Gorge, a windswept pass used centuries ago by merchants plying the Silk Road, police inspected incoming traffic and verified travelers’ identities. The Journal reporters were stopped, ordered out of their car and asked to explain the purpose of their visit. Drivers, mostly those who weren’t Han Chinese, were guided through electronic gateways that scanned their ID cards and faces.

Farther along, at the entrance to Hami, a city of a half-million, police had the Journal reporters wait in front of a bank of TV screens showing feeds from nearby surveillance cameras while recording their passport numbers.

Surveillance cameras loomed every few hundred feet along the road into town, blanketed street corners and kept watch on patrons of a small noodle shop near the main mosque. The proprietress, a member of the Muslim Hui minority, said the government ordered all restaurants in the area to install the devices earlier this year “to prevent terrorist attacks.”

Days later, as the Journal reporters were driving on a dirt road in Shanshan county after being ordered by officials to leave a nearby town, a police cruiser materialized seemingly from nowhere. It raced past, then skidded to a diagonal stop, kicking up a cloud of dust and blocking the reporters’ car. An SUV pulled up behind. A half-dozen police ordered the reporters out of the car and demanded their passports.

An officer explained that surveillance cameras had read the out-of-town license plates and sent out an alert. “We check every car that’s not from Xinjiang,” he said. The police then escorted the reporters to the highway.

At checkpoints further west, iris and body scanners are added to the security arsenal.

Darren Byler, an anthropology researcher at the University of Washington who spent two years in Xinjiang studying migration, says the closest contemporary parallel can be found in the West Bank and Gaza Strip, where the Israeli government has created a system of checkpoints and biometric surveillance to keep tabs on Palestinians.

In Erdaoqiao, the neighborhood where the fruit vendor Mr. Imin lives, small booths known as “convenience police stations,” marked by flashing lights atop a pole, appear every couple of hundred yards. The police stationed there offer water, cellphone charging and other services, while also taking in feeds from nearby surveillance cameras.

Young Uighur men are routinely pulled into the stations for phone checks, leading some to keep two devices—one for home use and another, with no sensitive content or apps, for going out, according to Uighur exiles.

Erdaoqiao, the heart of Uighur culture and commerce in Urumqi, is where ethnic riots started in 2009 that resulted in numerous deaths. The front entrance to Erdaoqiao Mosque is now closed, as are most entries to the International Grand Bazaar. Visitors funnel through a heavily guarded main gate. The faces and ID cards of Xinjiang residents are scanned. An array of cameras keeps watch.

After the riots, authorities showed up to shut down the shop Mr. Imin was running at the time, which sold clothing and religious items. When he protested, he says, they clubbed him on the back of the head, which has left him walking with a limp. They jailed him for six months for obstructing official business, he says. Other jail stints followed, including eight months for buying hashish.

The police in Urumqi didn’t respond to requests for comment.

Mr. Imin now sells fruit and freshly squeezed pomegranate juice from a cart. He worries that his flagged ID card will bring the police again. Recently remarried, he hasn’t dared visit his new wife’s family in southern Xinjiang.

Chinese rulers have struggled for two millennia to control Xinjiang, whose 23 million people are scattered over an expanse twice the size of Texas. Beijing sees it as a vital piece of President Xi’s trillion-dollar “Belt and Road” initiative to build infrastructure along the old Silk Road trade routes to Europe.

Last year, Mr. Xi installed a new Xinjiang party chief, Chen Quanguo, who previously handled ethnic strife in Tibet, another hot spot. Mr. Chen pioneered the convenience police stations in that region, partly in response to a string of self-immolations by monks protesting Chinese rule.

Under Mr. Chen, the police presence in Xinjiang has skyrocketed, based on data showing exponential increases in police-recruitment advertising. Local police departments last year began ordering cameras capable of creating three-dimensional face images as well as DNA sequencers and voice-pattern analysis systems, according to government procurement documents uncovered by Human Rights Watch and reviewed by the Journal.

During the first quarter of 2017, the government announced the equivalent of more than $1 billion in security-related investment projects in Xinjiang, up from $27 million in all of 2015, according to research in April by Chinese brokerage firm Industrial Securities .

Government procurement orders show millions spent on “unified combat platforms”—computer systems to analyze surveillance data from police and other government agencies.

Tahir Hamut, a Uighur poet and filmmaker, says Uighurs who had passports were called in to local police stations in May. He worried he would draw extra scrutiny for having been accused of carrying sensitive documents, including newspaper articles about Uighur separatist attacks, while trying to travel to Turkey to study in the mid-1990s. The aborted trip landed him in a labor camp for three years, he says.

He and his wife lined up at a police station with other Uighurs to have their fingerprints and blood samples taken. He says he was asked to read a newspaper for two minutes while police recorded his voice, and to turn his head slowly in front of a camera.

Later, his family’s passports were confiscated. After a friend was detained by police, he says, he assumed he also would be taken away. He says he paid officials a bribe of more than $9,000 to get the passports back, making up a story that his daughter had epilepsy requiring treatment in the U.S. Xinjiang’s Public Security Bureau, which is in charge of the region’s police forces, didn’t respond to a request for comment about the bribery.

“The day we left, I was filled with anxiety,” he says. “I worried what would happen if we were stopped going through security at the Urumqi airport, or going through border control in Beijing.”

He and his family made it to Virginia, where they have applied for political asylum.

Chinese authorities use forms to collect personal information from Uighurs. One form reviewed by the Journal asks about respondents’ prayer habits and if they have contacts abroad. There are sections for officials to rate “persons of interest” on a six-point scale and check boxes on whether they are “safe,” “average” or “unsafe.”

China Communications Services Co. Ltd., a subsidiary of state telecom giant China Telecom , has signed contracts this year worth more than $38 million to provide mosque surveillance and install surveillance-data platforms in Xinjiang, according to government procurement documents. The company declined to discuss the contracts, saying they constituted sensitive business information.

Xiamen Meiya Pico Information Co. Ltd. worked with police in Urumqi to adapt a hand-held device it sells for investigating economic crimes so it can scan smartphones for terrorism-related content.

A description of the device that recently was removed from the company’s website said it can read the files on 90% of smartphones and check findings against a police antiterror database. “Mostly, you’re looking for audio and video,” said Zhang Xuefeng, Meiya Pico’s chief marketing officer, in an interview.

Near the Xinjiang University campus in Urumqi, police sat at a wooden table recently, ordering some people walking by to hand over their phones.

“You just plug it in and it shows you what’s on the phone,” said one officer, brandishing a device similar to the one on Meiya Pico’s website. He declined to say what content they were checking for.

One recent afternoon in Korla, one of Xinjiang’s largest cities, only a trickle of people passed through the security checkpoint at the local bazaar, where vendors stared at darkened hallways empty of shoppers.

Li Qiang, the Han Chinese owner of a wine shop, said the security checks, while necessary for safety, were getting in the way of commerce. “As soon as you go out, they check your ID,” he said.

Authorities have built a network of detention facilities, officially referred to as education centers, across Xinjiang. In April, the official Xinjiang Daily newspaper said more than 2,000 people had been sent to a “study and training center” in the southern city of Hotan.

One new compound sits a half-hour drive south of Kashgar, a Uighur-dominated city near the border with Kyrgyzstan. It is surrounded by imposing walls topped with razor wire, with watchtowers at two corners. A slogan painted on the wall reads: “All ethnic groups should be like the pods of a pomegranate, tightly wrapped together.”

Villagers describe it as a detention center. A man standing near the entrance one recent night said it was a school and advised reporters to leave.

Mr. Hamut, the poet, says a relative in Kashgar was taken to a detention center after she participated in an Islamic ceremony, and another went missing soon after the family tried to call him from the U.S.

The local government in Kashgar didn’t respond to a request for comment.

Surveillance in and around Kashgar, where Han Chinese make up less than 7% of the population, is even tighter than in Urumqi. Drivers entering the city are screened intensively. A machine scans each driver’s face. Police officers inspect the engine and the trunk. Passengers must get out and run their bags through X-ray machines.

In Aksu, a dusty city a five-hour drive east of Kashgar, knife salesman Jiang Qiankun says his shop had to pay thousands of dollars for a machine that turns a customer’s ID card number, photo, ethnicity and address into a QR code that it lasers into the blade of any knife it sells. “If someone has a knife, it has to have their ID card information,” he says.

On the last day the Journal reporters were in Xinjiang, an unmarked car trailed them on a 5 a.m. drive to the Urumqi airport. During their China Southern Airlines flight to Beijing, a flight attendant appeared to train a police-style body camera attached to his belt on the reporters. Later, as passengers were disembarking, the attendant denied filming them, saying it was common for airline crew to wear the cameras as a security measure.

China Southern says the crew member was an air marshal, charged with safety on board.

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

Under fire for Facebook Inc.’s role as a platform for political propaganda, co-founder Mark Zuckerberg has punched back, saying his mission is above partisanship. “We hope to give all people a voice and create a platform for all ideas,” Zuckerberg wrote in September after President Donald Trump accused Facebook of bias. Zuckerberg’s social network is a politically agnostic tool for its more than 2 billion users, he has said. But Facebook, it turns out, is no bystander in global politics. What he hasn’t said is that his company actively works with political parties and leaders including those who use the platform to stifle opposition — sometimes with the aid of “troll armies” that spread misinformation and extremist ideologies.

The initiative is run by a little-known Facebook global government and politics team that’s neutral in that it works with nearly anyone seeking or securing power. The unit is led from Washington by Katie Harbath, a former Republican digital strategist who worked on former New York Mayor Rudy Giuliani’s 2008 presidential campaign. Since Facebook hired Harbath three years later, her team has traveled the globe helping political clients use the company’s powerful digital tools. In some of the world’s biggest democracies — from India and Brazil to Germany and the U.K. — the unit’s employees have become de facto campaign workers. And once a candidate is elected, the company in some instances goes on to train government employees or provide technical assistance for live streams at official state events.

Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked

Adam Alter (2017)

Welcome to the age of behavioral addiction—an age in which half of the American population is addicted to at least one behavior. We obsess over our emails, Instagram likes, and Facebook feeds; we binge on TV episodes and YouTube videos; we work longer hours each year; and we spend an average of three hours each day using our smartphones. Half of us would rather suffer a broken bone than a broken phone, and Millennial kids spend so much time in front of screens that they struggle to interact with real, live humans.

In this revolutionary book, Adam Alter, a professor of psychology and marketing at NYU, tracks the rise of behavioral addiction, and explains why so many of today’s products are irresistible. Though these miraculous products melt the miles that separate people across the globe, their extraordinary and sometimes damaging magnetism is no accident. The companies that design these products tweak them over time until they become almost impossible to resist.

By reverse engineering behavioral addiction, Alter explains how we can harness addictive products for the good—to improve how we communicate with each other, spend and save our money, and set boundaries between work and play—and how we can mitigate their most damaging effects on our well-being, and the health and happiness of our children.

Almost 45 million tons of e-waste discarded last year

A new study claims 44.7 million metric tons (49.3 million tons) of TV sets, refrigerators, cellphones and other electrical good were discarded last year, with only a fifth recycled to recover the valuable raw materials inside.

The U.N.-backed study published Wednesday calculates that the amount of e-waste thrown away in 2016 included a million tons of chargers alone.

The U.S. accounted for 6.3 million metric tons, partly due to the fact that the American market for heavy goods is saturated.

The study says all the gold, silver, copper and other valuable materials would have been worth $55 billion had they been recovered.

The authors of the Global E-waste Monitor predict that e-waste, defined as anything with a battery or a cord, will increase to 52.2 million metric tons by 2021.

Commercial Spyware is “Out of Control”

Throughout 2016 and 2017, individuals in Canada, United States, Germany, Norway, United Kingdom, and numerous other countries began to receive suspicious emails. It wasn’t just common spam. These people were chosen.

The emails were specifically designed to entice each individual to click a malicious link. Had the targets done so, their internet connections would have been hijacked and surreptitiously directed to servers laden with malware designed by a surveillance company in Israel. The spies who contracted the Israeli company’s services would have been able to monitor everything those targets did on their devices, including remotely activating the camera and microphone.

Who was behind this global cyber espionage campaign? Was it the National Security Agency? Or one of its “five eyes” partners, like the GCHQ or Canada’s CSE? Given that it was done using Israeli-made technology, perhaps it was Israel’s elite signals intelligence agency, Unit 8200?

In fact, it was none of them. Behind this sophisticated international spying operation was one of the poorest countries in the world; a country where less than 5 percent of the population has access to the internet; a country run by an autocratic government routinely flagged for human rights abuses and corruption. Behind this operation was… Ethiopia.

The details of this remarkable clandestine activity are outlined in a new Citizen Lab report published today entitled “Champing at the Cyberbit.” In our report my co-authors and I detail how we monitored the command and control servers used in the campaign and in doing so discovered a public log file that the operators mistakenly left open. That log file provided us with a window, for roughly a year, into the attackers’ activities, infrastructure, and operations. Strong circumstantial evidence points to one or more government agencies in Ethiopia as the responsible party.

We were also able to identify the IP addresses of those who were targeted and successfully infected: a group that includes journalists, a lawyer, activists, and academics. Our access also allowed us enumerate the countries in which the targets were located. Many of the countries in which the targets live—the United States, Canada, and Germany, among others—have strict wiretapping laws that make it illegal to eavesdrop without a warrant. It seems individuals in Ethiopia broke those laws.

If a government wants to collect evidence on a person in another country, it is customary for it to make a formal legal request to other governments through a process like the Mutual Legal Assistance Treaties. Ethiopia appears to have sidestepped all of that. International norms would suggest a formal démarche to Ethiopia from the governments whose citizens it monitored without permission, but that may happen quietly if at all.

Our team reverse-engineered the malware used in this instance, and over time this allowed us to positively identify the company whose spyware was being employed by Ethiopia: Cyberbit Solutions, a subsidiary of the Israel-based homeland security company Elbit Systems. Notably, Cyberbit is the fourth company we have identified, alongside Hacking Team, Finfisher, and NSO Group, whose products and services have been abused by autocratic regimes to target dissidents, journalists, and others. Along with NSO Group, it’s the second Israel-based company whose technology has been used in this way.

Israel does regulate the export of commercial spyware abroad, although apparently not very well from a human-rights perspective. Cyberbit was able to sell its services to Ethiopia—a country with not only a well-documented history of governance and human rights problems, but also a track record of abusing spyware. When considered alongside the extensive reporting we have done about UAE and Mexican government misuse of NSO Group’s services, it’s safe to conclude Israel has a commercial spyware control problem.

How big of a problem? Remarkably, by analyzing the command and control servers of the cyber espionage campaign, we were also able to monitor Cyberbit employees as they traveled the world with infected laptops that checked in to those servers, apparently demonstrating Cyberbit’s products to prospective clients. Those clients include the Royal Thai Army, Uzbekistan’s National Security Service, Zambia’s Financial Intelligence Centre, and the Philippine president’s Malacañang Palace. Outlining the human rights abuses associated with those government entities would fill volumes.

Cyberbit, for its part, has responded to Citizen Lab’s findings: “Cyberbit Solutions offers its products only to sovereign governmental authorities and law enforcement agencies,” the company wrote me on November 29. “Such governmental authorities and law enforcement agencies are responsible to ensure that they are legally authorized to use the products in their jurisdictions.“ The company declined to confirm or deny that the government of Ethiopia is a client, but did note that “Cyberbit Solutions can confirm that any transaction made by it was approved by the competent authorities.”

Governments like Ethiopia no longer depend on their own in-country advanced computer science, engineering, and mathematical capacity in order to build a globe-spanning cyber espionage operation. They can simply buy it off the shelf from a company like Cyberbit. Thanks to companies like these, an autocrat whose country has poor national infrastructure but whose regime has billions of dollars can order up their own NSA. To wit: Elbit Systems, the parent company of Cyberbit, says it has a backlog of orders valuing $7 billion. An investment firm recently sought to acquire a partial stake in NSO Group for a reported $400 million before eventually withdrawing its offer.

Of course, these companies insist that spyware they sell to governments is used exclusively to fight terrorists and investigate crime. Sounds reasonable, and no doubt many do just that. But the problem is when journalists, academics, or NGOs seek to expose corrupt dictators or hold them accountable, those truth tellers may then be labelled criminals or terrorists. And our research has shown that makes those individuals and groups vulnerable to this type of state surveillance, even if they live abroad.

Indeed, we discovered the second-largest concentration of successful infections of this Ethiopian operation are located in Canada. Among the targets whose identities we were able to verify and name in the report, what unites them all is their peaceful political opposition to the Ethiopian government. Except one. Astoundingly, Citizen Lab researcher Bill Marczak, who led our technical investigation, was himself targeted at one point by the espionage operators.

Countries sliding into authoritarianism and corruption. A booming and largely unregulated market for sophisticated surveillance. Civilians not equipped to defend themselves. Add these ingredients together, and you have a serious crisis of democracy brewing. Companies like Cyberbit market themselves as part of a solution to cyber security. But it is evident that commercial spyware is actually contributing to a very deep insecurity instead.

Remedying this problem will not be easy. It will require legal and policy efforts across multiple jurisdictions and involving governments, civil society, and the private sector. A companion piece to the report outlines some measures that could hopefully begin that process, including application of relevant criminal laws. If the international community does not act swiftly, journalists, activists, lawyers, and human rights defenders will be increasingly infiltrated and neutralized. It’s time to address the commercial spyware industry for what it has become: one of the most dangerous cyber security problems of our day.

With teen mental health deteriorating over five years, screens a likely culprit

Jean Twenge, Professor of Psychology at the San Diego State University, writes:

In just the five years between 2010 and 2015, the number of U.S. teens who felt useless and joyless–classic symptoms of depression–surged 33 percent in large national surveys. Teen suicide attempts increased 23 percent. Even more troubling, the number of 13-to-18-year-olds who committed suicide jumped 31 percent.

In a new paper published in Clinical Psychological Science, my colleagues and I found that the increases in depression, suicide attempts and suicide appeared among teens from every background–more privileged and less privileged, across all races and ethnicities and in every region of the country. All told, our analysis found that the generation of teens I call “iGen” (those born after 1995) is much more likely to experience mental health issues than their millennial predecessors.

Teens now spend much less time interacting with their friends in person. Feeling socially isolated is also one of the major risk factors for suicide. We found that teens who spent more time than average online and less time than average with friends in person were the most likely to be depressed. Since 2012, that’s what has occurred en masse: Teens have spent less time on activities known to benefit mental health (in-person social interaction) and more time on activities that may harm it (time online).

Teens are also sleeping less, and teens who spend more time on their phones are more likely to not be getting enough sleep. Not sleeping enough is a major risk factor for depression, so if smartphones are causing less sleep, that alone could explain why depression and suicide increased so suddenly.

But some vulnerable teens who would otherwise not have had mental health issues may have slipped into depression due to too much screen time, not enough face-to-face social interaction, inadequate sleep or a combination of all three.

It might be argued that it’s too soon to recommend less screen time, given that the research isn’t completely definitive. However, the downside to limiting screen time – say, to two hours a day or less – is minimal. In contrast, the downside to doing nothing – given the possible consequences of depression and suicide – seems, to me, quite high.

It’s not too early to think about limiting screen time; let’s hope it’s not too late.

Over 400 of the World’s Most Popular Websites Record Your Every Keystroke

The idea of websites tracking users isn’t new, but research from Princeton University released last week indicates that online tracking is far more invasive than most users understand.

In the first installment of a series titled “No Boundaries,” three researchers from Princeton’s Center for Information Technology Policy (CITP) explain how third-party scripts that run on many of the world’s most popular websites track your every keystroke and then send that information to a third-party server.

Some highly-trafficked sites run software that records every time you click and every word you type. If you go to a website, begin to fill out a form, and then abandon it, every letter you entered in is still recorded, according to the researchers’ findings. If you accidentally paste something into a form that was copied to your clipboard, it’s also recorded. These scripts, or bits of code that websites run, are called “session replay” scripts. Session replay scripts are used by companies to gain insight into how their customers are using their sites and to identify confusing webpages. But the scripts don’t just aggregate general statistics, they record and are capable of playing back individual browsing sessions.

The scripts don’t run on every page, but are often placed on pages where users input sensitive information, like passwords and medical conditions. Most troubling is that the information session replay scripts collect can’t “reasonably be expected to be kept anonymous,” according to the researchers.

Stare Into The Lights My Pretties

You may be sick of worrying about online privacy, but ‘surveillance apathy’ is also a problem

Siobhan Lyons, Scholar in Media and Cultural Studies, Macquarie University, writes in The Conversation:

We all seem worried about privacy. Though it’s not only privacy itself we should be concerned about: it’s also our attitudes towards privacy that are important.

When we stop caring about our digital privacy, we witness surveillance apathy.

And it’s something that may be particularly significant for marginalised communities, who feel they hold no power to navigate or negotiate fair use of digital technologies.

In the wake of the NSA leaks in 2013 led by Edward Snowden, we are more aware of the machinations of online companies such as Facebook and Google. Yet research shows some of us are apathetic when it comes to online surveillance.

Privacy and surveillance

Attitudes to privacy and surveillance in Australia are complex.

According to a major 2017 privacy survey, around 70% of us are more concerned about privacy than we were five years ago.

And yet we still increasingly embrace online activities. A 2017 report on social media conducted by search marketing firm Sensis showed that almost 80% of internet users in Australia now have a social media profile, an increase of around ten points from 2016. The data also showed that Australians are on their accounts more frequently than ever before.

Also, most Australians appear not to be concerned about recently proposed implementation of facial recognition technology. Only around one in three (32% of 1,486) respondents to a Roy Morgan study expressed worries about having their faces available on a mass database.

A recent ANU poll revealed a similar sentiment, with recent data retention laws supported by two thirds of Australians.

So while we’re aware of the issues with surveillance, we aren’t necessarily doing anything about it, or we’re prepared to make compromises when we perceive our safety is at stake.

Across the world, attitudes to surveillance vary. Around half of Americans polled in 2013 found mass surveillance acceptable. France, Britain and the Philippines appeared more tolerant of mass surveillance compared to Sweden, Spain, and Germany, according to 2015 Amnesty International data.

Apathy and marginalisation

In 2015, philosopher Slavoj Žižek proclaimed that he did not care about surveillance (admittedly though suggesting that “perhaps here I preach arrogance”).

This position cannot be assumed by all members of society. Australian academic Kate Crawford argues the impact of data mining and surveillance is more significant for marginalised communities, including people of different races, genders and socioeconomic backgrounds. American academics Shoshana Magnet and Kelley Gates agree, writing:

[…] new surveillance technologies are regularly tested on marginalised communities that are unable to resist their intrusion.

A 2015 White House report found that big data can be used to perpetuate price discrimination among people of different backgrounds. It showed how data surveillance “could be used to hide more explicit forms of discrimination”.

According to Ira Rubinstein, a senior fellow at New York University’s Information Law Institute, ignorance and cynicism are often behind surveillance apathy. Users are either ignorant of the complex infrastructure of surveillance, or they believe they are simply unable to avoid it.

As the White House report stated, consumers “have very little knowledge” about how data is used in conjunction with differential pricing.

So in contrast to the oppressive panopticon (a circular prison with a central watchtower) as envisioned by philosopher Jeremy Bentham, we have what Siva Vaidhyanathan calls the “crytopticon”. The crytopticon is “not supposed to be intrusive or obvious. Its scale, its ubiquity, even its very existence, are supposed to go unnoticed”.

But Melanie Taylor, lead artist of the computer game Orwell (which puts players in the role of surveillance) noted that many simply remain indifferent despite heightened awareness:

That’s the really scary part: that Snowden revealed all this, and maybe nobody really cared.

The Facebook trap

Surveillance apathy can be linked to people’s dependence on “the system”. As one of my media students pointed out, no matter how much awareness users have regarding their social media surveillance, invariably people will continue using these platforms. This is because they are convenient, practical, and “we are creatures of habit”.

As University of Melbourne scholar Suelette Dreyfus noted in a Four Corners report on Facebook:

Facebook has very cleverly figured out how to wrap itself around our lives. It’s the family photo album. It’s your messaging to your friends. It’s your daily diary. It’s your contact list.

This, along with the complex algorithms Facebook and Google use to collect and use data to produce “filter bubbles” or “you loops” is another issue.

Protecting privacy

While some people are attempting to delete themselves from the network, others have come up with ways to avoid being tracked online.

Search engines such as DuckDuckGo or Tor Browser allow users to browse without being tracked. Lightbeam, meanwhile, allows users to see how their information is being tracked by third party companies. And MIT devised a system to show people the metadata of their emails, called Immersion.

Surveillance apathy is more disconcerting than surveillance itself. Our very attitudes about privacy will inform the structure of surveillance itself, so caring about it is paramount.

How Facebook Figures Out Everyone You’ve Ever Met

From Slashdot:

“I deleted Facebook after it recommended as People You May Know a man who was defense counsel on one of my cases. We had only communicated through my work email, which is not connected to my Facebook, which convinced me Facebook was scanning my work email,” an attorney told Gizmodo. Kashmir Hill, a reporter at the news outlet, who recently documented how Facebook figured out a connection between her and a family member she did not know existed, shares several more instances others have reported and explains how Facebook gathers information. She reports:

Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections. Because shadow-profile connections happen inside Facebook’s algorithmic black box, people can’t see how deep the data-mining of their lives truly is, until an uncanny recommendation pops up. Facebook isn’t scanning the work email of the attorney above. But it likely has her work email address on file, even if she never gave it to Facebook herself. If anyone who has the lawyer’s address in their contacts has chosen to share it with Facebook, the company can link her to anyone else who has it, such as the defense counsel in one of her cases. Facebook will not confirm how it makes specific People You May Know connections, and a Facebook spokesperson suggested that there could be other plausible explanations for most of those examples — “mutual friendships,” or people being “in the same city/network.” The spokesperson did say that of the stories on the list, the lawyer was the likeliest case for a shadow-profile connection. Handing over address books is one of the first steps Facebook asks people to take when they initially sign up, so that they can “Find Friends.”

The problem with all this, Hill writes, is that Facebook doesn’t explicitly say the scale at which it would be using the contact information it gleans from a user’s address book. Furthermore, most people are not aware that Facebook is using contact information taken from their phones for these purposes.”

The Seemingly Pervasive Sinister Side of Algorythmic Screen Time for Children

Writer and artist James Bridle writes in Medium:

“Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatize, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.

To begin: Kid’s YouTube is definitely and markedly weird. I’ve been aware of its weirdness for some time. Last year, there were a number of articles posted about the Surprise Egg craze. Surprise Eggs videos depict, often at excruciating length, the process of unwrapping Kinder and other egg toys. That’s it, but kids are captivated by them. There are thousands and thousands of these videos and thousands and thousands, if not millions, of children watching them. […] What I find somewhat disturbing about the proliferation of even (relatively) normal kids videos is the impossibility of determining the degree of automation which is at work here; how to parse out the gap between human and machine.”

Sapna Maheshwari also explores in The New York Times:

“Parents and children have flocked to Google-owned YouTube Kids since it was introduced in early 2015. The app’s more than 11 million weekly viewers are drawn in by its seemingly infinite supply of clips, including those from popular shows by Disney and Nickelodeon, and the knowledge that the app is supposed to contain only child-friendly content that has been automatically filtered from the main YouTube site. But the app contains dark corners, too, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms. In recent months, parents like Ms. Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes.”

Very horrible and creepy.

The Video Game That Could Shape the Future of War

“As far as video games go, Operation Overmatch is rather unremarkable. Players command military vehicles in eight-on-eight matches against the backdrop of rendered cityscapes — a common setup of games that sometimes have the added advantage of hundreds of millions of dollars in development budgets. Overmatch does have something unique, though: its mission. The game’s developers believe it will change how the U.S. Army fights wars. Overmatch’s players are nearly all soldiers in real life. As they develop tactics around futuristic weapons and use them in digital battle against peers, the game monitors their actions.

Each shot fired and decision made, in addition to messages the players write in private forums, is a bit of information soaked up with a frequency not found in actual combat, or even in high-powered simulations without a wide network of players. The data is logged, sorted, and then analyzed, using insights from sports and commercial video games. Overmatch’s team hopes this data will inform the Army’s decisions about which technologies to purchase and how to develop tactics using them, all with the aim of building a more forward-thinking, prepared force… While the game currently has about 1,000 players recruited by word of mouth and outreach from the Overmatch team, the developers eventually want to involve tens of thousands of soldiers. This milestone would allow for millions of hours of game play per year, according to project estimates, enough to generate rigorous data sets and test hypotheses.”

Brian Vogt, a lieutenant colonel in the Army Capabilities Integration Center who oversees Overmatch’s development, says:

“Right after World War I, we had technologies like aircraft carriers we knew were going to play an important role,” he said. “We just didn’t know how to use them. That’s where we are and what we’re trying to do for robots.”

https://www.youtube.com/watch?v=N9IkDuls3oU

Woman is attacked on the street, bystanders stop to take selfies

“Shocking surveillance video shows the moment a Pittsburgh woman was knocked out cold by a man on a busy sidewalk — but that’s not the worst of it. The footage also shows the woman being beaten and robbed by bystanders — who proceed to take pictures of her, including selfies — as she lies unconscious on the ground.

A group of men can then be seen walking over to her — cellphones in hand, snapping pictures and video — as she lies unconscious on the sidewalk. Shortly after leaving, the men reportedly returned and began taking even more photos.”

The rise of big data policing

An excerpt from the book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (2017):

“Data-driven policing means aggressive police presence, surveillance, and perceived harassment in those communities. Each data point translates to real human experience, and many times those experiences remain fraught with all-too-human bias, fear, distrust, and racial tension. For those communities, especially poor communities of color, these data-collection efforts cast a dark shadow on the future.”

Saudi Arabia becomes first country to grant citizenship to a robot

“Saudi Arabia has officially recognised a humanoid robot as a citizen, marking the first time in history that an AI device has been awarded such status.

Specific details of Sophia’s citizenship were not discussed. It is unclear whether she will receive the same rights as human citizens, or if Saudi Arabia will develop a specific system devoted to robots.

The system could work in a similar way to the “personhood” status proposed by European Parliament earlier this year, which would see robots with AI given rights and responsibilities.”

Ex-Google engineer establishes new religion with ambition to develop an AI god

“One of the engineers behind Google’s self-driving car has established a nonprofit religious corporation with one main aim – to create a deity with artificial intelligence. According to newly uncovered documents filed to the state of California in September 2015, Anthony Levandowski serves as the CEO and president of religious organisation Way of the Future.”

Way of the Future’s startling mission: “To develop and promote the realization of a Godhead based on artificial intelligence and through understanding and worship of the Godhead contribute to the betterment of society.”

How Silicon Valley divided society and made everyone raging mad

“Silicon Valley’s utopians genuinely but mistakenly believe that more information and connection makes us more analytical and informed. But when faced with quinzigabytes of data, the human tendency is to simplify things. Information overload forces us to rely on simple algorithms to make sense of the overwhelming noise. This is why, just like the advertising industry that increasingly drives it, the internet is fundamentally an emotional medium that plays to our base instinct to reduce problems and take sides, whether like or don’t like, my guy/not my guy, or simply good versus evil. It is no longer enough to disagree with someone, they must also be evil or stupid…

Nothing holds a tribe together like a dangerous enemy. That is the essence of identity politics gone bad: a universe of unbridgeable opinion between opposing tribes, whose differences are always highlighted, exaggerated, retweeted and shared. In the end, this leads us to ever more distinct and fragmented identities, all of us armed with solid data, righteous anger, a gutful of anger and a digital network of likeminded people. This is not total connectivity; it is total division.”

The “Surprisingly” Large Energy Footprint of the Digital Economy

“Our computers and smartphones might seem “clean,” but the digital economy uses a tenth of the world’s electricity—and that share will only increase, with serious consequences for the economy and the environment.

The global Information-Communications-Technologies (ICT) system now uses approximately 1,500 terawatt-hours of power per year. That’s about 10% of the world’s total electricity generation or roughly the combined power production of Germany and Japan. It’s the same amount of electricity that was used to light the entire planet in 1985. We already use 50% more energy to move bytes than we do to move planes in global aviation.

Reduced to personal terms, although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year. And as the world continues to electrify, migrating towards one refrigerator per household, it also evolves towards several smartphones and equivalent per person.”

“Does reading an e-book, or watching a streaming video, use more energy than reading it on paper, or buying a DVD? Does playing a video game use more energy than playing Monopoly? Does a doctor using an iPad for diagnostic advice from artificial intelligence in the Cloud use more energy than, what? Traveling for a second opinion?  The answer involves more than knowing how much electricity one iPad, PC or smartphone uses. It requires accounting for all the electricity used in the entire ICT ecosystem needed to make any of that possible, and the energy characteristics of the ICT ecosystem are quite unlike anything else built to date. Turning on a light does not require dozens of lights to turn on elsewhere. However, turn on an iPad to watch a video and iPad-like devices all over the country, even all over the world, simultaneously light up throughout a vast network. Nothing else in society operates that way. Starting a car doesn’t cause dozens of cars elsewhere to fire up.”

In a cashless world, you better pray the power never goes out…

When Hurricane Maria knocked out power in Puerto Rico, residents there realised they were going to “need physical cash—and a lot of it.” Bloomberg reports that the Fed was forced to fly a planeload of cash to the Island to help avert disaster.

“William Dudley, the New York Fed president, put the word out within minutes, and ultimately a jet loaded with an undisclosed amount of cash landed on the stricken island. [Business executives in Puerto Rico] described corporate clients’ urgent requests for hundreds of thousands in cash to meet payrolls, and the challenge of finding enough armoured cars to satisfy endless demand at ATMs… As early as the day after the storm, the Fed began working to get money onto the island.”

For a time, unless one had a hoard of cash stored up in ones home, it was impossible to get cash at all. 85 percent of Puerto Rico is still without power…

“When some generator-powered ATMs finally opened, lines stretched hours long, with people camping out in beach chairs and holding umbrellas against the sun.”

In an earlier article from September 25, Bloomberg noted how, “without cash, necessities were simply unavailable.”

PrivacyTools

privacytools.io provides knowledge and tools to protect your privacy against global mass surveillance.