Resources

Car Manufacturers Are Tracking Millions of Cars

Millions of new cars sold in the US and Europe are “connected,” having some mechanism for exchanging data with their manufacturers after the cars are sold; these cars stream or batch-upload location data and other telemetry to their manufacturers, who argue that they are allowed to do virtually anything they want with this data, thanks to the “explicit consent” of the car owners — who signed a lengthy contract at purchase time that contained a vague and misleading clause deep in its fine-print.

Slashdot reader Luthair adds that “OnStar infamously has done this for some time, even if the vehicle’s owner was not a subscriber of their services.” But now 78 million cars have an embedded cyber connection, according to one report, with analysts predicting 98% of new cars will be “connected” by 2021. The Washington Post calls it “Big Brother on Wheels.”

“Carmakers have turned on a powerful spigot of precious personal data, often without owners’ knowledge, transforming the automobile from a machine that helps us travel to a sophisticated computer on wheels that offers even more access to our personal habits and behaviors than smartphones do.”

805

Facebook should be ‘regulated like cigarette industry’, says tech CEO

Facebook should be regulated like a cigarette company, because of the addictive and harmful properties of social media, according to Salesforce chief executive Marc Benioff.

Last week, venture capitalist Roger McNamee – an early investor in Facebook – wrote a Guardian column warning that the company would would have to “address the harm the platform has caused through addiction and exploitation by bad actors”.

“I was once Mark Zuckerberg’s mentor, but I have not been able to speak to him about this. Unfortunately, all the internet platforms are deflecting criticism and leaving their users in peril,” McNamee wrote.

Earlier, Sean Parker, Facebook’s first President, had described the business practice of social media firms as “a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology”. Parker now describes himself as “something of a conscientious objector” to social media.

As part of its attempt to win back control of the narrative, Facebook has announced it will begin taking into account how trusted a publisher is as part of its News Feed algorithm. The company’s metric for determining trust, however, is a simple two-question survey, causing some to query its potential.

803

Study links decline in teenagers’ happiness to smartphones

A precipitous drop in the happiness, self-esteem and life satisfaction of American teenagers came as their ownership of smartphones rocketed from zero to 73 percent and they devoted an increasing share of their time online. Coincidence? New research suggests it is not. In a study published Monday in the journal Emotion, psychologists from San Diego State University and the University of Georgia used data on mood and media culled from roughly 1.1 million U.S. teens to figure out why a decades-long rise in happiness and satisfaction among U.S. teenagers suddenly shifted course in 2012 and declined sharply over the next four years.

In the new study, researchers tried to find it by plumbing a trove of eighth-, 10th- and 12th-graders’ responses to queries on how they felt about life and how they used their time. They found that between 1991 and 2016, adolescents who spent more time on electronic communication and screens — social media, texting, electronic games, the internet — were less happy, less satisfied with their lives and had lower self-esteem. TV watching, which declined over the nearly two decades they examined, was similarly linked to lower psychological well-being.

By contrast, adolescents who spent more time on non-screen activities had higher psychological well-being. They tended to profess greater happiness, higher self-esteem and more satisfaction with their lives. While these patterns emerged in the group as a whole, they were particularly clear among eighth- and 10th-graders, the authors found: “Every non-screen activity was correlated with greater happiness, and every screen activity was correlated with less happiness.”

846

An AI-Powered App Has Resulted in an Explosion of Convincing Face-Swap Porn

In December, Motherboard discovered a Redditor named ‘deepfakes’ quietly enjoying his hobby: Face-swapping celebrity faces onto porn performers’ bodies. He made several convincing porn videos of celebrities — including Gal Gadot, Maisie Williams, and Taylor Swift — using a machine learning algorithm, his home computer, publicly available videos, and some spare time. Since we first wrote about deepfakes, the practice of producing AI-assisted fake porn has exploded. More people are creating fake celebrity porn using machine learning, and the results have become increasingly convincing. A redditor even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

An incredibly easy-to-use application for DIY fake videos—of sex and revenge porn, but also political speeches and whatever else you want—that moves and improves at this pace could have society-changing impacts in the ways we consume media. The combination of powerful, open-source neural network research, our rapidly eroding ability to discern truth from fake news, and the way we spread news through social media has set us up for serious consequences.

884
Stare Into The Lights My Pretties

You spend nearly a whole day each week on the internet

Since 2000, our time spent online each week has steadily increased, rising from 9.4 hours to 23.6 hours — nearly an entire day, according to a recent report by the USC Annenberg Center for the Digital Future. The internet has become an integral component of our home lives as well, with time spent rising more than 400 percent over that period from 3.3 hours to 17.6 hours each week, according to the report, which surveys more than 2,000 people across the U.S. each year. The center’s 15th annual Digital Future Report illustrates the internet’s dramatic evolution since 2000 from a secondary medium to an indispensable component of our daily lives — always on and always with us. It also comes as many fear for the future of the unlimited internet we have largely taken for granted over the past two decades. The report also found that the internet has had a dramatic impact on how we get our news. News consumption for all ages went from a print-to-online ratio of 85-15 in 2001 to a near even 51-49 in 2016.

764

Smartphone addiction could be changing your brain

You may be one of the growing number of Americans (or global citizens) who has a bit of nomophobia.

“Nomophobia?” you mutter as you read this on your ever-present smartphone. “Of course not.”

“NO MObile PHOne phoBIA” is a 21st-century term for the fear of not being able to use your cell phone or other smart device. Cell phone addiction is on the rise, surveys show, and a new study released Thursday adds to a growing body of evidence that smartphone and internet addiction is harming our minds – literally.

SecurEnvoy, a two-factor authentication company, conducted research using a polling panel (which is not as scientific as a randomized poll) and found that 66% of people in the United Kingdom have some form of nomophobia. Notably, 41% of the participants said they had two or more phones to make sure they stayed connected.

Surveys by the Pew Research Center this year showed that 77% of Americans own smartphones, up from 35% in 2011. Ninety-five percent own a cell phone of some kind.

Obviously, there are some serious ramifications to having a cell phone habit. According to the US Centers for Disease Control and Prevention, mobile phone use is partially to blame for the distracted driving that kills an estimated nine people each day and injures more than 1,000.

The prevalence of texting while driving has reached epidemic proportions. A 2010 study by the Pew Research Center said nearly half of US adults admit reading or sending a text message while driving. The news is worse for teens: Nearly one in three 16- or 17-year-olds said they have texted while driving.
woman sleeping under covers

Your smartphone may be hurting your sleep

Millennials are the worst offenders, according to Pew. Fifty-nine percent of people between the ages of 18 and 33 reported texting while driving, compared with 50% of Gen Xers (age 34 to 45) and only 29% of baby boomers.

It’s not just driving. A study of pedestrians in midtown Manhattan found that 42% of those who entered traffic during a “Don’t Walk” signal were talking on a cell phone, wearing headphones or looking down at an electronic device. A 2013 study found a tenfold increase in injuries related to pedestrians using cell phones from 2005 to 2010.

Other health ramifications include text neck – that cramping, stabbing pain that comes after looking down at your phone too long – and poor posture, which can affect your spine, respiratory functions and even emotions. Researchers have also found that the blue light emitted from our cell phones and other internet devices can disrupt melatonin production and therefore our sleep.
A connection to executive functioning

The latest evidence comes from a small study presented Thursday at the annual meeting of the Radiological Society of North America in Chicago. The study, which has not been peer-reviewed, indicates that cell phone addiction may affect brain functioning.

Researchers from Korea University in Seoul used brain imaging to study the brains of 19 teenage boys who were diagnosed with internet or smartphone addiction. Compared with 19 teenagers who were not addicted, the brains of the addicted boys had significantly higher levels of GABA, a neurotransmitter in the cortex that inhibits neurons, than levels of glutamate-glutamine, a neurotransmitter that energizes brain signals.

“GABA slows down the neurons,” explained Yildirim, who was not involved in the Korean study. “That results in poorer attention and control, which you don’t want to have, because you want to stay focused. So that means you are more vulnerable to distractions.”

“It’s a very small study, so you have to take it with a grain of salt,” said Stanford neuroradiologist Dr. Max Wintermark, an expert in neuroimaging who was also not connected with the research. “It’s the first study that I read about internet addiction, but there are many studies that link alcohol, drug and other types of addiction to imbalances in various neurotransmitters in the brain.”

Yildirim agreed that the preliminary findings were consistent with prior research.

“We know that medium to heavy multitaskers, who engage in multiple forms of media simultaneously, tend to demonstrate smaller gray matter area in the anterior cingulate cortex, which is the area of the brain responsible for top-down attention control,” he said. “Altogether, this means that if you are too dependent on your smartphone, you are basically damaging your ability to be attentive.”

Addicted teenagers in the study also had significantly higher scores in anxiety, depression and levels of insomnia and impulsivity, said Dr. Hyung Suk Seo, professor of neuroradiology at Korea University, who led the study.

The good news is that when 12 of the addicted teens were given nine weeks of cognitive behavioral therapy, the levels of GABA to glutamate-glutamine normalized.

“This is a common finding in the literature,” Yildirim said. “There are studies that have looked at how cognitive behavioral therapy can improve attention control and executive functioning.”

One study of mindfulness training showed increased cognitive performance, and another showed neuroplastic changes in the anterior cingulate cortex, the same area of the brain damaged by smartphone addiction.

“To me, the most interesting aspect of the study is that they were able to see a correction of the imbalance after cognitive behavior therapy intervention,” Wintermark said. “What I would like to see is more research on whether the symptoms of addiction are also corrected.”

Fighting back against smartphone addiction

If you, or a loved one, seems to have the symptoms of smart device or internet addiction, experts have some suggestions in addition to mindfulness training. First, turn off your phone at certain times of the day, such as in meetings, having dinner, playing with your kids, and of course, driving. Remove social media apps, like Facebook and Twitter from your phone, and only check-in from your laptop. Try to wean yourself to 15 minute intervals at set times of the day when it won’t affect work or family life. Don’t bring your cell phone and it’s harmful blue light to bed; use an old fashioned alarm to wake you. And last, try to replace your smart device time with healthier activities such as meditating or actually interacting with real people.

170
Stare Into The Lights My Pretties

Apple says it looks out for kids, as investors cite phone ‘addiction’

Apple Inc said it “has always looked out for kids”, defending its technology policy for children, after two major investors urged it to address what they said was a growing problem of young people getting addicted to Apple’s iPhones.

Shareholders Jana Partners, a leading activist shareholder, and California teacher pension investor CalSTRS, one of the nation’s largest public pension plans, delivered a letter to Apple on Saturday asking the company to consider developing software that would allow parents more options to limit children’s phone use.

The issue of phone addiction among young people has become a growing concern in the United States as parents report their children cannot give up their phones. CalSTRS and Jana worry that “even” Apple’s reputation could be hurt if it does not address those concerns. Their letter was originally reported by the Wall Street Journal.

846

Pentagon Seeks Laser-Powered Bat Drones

On Wednesday, the the Defense Enterprise Science Initiative, or DESI, announced a competition for basic science grants to build “new paradigms for autonomous flight, with a focus on highly-maneuverable platforms and algorithms for flight control and decision making.”

Biomimetic, or nature-imitating, designs for crawling, slinking and even swimming robots go back decades.

But getting flying machines to mimic nature is a good deal more difficult and more complicated than teaching robots to swim and crawl, which is why even the military’s smallest drones have followed conventional aerodynamic designs.

836

That Game on Your Phone May Be Tracking What You’re Watching on TV

At first glance, the gaming apps — with names like “Pool 3D,” “Beer Pong: Trickshot” and “Real Bowling Strike 10 Pin” — seem innocuous. One called “Honey Quest” features Jumbo, an animated bear.

Yet these apps, once downloaded onto a smartphone, have the ability to keep tabs on the viewing habits of their users — some of whom may be children — even when the games aren’t being played.

It is yet another example of how companies, using devices that many people feel they can’t do without, are documenting how audiences in a rapidly changing entertainment landscape are viewing television and commercials.

The apps use software from Alphonso, a start-up that collects TV-viewing data for advertisers. Using a smartphone’s microphone, Alphonso’s software can detail what people watch by identifying audio signals in TV ads and shows, sometimes even matching that information with the places people visit and the movies they see. The information can then be used to target ads more precisely and to try to analyze things like which ads prompted a person to go to a car dealership.

More than 250 games that use Alphonso software are available in the Google Play store; some are also available in Apple’s app store.

Some of the tracking is taking place through gaming apps that do not otherwise involve a smartphone’s microphone, including some apps that are geared toward children. The software can also detect sounds even when a phone is in a pocket if the apps are running in the background.

898

12 Days In Xinjiang — China’s Surveillance State

Urumqi, China – This city on China’s Central Asia frontier may be one of the most closely surveilled places on earth.

Security checkpoints with identification scanners guard the train station and roads in and out of town. Facial scanners track comings and goings at hotels, shopping malls and banks. Police use hand-held devices to search smartphones for encrypted chat apps, politically charged videos and other suspect content. To fill up with gas, drivers must first swipe their ID cards and stare into a camera.

China’s efforts to snuff out a violent separatist movement by some members of the predominantly Muslim Uighur ethnic group have turned the autonomous region of Xinjiang, of which Urumqi is the capital, into a laboratory for high-tech social controls that civil-liberties activists say the government wants to roll out across the country.

It is nearly impossible to move about the region without feeling the unrelenting gaze of the government. Citizens and visitors alike must run a daily gantlet of police checkpoints, surveillance cameras and machines scanning their ID cards, faces, eyeballs and sometimes entire bodies.

When fruit vendor Parhat Imin swiped his card at a telecommunications office this summer to pay an overdue phone bill, his photo popped up with an “X.” Since then, he says, every scan of his ID card sets off an alarm. He isn’t sure what it signifies, but figures he is on some kind of government watch list because he is a Uighur and has had intermittent run-ins with the police.

He says he is reluctant to travel for fear of being detained. “They blacklisted me,” he says. “I can’t go anywhere.”

All across China, authorities are rolling out new technology to keep watch over people and shape their behavior. Controls on expression have tightened under President Xi Jinping, and the state’s vast security web now includes high-tech equipment to monitor online activity and even snoop in smartphone messaging apps.

China’s government has been on high alert since a surge in deadly terrorist attacks around the country in 2014 that authorities blamed on Xinjiang-based militants inspired by extremist Islamic messages from abroad. Now officials are putting the world’s most state-of-the-art tools in the hands of a ramped-up security force to create a system of social control in Xinjiang—one that falls heaviest on Uighurs.

At a security exposition in October, an executive of Guangzhou-based CloudWalk Technology Co., which has sold facial-recognition algorithms to police and identity-verification systems to gas stations in Xinjiang, called the region the world’s most heavily guarded place. According to the executive, Jiang Jun, for every 100,000 people the police in Xinjiang want to monitor, they use the same amount of surveillance equipment that police in other parts of China would use to monitor millions.

Authorities in Xinjiang declined to respond to questions about surveillance. Top party officials from Xinjiang said at a Communist Party gathering in Beijing in October that “social stability and long-term security” were the local government’s bottom-line goals.

Chinese and foreign civil-liberty activists say the surveillance in this northwestern corner of China offers a preview of what is to come nationwide.

“They constantly take lessons from the high-pressure rule they apply in Xinjiang and implement them in the east,” says Zhu Shengwu, a Chinese human-rights lawyer who has worked on surveillance cases. “What happens in Xinjiang has bearing on the fate of all Chinese people.”

During an October road trip into Xinjiang along a modern highway, two Wall Street Journal reporters encountered a succession of checkpoints that turned the ride into a strange and tense journey.

At Xingxing Gorge, a windswept pass used centuries ago by merchants plying the Silk Road, police inspected incoming traffic and verified travelers’ identities. The Journal reporters were stopped, ordered out of their car and asked to explain the purpose of their visit. Drivers, mostly those who weren’t Han Chinese, were guided through electronic gateways that scanned their ID cards and faces.

Farther along, at the entrance to Hami, a city of a half-million, police had the Journal reporters wait in front of a bank of TV screens showing feeds from nearby surveillance cameras while recording their passport numbers.

Surveillance cameras loomed every few hundred feet along the road into town, blanketed street corners and kept watch on patrons of a small noodle shop near the main mosque. The proprietress, a member of the Muslim Hui minority, said the government ordered all restaurants in the area to install the devices earlier this year “to prevent terrorist attacks.”

Days later, as the Journal reporters were driving on a dirt road in Shanshan county after being ordered by officials to leave a nearby town, a police cruiser materialized seemingly from nowhere. It raced past, then skidded to a diagonal stop, kicking up a cloud of dust and blocking the reporters’ car. An SUV pulled up behind. A half-dozen police ordered the reporters out of the car and demanded their passports.

An officer explained that surveillance cameras had read the out-of-town license plates and sent out an alert. “We check every car that’s not from Xinjiang,” he said. The police then escorted the reporters to the highway.

At checkpoints further west, iris and body scanners are added to the security arsenal.

Darren Byler, an anthropology researcher at the University of Washington who spent two years in Xinjiang studying migration, says the closest contemporary parallel can be found in the West Bank and Gaza Strip, where the Israeli government has created a system of checkpoints and biometric surveillance to keep tabs on Palestinians.

In Erdaoqiao, the neighborhood where the fruit vendor Mr. Imin lives, small booths known as “convenience police stations,” marked by flashing lights atop a pole, appear every couple of hundred yards. The police stationed there offer water, cellphone charging and other services, while also taking in feeds from nearby surveillance cameras.

Young Uighur men are routinely pulled into the stations for phone checks, leading some to keep two devices—one for home use and another, with no sensitive content or apps, for going out, according to Uighur exiles.

Erdaoqiao, the heart of Uighur culture and commerce in Urumqi, is where ethnic riots started in 2009 that resulted in numerous deaths. The front entrance to Erdaoqiao Mosque is now closed, as are most entries to the International Grand Bazaar. Visitors funnel through a heavily guarded main gate. The faces and ID cards of Xinjiang residents are scanned. An array of cameras keeps watch.

After the riots, authorities showed up to shut down the shop Mr. Imin was running at the time, which sold clothing and religious items. When he protested, he says, they clubbed him on the back of the head, which has left him walking with a limp. They jailed him for six months for obstructing official business, he says. Other jail stints followed, including eight months for buying hashish.

The police in Urumqi didn’t respond to requests for comment.

Mr. Imin now sells fruit and freshly squeezed pomegranate juice from a cart. He worries that his flagged ID card will bring the police again. Recently remarried, he hasn’t dared visit his new wife’s family in southern Xinjiang.

Chinese rulers have struggled for two millennia to control Xinjiang, whose 23 million people are scattered over an expanse twice the size of Texas. Beijing sees it as a vital piece of President Xi’s trillion-dollar “Belt and Road” initiative to build infrastructure along the old Silk Road trade routes to Europe.

Last year, Mr. Xi installed a new Xinjiang party chief, Chen Quanguo, who previously handled ethnic strife in Tibet, another hot spot. Mr. Chen pioneered the convenience police stations in that region, partly in response to a string of self-immolations by monks protesting Chinese rule.

Under Mr. Chen, the police presence in Xinjiang has skyrocketed, based on data showing exponential increases in police-recruitment advertising. Local police departments last year began ordering cameras capable of creating three-dimensional face images as well as DNA sequencers and voice-pattern analysis systems, according to government procurement documents uncovered by Human Rights Watch and reviewed by the Journal.

During the first quarter of 2017, the government announced the equivalent of more than $1 billion in security-related investment projects in Xinjiang, up from $27 million in all of 2015, according to research in April by Chinese brokerage firm Industrial Securities .

Government procurement orders show millions spent on “unified combat platforms”—computer systems to analyze surveillance data from police and other government agencies.

Tahir Hamut, a Uighur poet and filmmaker, says Uighurs who had passports were called in to local police stations in May. He worried he would draw extra scrutiny for having been accused of carrying sensitive documents, including newspaper articles about Uighur separatist attacks, while trying to travel to Turkey to study in the mid-1990s. The aborted trip landed him in a labor camp for three years, he says.

He and his wife lined up at a police station with other Uighurs to have their fingerprints and blood samples taken. He says he was asked to read a newspaper for two minutes while police recorded his voice, and to turn his head slowly in front of a camera.

Later, his family’s passports were confiscated. After a friend was detained by police, he says, he assumed he also would be taken away. He says he paid officials a bribe of more than $9,000 to get the passports back, making up a story that his daughter had epilepsy requiring treatment in the U.S. Xinjiang’s Public Security Bureau, which is in charge of the region’s police forces, didn’t respond to a request for comment about the bribery.

“The day we left, I was filled with anxiety,” he says. “I worried what would happen if we were stopped going through security at the Urumqi airport, or going through border control in Beijing.”

He and his family made it to Virginia, where they have applied for political asylum.

Chinese authorities use forms to collect personal information from Uighurs. One form reviewed by the Journal asks about respondents’ prayer habits and if they have contacts abroad. There are sections for officials to rate “persons of interest” on a six-point scale and check boxes on whether they are “safe,” “average” or “unsafe.”

China Communications Services Co. Ltd., a subsidiary of state telecom giant China Telecom , has signed contracts this year worth more than $38 million to provide mosque surveillance and install surveillance-data platforms in Xinjiang, according to government procurement documents. The company declined to discuss the contracts, saying they constituted sensitive business information.

Xiamen Meiya Pico Information Co. Ltd. worked with police in Urumqi to adapt a hand-held device it sells for investigating economic crimes so it can scan smartphones for terrorism-related content.

A description of the device that recently was removed from the company’s website said it can read the files on 90% of smartphones and check findings against a police antiterror database. “Mostly, you’re looking for audio and video,” said Zhang Xuefeng, Meiya Pico’s chief marketing officer, in an interview.

Near the Xinjiang University campus in Urumqi, police sat at a wooden table recently, ordering some people walking by to hand over their phones.

“You just plug it in and it shows you what’s on the phone,” said one officer, brandishing a device similar to the one on Meiya Pico’s website. He declined to say what content they were checking for.

One recent afternoon in Korla, one of Xinjiang’s largest cities, only a trickle of people passed through the security checkpoint at the local bazaar, where vendors stared at darkened hallways empty of shoppers.

Li Qiang, the Han Chinese owner of a wine shop, said the security checks, while necessary for safety, were getting in the way of commerce. “As soon as you go out, they check your ID,” he said.

Authorities have built a network of detention facilities, officially referred to as education centers, across Xinjiang. In April, the official Xinjiang Daily newspaper said more than 2,000 people had been sent to a “study and training center” in the southern city of Hotan.

One new compound sits a half-hour drive south of Kashgar, a Uighur-dominated city near the border with Kyrgyzstan. It is surrounded by imposing walls topped with razor wire, with watchtowers at two corners. A slogan painted on the wall reads: “All ethnic groups should be like the pods of a pomegranate, tightly wrapped together.”

Villagers describe it as a detention center. A man standing near the entrance one recent night said it was a school and advised reporters to leave.

Mr. Hamut, the poet, says a relative in Kashgar was taken to a detention center after she participated in an Islamic ceremony, and another went missing soon after the family tried to call him from the U.S.

The local government in Kashgar didn’t respond to a request for comment.

Surveillance in and around Kashgar, where Han Chinese make up less than 7% of the population, is even tighter than in Urumqi. Drivers entering the city are screened intensively. A machine scans each driver’s face. Police officers inspect the engine and the trunk. Passengers must get out and run their bags through X-ray machines.

In Aksu, a dusty city a five-hour drive east of Kashgar, knife salesman Jiang Qiankun says his shop had to pay thousands of dollars for a machine that turns a customer’s ID card number, photo, ethnicity and address into a QR code that it lasers into the blade of any knife it sells. “If someone has a knife, it has to have their ID card information,” he says.

On the last day the Journal reporters were in Xinjiang, an unmarked car trailed them on a 5 a.m. drive to the Urumqi airport. During their China Southern Airlines flight to Beijing, a flight attendant appeared to train a police-style body camera attached to his belt on the reporters. Later, as passengers were disembarking, the attendant denied filming them, saying it was common for airline crew to wear the cameras as a security measure.

China Southern says the crew member was an air marshal, charged with safety on board.

864

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

Under fire for Facebook Inc.’s role as a platform for political propaganda, co-founder Mark Zuckerberg has punched back, saying his mission is above partisanship. “We hope to give all people a voice and create a platform for all ideas,” Zuckerberg wrote in September after President Donald Trump accused Facebook of bias. Zuckerberg’s social network is a politically agnostic tool for its more than 2 billion users, he has said. But Facebook, it turns out, is no bystander in global politics. What he hasn’t said is that his company actively works with political parties and leaders including those who use the platform to stifle opposition — sometimes with the aid of “troll armies” that spread misinformation and extremist ideologies.

The initiative is run by a little-known Facebook global government and politics team that’s neutral in that it works with nearly anyone seeking or securing power. The unit is led from Washington by Katie Harbath, a former Republican digital strategist who worked on former New York Mayor Rudy Giuliani’s 2008 presidential campaign. Since Facebook hired Harbath three years later, her team has traveled the globe helping political clients use the company’s powerful digital tools. In some of the world’s biggest democracies — from India and Brazil to Germany and the U.K. — the unit’s employees have become de facto campaign workers. And once a candidate is elected, the company in some instances goes on to train government employees or provide technical assistance for live streams at official state events.

954

Almost 45 million tons of e-waste discarded last year

A new study claims 44.7 million metric tons (49.3 million tons) of TV sets, refrigerators, cellphones and other electrical good were discarded last year, with only a fifth recycled to recover the valuable raw materials inside.

The U.N.-backed study published Wednesday calculates that the amount of e-waste thrown away in 2016 included a million tons of chargers alone.

The U.S. accounted for 6.3 million metric tons, partly due to the fact that the American market for heavy goods is saturated.

The study says all the gold, silver, copper and other valuable materials would have been worth $55 billion had they been recovered.

The authors of the Global E-waste Monitor predict that e-waste, defined as anything with a battery or a cord, will increase to 52.2 million metric tons by 2021.

802

Commercial Spyware is “Out of Control”

Throughout 2016 and 2017, individuals in Canada, United States, Germany, Norway, United Kingdom, and numerous other countries began to receive suspicious emails. It wasn’t just common spam. These people were chosen.

The emails were specifically designed to entice each individual to click a malicious link. Had the targets done so, their internet connections would have been hijacked and surreptitiously directed to servers laden with malware designed by a surveillance company in Israel. The spies who contracted the Israeli company’s services would have been able to monitor everything those targets did on their devices, including remotely activating the camera and microphone.

Who was behind this global cyber espionage campaign? Was it the National Security Agency? Or one of its “five eyes” partners, like the GCHQ or Canada’s CSE? Given that it was done using Israeli-made technology, perhaps it was Israel’s elite signals intelligence agency, Unit 8200?

In fact, it was none of them. Behind this sophisticated international spying operation was one of the poorest countries in the world; a country where less than 5 percent of the population has access to the internet; a country run by an autocratic government routinely flagged for human rights abuses and corruption. Behind this operation was… Ethiopia.

The details of this remarkable clandestine activity are outlined in a new Citizen Lab report published today entitled “Champing at the Cyberbit.” In our report my co-authors and I detail how we monitored the command and control servers used in the campaign and in doing so discovered a public log file that the operators mistakenly left open. That log file provided us with a window, for roughly a year, into the attackers’ activities, infrastructure, and operations. Strong circumstantial evidence points to one or more government agencies in Ethiopia as the responsible party.

We were also able to identify the IP addresses of those who were targeted and successfully infected: a group that includes journalists, a lawyer, activists, and academics. Our access also allowed us enumerate the countries in which the targets were located. Many of the countries in which the targets live—the United States, Canada, and Germany, among others—have strict wiretapping laws that make it illegal to eavesdrop without a warrant. It seems individuals in Ethiopia broke those laws.

If a government wants to collect evidence on a person in another country, it is customary for it to make a formal legal request to other governments through a process like the Mutual Legal Assistance Treaties. Ethiopia appears to have sidestepped all of that. International norms would suggest a formal démarche to Ethiopia from the governments whose citizens it monitored without permission, but that may happen quietly if at all.

Our team reverse-engineered the malware used in this instance, and over time this allowed us to positively identify the company whose spyware was being employed by Ethiopia: Cyberbit Solutions, a subsidiary of the Israel-based homeland security company Elbit Systems. Notably, Cyberbit is the fourth company we have identified, alongside Hacking Team, Finfisher, and NSO Group, whose products and services have been abused by autocratic regimes to target dissidents, journalists, and others. Along with NSO Group, it’s the second Israel-based company whose technology has been used in this way.

Israel does regulate the export of commercial spyware abroad, although apparently not very well from a human-rights perspective. Cyberbit was able to sell its services to Ethiopia—a country with not only a well-documented history of governance and human rights problems, but also a track record of abusing spyware. When considered alongside the extensive reporting we have done about UAE and Mexican government misuse of NSO Group’s services, it’s safe to conclude Israel has a commercial spyware control problem.

How big of a problem? Remarkably, by analyzing the command and control servers of the cyber espionage campaign, we were also able to monitor Cyberbit employees as they traveled the world with infected laptops that checked in to those servers, apparently demonstrating Cyberbit’s products to prospective clients. Those clients include the Royal Thai Army, Uzbekistan’s National Security Service, Zambia’s Financial Intelligence Centre, and the Philippine president’s Malacañang Palace. Outlining the human rights abuses associated with those government entities would fill volumes.

Cyberbit, for its part, has responded to Citizen Lab’s findings: “Cyberbit Solutions offers its products only to sovereign governmental authorities and law enforcement agencies,” the company wrote me on November 29. “Such governmental authorities and law enforcement agencies are responsible to ensure that they are legally authorized to use the products in their jurisdictions.“ The company declined to confirm or deny that the government of Ethiopia is a client, but did note that “Cyberbit Solutions can confirm that any transaction made by it was approved by the competent authorities.”

Governments like Ethiopia no longer depend on their own in-country advanced computer science, engineering, and mathematical capacity in order to build a globe-spanning cyber espionage operation. They can simply buy it off the shelf from a company like Cyberbit. Thanks to companies like these, an autocrat whose country has poor national infrastructure but whose regime has billions of dollars can order up their own NSA. To wit: Elbit Systems, the parent company of Cyberbit, says it has a backlog of orders valuing $7 billion. An investment firm recently sought to acquire a partial stake in NSO Group for a reported $400 million before eventually withdrawing its offer.

Of course, these companies insist that spyware they sell to governments is used exclusively to fight terrorists and investigate crime. Sounds reasonable, and no doubt many do just that. But the problem is when journalists, academics, or NGOs seek to expose corrupt dictators or hold them accountable, those truth tellers may then be labelled criminals or terrorists. And our research has shown that makes those individuals and groups vulnerable to this type of state surveillance, even if they live abroad.

Indeed, we discovered the second-largest concentration of successful infections of this Ethiopian operation are located in Canada. Among the targets whose identities we were able to verify and name in the report, what unites them all is their peaceful political opposition to the Ethiopian government. Except one. Astoundingly, Citizen Lab researcher Bill Marczak, who led our technical investigation, was himself targeted at one point by the espionage operators.

Countries sliding into authoritarianism and corruption. A booming and largely unregulated market for sophisticated surveillance. Civilians not equipped to defend themselves. Add these ingredients together, and you have a serious crisis of democracy brewing. Companies like Cyberbit market themselves as part of a solution to cyber security. But it is evident that commercial spyware is actually contributing to a very deep insecurity instead.

Remedying this problem will not be easy. It will require legal and policy efforts across multiple jurisdictions and involving governments, civil society, and the private sector. A companion piece to the report outlines some measures that could hopefully begin that process, including application of relevant criminal laws. If the international community does not act swiftly, journalists, activists, lawyers, and human rights defenders will be increasingly infiltrated and neutralized. It’s time to address the commercial spyware industry for what it has become: one of the most dangerous cyber security problems of our day.

814

With teen mental health deteriorating over five years, screens a likely culprit

Jean Twenge, Professor of Psychology at the San Diego State University, writes:

In just the five years between 2010 and 2015, the number of U.S. teens who felt useless and joyless–classic symptoms of depression–surged 33 percent in large national surveys. Teen suicide attempts increased 23 percent. Even more troubling, the number of 13-to-18-year-olds who committed suicide jumped 31 percent.

In a new paper published in Clinical Psychological Science, my colleagues and I found that the increases in depression, suicide attempts and suicide appeared among teens from every background–more privileged and less privileged, across all races and ethnicities and in every region of the country. All told, our analysis found that the generation of teens I call “iGen” (those born after 1995) is much more likely to experience mental health issues than their millennial predecessors.

Teens now spend much less time interacting with their friends in person. Feeling socially isolated is also one of the major risk factors for suicide. We found that teens who spent more time than average online and less time than average with friends in person were the most likely to be depressed. Since 2012, that’s what has occurred en masse: Teens have spent less time on activities known to benefit mental health (in-person social interaction) and more time on activities that may harm it (time online).

Teens are also sleeping less, and teens who spend more time on their phones are more likely to not be getting enough sleep. Not sleeping enough is a major risk factor for depression, so if smartphones are causing less sleep, that alone could explain why depression and suicide increased so suddenly.

But some vulnerable teens who would otherwise not have had mental health issues may have slipped into depression due to too much screen time, not enough face-to-face social interaction, inadequate sleep or a combination of all three.

It might be argued that it’s too soon to recommend less screen time, given that the research isn’t completely definitive. However, the downside to limiting screen time – say, to two hours a day or less – is minimal. In contrast, the downside to doing nothing – given the possible consequences of depression and suicide – seems, to me, quite high.

It’s not too early to think about limiting screen time; let’s hope it’s not too late.

771

Over 400 of the World’s Most Popular Websites Record Your Every Keystroke

The idea of websites tracking users isn’t new, but research from Princeton University released last week indicates that online tracking is far more invasive than most users understand.

In the first installment of a series titled “No Boundaries,” three researchers from Princeton’s Center for Information Technology Policy (CITP) explain how third-party scripts that run on many of the world’s most popular websites track your every keystroke and then send that information to a third-party server.

Some highly-trafficked sites run software that records every time you click and every word you type. If you go to a website, begin to fill out a form, and then abandon it, every letter you entered in is still recorded, according to the researchers’ findings. If you accidentally paste something into a form that was copied to your clipboard, it’s also recorded. These scripts, or bits of code that websites run, are called “session replay” scripts. Session replay scripts are used by companies to gain insight into how their customers are using their sites and to identify confusing webpages. But the scripts don’t just aggregate general statistics, they record and are capable of playing back individual browsing sessions.

The scripts don’t run on every page, but are often placed on pages where users input sensitive information, like passwords and medical conditions. Most troubling is that the information session replay scripts collect can’t “reasonably be expected to be kept anonymous,” according to the researchers.

835
Stare Into The Lights My Pretties

You may be sick of worrying about online privacy, but ‘surveillance apathy’ is also a problem

Siobhan Lyons, Scholar in Media and Cultural Studies, Macquarie University, writes in The Conversation:

We all seem worried about privacy. Though it’s not only privacy itself we should be concerned about: it’s also our attitudes towards privacy that are important.

When we stop caring about our digital privacy, we witness surveillance apathy.

And it’s something that may be particularly significant for marginalised communities, who feel they hold no power to navigate or negotiate fair use of digital technologies.

In the wake of the NSA leaks in 2013 led by Edward Snowden, we are more aware of the machinations of online companies such as Facebook and Google. Yet research shows some of us are apathetic when it comes to online surveillance.

Privacy and surveillance

Attitudes to privacy and surveillance in Australia are complex.

According to a major 2017 privacy survey, around 70% of us are more concerned about privacy than we were five years ago.

And yet we still increasingly embrace online activities. A 2017 report on social media conducted by search marketing firm Sensis showed that almost 80% of internet users in Australia now have a social media profile, an increase of around ten points from 2016. The data also showed that Australians are on their accounts more frequently than ever before.

Also, most Australians appear not to be concerned about recently proposed implementation of facial recognition technology. Only around one in three (32% of 1,486) respondents to a Roy Morgan study expressed worries about having their faces available on a mass database.

A recent ANU poll revealed a similar sentiment, with recent data retention laws supported by two thirds of Australians.

So while we’re aware of the issues with surveillance, we aren’t necessarily doing anything about it, or we’re prepared to make compromises when we perceive our safety is at stake.

Across the world, attitudes to surveillance vary. Around half of Americans polled in 2013 found mass surveillance acceptable. France, Britain and the Philippines appeared more tolerant of mass surveillance compared to Sweden, Spain, and Germany, according to 2015 Amnesty International data.

Apathy and marginalisation

In 2015, philosopher Slavoj Žižek proclaimed that he did not care about surveillance (admittedly though suggesting that “perhaps here I preach arrogance”).

This position cannot be assumed by all members of society. Australian academic Kate Crawford argues the impact of data mining and surveillance is more significant for marginalised communities, including people of different races, genders and socioeconomic backgrounds. American academics Shoshana Magnet and Kelley Gates agree, writing:

[…] new surveillance technologies are regularly tested on marginalised communities that are unable to resist their intrusion.

A 2015 White House report found that big data can be used to perpetuate price discrimination among people of different backgrounds. It showed how data surveillance “could be used to hide more explicit forms of discrimination”.

According to Ira Rubinstein, a senior fellow at New York University’s Information Law Institute, ignorance and cynicism are often behind surveillance apathy. Users are either ignorant of the complex infrastructure of surveillance, or they believe they are simply unable to avoid it.

As the White House report stated, consumers “have very little knowledge” about how data is used in conjunction with differential pricing.

So in contrast to the oppressive panopticon (a circular prison with a central watchtower) as envisioned by philosopher Jeremy Bentham, we have what Siva Vaidhyanathan calls the “crytopticon”. The crytopticon is “not supposed to be intrusive or obvious. Its scale, its ubiquity, even its very existence, are supposed to go unnoticed”.

But Melanie Taylor, lead artist of the computer game Orwell (which puts players in the role of surveillance) noted that many simply remain indifferent despite heightened awareness:

That’s the really scary part: that Snowden revealed all this, and maybe nobody really cared.

The Facebook trap

Surveillance apathy can be linked to people’s dependence on “the system”. As one of my media students pointed out, no matter how much awareness users have regarding their social media surveillance, invariably people will continue using these platforms. This is because they are convenient, practical, and “we are creatures of habit”.

As University of Melbourne scholar Suelette Dreyfus noted in a Four Corners report on Facebook:

Facebook has very cleverly figured out how to wrap itself around our lives. It’s the family photo album. It’s your messaging to your friends. It’s your daily diary. It’s your contact list.

This, along with the complex algorithms Facebook and Google use to collect and use data to produce “filter bubbles” or “you loops” is another issue.

Protecting privacy

While some people are attempting to delete themselves from the network, others have come up with ways to avoid being tracked online.

Search engines such as DuckDuckGo or Tor Browser allow users to browse without being tracked. Lightbeam, meanwhile, allows users to see how their information is being tracked by third party companies. And MIT devised a system to show people the metadata of their emails, called Immersion.

Surveillance apathy is more disconcerting than surveillance itself. Our very attitudes about privacy will inform the structure of surveillance itself, so caring about it is paramount.

836

How Facebook Figures Out Everyone You’ve Ever Met

From Slashdot:

“I deleted Facebook after it recommended as People You May Know a man who was defense counsel on one of my cases. We had only communicated through my work email, which is not connected to my Facebook, which convinced me Facebook was scanning my work email,” an attorney told Gizmodo. Kashmir Hill, a reporter at the news outlet, who recently documented how Facebook figured out a connection between her and a family member she did not know existed, shares several more instances others have reported and explains how Facebook gathers information. She reports:

Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections. Because shadow-profile connections happen inside Facebook’s algorithmic black box, people can’t see how deep the data-mining of their lives truly is, until an uncanny recommendation pops up. Facebook isn’t scanning the work email of the attorney above. But it likely has her work email address on file, even if she never gave it to Facebook herself. If anyone who has the lawyer’s address in their contacts has chosen to share it with Facebook, the company can link her to anyone else who has it, such as the defense counsel in one of her cases. Facebook will not confirm how it makes specific People You May Know connections, and a Facebook spokesperson suggested that there could be other plausible explanations for most of those examples — “mutual friendships,” or people being “in the same city/network.” The spokesperson did say that of the stories on the list, the lawyer was the likeliest case for a shadow-profile connection. Handing over address books is one of the first steps Facebook asks people to take when they initially sign up, so that they can “Find Friends.”

The problem with all this, Hill writes, is that Facebook doesn’t explicitly say the scale at which it would be using the contact information it gleans from a user’s address book. Furthermore, most people are not aware that Facebook is using contact information taken from their phones for these purposes.”

882

The Seemingly Pervasive Sinister Side of Algorythmic Screen Time for Children

Writer and artist James Bridle writes in Medium:

“Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatize, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level.

To begin: Kid’s YouTube is definitely and markedly weird. I’ve been aware of its weirdness for some time. Last year, there were a number of articles posted about the Surprise Egg craze. Surprise Eggs videos depict, often at excruciating length, the process of unwrapping Kinder and other egg toys. That’s it, but kids are captivated by them. There are thousands and thousands of these videos and thousands and thousands, if not millions, of children watching them. […] What I find somewhat disturbing about the proliferation of even (relatively) normal kids videos is the impossibility of determining the degree of automation which is at work here; how to parse out the gap between human and machine.”

Sapna Maheshwari also explores in The New York Times:

“Parents and children have flocked to Google-owned YouTube Kids since it was introduced in early 2015. The app’s more than 11 million weekly viewers are drawn in by its seemingly infinite supply of clips, including those from popular shows by Disney and Nickelodeon, and the knowledge that the app is supposed to contain only child-friendly content that has been automatically filtered from the main YouTube site. But the app contains dark corners, too, as videos that are disturbing for children slip past its filters, either by mistake or because bad actors have found ways to fool the YouTube Kids algorithms. In recent months, parents like Ms. Burns have complained that their children have been shown videos with well-known characters in violent or lewd situations and other clips with disturbing imagery, sometimes set to nursery rhymes.”

Very horrible and creepy.

917

The Video Game That Could Shape the Future of War

“As far as video games go, Operation Overmatch is rather unremarkable. Players command military vehicles in eight-on-eight matches against the backdrop of rendered cityscapes — a common setup of games that sometimes have the added advantage of hundreds of millions of dollars in development budgets. Overmatch does have something unique, though: its mission. The game’s developers believe it will change how the U.S. Army fights wars. Overmatch’s players are nearly all soldiers in real life. As they develop tactics around futuristic weapons and use them in digital battle against peers, the game monitors their actions.

Each shot fired and decision made, in addition to messages the players write in private forums, is a bit of information soaked up with a frequency not found in actual combat, or even in high-powered simulations without a wide network of players. The data is logged, sorted, and then analyzed, using insights from sports and commercial video games. Overmatch’s team hopes this data will inform the Army’s decisions about which technologies to purchase and how to develop tactics using them, all with the aim of building a more forward-thinking, prepared force… While the game currently has about 1,000 players recruited by word of mouth and outreach from the Overmatch team, the developers eventually want to involve tens of thousands of soldiers. This milestone would allow for millions of hours of game play per year, according to project estimates, enough to generate rigorous data sets and test hypotheses.”

Brian Vogt, a lieutenant colonel in the Army Capabilities Integration Center who oversees Overmatch’s development, says:

“Right after World War I, we had technologies like aircraft carriers we knew were going to play an important role,” he said. “We just didn’t know how to use them. That’s where we are and what we’re trying to do for robots.”

924

The rise of big data policing

An excerpt from the book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (2017):

“Data-driven policing means aggressive police presence, surveillance, and perceived harassment in those communities. Each data point translates to real human experience, and many times those experiences remain fraught with all-too-human bias, fear, distrust, and racial tension. For those communities, especially poor communities of color, these data-collection efforts cast a dark shadow on the future.”

881