Resources

Google hasn’t stopped reading your e-mails

If you’re a Gmail user, your messages and emails likely aren’t as private as you’d think. Google reads each and every one (even if you definitely don’t), scanning your painfully long email chains and vacation responders in order to collect more data on you. Google uses the data gleaned from your messages in order to inform a whole host of other products and services, NBC News reported Thursday.

Though Google announced that it would stop using consumer Gmail content for ad personalization last July, the language permitting it to do so is still included in its current privacy policy, and it without a doubt still scans users emails for other purposes. Aaron Stein, a Google spokesperson, told NBC that Google also automatically extracts keyword data from users’ Gmail accounts, which is then fed into machine learning programs and other products within the Google family. Stein told NBC that Google also “may analyze [email] content to customize search results, better detect spam and malware,” a practice the company first announced back in 2012.

“We collect information about the services that you use and how you use them…” says Google’s privacy policy. “This includes information like your usage data and preferences, Gmail messages, G+ profile, photos, videos, browsing history, map searches, docs, or other Google-hosted content. Our automated systems analyze this information as it is sent and received and when it is stored.”

While Google doesn’t sell this information to third parties, has used it to power its own advertising network and inform search results, among other things. And this is far from a closely guarded secret. The company has included disclosures relating to these practices in its privacy policy since at least 2012: “When you share information with us, for example by creating a Google Account, we can make those services even better – to show you more relevant search results and ads…,” says Google’s March 2012 privacy policy.

806
Stare Into The Lights My Pretties

You may be sick of worrying about online privacy, but ‘surveillance apathy’ is also a problem

Siobhan Lyons, Scholar in Media and Cultural Studies, Macquarie University, writes in The Conversation:

We all seem worried about privacy. Though it’s not only privacy itself we should be concerned about: it’s also our attitudes towards privacy that are important.

When we stop caring about our digital privacy, we witness surveillance apathy.

And it’s something that may be particularly significant for marginalised communities, who feel they hold no power to navigate or negotiate fair use of digital technologies.

In the wake of the NSA leaks in 2013 led by Edward Snowden, we are more aware of the machinations of online companies such as Facebook and Google. Yet research shows some of us are apathetic when it comes to online surveillance.

Privacy and surveillance

Attitudes to privacy and surveillance in Australia are complex.

According to a major 2017 privacy survey, around 70% of us are more concerned about privacy than we were five years ago.

And yet we still increasingly embrace online activities. A 2017 report on social media conducted by search marketing firm Sensis showed that almost 80% of internet users in Australia now have a social media profile, an increase of around ten points from 2016. The data also showed that Australians are on their accounts more frequently than ever before.

Also, most Australians appear not to be concerned about recently proposed implementation of facial recognition technology. Only around one in three (32% of 1,486) respondents to a Roy Morgan study expressed worries about having their faces available on a mass database.

A recent ANU poll revealed a similar sentiment, with recent data retention laws supported by two thirds of Australians.

So while we’re aware of the issues with surveillance, we aren’t necessarily doing anything about it, or we’re prepared to make compromises when we perceive our safety is at stake.

Across the world, attitudes to surveillance vary. Around half of Americans polled in 2013 found mass surveillance acceptable. France, Britain and the Philippines appeared more tolerant of mass surveillance compared to Sweden, Spain, and Germany, according to 2015 Amnesty International data.

Apathy and marginalisation

In 2015, philosopher Slavoj Žižek proclaimed that he did not care about surveillance (admittedly though suggesting that “perhaps here I preach arrogance”).

This position cannot be assumed by all members of society. Australian academic Kate Crawford argues the impact of data mining and surveillance is more significant for marginalised communities, including people of different races, genders and socioeconomic backgrounds. American academics Shoshana Magnet and Kelley Gates agree, writing:

[…] new surveillance technologies are regularly tested on marginalised communities that are unable to resist their intrusion.

A 2015 White House report found that big data can be used to perpetuate price discrimination among people of different backgrounds. It showed how data surveillance “could be used to hide more explicit forms of discrimination”.

According to Ira Rubinstein, a senior fellow at New York University’s Information Law Institute, ignorance and cynicism are often behind surveillance apathy. Users are either ignorant of the complex infrastructure of surveillance, or they believe they are simply unable to avoid it.

As the White House report stated, consumers “have very little knowledge” about how data is used in conjunction with differential pricing.

So in contrast to the oppressive panopticon (a circular prison with a central watchtower) as envisioned by philosopher Jeremy Bentham, we have what Siva Vaidhyanathan calls the “crytopticon”. The crytopticon is “not supposed to be intrusive or obvious. Its scale, its ubiquity, even its very existence, are supposed to go unnoticed”.

But Melanie Taylor, lead artist of the computer game Orwell (which puts players in the role of surveillance) noted that many simply remain indifferent despite heightened awareness:

That’s the really scary part: that Snowden revealed all this, and maybe nobody really cared.

The Facebook trap

Surveillance apathy can be linked to people’s dependence on “the system”. As one of my media students pointed out, no matter how much awareness users have regarding their social media surveillance, invariably people will continue using these platforms. This is because they are convenient, practical, and “we are creatures of habit”.

As University of Melbourne scholar Suelette Dreyfus noted in a Four Corners report on Facebook:

Facebook has very cleverly figured out how to wrap itself around our lives. It’s the family photo album. It’s your messaging to your friends. It’s your daily diary. It’s your contact list.

This, along with the complex algorithms Facebook and Google use to collect and use data to produce “filter bubbles” or “you loops” is another issue.

Protecting privacy

While some people are attempting to delete themselves from the network, others have come up with ways to avoid being tracked online.

Search engines such as DuckDuckGo or Tor Browser allow users to browse without being tracked. Lightbeam, meanwhile, allows users to see how their information is being tracked by third party companies. And MIT devised a system to show people the metadata of their emails, called Immersion.

Surveillance apathy is more disconcerting than surveillance itself. Our very attitudes about privacy will inform the structure of surveillance itself, so caring about it is paramount.

848

How Facebook Figures Out Everyone You’ve Ever Met

From Slashdot:

“I deleted Facebook after it recommended as People You May Know a man who was defense counsel on one of my cases. We had only communicated through my work email, which is not connected to my Facebook, which convinced me Facebook was scanning my work email,” an attorney told Gizmodo. Kashmir Hill, a reporter at the news outlet, who recently documented how Facebook figured out a connection between her and a family member she did not know existed, shares several more instances others have reported and explains how Facebook gathers information. She reports:

Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections. Because shadow-profile connections happen inside Facebook’s algorithmic black box, people can’t see how deep the data-mining of their lives truly is, until an uncanny recommendation pops up. Facebook isn’t scanning the work email of the attorney above. But it likely has her work email address on file, even if she never gave it to Facebook herself. If anyone who has the lawyer’s address in their contacts has chosen to share it with Facebook, the company can link her to anyone else who has it, such as the defense counsel in one of her cases. Facebook will not confirm how it makes specific People You May Know connections, and a Facebook spokesperson suggested that there could be other plausible explanations for most of those examples — “mutual friendships,” or people being “in the same city/network.” The spokesperson did say that of the stories on the list, the lawyer was the likeliest case for a shadow-profile connection. Handing over address books is one of the first steps Facebook asks people to take when they initially sign up, so that they can “Find Friends.”

The problem with all this, Hill writes, is that Facebook doesn’t explicitly say the scale at which it would be using the contact information it gleans from a user’s address book. Furthermore, most people are not aware that Facebook is using contact information taken from their phones for these purposes.”

898

PrivacyTools

privacytools.io provides knowledge and tools to protect your privacy against global mass surveillance.

899

Panopticlick

Electronic Frontier Foundation’s Browser Privacy Tool checks if websites may be able to track you, even if you’ve limited or disabled cookies. Panopticlick tests your browser to see how unique it is based on the information it will share with sites it visits.

880

PRISM Break

Encrypt your communications and end your reliance on proprietary services.

784

EFF Surveillance Self-Defence

The Electronic Frontier Foundation’s guide to defending yourself and your friends from mass surveillance by using secure technology and developing awareness practices.

893
Stare Into The Lights My Pretties

Across the United States, police officers abuse confidential databases

“Police officers across the country misuse confidential law enforcement databases to get information on romantic partners, business associates, neighbors, journalists and others for reasons that have nothing to do with daily police work, an Associated Press investigation has found.
[…]In the most egregious cases, officers have used information to stalk or harass, or have tampered with or sold records they obtained.
[…]Unspecified discipline was imposed in more than 90 instances reviewed by AP. In many other cases, it wasn’t clear from the records if punishment was given at all. The number of violations was surely far higher since records provided were spotty at best, and many cases go unnoticed.

Among those punished: an Ohio officer who pleaded guilty to stalking an ex-girlfriend and who looked up information on her; a Michigan officer who looked up home addresses of women he found attractive; and two Miami-Dade officers who ran checks on a journalist after he aired unflattering stories about the department.

”It’s personal. It’s your address. It’s all your information, it’s your Social Security number, it’s everything about you,” said Alexis Dekany, the Ohio woman whose ex-boyfriend, a former Akron officer, pleaded guilty last year to stalking her. “And when they use it for ill purposes to commit crimes against you — to stalk you, to follow you, to harass you … it just becomes so dangerous.”

The misuse represents only a tiny fraction of the millions of daily database queries run legitimately during traffic stops, criminal investigations and routine police encounters. But the worst violations profoundly abuses systems that supply vital information on criminal suspects and law-abiding citizens alike. The unauthorized searches demonstrate how even old-fashioned policing tools are ripe for abuse, at a time when privacy concerns about law enforcement have focused mostly on more modern electronic technologies.”

874

Steven Rambam at HOPE XI, 2016

“First came the assault on privacy. Name, address, telephone, DOB, SSN, physical description, friends, family, likes, dislikes, habits, hobbies, beliefs, religion, sexual orientation, finances, every granular detail of a person’s life, all logged, indexed, analyzed and cross-referenced. Then came the gathering of location and communication data. Cell phones, apps, metro cards, license plate readers and toll tags, credit card use, IP addresses and authenticated logins, tower info, router proximity, networked “things” everywhere reporting on activity and location, astoundingly accurate facial recognition mated with analytics and “gigapixel” cameras and, worst of all, mindlessly self-contributed posts, tweets, and “check-ins,” all constantly reporting a subject’s location 24-7-365, to such a degree of accuracy that “predictive profiling” knows where you will likely be next Thursday afternoon. Today we are experiencing constant efforts to shred anonymity. Forensic linguistics, browser fingerprinting, lifestyle and behavior analysis, metadata of all types, HTML5, IPv6, and daily emerging “advances” in surveillance technologies – some seemingly science fiction but real – are combining to make constant, mobile identification and absolute loss of anonymity inevitable. And, now, predictably, the final efforts to homogenize: the “siloing” and Balkanization of the Internet. As Internet use becomes more and more self-restricted to a few large providers, as users increasingly never leave the single ecosystem of a Facebook or a Google, as the massive firehose of information on the Internet is “curated” and “managed” by persons who believe that they know best what news and opinions you should have available to read, see, and believe, the bias of a few will eventually determine what you believe. What is propaganda? What is truth? You simply won’t know. In a tradition dating back to the first HOPE conference, for three full hours Steven Rambam will detail the latest trends in privacy invasion and will demonstrate cutting-edge anonymity-shredding surveillance technologies. Drones will fly, a “privacy victim” will undergo digital proctology, a Q&A period will be provided, and fun will be had by all.”

884
Stare Into The Lights My Pretties

“Faceless” recognition can identify you even when you hide your face

“With widespread adoption among law enforcement, advertisers, and even churches, face recognition has undoubtedly become one of the biggest threats to privacy out there.

By itself, the ability to instantly identify anyone just by seeing their face already creates massive power imbalances, with serious implications for free speech and political protest.”

Microsoft pitches technology that can read facial expressions at political rallies.

“But more recently, researchers have demonstrated that even when faces are blurred or otherwise obscured, algorithms can be trained to identify people by matching previously-observed patterns around their head and body.

In a new paper uploaded to the ArXiv pre-print server, researchers at the Max Planck Institute in Saarbrücken, Germany demonstrate a method of identifying individuals even when most of their photos are un-tagged or obscured. The researchers’ system, which they call the “Faceless Recognition System,” trains a neural network on a set of photos containing both obscured and visible faces, then uses that knowledge to predict the identity of obscured faces by looking for similarities in the area around a person’s head and body.”

[…]

“In the past, Facebook has shown its face recognition algorithms can predict the identity of users when they obscure their face with 83% accuracy, using cues such as their stance and body type. But the researchers say their system is the first to do so using a trainable system that uses a full range of body cues surrounding blurred and blacked-out faces.”

 

915

The Outrage Machine

This short video explores how the online world has overwhelmingly become the popular outlet for public rage by briefly illustrating some of the many stories of everyday people which have suddenly become public enemy number one under the most misunderstood of circumstances and trivial narratives. With the web acting like a giant echo-chamber, amplifying false stories and feeding on the pent-up aggression of the audience watching the spectacle, The Outrage Machine shows how these systems froth the mob mentality into a hideous mess, as a good example of where the spectacle goes and how its intensity has to keep ratcheting up in order maintain the audience attention, in a culture of dwindling attention spans, distraction and triviality.

Filmmaker and author Jon Ronson also recently wrote a book about this topic too, which is quite good. So You’ve Been Publicly Shamed. His TED talk is essentially a 17 min overview:

And a longer presentation with interview and Q&A from earlier this year:

863
Stare Into The Lights My Pretties

FBI says utility-pole surveillance camera locations must be kept secret

“The US Federal Bureau of Investigation has successfully convinced a federal judge to block the disclosure of where the bureau has attached surveillance cams on Seattle utility poles.

However, this privacy dispute highlights a powerful and clandestine tool the authorities are employing across the country to snoop on the public—sometimes with warrants, sometimes without.

The deployment of such video cameras appears to be widespread. What’s more, the Seattle authorities aren’t saying whether they have obtained court warrants to install the surveillance cams.”

“Peter Winn [assistant U.S. attorney in Seattle] wrote to Judge Jones that the location information about the disguised surveillance cams should be withheld because the public might think they are an ‘invasion of privacy.’ Winn also said that revealing the cameras’ locations could threaten the safety of FBI agents. And if the cameras become ‘publicly identifiable,’ Winn said, ‘subjects of the criminal investigation and national security adversaries of the United States will know what to look for to discern whether the FBI is conducting surveillance in a particular location.’

1142

FBI and NIST developing software to track and categorise people by their tattoos

“An Electronic Frontier Foundation (EFF) investigation just revealed an awfully Orwellian fact: the FBI is working with government researchers to develop advanced tattoo recognition technology. This would allow law enforcement to sort and identify people based on their tattoos to determine “affiliation to gangs, sub-cultures, religious or ritualistic beliefs, or political ideology.”

774
Stare Into The Lights My Pretties

Face recognition app taking Russia by storm may bring end to public anonymity

“If the founders of a new face recognition app get their way, anonymity in public could soon be a thing of the past. FindFace, launched two months ago and currently taking Russia by storm, allows users to photograph people in a crowd and work out their identities, with 70% reliability.

It works by comparing photographs to profile pictures on Vkontakte, a social network popular in Russia and the former Soviet Union, with more than 200 million accounts. In future, the designers imagine a world where people walking past you on the street could find your social network profile by sneaking a photograph of you, and shops, advertisers and the police could pick your face out of crowds and track you down via social networks.”

Founder Kabakov says the app could revolutionise dating: “If you see someone you like, you can photograph them, find their identity, and then send them a friend request.” The interaction doesn’t always have to involve the rather creepy opening gambit of clandestine street photography, he added: “It also looks for similar people. So you could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages.”

799

Google AI has access to 1.6M people’s health records (UK)

“A document obtained by New Scientist reveals that the tech giant’s collaboration with the UK’s National Health Service goes far beyond what has been publicly announced. The document — a data-sharing agreement between Google-owned artificial intelligence company DeepMind and the Royal Free NHS Trust — gives the clearest picture yet of what the company is doing and what sensitive data it now has access to. The agreement gives DeepMind access to a wide range of healthcare data on the 1.6 million patients who pass through three London hospitals.

It includes logs of day-to-day hospital activity, such as records of the location and status of patients – as well as who visits them and when. The hospitals will also share the results of certain pathology and radiology tests.

As well as receiving this continuous stream of new data, DeepMind has access to the historical data that the Royal Free trust submits to the Secondary User Service (SUS) database – the NHS’s centralised record of all hospital treatments in the UK. This includes data from critical care and accident and emergency departments.

Google says it has no commercial plans for DeepMind’s work with Royal Free and that the current pilots are being done for free. But the data to which Royal Free is giving DeepMind access is hugely valuable. It may have to destroy its copy of the data when the agreement expires next year, but that gives ample time to mine it for health insights.”

794
Stare Into The Lights My Pretties

Inventor of World Wide Web warns of threat to internet

“Tim Berners-Lee, a computer scientist who invented the web 25 years ago, called for a bill of rights that would guarantee the independence of the internet and ensure users’ privacy.

“If a company can control your access to the internet, if they can control which websites they go to, then they have tremendous control over your life,” Berners-Lee said at the London “Web We Want” festival on the future of the internet.

“If a Government can block you going to, for example, the opposition’s political pages, then they can give you a blinkered view of reality to keep themselves in power.”

“Suddenly the power to abuse the open internet has become so tempting both for government and big companies.”

843
Stare Into The Lights My Pretties

Google: “Essentially we’d like to make the technology disappear”

“Google has big hopes for its Glass head-mounted computer, chief among them a desire to make the unit smaller and more comfortable to wear.

Those were just a couple of the goals for a polished version of the device laid out Tuesday by Babak Parviz, the creator of Glass, who is also the director of Google’s “X” special projects division.

“Essentially we’d like to make the technology disappear,” he said during a conference on wearable technology in San Francisco.

“It should be non-intrusive” and as comfortable to wear as regular glasses or a wristwatch, he said.

Shrinking the unit would require advances in optics and photonics, he said. More computing power is also needed to make the device faster at answering people’s questions on the fly, Parviz said.

809

Don’t expect privacy when sending to Gmail

People sending email to any of Google’s 425 million Gmail users have no “reasonable expectation” that their communications are confidential, the internet giant has said in a court filing.

Consumer Watchdog, the advocacy group that uncovered the filing, called the revelation a “stunning admission.” It comes as Google and its peers are under pressure to explain their role in the National Security Agency’s (NSA) mass surveillance of US citizens and foreign nationals.

“Google has finally admitted they don’t respect privacy,” said John Simpson, Consumer Watchdog’s privacy project director. “People should take them at their word; if you care about your email correspondents’ privacy, don’t use Gmail.”

Google set out its case last month in an attempt to dismiss a class action lawsuit that accuses the tech giant of breaking wire tap laws when it scans emails sent from non-Google accounts in order to target ads to Gmail users.

That suit, filed in May, claims Google “unlawfully opens up, reads, and acquires the content of people’s private email messages”. It quotes Eric Schmidt, Google’s executive chairman: “Google policy is to get right up to the creepy line and not cross it.”

727