Resources

Facebook Allowed Advertisers to Target Users Interested in “White Genocide”—Even in Wake of Pittsburgh Massacre

Apparently fueled by anti-Semitism and the bogus narrative that outside forces are scheming to exterminate the white race, Robert Bowers murdered 11 Jewish congregants as they gathered inside their Pittsburgh synagogue, federal prosecutors allege. But despite long-running international efforts to debunk the idea of a “white genocide,” Facebook was still selling advertisers the ability to market to those with an interest in that myth just days after the bloodshed.

A simple search of Facebook pages also makes plain that there are tens of thousands of users with a very earnest interest in “white genocide,” shown through the long list of groups with names like “Stop White South African Genocide,” “White Genocide Watch,” and “The last days of the white man.” Images with captions like “Don’t Be A Race Traitor” and “STOP WHITE GENOCIDE IN SOUTH AFRICA” are freely shared in such groups, providing a natural target for anyone who might want to pay to promote deliberately divisive and incendiary hate-based content.

713

Pentagon Wants to Predict Anti-Trump Protests Using Social Media Surveillance

A series of research projects, patent filings, and policy changes indicate that the Pentagon wants to use social media surveillance to quell domestic insurrection and rebellion.

The social media posts of American citizens who don’t like President Donald Trump are the focus of the latest US military-funded research. The research, funded by the US Army and co-authored by a researcher based at the West Point Military Academy, is part of a wider effort by the Trump administration to consolidate the US military’s role and influence on domestic intelligence.

The vast scale of this effort is reflected in a number of government social media surveillance patents granted this year, which relate to a spy program that the Trump administration outsourced to a private company last year. Experts interviewed by Motherboard say that the Pentagon’s new technology research may have played a role in amendments this April to the Joint Chiefs of Staff homeland defense doctrine, which widen the Pentagon’s role in providing intelligence for domestic “emergencies,” including an “insurrection.”

It’s no secret that the Pentagon has funded Big Data research into how social media surveillance can help predict large-scale population behaviours, specifically the outbreak of conflict, terrorism, and civil unrest.

Much of this research focuses on foreign theatres like the Middle East and North Africa — where the 2011 Arab Spring kicked off an arc of protest that swept across the region and toppled governments.

Since then, the Pentagon has spent millions of dollars finding patterns in posts across platforms like Facebook, Twitter, Instagram, Tumblr, and beyond to enable the prediction of major events.

But the Pentagon isn’t just interested in anticipating surprises abroad. The research also appears to be intended for use in the US homeland.

Datasets for the research were collected using the Apollo Social Sensing Tool, a real-time event tracking software that collects and analyses millions of social media posts.

The tool was originally developed under the Obama administration back in 2011 by the US Army Research Laboratory and US Defense Threat Reduction Agency, in partnership with Rensselaer Polytechnic Institute, the University of Illinois, IBM, and Caterva (a social marketing company that in 2013 was folded into a subsidiary of giant US government IT contractor, CSC). Past papers associated with the project show that the tool has been largely tested in foreign theatres like Haiti, Egypt, and Syria.

But the use of the Apollo tool to focus on protests in the US homeland has occurred under the Trump administration. The ‘election’ dataset compiled using Apollo for the 2018 US Army-funded study is comprised of 2.5 million tweets sent between October 26, 2016, and December 20, 2016, using the words “Trump”, “Clinton,” and “election.”

Tweets were geolocated to focus on “locations where protests occurred following the election” based on user profiles. Locations were then triangulated against protest data from “online news outlets across the country.”

The millions of tweets were used to make sense of the “frequencies of the protests in 39 cities” using 18 different ways of measuring the “size, structure and geography” of a network, along with two ways of measuring how that network leads a social group to become “mobilized,” or take action.

In short, this means that “the social network can be a predictor of mobilization, which in turn is a predictor of the protest.” This pivotal finding means that extensive real-time monitoring of American citizens’ social media activity can be used to predict future protests.

777

Instagram is testing the ability to share your precise location history with Facebook

Revealed just weeks after Instagram’s co-founders left the company, Instagram is currently testing a feature that would allow it to share your location data with Facebook, even when you’re not using the app.

Instagram is not the only service that Facebook has sought to share data between. Back in 2016 the company announced that it would be sharing user data between WhatsApp and Facebook in order to offer better friend suggestions. The practice was later halted in the European Union thanks to its GDPR legislation, although WhatsApp’s CEO and co-founder later left over data privacy concerns.

Facebook is also reportedly testing a map view to see friend’s locations, similar to what’s already offered by Snapchat. Instagram’s data sharing could provide additional data points to power this functionality, while providing Facebook with more data to better target its ads.

796

Facebook is not alone in making everyone’s data available for whatever purpose

Most companies that trade in the sale and manipulation of personal information are private and beholden to few rules other than the bare minimum of those they establish themselves, to avoid scrutiny and be able to say “we told you so” if an angry individual ever comes calling. Even if a consumer is aware their data is being passed around, their ability to control it once it’s out there is virtually nil: if they request it be deleted from one data broker, it can simply be bought back from from one of several gigantic firms that have been storing it, too.

It is an open question what the actual effect of Cambridge Analytica’s work on the presidential election was, and what the outcome might have been without its influence (most references to its “psychographic” profiling in The New York Times’ story are appropriately skeptical). It would be hard to say without a lot more cooperation from the company and Facebook itself. But the leak by one of its researchers is an incredibly rare glimpse into a fairly routine process in an industry that is so staggeringly enormous and influential, not just in politics but in our personal, day-to-day existence, that it’s difficult to believe that it is anything but a mistake. But it isn’t, and wasn’t, a mistake. It is how things happened and are still happening every day.

765

Facebook silently enables facial recognition abilities for users outside EU and Canada

Facebook is now informing users around the world that it’s rolling out facial recognition features. In December, we reported the features would be coming to the platform; that roll out finally appears to have begun. It should be noted that users in the European Union and Canada will not be notified because laws restrict this type of activity in those areas.

With the new tools, you’ll be able to find photos that you’re in but haven’t been tagged in; they’ll help you protect yourself against strangers using your photo; and Facebook will be able to tell people with visual impairments who’s in their photos and videos. By default, Facebook warns that this feature is enabled but can be switched off at any time; additionally, the firm says it may add new capabilities at any time.

While Facebook may want its users to “feel confident” uploading pictures online, it will likely give many other users the heebie-jeebies when they think of the colossal database of faces that Facebook has and what it could do with all that data. Even non-users should be cautious which photos they include themselves in if they don’t want to be caught up in Facebook’s web of data.

847

Google and Facebook are watching our every move online

You may know that hidden trackers lurk on most websites you visit, soaking up your personal information. What you may not realize, though, is 76 percent of websites now contain hidden Google trackers, and 24 percent have hidden Facebook trackers, according to the Princeton Web Transparency & Accountability Project. The next highest is Twitter with 12 percent. It is likely that Google or Facebook are watching you on many sites you visit, in addition to tracking you when using their products. As a result, these two companies have amassed huge data profiles on each person, which can include your interests, purchases, search, browsing and location history, and much more. They then make your sensitive data profile available for invasive targeted advertising that can follow you around the Internet.

So how do we move forward from here? Don’t be fooled by claims of self-regulation, as any useful long-term reforms of Google and Facebook’s data privacy practices fundamentally oppose their core business models: hyper-targeted advertising based on more and more intrusive personal surveillance. Change must come from the outside. Unfortunately, we’ve seen relatively little from Washington. Congress and federal agencies need to take a fresh look at what can be done to curb these data monopolies. They first need to demand more algorithmic and privacy policy transparency, so people can truly understand the extent of how their personal information is being collected, processed and used by these companies. Only then can informed consent be possible. They also need to legislate that people own their own data, enabling real opt-outs. Finally, they need to restrict how data can be combined including being more aggressive at blocking acquisitions that further consolidate data power, which will pave the way for more competition in digital advertising. Until we see such meaningful changes, consumers should vote with their feet.

842

The rise of big data policing

An excerpt from the book The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement (2017):

“Data-driven policing means aggressive police presence, surveillance, and perceived harassment in those communities. Each data point translates to real human experience, and many times those experiences remain fraught with all-too-human bias, fear, distrust, and racial tension. For those communities, especially poor communities of color, these data-collection efforts cast a dark shadow on the future.”

878

What Makes You Click (2016)

“The biggest psychological experiment ever is being conducted, and we’re all taking part in it: every day, a billion people are tested online. Which ingenious tricks and other digital laws ensure that we fill our online shopping carts to the brim, or stay on websites as long as possible? Or vote for a particular candidate?

The bankruptcies of department stores and shoe shops clearly show that our buying behaviour is rapidly shifting to the Internet. An entirely new field has arisen, of ‘user experience’ architects and ‘online persuasion officers’. How do these digital data dealers use, manipulate and abuse our user experience? Not just when it comes to buying things, but also with regards to our free time and political preferences.

Aren’t companies, which are running millions of tests at a time, miles ahead of science and government, in this respect? Now the creators of these digital seduction techniques, former Google employees among them, are themselves arguing for the introduction of an ethical code. What does it mean, when the conductors of experiments themselves are asking for their power and possibilities to be restricted?”

1326
Stare Into The Lights My Pretties

The data analytics company Cambridge Analytica

The Guardian is running an article about a ‘mysterious’ big-data analytics company called Cambridge Analytica and its activities with SCL Group—a 25-year-old military psyops company in the UK later bought by “secretive hedge fund billionaire” Robert Mercer. In the article, a former employee calls it “this dark, dystopian data company that gave the world Trump.”

Mercer, with a background in computer science is alleged to be at the centre of a multimillion-dollar propaganda network.

“Facebook was the source of the psychological insights that enabled Cambridge Analytica to target individuals. It was also the mechanism that enabled them to be delivered on a large scale. The company also (perfectly legally) bought consumer datasets — on everything from magazine subscriptions to airline travel — and uniquely it appended these with the psych data to voter files… Finding “persuadable” voters is key for any campaign and with its treasure trove of data, Cambridge Analytica could target people high in neuroticism, for example, with images of immigrants “swamping” the country.

The key is finding emotional triggers for each individual voter. Cambridge Analytica worked on campaigns in several key states for a Republican political action committee. Its key objective, according to a memo the Observer has seen, was “voter disengagement” and “to persuade Democrat voters to stay at home”… In the U.S., the government is bound by strict laws about what data it can collect on individuals. But, for private companies anything goes.”

834
Stare Into The Lights My Pretties

“Your browsing history alone can give away your identity”

“Researchers at Stanford and Princeton universities have found a way to connect the dots between people’s private online activity and their Twitter accounts—even for people who have never tweeted.

When the team tested the technique on 400 real people who submitted their browsing history, they were able to correctly pick out the volunteers’ Twitter profiles nearly three-quarters of the time.

Here’s how the de-anonymization system works: The researchers figured that a person is more likely to click a link that was shared on social media by a friend—or a friend of a friend—than any other random link on the internet. (Their model controls for the baseline popularity of each website.) With that in mind, and the details of an anonymous person’s browser history in hand, the researchers can compute the probability that any one Twitter user created that browsing history. People’s basic tendency to follow links they come across on Twitter unmasks them—and it usually takes less than a minute.

“You can even be de-anonymized if you just browse and follow people, without actually sharing anything.”

869

U.S. government begins asking foreign travelers about their social media at border

“Foreign travelers arriving in the United States on the visa waiver program have been presented with an “optional” request to “enter information associated with your online presence,” a government official confirmed Thursday. The prompt includes a drop-down menu that lists platforms including Facebook, Google+, Instagram, LinkedIn and YouTube, as well as a space for users to input their account names on those sites. The new policy comes as Washington tries to improve its ability to spot and deny entry to individuals who have ties to terrorist groups like the Islamic State. But the government has faced a barrage of criticism since it first floated the idea last summer. The Internet Association, which represents companies including Facebook, Google and Twitter, at the time joined with consumer advocates to argue the draft policy threatened free expression and posed new privacy and security risks to foreigners. Now that it is final, those opponents are furious the Obama administration ignored their concerns. The question itself is included in what’s known as the Electronic System for Travel Authorization, a process that certain foreign travelers must complete to come to the United States. ESTA and a related paper form specifically apply to those arriving here through the visa-waiver program, which allows citizens of 38 countries to travel and stay in the United States for up to 90 days without a visa.”

723

“Yahoo has a creepy plan for advertising billboards to spy on you”

Yahoo has filed a patent for a type of smart billboard that would collect people’s information and use it to deliver targeted ad content in real-time.

To achieve that functionality, the billboards would use a variety of sensor systems, including cameras and proximity technology, to capture real-time audio, video and even biometric information about potential target audiences.

But the tech company doesn’t just want to know about a passing vehicle. It also wants to know who the occupants are inside of it.

That’s why Yahoo is prepared to cooperate with cell towers and telecommunications companies to learn as much as possible about each vehicle’s occupants.”

“Various types of data (e.g., cell tower data, mobile app location data, image data, etc.) can be used to identify specific individuals in an audience in position to view advertising content. Similarly, vehicle navigation/tracking data from vehicles equipped with such systems could be used to identify specific vehicles and/or vehicle owners. Demographic data (e.g., as obtained from a marketing or user database) for the audience can thus be determined for the purpose of, for example, determining whether and/or the degree to which the demographic profile of the audience corresponds to a target demographic.”

803

Steven Rambam at HOPE XI, 2016

“First came the assault on privacy. Name, address, telephone, DOB, SSN, physical description, friends, family, likes, dislikes, habits, hobbies, beliefs, religion, sexual orientation, finances, every granular detail of a person’s life, all logged, indexed, analyzed and cross-referenced. Then came the gathering of location and communication data. Cell phones, apps, metro cards, license plate readers and toll tags, credit card use, IP addresses and authenticated logins, tower info, router proximity, networked “things” everywhere reporting on activity and location, astoundingly accurate facial recognition mated with analytics and “gigapixel” cameras and, worst of all, mindlessly self-contributed posts, tweets, and “check-ins,” all constantly reporting a subject’s location 24-7-365, to such a degree of accuracy that “predictive profiling” knows where you will likely be next Thursday afternoon. Today we are experiencing constant efforts to shred anonymity. Forensic linguistics, browser fingerprinting, lifestyle and behavior analysis, metadata of all types, HTML5, IPv6, and daily emerging “advances” in surveillance technologies – some seemingly science fiction but real – are combining to make constant, mobile identification and absolute loss of anonymity inevitable. And, now, predictably, the final efforts to homogenize: the “siloing” and Balkanization of the Internet. As Internet use becomes more and more self-restricted to a few large providers, as users increasingly never leave the single ecosystem of a Facebook or a Google, as the massive firehose of information on the Internet is “curated” and “managed” by persons who believe that they know best what news and opinions you should have available to read, see, and believe, the bias of a few will eventually determine what you believe. What is propaganda? What is truth? You simply won’t know. In a tradition dating back to the first HOPE conference, for three full hours Steven Rambam will detail the latest trends in privacy invasion and will demonstrate cutting-edge anonymity-shredding surveillance technologies. Drones will fly, a “privacy victim” will undergo digital proctology, a Q&A period will be provided, and fun will be had by all.”

862

Is Facebook eavesdropping on your phone conversations?

840

Ruin My Search History

Still using Google? Sick of them harvesting all your personal data and profiling you? Ruin My Search History is a simple tool that can help throw Google off the scent by generating random search queries. The tool prevents Google from building a more accurate picture of who you are by running random search queries in a small browser window. Decide just how random you want the searches to be using the slider and click ‘Ruin My Search History’.

281