Resources

Facebook silently enables facial recognition abilities for users outside EU and Canada

Facebook is now informing users around the world that it’s rolling out facial recognition features. In December, we reported the features would be coming to the platform; that roll out finally appears to have begun. It should be noted that users in the European Union and Canada will not be notified because laws restrict this type of activity in those areas.

With the new tools, you’ll be able to find photos that you’re in but haven’t been tagged in; they’ll help you protect yourself against strangers using your photo; and Facebook will be able to tell people with visual impairments who’s in their photos and videos. By default, Facebook warns that this feature is enabled but can be switched off at any time; additionally, the firm says it may add new capabilities at any time.

While Facebook may want its users to “feel confident” uploading pictures online, it will likely give many other users the heebie-jeebies when they think of the colossal database of faces that Facebook has and what it could do with all that data. Even non-users should be cautious which photos they include themselves in if they don’t want to be caught up in Facebook’s web of data.

865

How Do You Vote? 50 Million Google Images Give a Clue

What vehicle is most strongly associated with Republican voting districts? Extended-cab pickup trucks. For Democratic districts? Sedans.

Those conclusions may not be particularly surprising. After all, market researchers and political analysts have studied such things for decades.

But what is surprising is how researchers working on an ambitious project based at Stanford University reached those conclusions: by analyzing 50 million images and location data from Google Street View, the street-scene feature of the online giant’s mapping service.

For the first time, helped by recent advances in artificial intelligence, researchers are able to analyze large quantities of images, pulling out data that can be sorted and mined to predict things like income, political leanings and buying habits. In the Stanford study, computers collected details about cars in the millions of images it processed, including makes and models.

Identifying so many car images in such detail was a technical feat. But it was linking that new data set to public collections of socioeconomic and environmental information, and then tweaking the software to spot patterns and correlations, that makes the Stanford project part of what computer scientists see as the broader application of image data.

845

Google and Facebook are watching our every move online

You may know that hidden trackers lurk on most websites you visit, soaking up your personal information. What you may not realize, though, is 76 percent of websites now contain hidden Google trackers, and 24 percent have hidden Facebook trackers, according to the Princeton Web Transparency & Accountability Project. The next highest is Twitter with 12 percent. It is likely that Google or Facebook are watching you on many sites you visit, in addition to tracking you when using their products. As a result, these two companies have amassed huge data profiles on each person, which can include your interests, purchases, search, browsing and location history, and much more. They then make your sensitive data profile available for invasive targeted advertising that can follow you around the Internet.

So how do we move forward from here? Don’t be fooled by claims of self-regulation, as any useful long-term reforms of Google and Facebook’s data privacy practices fundamentally oppose their core business models: hyper-targeted advertising based on more and more intrusive personal surveillance. Change must come from the outside. Unfortunately, we’ve seen relatively little from Washington. Congress and federal agencies need to take a fresh look at what can be done to curb these data monopolies. They first need to demand more algorithmic and privacy policy transparency, so people can truly understand the extent of how their personal information is being collected, processed and used by these companies. Only then can informed consent be possible. They also need to legislate that people own their own data, enabling real opt-outs. Finally, they need to restrict how data can be combined including being more aggressive at blocking acquisitions that further consolidate data power, which will pave the way for more competition in digital advertising. Until we see such meaningful changes, consumers should vote with their feet.

859

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

Under fire for Facebook Inc.’s role as a platform for political propaganda, co-founder Mark Zuckerberg has punched back, saying his mission is above partisanship. “We hope to give all people a voice and create a platform for all ideas,” Zuckerberg wrote in September after President Donald Trump accused Facebook of bias. Zuckerberg’s social network is a politically agnostic tool for its more than 2 billion users, he has said. But Facebook, it turns out, is no bystander in global politics. What he hasn’t said is that his company actively works with political parties and leaders including those who use the platform to stifle opposition — sometimes with the aid of “troll armies” that spread misinformation and extremist ideologies.

The initiative is run by a little-known Facebook global government and politics team that’s neutral in that it works with nearly anyone seeking or securing power. The unit is led from Washington by Katie Harbath, a former Republican digital strategist who worked on former New York Mayor Rudy Giuliani’s 2008 presidential campaign. Since Facebook hired Harbath three years later, her team has traveled the globe helping political clients use the company’s powerful digital tools. In some of the world’s biggest democracies — from India and Brazil to Germany and the U.K. — the unit’s employees have become de facto campaign workers. And once a candidate is elected, the company in some instances goes on to train government employees or provide technical assistance for live streams at official state events.

976

How Facebook Figures Out Everyone You’ve Ever Met

From Slashdot:

“I deleted Facebook after it recommended as People You May Know a man who was defense counsel on one of my cases. We had only communicated through my work email, which is not connected to my Facebook, which convinced me Facebook was scanning my work email,” an attorney told Gizmodo. Kashmir Hill, a reporter at the news outlet, who recently documented how Facebook figured out a connection between her and a family member she did not know existed, shares several more instances others have reported and explains how Facebook gathers information. She reports:

Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections. Because shadow-profile connections happen inside Facebook’s algorithmic black box, people can’t see how deep the data-mining of their lives truly is, until an uncanny recommendation pops up. Facebook isn’t scanning the work email of the attorney above. But it likely has her work email address on file, even if she never gave it to Facebook herself. If anyone who has the lawyer’s address in their contacts has chosen to share it with Facebook, the company can link her to anyone else who has it, such as the defense counsel in one of her cases. Facebook will not confirm how it makes specific People You May Know connections, and a Facebook spokesperson suggested that there could be other plausible explanations for most of those examples — “mutual friendships,” or people being “in the same city/network.” The spokesperson did say that of the stories on the list, the lawyer was the likeliest case for a shadow-profile connection. Handing over address books is one of the first steps Facebook asks people to take when they initially sign up, so that they can “Find Friends.”

The problem with all this, Hill writes, is that Facebook doesn’t explicitly say the scale at which it would be using the contact information it gleans from a user’s address book. Furthermore, most people are not aware that Facebook is using contact information taken from their phones for these purposes.”

896

The Video Game That Could Shape the Future of War

“As far as video games go, Operation Overmatch is rather unremarkable. Players command military vehicles in eight-on-eight matches against the backdrop of rendered cityscapes — a common setup of games that sometimes have the added advantage of hundreds of millions of dollars in development budgets. Overmatch does have something unique, though: its mission. The game’s developers believe it will change how the U.S. Army fights wars. Overmatch’s players are nearly all soldiers in real life. As they develop tactics around futuristic weapons and use them in digital battle against peers, the game monitors their actions.

Each shot fired and decision made, in addition to messages the players write in private forums, is a bit of information soaked up with a frequency not found in actual combat, or even in high-powered simulations without a wide network of players. The data is logged, sorted, and then analyzed, using insights from sports and commercial video games. Overmatch’s team hopes this data will inform the Army’s decisions about which technologies to purchase and how to develop tactics using them, all with the aim of building a more forward-thinking, prepared force… While the game currently has about 1,000 players recruited by word of mouth and outreach from the Overmatch team, the developers eventually want to involve tens of thousands of soldiers. This milestone would allow for millions of hours of game play per year, according to project estimates, enough to generate rigorous data sets and test hypotheses.”

Brian Vogt, a lieutenant colonel in the Army Capabilities Integration Center who oversees Overmatch’s development, says:

“Right after World War I, we had technologies like aircraft carriers we knew were going to play an important role,” he said. “We just didn’t know how to use them. That’s where we are and what we’re trying to do for robots.”

937

Facebook has mapped populations in 23 countries as it explores satellites to expand internet

“Facebook doesn’t only know what its 2 billion users “Like.” It now knows where millions of humans live, everywhere on Earth, to within 15 feet. The company has created a data map of the human population by combining government census numbers with information it’s obtained from space satellites, according to Janna Lewis, Facebook’s head of strategic innovation partnerships and sourcing. A Facebook representative later told CNBC that this map currently covers 23 countries, up from 20 countries mentioned in this blog post from February 2016.

The mapping technology, which Facebook says it developed itself, can pinpoint any man-made structures in any country on Earth to a resolution of five meters. Facebook is using the data to understand the precise distribution of humans around the planet. That will help the company determine what types of internet service — based either on land, in the air or in space — it can use to reach consumers who now have no (or very low quality) internet connections.”

774

“Are you happy now? The uncertain future of emotion analytics”

Elise Thomas writes at Hopes & Fears:

“Right now, in a handful of computing labs scattered across the world, new software is being developed which has the potential to completely change our relationship with technology. Affective computing is about creating technology which recognizes and responds to your emotions. Using webcams, microphones or biometric sensors, the software uses a person’s physical reactions to analyze their emotional state, generating data which can then be used to monitor, mimic or manipulate that person’s emotions.”

Corporations spend billions each year trying to build “authentic” emotional connections to their target audiences. Marketing research is one of the most prolific research fields around, conducting thousands of studies on how to more effectively manipulate consumers’ decision-making. Advertisers are extremely interested in affective computing and particularly in a branch known as emotion analytics, which offers unprecedented real-time access to consumers’ emotional reactions and the ability to program alternative responses depending on how the content is being received.

For example, if two people watch an advertisement with a joke and only one person laughs, the software can be programmed to show more of the same kind of advertising to the person who laughs while trying different sorts of advertising on the person who did not laugh to see if it’s more effective. In essence, affective computing could enable advertisers to create individually-tailored advertising en masse.”

“Say 15 years from now a particular brand of weight loss supplements obtains a particular girl’s information and locks on. When she scrolls through her Facebook, she sees pictures of rail-thin celebrities, carefully calibrated to capture her attention. When she turns on the TV, it automatically starts on an episode of “The Biggest Loser,” tracking her facial expressions to find the optimal moment for a supplement commercial. When she sets her music on shuffle, it “randomly” plays through a selection of the songs which make her sad. This goes on for weeks.

Now let’s add another layer. This girl is 14, and struggling with depression. She’s being bullied in school. Having become the target of a deliberate and persistent campaign by her technology to undermine her body image and sense of self-worth, she’s at risk of making some drastic choices.”

968

Facebook built an AI system that learned to lie to get what it wants

“Facebook researchers used a game to help the bot learn how to haggle over books, hats, and basketballs. Each object had a point value, and they needed to be split between each bot negotiator via text. From the human conversations (gathered via Amazon Mechanical Turk), and testing its skills against itself, the AI system didn’t only learn how to state its demands, but negotiation tactics as well — specifically, lying. Instead of outright saying what it wanted, sometimes the AI would feign interest in a worthless object, only to later concede it for something that it really wanted. Facebook isn’t sure whether it learned from the human hagglers or whether it stumbled upon the trick accidentally, but either way when the tactic worked, it was rewarded.

It’s no surprise that Facebook is working on ways to improve how its bot can interact with others—the company is highly invested in building bots that can negotiate on behalf of users and businesses for its Messenger platform, where it envisions the future of customer service.

724

What Makes You Click (2016)

“The biggest psychological experiment ever is being conducted, and we’re all taking part in it: every day, a billion people are tested online. Which ingenious tricks and other digital laws ensure that we fill our online shopping carts to the brim, or stay on websites as long as possible? Or vote for a particular candidate?

The bankruptcies of department stores and shoe shops clearly show that our buying behaviour is rapidly shifting to the Internet. An entirely new field has arisen, of ‘user experience’ architects and ‘online persuasion officers’. How do these digital data dealers use, manipulate and abuse our user experience? Not just when it comes to buying things, but also with regards to our free time and political preferences.

Aren’t companies, which are running millions of tests at a time, miles ahead of science and government, in this respect? Now the creators of these digital seduction techniques, former Google employees among them, are themselves arguing for the introduction of an ethical code. What does it mean, when the conductors of experiments themselves are asking for their power and possibilities to be restricted?”

1353
Stare Into The Lights My Pretties

The data analytics company Cambridge Analytica

The Guardian is running an article about a ‘mysterious’ big-data analytics company called Cambridge Analytica and its activities with SCL Group—a 25-year-old military psyops company in the UK later bought by “secretive hedge fund billionaire” Robert Mercer. In the article, a former employee calls it “this dark, dystopian data company that gave the world Trump.”

Mercer, with a background in computer science is alleged to be at the centre of a multimillion-dollar propaganda network.

“Facebook was the source of the psychological insights that enabled Cambridge Analytica to target individuals. It was also the mechanism that enabled them to be delivered on a large scale. The company also (perfectly legally) bought consumer datasets — on everything from magazine subscriptions to airline travel — and uniquely it appended these with the psych data to voter files… Finding “persuadable” voters is key for any campaign and with its treasure trove of data, Cambridge Analytica could target people high in neuroticism, for example, with images of immigrants “swamping” the country.

The key is finding emotional triggers for each individual voter. Cambridge Analytica worked on campaigns in several key states for a Republican political action committee. Its key objective, according to a memo the Observer has seen, was “voter disengagement” and “to persuade Democrat voters to stay at home”… In the U.S., the government is bound by strict laws about what data it can collect on individuals. But, for private companies anything goes.”

852

Facebook: Cracking the Code (2017)

“What’s on your mind?” It’s the friendly Facebook question which lets you share how you’re feeling. It’s also the question that unlocks the details of your life and helps turn your thoughts into profits.

Facebook has the ability to track much of your browsing history, even when you’re not logged on, and even if you aren’t a member of the social network at all. This is one of the methods used to deliver targeted advertising and ‘news’ to your Facebook feed. This is why you are unlikely to see anything that challenges your world view.

This feedback loop is fuelling the rise and power of ‘fake news’. “We’re seeing news that’s tailored ever more tightly towards those kinds of things that people will click on, and will share, rather than things that perhaps are necessarily good for them”, says one Media Analyst.

This information grants huge power to those with access to it. Republican Party strategist Patrick Ruffini says, “What it does give us is much greater level of certainty and granularity and precision down to the individual voter, down to the individual precinct about how things are going to go”. Resultantly, former Facebook journalist, Adam Schrader thinks that there’s “a legitimate argument to this that Facebook influenced the election, the United States Election results.

963

Japan researchers warn of fingerprint theft from ‘peace’ sign, selfies

“Could flashing the “peace” sign in photos lead to fingerprint data being stolen? Research by a team at Japan’s National Institute of Informatics (NII) says so, raising alarm bells over the popular two-fingered pose. Fingerprint recognition technology is becoming widely available to verify identities, such as when logging on to smartphones, tablets and laptop computers. But the proliferation of mobile devices with high-quality cameras and social media sites where photographs can be easily posted is raising the risk of personal information being leaked, reports said. The NII researchers were able to copy fingerprints based on photos taken by a digital camera three meters (nine feet) away from the subject.”

751

Facebook buys data from third-party brokers to fill in user profiles

“It comes as no surprise to any Facebook user that the social network gathers a considerable amount of information based on their actions and interests. But according to a report from ProPublica, the world’s largest social network knows far more about its users than just what they do online.

What Facebook can’t glean from a user’s activity, it’s getting from third-party data brokers. ProPublica found the social network is purchasing additional information including personal income, where a person eats out and how many credit cards they keep.

That data all comes separate from the unique identifiers that Facebook generates for its users based on interests and online behavior. A separate investigation by ProPublica in which the publication asked users to report categories of interest Facebook assigned to them generated more than 52,000 attributes.

The data Facebook pays for from other brokers to round out user profiles isn’t disclosed by the company beyond a note that it gets information “from a few different sources.” Those sources, according to ProPublica, come from commercial data brokers who have access to information about people that isn’t linked directly to online behavior.”

From ProPublica:

“When asked this week about the lack of disclosure, Facebook responded that it doesn’t tell users about the third-party data because it’s widely available and was not collected by Facebook.

Facebook has been working with data brokers since 2012 when it signed a deal with Datalogix. This prompted Chester, the privacy advocate at the Center for Digital Democracy, to file a complaint with the Federal Trade Commission alleging that Facebook had violated a consent decree with the agency on privacy issues. The FTC has never publicly responded to that complaint and Facebook subsequently signed deals with five other data brokers.

Oracle’s Datalogix provides about 350 types of data to Facebook.”

620

“Creepy new website makes its monitoring of your online behaviour visible”

“If YOU think you are not being analysed while browsing websites, it could be time to reconsider. A creepy new website called clickclickclick has been developed to demonstrate how our online behaviour is continuously measured.

The site, which observes and comments on your behaviour in detail, and is not harmful to your computer, contains nothing but a white screen and a large green button. From the minute you visit the website, it begins detailing your actions on the screen in real-time.

The site also encourages users to turn on their audio, which offers the even more disturbing experience of having an English voice comment about your behaviour.

Designer Roel Wouters said the experiment was aimed to remind people about the serious themes of big data and privacy. “It seemed fun to thematise this in a simple and lighthearted way,” he said.

Fellow designer Luna Maurer said the website her own experiences with the internet had helped with the project. “I am actually quite internet aware, but I am still very often surprised that after I watched something on a website, a second later I get instantly personalised ads,” she said.”

853

Machine Logic: Our lives are ruled by big tech’s decisions by data

The Guardian’s Julia Powles writes about how with the advent of artificial intelligence and so-called “machine learning,” this society is increasingly a world where decisions are more shaped by calculations and data analytics rather than traditional human judgement:

“Jose van Dijck, president of the Dutch Royal Academy and the conference’s keynote speaker, expands: Datification is the core logic of what she calls “the platform society,” in which companies bypass traditional institutions, norms and codes by promising something better and more efficient — appealing deceptively to public values, while obscuring private gain. Van Dijck and peers have nascent, urgent ideas. They commence with a pressing agenda for strong interdisciplinary research — something Kate Crawford is spearheading at Microsoft Research, as are many other institutions, including the new Leverhulme Centre for the Future of Intelligence. There’s the old theory to confront, that this is a conscious move on the part of consumers and, if so, there’s always a theoretical opt-out. Yet even digital activists plot by Gmail, concedes Fieke Jansen of the Berlin-based advocacy organisation Tactical Tech. The Big Five tech companies, as well as the extremely concentrated sources of finance behind them, are at the vanguard of “a society of centralized power and wealth. “How did we let it get this far?” she asks. Crawford says there are very practical reasons why tech companies have become so powerful. “We’re trying to put so much responsibility on to individuals to step away from the ‘evil platforms,’ whereas in reality, there are so many reasons why people can’t. The opportunity costs to employment, to their friends, to their families, are so high” she says.”

862
Stare Into The Lights My Pretties

CIA’s “Siren Servers” can predict social uprisings several days before they happen

“The CIA claims to be able to predict social unrest days before it happens thanks to powerful super computers dubbed Siren Servers by the father of Virtual Reality, Jaron Lanier.

CIA Deputy Director for Digital Innovation Andrew Hallman announced that the agency has beefed-up its “anticipatory intelligence” through the use of deep learning and machine learning servers that can process an incredible amount of data.

“We have, in some instances, been able to improve our forecast to the point of being able to anticipate the development of social unrest and societal instability some I think as near as three to five days out,” said Hallman on Tuesday at the Federal Tech event, Fedstival.

This Minority Report-type technology has been viewed skeptically by policymakers as the data crunching hasn’t been perfected, and if policy were to be enacted based on faulty data, the results could be disastrous. Iraq WMDs?”

I called it a siren server because there’s no plan to be evil. A siren server seduces you,” said Lanier.

In the case of the CIA; however, whether the agency is being innocently seduced or is actively planning to use this data for its own self-sustaining benefit, one can only speculate.

Given the Intelligence Community’s track record for toppling governments, infiltrating the mainstream media, MK Ultra, and scanning hundreds of millions of private emails, that speculation becomes easier to justify.”

925

Steven Rambam at HOPE XI, 2016

“First came the assault on privacy. Name, address, telephone, DOB, SSN, physical description, friends, family, likes, dislikes, habits, hobbies, beliefs, religion, sexual orientation, finances, every granular detail of a person’s life, all logged, indexed, analyzed and cross-referenced. Then came the gathering of location and communication data. Cell phones, apps, metro cards, license plate readers and toll tags, credit card use, IP addresses and authenticated logins, tower info, router proximity, networked “things” everywhere reporting on activity and location, astoundingly accurate facial recognition mated with analytics and “gigapixel” cameras and, worst of all, mindlessly self-contributed posts, tweets, and “check-ins,” all constantly reporting a subject’s location 24-7-365, to such a degree of accuracy that “predictive profiling” knows where you will likely be next Thursday afternoon. Today we are experiencing constant efforts to shred anonymity. Forensic linguistics, browser fingerprinting, lifestyle and behavior analysis, metadata of all types, HTML5, IPv6, and daily emerging “advances” in surveillance technologies – some seemingly science fiction but real – are combining to make constant, mobile identification and absolute loss of anonymity inevitable. And, now, predictably, the final efforts to homogenize: the “siloing” and Balkanization of the Internet. As Internet use becomes more and more self-restricted to a few large providers, as users increasingly never leave the single ecosystem of a Facebook or a Google, as the massive firehose of information on the Internet is “curated” and “managed” by persons who believe that they know best what news and opinions you should have available to read, see, and believe, the bias of a few will eventually determine what you believe. What is propaganda? What is truth? You simply won’t know. In a tradition dating back to the first HOPE conference, for three full hours Steven Rambam will detail the latest trends in privacy invasion and will demonstrate cutting-edge anonymity-shredding surveillance technologies. Drones will fly, a “privacy victim” will undergo digital proctology, a Q&A period will be provided, and fun will be had by all.”

881
Stare Into The Lights My Pretties

Face recognition app taking Russia by storm may bring end to public anonymity

“If the founders of a new face recognition app get their way, anonymity in public could soon be a thing of the past. FindFace, launched two months ago and currently taking Russia by storm, allows users to photograph people in a crowd and work out their identities, with 70% reliability.

It works by comparing photographs to profile pictures on Vkontakte, a social network popular in Russia and the former Soviet Union, with more than 200 million accounts. In future, the designers imagine a world where people walking past you on the street could find your social network profile by sneaking a photograph of you, and shops, advertisers and the police could pick your face out of crowds and track you down via social networks.”

Founder Kabakov says the app could revolutionise dating: “If you see someone you like, you can photograph them, find their identity, and then send them a friend request.” The interaction doesn’t always have to involve the rather creepy opening gambit of clandestine street photography, he added: “It also looks for similar people. So you could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages.”

798

Welcome to the age of the chatbot. Soon you’ll be lonelier than ever.

“Very soon – by the end of the year, probably – you won’t need to be on Facebook in order to talk to your friends on Facebook.

Your Facebook avatar will dutifully wish people happy birthday, congratulate them on the new job, accept invitations, and send them jolly texts punctuated by your favourite emojis – all while you’re asleep, or shopping, or undergoing major surgery.

Using IBM’s powerful Watson natural language processing platform, The Chat Bot Club learns to imitate its user. It learns texting styles, favourite phrases, preferred emojis, repeated opinions – and then it learns to respond in kind, across an ever-broadening range of subjects.”

“Humans aren’t perfect, and AI is a bit the same way,” he said. “AI is not significantly smarter than the people who program it. So AI is always going to encounter circumstances that it was not prepared for.”

848