Resources

Facebook Really Wants You to Come Back

The social network is getting aggressive with people who don’t log in often, working to keep up its engagement numbers.

It’s been about a year since Rishi Gorantala deleted the Facebook app from his phone, and the company has only gotten more aggressive in its emails to win him back. The social network started out by alerting him every few days about friends that had posted photos or made comments—each time inviting him to click a link and view the activity on Facebook. He rarely did.

Then, about once a week in September, he started to get prompts from a Facebook security customer-service address. “It looks like you’re having trouble logging into Facebook,” the emails would say. “Just click the button below and we’ll log you in. If you weren’t trying to log in, let us know.” He wasn’t trying. But he doesn’t think anybody else was, either.

“The content of mail they send is essentially trying to trick you,” said Gorantala, 35, who lives in Chile. “Like someone tried to access my account so I should go and log in.”

How Facebook’s Political Unit Enables the Dark Art of Digital Propaganda

Under fire for Facebook Inc.’s role as a platform for political propaganda, co-founder Mark Zuckerberg has punched back, saying his mission is above partisanship. “We hope to give all people a voice and create a platform for all ideas,” Zuckerberg wrote in September after President Donald Trump accused Facebook of bias. Zuckerberg’s social network is a politically agnostic tool for its more than 2 billion users, he has said. But Facebook, it turns out, is no bystander in global politics. What he hasn’t said is that his company actively works with political parties and leaders including those who use the platform to stifle opposition — sometimes with the aid of “troll armies” that spread misinformation and extremist ideologies.

The initiative is run by a little-known Facebook global government and politics team that’s neutral in that it works with nearly anyone seeking or securing power. The unit is led from Washington by Katie Harbath, a former Republican digital strategist who worked on former New York Mayor Rudy Giuliani’s 2008 presidential campaign. Since Facebook hired Harbath three years later, her team has traveled the globe helping political clients use the company’s powerful digital tools. In some of the world’s biggest democracies — from India and Brazil to Germany and the U.K. — the unit’s employees have become de facto campaign workers. And once a candidate is elected, the company in some instances goes on to train government employees or provide technical assistance for live streams at official state events.

“Are you happy now? The uncertain future of emotion analytics”

Elise Thomas writes at Hopes & Fears:

“Right now, in a handful of computing labs scattered across the world, new software is being developed which has the potential to completely change our relationship with technology. Affective computing is about creating technology which recognizes and responds to your emotions. Using webcams, microphones or biometric sensors, the software uses a person’s physical reactions to analyze their emotional state, generating data which can then be used to monitor, mimic or manipulate that person’s emotions.”

Corporations spend billions each year trying to build “authentic” emotional connections to their target audiences. Marketing research is one of the most prolific research fields around, conducting thousands of studies on how to more effectively manipulate consumers’ decision-making. Advertisers are extremely interested in affective computing and particularly in a branch known as emotion analytics, which offers unprecedented real-time access to consumers’ emotional reactions and the ability to program alternative responses depending on how the content is being received.

For example, if two people watch an advertisement with a joke and only one person laughs, the software can be programmed to show more of the same kind of advertising to the person who laughs while trying different sorts of advertising on the person who did not laugh to see if it’s more effective. In essence, affective computing could enable advertisers to create individually-tailored advertising en masse.”

“Say 15 years from now a particular brand of weight loss supplements obtains a particular girl’s information and locks on. When she scrolls through her Facebook, she sees pictures of rail-thin celebrities, carefully calibrated to capture her attention. When she turns on the TV, it automatically starts on an episode of “The Biggest Loser,” tracking her facial expressions to find the optimal moment for a supplement commercial. When she sets her music on shuffle, it “randomly” plays through a selection of the songs which make her sad. This goes on for weeks.

Now let’s add another layer. This girl is 14, and struggling with depression. She’s being bullied in school. Having become the target of a deliberate and persistent campaign by her technology to undermine her body image and sense of self-worth, she’s at risk of making some drastic choices.”

Google forming ‘smart cities’

“An ambitious project to blanket New York and London with ultrafast Wi-Fi via so-called “smart kiosks,” which will replace obsolete public telephones, are the work of a Google-backed startup.

Each kiosk is around nine feet high and relatively flat. Each flat side houses a big-screen display that pays for the whole operation with advertising.

Each kiosk provides free, high-speed Wi-Fi for anyone in range. By selecting the Wi-Fi network at one kiosk, and authenticating with an email address, each user will be automatically connected to every other LinkNYC kiosk they get within range of. Eventually, anyone will be able to walk around most of the city without losing the connection to these hotspots.

Wide-angle cameras on each side of the kiosks point up and down the street and sidewalk, approximating a 360-degree view. If a city wants to use those cameras and sensors for surveillance, it can.

Over the next 15 years, the city will go through the other two phases, where sensor data will be processed by artificial intelligence to gain unprecedented insights about traffic, environment and human behavior and eventually use it to intelligently re-direct traffic and shape other city functions.”

Stare Into The Lights My Pretties

The data analytics company Cambridge Analytica

The Guardian is running an article about a ‘mysterious’ big-data analytics company called Cambridge Analytica and its activities with SCL Group—a 25-year-old military psyops company in the UK later bought by “secretive hedge fund billionaire” Robert Mercer. In the article, a former employee calls it “this dark, dystopian data company that gave the world Trump.”

Mercer, with a background in computer science is alleged to be at the centre of a multimillion-dollar propaganda network.

“Facebook was the source of the psychological insights that enabled Cambridge Analytica to target individuals. It was also the mechanism that enabled them to be delivered on a large scale. The company also (perfectly legally) bought consumer datasets — on everything from magazine subscriptions to airline travel — and uniquely it appended these with the psych data to voter files… Finding “persuadable” voters is key for any campaign and with its treasure trove of data, Cambridge Analytica could target people high in neuroticism, for example, with images of immigrants “swamping” the country.

The key is finding emotional triggers for each individual voter. Cambridge Analytica worked on campaigns in several key states for a Republican political action committee. Its key objective, according to a memo the Observer has seen, was “voter disengagement” and “to persuade Democrat voters to stay at home”… In the U.S., the government is bound by strict laws about what data it can collect on individuals. But, for private companies anything goes.”

Facebook: Cracking the Code (2017)

“What’s on your mind?” It’s the friendly Facebook question which lets you share how you’re feeling. It’s also the question that unlocks the details of your life and helps turn your thoughts into profits.

Facebook has the ability to track much of your browsing history, even when you’re not logged on, and even if you aren’t a member of the social network at all. This is one of the methods used to deliver targeted advertising and ‘news’ to your Facebook feed. This is why you are unlikely to see anything that challenges your world view.

This feedback loop is fuelling the rise and power of ‘fake news’. “We’re seeing news that’s tailored ever more tightly towards those kinds of things that people will click on, and will share, rather than things that perhaps are necessarily good for them”, says one Media Analyst.

This information grants huge power to those with access to it. Republican Party strategist Patrick Ruffini says, “What it does give us is much greater level of certainty and granularity and precision down to the individual voter, down to the individual precinct about how things are going to go”. Resultantly, former Facebook journalist, Adam Schrader thinks that there’s “a legitimate argument to this that Facebook influenced the election, the United States Election results.

Stare Into The Lights My Pretties

Children as young as 13 are attending ‘smartphone rehab’

Children refusing to put down their phones is a common flashpoint in many homes, with a third of British children aged 12 to 15 admitting they do not have a good balance between screen time and other activities.

But in the US, the problem has become so severe for some families that children as young as 13 are being treated for digital technology addiction.

One ‘smartphone rehab’ centre near Seattle has started offering residential “intensive recovery programs” for teenagers who have trouble controlling their use of electronic devices.

The Restart Life Centre says parents have been asking it to offer courses of treatment to their children for more than eight years.

Hilarie Cash, the Centre’s founder, told Sky News smartphones, tablets and other mobile devices can be so stimulating and entertaining that they “override all those natural instincts that children actually have for movement and exploration and social interaction”.

Child psychotherapist Julie Lynn Evans, who has worked with hospitals, schools and families for 25 years, said her workload has significantly increased since the use of smartphones became widespread among young people.

“It’s a simplistic view, but I think it is the ubiquity of broadband and smartphones that has changed the pace and the power and the drama of mental illness in young people,” she told The Telegraph.

A ComRes poll of more than 1,000 parents of children aged under 18, published in September 2015, found 47 per cent of parents said they thought their children spent too much time in front of screens, with 43 per cent saying this amounts to an emotional dependency.”

Stare Into The Lights My Pretties

The Internet of Things is a surveillance nightmare

… or a dream come true for those in power. And those in power are the same entities pushing IoT technologies.

A little background reading about JTRIG from the Snowden documents is helpful. It’s the modern-day equivalent of the Zersetzung—the special unit of the Stasi that was used to attack, repress and sabotage political opponents. A power greatly expanded with a society driven by IoT.

Full article from Daily Dot:

“In 2014, security guru Bruce Schneier said, “Surveillance is the business model of the Internet. We build systems that spy on people in exchange for services. Corporations call it marketing.” The abstract and novel nature of these services tends to obscure our true relationship to companies like Facebook or Google. As the old saying goes, if you don’t pay for a product, you are the product.

But what happens when the Internet stops being just “that fiddly thing with a mouse” and becomes “the real world”? Surveillance becomes the business model of everything, as more and more companies look to turn the world into a collection of data points.

If we truly understood the bargain we were making when we give up our data for free or discounted services, would we still sign on the dotted line (or agree to the Terms and Conditions)? Would we still accept constant monitoring of our driving habits in exchange for potential insurance breaks, or allow our energy consumption to be uploaded into the cloud in exchange for “smart data” about it?

Nowhere is our ignorance of the trade-offs greater, or the consequences more worrisome, than our madcap rush to connect every toaster, fridge, car, and medical device to the Internet.

Welcome to the Internet of Things, what Schneier calls “the World Size Web,” already growing around you as we speak, which creates such a complete picture of our lives that Dr. Richard Tynan of Privacy International calls them “doppelgängers”—mirror images of ourselves built on constantly updated data. These doppelgängers live in the cloud, where they can easily be interrogated by intelligence agencies. Nicholas Weaver, a security researcher at University of California, Berkeley, points out that “Under the FISA Amendments Act 702 (aka PRISM), the NSA can directly ask Google for any data collected on a valid foreign intelligence target through Google’s Nest service, including a Nest Cam.” And that’s just one, legal way of questioning your digital doppelgänger; we’ve all heard enough stories about hacked cloud storage to be wary of trusting our entire lives to it.

 
But with the IoT, the potential goes beyond simple espionage, into outright sabotage. Imagine an enemy that can remotely disable the brakes in your car, or (even more subtly) give you food poisoning by hacking your fridge. That’s a new kind of power. “The surveillance, the interference, the manipulation … the full life cycle is the ultimate nightmare,” says Tynan.

The professional spies agree that the IoT changes the game. “‘Transformational’ is an overused word, but I do believe it properly applies to these technologies,” then CIA Director David Petraeus told a 2012 summit organized by the agency’s venture capital firm, In-Q-Tel, “particularly to their effect on clandestine tradecraft,” according to Wired.

Clandestine tradecraft is not about watching, but about interfering. Take, for example, the Joint Threat Research Intelligence Group (JTRIG), the dirty tricks division of GCHQ, the British intelligence agency. As the Snowden documents reveal, JTRIG wants to create “Cyber Magicians” who can “make something happen in the real…world,” including ruining business deals, intimidating activists, and sexual entrapment (“honeypots”). The documents show that JTRIG operatives will ignore international law to achieve their goals, which are not about fighting terrorism, but, in fact, targeting individuals who have not been charged with or convicted of any crime.

The Internet of Things “is a JTRIG wet dream,” says security researcher Rob Graham. But you don’t have to be a spy to take advantage of the IoT. Thanks to widespread security vulnerabilities in most IoT devices, almost anyone can take advantage of it. That means cops, spies, gangsters, anyone with the motivation and resources—but probably bored teenagers as well. “I can take any competent computer person and take them from zero to Junior Hacker 101 in a weekend,” says security researcher Dan Tentler. The security of most IoT devices—including home IoT, but also smart cities, power plants, gas pipelines, self-driving cars, and medical devices—is laughably bad. “The barrier to entry is not very tall,” he says, “especially when what’s being released to consumers is so trivial to get into.”

That makes the IoT vulnerable—our society vulnerable—to any criminal with a weekend to spend learning how to hack. “When we talk about vulnerabilities in computers…people are using a lot of rhetoric in the abstract,” says Privacy International’s Tynan. “What we really mean is, vulnerable to somebody. That somebody you’re vulnerable to is the real question.”

“They’re the ones with the power over you,” he added. That means intelligence agencies, sure, but really anyone with the time and motivation to learn how to hack. And, as Joshua Corman of I Am the Cavalry, a concerned group of security researchers, once put it, “There are as many motivations to hacking as there are motivations in the human condition. Hacking is a form of power.”

The authorities want that power; entities like JTRIG, the NSA, the FBI and the DOJ want to be able to not just surveil but also to disrupt, to sabotage, to interfere. Right now the Bureau wants to force Apple to create the ability to deliver backdoored software updates to iPhones, allowing law enforcement access to locally stored, encrypted data. Chris Soghoian, a technologist at the ACLU, tweeted, “If DOJ get what they want in this Apple case, imagine the surveillance assistance they’ll be able to force from Internet of Things companies.”

“The notion that there are legal checks and balances in place is a fiction,” Tynan says. “We need to rely more on technology to increase the hurdles required. For the likes of JTRIG to take the massive resources of the U.K. state and focus them on destroying certain individuals, potentially under flimsy pretenses—I just can’t understand the mentality of these people.”

Defending ourselves in this new, insecure world is difficult, perhaps impossible. “If you go on the Internet, it’s a free-for-all,” Tentler says. “Despite the fact that we have these three-letter agencies, they’re not here to help us; they’re not our friends. When the NSA and GCHQ learn from the bad guys and use those techniques on us, we should be worried.”

If the Internet is a free-for-all, and with the Internet of Things we’re putting the entire world on the Internet, what does that make us?

“Fish in a barrel?”

Data surveillance is all around us, and it’s going to change our behaviour

“Increasing aspects of our lives are now recorded as digital data that are systematically stored, aggregated, analysed, and sold. Despite the promise of big data to improve our lives, all encompassing data surveillance constitutes a new form of power that poses a risk not only to our privacy, but to our free will.

A more worrying trend is the use of big data to manipulate human behaviour at scale by incentivising “appropriate” activities, and penalising “inappropriate” activities. In recent years, governments in the UK, US, and Australia have been experimenting with attempts to “correct” the behaviour of their citizens through “nudge units”.”

Nudge units: “In ways you don’t detect [corporations and governments are] subtly influencing your decisions, pushing you towards what it believes are your (or its) best interests, exploiting the biases and tics of the human brain uncovered by research into behavioural psychology. And it is trying this in many different ways on many different people, running constant trials of different unconscious pokes and prods, to work out which is the most effective, which improves the most lives, or saves the most money. Preferably, both.”

“In his new book Inside the Nudge Unit, published this week in Britain, Halpern explains his fascination with behavioural psychology.

”Our brains weren’t made for the day-to-day financial judgments that are the foundation of modern economies: from mortgages, to pensions, to the best buy in a supermarket. Our thinking and decisions are fused with emotion.”

There’s a window of opportunity for governments, Halpern believes: to exploit the gaps between perception, reason, emotion and reality, and push us the “right” way.

He gives me a recent example of BI’s work – they were looking at police recruitment, and how to get a wider ethnic mix.

Just before applicants did an online recruitment test, in an email sending the link, BI added a line saying “before you do this, take a moment to think about why joining the police is important to you and your community”.

There was no effect on white applicants. But the pass rate for black and minority ethnic applicants moved from 40 to 60 per cent.

”It entirely closes the gap,” Halpern says. “Absolutely amazing. We thought we had good grounds in the [scientific research] literature that such a prompt might make a difference, but the scale of the difference was extraordinary.

Halpern taught social psychology at Cambridge but spent six years in the Blair government’s strategy unit. An early think piece on behavioural policy-making was leaked to the media and caused a small storm – Blair publicly disowned it and that was that. Halpern returned to academia, but was lured back after similar ideas started propagating through the Obama administration, and Cameron was persuaded to give it a go.

Ministers tend not to like it – once, one snapped, “I didn’t spend a decade in opposition to come into government to run a pilot”, but the technique is rife in the digital commercial world, where companies like Amazon or Google try 20 different versions of a web page.

Governments and public services should do it too, Halpern says. His favourite example is Britain’s organ donor register. They tested eight alternative online messages prompting people to join, including a simple request, different pictures, statistics or conscience-tweaking statements like “if you needed an organ transplant would you have one? If so please help others”.

It’s not obvious which messages work best, even to an expert. The only way to find out is to test them. They were surprised to find that the picture (of a group of people) actually put people off, Halpern says.

In future they want to use demographic data to personalise nudges, Halpern says. On tax reminder notices, they had great success putting the phrase “most people pay their tax on time” at the top. But a stubborn top 5 per cent, with the biggest tax debts, saw this reminder and thought, “Well, I’m not most people”.

This whole approach raises ethical issues. Often you can’t tell people they’re being experimented on – it’s impractical, or ruins the experiment, or both.

”If we’re trying to find the best way of saying ‘don’t drop your litter’ with a sign saying ‘most people don’t drop litter’, are you supposed to have a sign before it saying ‘caution you are about to participate in a trial’?

”Where should we draw the line between effective communication and unacceptable ‘PsyOps’ or propaganda?”

Stare Into The Lights My Pretties

CIA’s “Siren Servers” can predict social uprisings several days before they happen

“The CIA claims to be able to predict social unrest days before it happens thanks to powerful super computers dubbed Siren Servers by the father of Virtual Reality, Jaron Lanier.

CIA Deputy Director for Digital Innovation Andrew Hallman announced that the agency has beefed-up its “anticipatory intelligence” through the use of deep learning and machine learning servers that can process an incredible amount of data.

“We have, in some instances, been able to improve our forecast to the point of being able to anticipate the development of social unrest and societal instability some I think as near as three to five days out,” said Hallman on Tuesday at the Federal Tech event, Fedstival.

This Minority Report-type technology has been viewed skeptically by policymakers as the data crunching hasn’t been perfected, and if policy were to be enacted based on faulty data, the results could be disastrous. Iraq WMDs?”

I called it a siren server because there’s no plan to be evil. A siren server seduces you,” said Lanier.

In the case of the CIA; however, whether the agency is being innocently seduced or is actively planning to use this data for its own self-sustaining benefit, one can only speculate.

Given the Intelligence Community’s track record for toppling governments, infiltrating the mainstream media, MK Ultra, and scanning hundreds of millions of private emails, that speculation becomes easier to justify.”

Machine Logic: Our lives are ruled by big tech’s decisions by data

The Guardian’s Julia Powles writes about how with the advent of artificial intelligence and so-called “machine learning,” this society is increasingly a world where decisions are more shaped by calculations and data analytics rather than traditional human judgement:

“Jose van Dijck, president of the Dutch Royal Academy and the conference’s keynote speaker, expands: Datification is the core logic of what she calls “the platform society,” in which companies bypass traditional institutions, norms and codes by promising something better and more efficient — appealing deceptively to public values, while obscuring private gain. Van Dijck and peers have nascent, urgent ideas. They commence with a pressing agenda for strong interdisciplinary research — something Kate Crawford is spearheading at Microsoft Research, as are many other institutions, including the new Leverhulme Centre for the Future of Intelligence. There’s the old theory to confront, that this is a conscious move on the part of consumers and, if so, there’s always a theoretical opt-out. Yet even digital activists plot by Gmail, concedes Fieke Jansen of the Berlin-based advocacy organisation Tactical Tech. The Big Five tech companies, as well as the extremely concentrated sources of finance behind them, are at the vanguard of “a society of centralized power and wealth. “How did we let it get this far?” she asks. Crawford says there are very practical reasons why tech companies have become so powerful. “We’re trying to put so much responsibility on to individuals to step away from the ‘evil platforms,’ whereas in reality, there are so many reasons why people can’t. The opportunity costs to employment, to their friends, to their families, are so high” she says.”

Stare Into The Lights My Pretties

Parents are worried the Amazon Echo is conditioning their kids to be rude

“I’ve found my kids pushing the virtual assistant further than they would push a human,” says Avi Greengart, a tech analyst and father of five who lives in Teaneck, New Jersey. “[Alexa] never says ‘That was rude’ or ‘I’m tired of you asking me the same question over and over again.'” Perhaps she should, he thinks. “One of the responsibilities of parents is to teach your kids social graces,” says Greengart, “and this is a box you speak to as if it were a person who does not require social graces.”

Alexa, tell me a knock-knock joke.
Alexa, how do you spell forest?
Alexa, what’s 17 times 42?

The syntax is generally simple and straightforward, but it doesn’t exactly reward niceties like “please.” Adding to this, extraneous words can often trip up the speaker’s artificial intelligence. When it comes to chatting with Alexa, it pays to be direct—curt even. “If it’s not natural language, one of the first things you cut away is the little courtesies,” says Dennis Mortensen, who founded a calendar-scheduling startup called x.ai.

For parents trying to drill good manners into their children, listening to their kids boss Alexa around can be disconcerting.

“One of the responsibilities of parents is to teach your kids social graces,” says Greengart, “and this is a box you speak to as if it were a person who does not require social graces.”

It’s this combination that worries Hunter Walk, a tech investor in San Francisco. In a blog post, he described the Amazon Echo as “magical” while expressing fears it’s “turning our daughter into a raging asshole.”

Wikipedia Is Basically a Corporate Bureaucracy

This study, that details the “Evolution of Wikipedia’s Norm Network,” could speak analogously to the supposed “democratisation” that technology pundits constantly invoke when idealising the web, not just in regards to Wikipedia, but even in more general terms about the Screen Culture. Also, mix in a reading of George Orwell’s ‘Animal Farm’ for good measure.

Emphasis added:

“Wikipedia is a voluntary organization dedicated to the noble goal of decentralized knowledge creation. But as the community has evolved over time, it has wandered further and further from its early egalitarian ideals, according to a new paper published in the journal Future Internet. In fact, such systems usually end up looking a lot like 20th-century bureaucracies. […] This may seem surprising, since there is no policing authority on Wikipedia — no established top-down means of control. The community is self-governing, relying primarily on social pressure to enforce the established core norms, according to co-author Simon DeDeo, a complexity scientist at Indiana University. […] “You start with a decentralized democratic system, but over time you get the emergence of a leadership class with privileged access to information and social networks,” DeDeo explained. “Their interests begin to diverge from the rest of the group. They no longer have the same needs and goals. So not only do they come to gain the most power within the system, but they may use it in ways that conflict with the needs of everybody else.”