Resources

How Facebook Figures Out Everyone You’ve Ever Met

From Slashdot:

“I deleted Facebook after it recommended as People You May Know a man who was defense counsel on one of my cases. We had only communicated through my work email, which is not connected to my Facebook, which convinced me Facebook was scanning my work email,” an attorney told Gizmodo. Kashmir Hill, a reporter at the news outlet, who recently documented how Facebook figured out a connection between her and a family member she did not know existed, shares several more instances others have reported and explains how Facebook gathers information. She reports:

Behind the Facebook profile you’ve built for yourself is another one, a shadow profile, built from the inboxes and smartphones of other Facebook users. Contact information you’ve never given the network gets associated with your account, making it easier for Facebook to more completely map your social connections. Because shadow-profile connections happen inside Facebook’s algorithmic black box, people can’t see how deep the data-mining of their lives truly is, until an uncanny recommendation pops up. Facebook isn’t scanning the work email of the attorney above. But it likely has her work email address on file, even if she never gave it to Facebook herself. If anyone who has the lawyer’s address in their contacts has chosen to share it with Facebook, the company can link her to anyone else who has it, such as the defense counsel in one of her cases. Facebook will not confirm how it makes specific People You May Know connections, and a Facebook spokesperson suggested that there could be other plausible explanations for most of those examples — “mutual friendships,” or people being “in the same city/network.” The spokesperson did say that of the stories on the list, the lawyer was the likeliest case for a shadow-profile connection. Handing over address books is one of the first steps Facebook asks people to take when they initially sign up, so that they can “Find Friends.”

The problem with all this, Hill writes, is that Facebook doesn’t explicitly say the scale at which it would be using the contact information it gleans from a user’s address book. Furthermore, most people are not aware that Facebook is using contact information taken from their phones for these purposes.”

879

How Silicon Valley divided society and made everyone raging mad

“Silicon Valley’s utopians genuinely but mistakenly believe that more information and connection makes us more analytical and informed. But when faced with quinzigabytes of data, the human tendency is to simplify things. Information overload forces us to rely on simple algorithms to make sense of the overwhelming noise. This is why, just like the advertising industry that increasingly drives it, the internet is fundamentally an emotional medium that plays to our base instinct to reduce problems and take sides, whether like or don’t like, my guy/not my guy, or simply good versus evil. It is no longer enough to disagree with someone, they must also be evil or stupid…

Nothing holds a tribe together like a dangerous enemy. That is the essence of identity politics gone bad: a universe of unbridgeable opinion between opposing tribes, whose differences are always highlighted, exaggerated, retweeted and shared. In the end, this leads us to ever more distinct and fragmented identities, all of us armed with solid data, righteous anger, a gutful of anger and a digital network of likeminded people. This is not total connectivity; it is total division.”

858
Stare Into The Lights My Pretties

“Facebook decides which killings we’re allowed to see”

Minutes after a police officer shot Philando Castile in Minnesota, United States, a live video was published on Facebook of the aftermath. Castile was captured in some harrowing detail and streamed to Facebook by his girlfriend Diamond Reynolds, using the live video tool on her smartphone. She narrates the footage with a contrasting mix of eerie calm and anguish. But the video was removed from Facebook due to, as company says, a “technical glitch.” The video has since been restored, but with a “Warning — Graphic Video,” disclaimer.

Now an article has come out commenting on how Facebook has become the “de-facto platform” for such “controversial” videos, and that there’s a pattern in these so called glitches–as they happen very often time after “questionable content” is streamed.

It has long been obvious to anyone paying attention that Facebook operates various nefarious controls over all aspects of how information is displayed and disseminated on their network, not just with advertising and the filter bubble:

“As Facebook continues to build out its Live video platform, the world’s most popular social network has become the de-facto choice for important, breaking, and controversial videos. Several times, Facebook has blocked political or newsworthy content only to later say that the removal was a “technical glitch” or an “error.” Nearly two-thirds of Americans get their news from social media, and two thirds of Facebook users say they use the site to get news. If Facebook is going to become the middleman that delivers the world’s most popular news events to the masses, technical glitches and erroneous content removals could be devastating to information dissemination efforts. More importantly, Facebook has become the self-appointed gatekeeper for what is acceptable content to show the public, which is an incredibly important and powerful position to be in. By censoring anything, Facebook has created the expectation that there are rules for using its platform (most would agree that some rules are necessary). But because the public relies on the website so much, Facebook’s rules and judgments have an outsized impact on public debate.”

777

Heavy social media users trapped in endless cycle of depression

“The more time young adults spend on social media, the more likely they are to become depressed, a study has found.

Of the 19 to 32-year-olds who took part in the research, those who checked social media most frequently throughout the week were 2.7 times more likely to develop depression than those who checked least often.

The 1,787 US participants used social media for an average 61 minutes every day, visiting accounts 30 times per week. Of them a quarter were found to have high indicators of depression.”

765