Resources

Stare Into The Lights My Pretties

“RoboCop” deployed to Silicon Valley shopping centre

At the Stanford shopping center in Palo Alto, California, there is a new sheriff in town – and it’s an egg-shaped robot.

“Everyone likes to take robot selfies,” Stephens said. “People really like to interact with the robot.” He said there have even been two instances where the company found lipstick marks on the robot where people had kissed the graffiti-resistant dome.

The slightly comical Dalek design was intentional…”

902

Scientists put the brainwaves of a parasitic roundworm into Lego robot body

“Scientists believe they could be on the brink of creating artificial life after they digitized the brain of a worm and successfully placed it inside a robot.

Incredibly, they discovered that the bionic simulation behaved in exactly the same way as a real worm — despite the fact that they’d never coded its actual behavior.”

748
Stare Into The Lights My Pretties

Face recognition app taking Russia by storm may bring end to public anonymity

“If the founders of a new face recognition app get their way, anonymity in public could soon be a thing of the past. FindFace, launched two months ago and currently taking Russia by storm, allows users to photograph people in a crowd and work out their identities, with 70% reliability.

It works by comparing photographs to profile pictures on Vkontakte, a social network popular in Russia and the former Soviet Union, with more than 200 million accounts. In future, the designers imagine a world where people walking past you on the street could find your social network profile by sneaking a photograph of you, and shops, advertisers and the police could pick your face out of crowds and track you down via social networks.”

Founder Kabakov says the app could revolutionise dating: “If you see someone you like, you can photograph them, find their identity, and then send them a friend request.” The interaction doesn’t always have to involve the rather creepy opening gambit of clandestine street photography, he added: “It also looks for similar people. So you could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages.”

799

It’s trivially easy to identify you based on records of your phone calls and texts

“Contrary to the claims of America’s top spies, the details of your phone calls and text messages—including when they took place and whom they involved—are no less revealing than the actual contents of those communications.

In a study published online Monday in the journal Proceedings of the National Academy of Sciences, Stanford University researchers demonstrated how they used publicly available sources—like Google searches and the paid background-check service Intelius—to identify “the overwhelming majority” of their 823 volunteers based only on their anonymized call and SMS metadata.

Using data collected through a special Android app, the Stanford researchers determined that they could easily identify people based on their call and message logs.

The results cast doubt on [show as lies] claims by senior intelligence officials that telephone and Internet “metadata”—information about communications, but not the content of those communications—should be subjected to a lower privacy threshold because it is less sensitive.”

815

Welcome to the age of the chatbot. Soon you’ll be lonelier than ever.

“Very soon – by the end of the year, probably – you won’t need to be on Facebook in order to talk to your friends on Facebook.

Your Facebook avatar will dutifully wish people happy birthday, congratulate them on the new job, accept invitations, and send them jolly texts punctuated by your favourite emojis – all while you’re asleep, or shopping, or undergoing major surgery.

Using IBM’s powerful Watson natural language processing platform, The Chat Bot Club learns to imitate its user. It learns texting styles, favourite phrases, preferred emojis, repeated opinions – and then it learns to respond in kind, across an ever-broadening range of subjects.”

“Humans aren’t perfect, and AI is a bit the same way,” he said. “AI is not significantly smarter than the people who program it. So AI is always going to encounter circumstances that it was not prepared for.”

851

Facebook is monitoring your reactions to serve you ads, warn Belgian Police

Belgian police have asked citizens to shun Facebook’s “Reactions” buttons to protect their privacy. In February, five new “Reaction” buttons were added next to the “Like” button to allow people to display responses such as sad, wow, angry, love and haha. According to reports, police said Facebook is able to use the tool to tell when people are likely to be in a good mood — and then decide when is the best time to show them ads. “The icons help not only express your feelings, they also help Facebook assess the effectiveness of the ads on your profile,” a post on Belgian’s official police website read.

“By limiting the number of icons to six, Facebook is counting on you to express your thoughts more easily so that the algorithms that run in the background are more effective,” the post continues. “By mouse clicks you can let them know what makes you happy. “So that will help Facebook find the perfect location, on your profile, allowing it to display content that will arouse your curiosity but also to choose the time you present it. If it appears that you are in a good mood, it can deduce that you are more receptive and able to sell spaces explaining advertisers that they will have more chance to see you react.”

737

Marketers hungry for data from wearable devices

“In the future the data procured from smartwatches might be much more valuable than what is currently available from laptop and mobile users,” reports David Curry, raising the possibility that stores might someday use your past Google searches to alert you when they’re selling a cheaper product.”

833

Silicon Valley tech firms exacerbating income inequality

“Google democratized information, Uber democratized car rides, and Twitter democratized publishing a single sentence. But to the World Bank, the powerful Washington-based organisation that lends money to developing countries, Silicon Valley’s technology firms appear to be exacerbating economic inequality rather than improving it.”

751

Snowden: ‘Governments Can Reduce Our Dignity To That Of Tagged Animals’

“NSA whistleblower Edward Snowden writes a report on The Guardian explaining why leaking information about wrongdoing is a vital act of resistance. “One of the challenges of being a whistleblower is living with the knowledge that people continue to sit, just as you did, at those desks, in that unit, throughout the agency; who see what you saw and comply in silence, without resistance or complaint,” Snowden writes. “They learn to live not just with untruths but with unnecessary untruths, dangerous untruths, corrosive untruths. It is a double tragedy: what begins as a survival strategy ends with the compromise of the human being it sought to preserve and the diminishing of the democracy meant to justify the sacrifice.” He goes on to explain the importance and significance of leaks, how not all leaks are alike, nor are their makers, and how our connected devices come into play in the post-9/11 period. Snowden writes, “By preying on the modern necessity to stay connected, governments can reduce our dignity to something like that of tagged animals, the primary difference being that we paid for the tags and they are in our pockets.”

789

Google AI has access to 1.6M people’s health records (UK)

“A document obtained by New Scientist reveals that the tech giant’s collaboration with the UK’s National Health Service goes far beyond what has been publicly announced. The document — a data-sharing agreement between Google-owned artificial intelligence company DeepMind and the Royal Free NHS Trust — gives the clearest picture yet of what the company is doing and what sensitive data it now has access to. The agreement gives DeepMind access to a wide range of healthcare data on the 1.6 million patients who pass through three London hospitals.

It includes logs of day-to-day hospital activity, such as records of the location and status of patients – as well as who visits them and when. The hospitals will also share the results of certain pathology and radiology tests.

As well as receiving this continuous stream of new data, DeepMind has access to the historical data that the Royal Free trust submits to the Secondary User Service (SUS) database – the NHS’s centralised record of all hospital treatments in the UK. This includes data from critical care and accident and emergency departments.

Google says it has no commercial plans for DeepMind’s work with Royal Free and that the current pilots are being done for free. But the data to which Royal Free is giving DeepMind access is hugely valuable. It may have to destroy its copy of the data when the agreement expires next year, but that gives ample time to mine it for health insights.”

794

China Debuts Anbot, the Police Robot

826

Google files patent for injecting a device directly into your eyeball

“Second to the contact lenses that monitor for diabetes, Google’s parent company Alphabet has filed a patent which takes their development to another level. The patent specifically covers a method for “injecting a fluid into a lens capsule of an eye, wherein a natural lens of the eye has been removed from the lens capsule.” It’s powered by “radio frequency energy” received by a small antenna inside. The gadget even has its own data storage. Forbes reports, it is designed to “improve vision.”

Samsung is also one of the most recent companies to receive a patent for smart contact lenses. Their lenses are for “experimenting with new methods of delivering augmented reality interfaces and data.”

708

Wikipedia Is Basically a Corporate Bureaucracy

This study, that details the “Evolution of Wikipedia’s Norm Network,” could speak analogously to the supposed “democratisation” that technology pundits constantly invoke when idealising the web, not just in regards to Wikipedia, but even in more general terms about the Screen Culture. Also, mix in a reading of George Orwell’s ‘Animal Farm’ for good measure.

Emphasis added:

“Wikipedia is a voluntary organization dedicated to the noble goal of decentralized knowledge creation. But as the community has evolved over time, it has wandered further and further from its early egalitarian ideals, according to a new paper published in the journal Future Internet. In fact, such systems usually end up looking a lot like 20th-century bureaucracies. […] This may seem surprising, since there is no policing authority on Wikipedia — no established top-down means of control. The community is self-governing, relying primarily on social pressure to enforce the established core norms, according to co-author Simon DeDeo, a complexity scientist at Indiana University. […] “You start with a decentralized democratic system, but over time you get the emergence of a leadership class with privileged access to information and social networks,” DeDeo explained. “Their interests begin to diverge from the rest of the group. They no longer have the same needs and goals. So not only do they come to gain the most power within the system, but they may use it in ways that conflict with the needs of everybody else.”

808

How Big Data Creates False Confidence

“The general idea is to find datasets so enormous that they can reveal patterns invisible to conventional inquiry… But there’s a problem: It’s tempting to think that with such an incredible volume of data behind them, studies relying on big data couldn’t be wrong. But the bigness of the data can imbue the results with a false sense of certainty. Many of them are probably bogus — and the reasons why should give us pause about any research that blindly trusts big data.”

For example, Google’s database of scanned books represents 4% of all books ever published, but in this data set, “The Lord of the Rings gets no more influence than, say, Witchcraft Persecutions in Bavaria.” And the name Lanny appears to be one of the most common in early-20th century fiction — solely because Upton Sinclair published 11 different novels about a character named Lanny Budd.

The problem seems to be skewed data and misinterpretation. (The article points to the failure of Google Flu Trends, which it turns out “was largely predicting winter”.) The article’s conclusion? “Rather than succumb to ‘big data hubris,’ the rest of us would do well to keep our sceptic hats on — even when someone points to billions of words.”

818
Stare Into The Lights My Pretties

“From Uber To Eric Schmidt, Tech Is Closer To the US Government Than You’d Think”

“Alphabet’s [Google] executive chairman, Eric Schmidt, recently joined a Department of Defense advisory panel. Facebook recently hired a former director at the U.S. military’s research lab, Darpa. Uber employs Barack Obama’s former campaign manager David Plouffe and Amazon.com tapped his former spokesman Jay Carney. Google, Facebook, Uber and Apple collectively employ a couple of dozen former analysts for America’s spy agencies, who openly list their resumes on LinkedIn.

These connections are neither new nor secret. But the fact they are so accepted illustrates how tech’s leaders — even amid current fights over encryption and surveillance — are still seen as mostly U.S. firms that back up American values. Christopher Soghoian, a technologist with the American Civil Liberties Union, said low-level employees’ government connections matter less than leading executives’ ties to government. For instance, at least a dozen Google engineers have worked at the NSA, according to publicly available records on LinkedIn. And, this being Silicon Valley, not everyone who worked for a spy agency advertises that on LinkedIn. Soghoian, a vocal critic of mass surveillance, said Google hiring an ex-hacker for the NSA to work on security doesn’t really bother him. “But Eric Schmidt having a close relationship with the White House does…”

915

UK Spook Agencies Have Been Spying on Millions of People ‘Of No Security Interest’ Since 1990s

UK’s intelligence agencies such as MI5, MI6, and GCHQ have been collecting personal information from citizens who are “unlikely to be of intelligence or security interest” since the 1990s, previously confidential documents reveal. The documents were published as a result of a lawsuit filed by Privacy International, and according to the files, GCHQ and others have been collecting bulk personal data sets since 1998.

Emphasis added:

“These records can be “anything from your private medical records, your correspondence with your doctor or lawyer, even what petitions you have signed, your financial data, and commercial activities,” Privacy International legal officer Millie Graham Wood said in a statement. “The information revealed by this disclosure shows the staggering extent to which the intelligence agencies hoover up our data.”

Nor, it seems, are BPDs only being used to investigate terrorism and serious crime; they can and are used to protect Britain’s “economic well-being”—including preventing pirate copies of Harry Potter books from leaking before their release date.

BPDs are so powerful, in fact, that the normally toothless UK parliament watchdog that oversees intelligence gathering, the Intelligence and Security Committee (ISC), recommended in February that “Class Bulk Personal Dataset warrants are removed from the new legislation.”

These data sets are so large and collect so much information so indiscriminately that they even include information on dead people.”

775

Why movie trailers now begin with five-second ads for themselves

Emphasis added.

“Jason Bourne takes off his jacket, punches a man unconscious, looks forlornly off camera, and then a title card appears. The ad — five seconds of action — is a teaser for the full Jason Bourne trailer (video), which immediately follows the teaser. In fact, the micro-teaser and trailer are actually part of the same video, the former being an intro for the latter. The trend is the latest example of metahype, a marketing technique in which brands promote their advertisements as if they’re cultural events unto themselves.

[…]

“Last year, the studio advertised the teaser for Ant-Man with a ten-second cut of the footage reduced to an imperceptive scale. […] But where previous metahype promoted key dates in a marketing campaign—like official trailer releases and fan celebrations—the burgeoning trend of teasers within trailers exist purely to retain the viewer’s attention in that exact moment. The teaser within the trailer speaks to a moment in which we have so many distractions and choices that marketers must sell us on giving a trailer three minutes of our time. This practice isn’t limited to movie trailers, though. Next time you’re on Facebook, pay attention to how the popular videos in your newsfeed are edited. Is the most interesting image the first thing you see? And does that trick get you to stop scrolling and watch?”

810

Catalogue of US Government Surveillance Devices

The Intercept has obtained a secret, internal U.S. government catalogue of dozens of cellphone surveillance devices used by the military and by intelligence agencies. The document, thick with previously undisclosed information, also offers rare insight into the spying capabilities of federal law enforcement and local police inside the United States.

The catalogue includes details on the Stingray, a well-known brand of surveillance gear, as well as Boeing “dirt boxes” and dozens of more obscure devices that can be mounted on vehicles, drones, and piloted aircraft. Some are designed to be used at static locations, while others can be discreetly carried by an individual. They have names like Cyberhawk, Yellowstone, Blackfin, Maximus, Cyclone, and Spartacus. Within the catalogue, the NSA is listed as the vendor of one device, while another was developed for use by the CIA, and another was developed for a special forces requirement. Nearly a third of the entries focus on equipment that seems to have never been described in public before.

Slides of the catalogue available here, while a stylised version is available here.

851
Stare Into The Lights My Pretties

How the Internet changed the way we read

“UC Literature Professor Jackson Bliss puts into words something many of you have probably experienced: the evolution of the internet and mobile devices has changed how we read. “The truth is that most of us read continuously in a perpetual stream of incestuous words, but instead of reading novels, book reviews, or newspapers like we used to in the ancien régime, we now read text messages, social media, and bite-sized entries about our protean cultural history on Wikipedia.”

Bliss continues, “In the great epistemic galaxy of words, we have become both reading junkies and also professional text skimmers. … Reading has become a relentless exercise in self-validation, which is why we get impatient when writers don’t come out and simply tell us what they’re arguing. … Content—whether thought-provoking, regurgitated, or analytically superficial, impeccably-researched, politically doctrinaire, or grammatically atrocious—now occupies the same cultural space, the same screen space, and the same mental space in the public imagination. After awhile, we just stop keeping track of what’s legitimately good because it takes too much energy to separate the crème from the foam.”

887
Stare Into The Lights My Pretties

The dangers of trusting robots

Emphasis added:

“There are many other examples of intelligent technology gone bad, but more often than not they involve deception rather than physical danger. Malevolent bots, designed by criminals, are now ubiquitous on social media sites and elsewhere online. The mobile dating app Tinder, for example, has been frequently infiltrated by bots posing as real people that attempt to manipulate users into using their webcams or disclosing credit card information. So it’s not a stretch to imagine that untrustworthy bots may soon come to the physical world.

Meanwhile, increasing evidence suggests that we are susceptible to telling our deepest, darkest secrets to anthropomorphic robots whose cute faces may hide exploitative code – children particularly so. So how do we protect ourselves from double-crossing decepticons?”

808