Summary: Short article basically speaking to how culture is transmitted, with an underpinning comment about how ubiquitous technology trumps real life relationships, even in small ways, such as real-life people’s names.
“I’ve become slow to respond to my name in public spaces for fear I’ll turn and smile at a stranger scowling into their phone. In protest, I’ve never used the feature and forbade my parents from using it on their iPhones.
“OMG, Siri like the iPhone,” should be engraved on my tombstone.
At worst, people air their grievances against Apple to me.”
At the Stanford shopping center in Palo Alto, California, there is a new sheriff in town – and it’s an egg-shaped robot.
“Everyone likes to take robot selfies,” Stephens said. “People really like to interact with the robot.” He said there have even been two instances where the company found lipstick marks on the robot where people had kissed the graffiti-resistant dome.
The slightly comical Dalek design was intentional…”
“Scientists believe they could be on the brink of creating artificial life after they digitized the brain of a worm and successfully placed it inside a robot.
Incredibly, they discovered that the bionic simulation behaved in exactly the same way as a real worm — despite the fact that they’d never coded its actual behavior.”
“Very soon – by the end of the year, probably – you won’t need to be on Facebook in order to talk to your friends on Facebook.
Your Facebook avatar will dutifully wish people happy birthday, congratulate them on the new job, accept invitations, and send them jolly texts punctuated by your favourite emojis – all while you’re asleep, or shopping, or undergoing major surgery.
Using IBM’s powerful Watson natural language processing platform, The Chat Bot Club learns to imitate its user. It learns texting styles, favourite phrases, preferred emojis, repeated opinions – and then it learns to respond in kind, across an ever-broadening range of subjects.”
“Humans aren’t perfect, and AI is a bit the same way,” he said. “AI is not significantly smarter than the people who program it. So AI is always going to encounter circumstances that it was not prepared for.”
“Contrary to the claims of America’s top spies, the details of your phone calls and text messages—including when they took place and whom they involved—are no less revealing than the actual contents of those communications.
In a study published online Monday in the journal Proceedings of the National Academy of Sciences, Stanford University researchers demonstrated how they used publicly available sources—like Google searches and the paid background-check service Intelius—to identify “the overwhelming majority” of their 823 volunteers based only on their anonymized call and SMS metadata.
Using data collected through a special Android app, the Stanford researchers determined that they could easily identify people based on their call and message logs.
The results cast doubt on [show as lies] claims by senior intelligence officials that telephone and Internet “metadata”—information about communications, but not the content of those communications—should be subjected to a lower privacy threshold because it is less sensitive.”
“If the founders of a new face recognition app get their way, anonymity in public could soon be a thing of the past. FindFace, launched two months ago and currently taking Russia by storm, allows users to photograph people in a crowd and work out their identities, with 70% reliability.
It works by comparing photographs to profile pictures on Vkontakte, a social network popular in Russia and the former Soviet Union, with more than 200 million accounts. In future, the designers imagine a world where people walking past you on the street could find your social network profile by sneaking a photograph of you, and shops, advertisers and the police could pick your face out of crowds and track you down via social networks.”
Founder Kabakov says the app could revolutionise dating: “If you see someone you like, you can photograph them, find their identity, and then send them a friend request.” The interaction doesn’t always have to involve the rather creepy opening gambit of clandestine street photography, he added: “It also looks for similar people. So you could just upload a photo of a movie star you like, or your ex, and then find 10 girls who look similar to her and send them messages.”
“NSA whistleblower Edward Snowden writes a report on The Guardian explaining why leaking information about wrongdoing is a vital act of resistance. “One of the challenges of being a whistleblower is living with the knowledge that people continue to sit, just as you did, at those desks, in that unit, throughout the agency; who see what you saw and comply in silence, without resistance or complaint,” Snowden writes. “They learn to live not just with untruths but with unnecessary untruths, dangerous untruths, corrosive untruths. It is a double tragedy: what begins as a survival strategy ends with the compromise of the human being it sought to preserve and the diminishing of the democracy meant to justify the sacrifice.” He goes on to explain the importance and significance of leaks, how not all leaks are alike, nor are their makers, and how our connected devices come into play in the post-9/11 period. Snowden writes, “By preying on the modern necessity to stay connected, governments can reduce our dignity to something like that of tagged animals, the primary difference being that we paid for the tags and they are in our pockets.”
“Google democratized information, Uber democratized car rides, and Twitter democratized publishing a single sentence. But to the World Bank, the powerful Washington-based organisation that lends money to developing countries, Silicon Valley’s technology firms appear to be exacerbating economic inequality rather than improving it.”
“In the future the data procured from smartwatches might be much more valuable than what is currently available from laptop and mobile users,” reports David Curry, raising the possibility that stores might someday use your past Google searches to alert you when they’re selling a cheaper product.”
Belgian police have asked citizens to shun Facebook’s “Reactions” buttons to protect their privacy. In February, five new “Reaction” buttons were added next to the “Like” button to allow people to display responses such as sad, wow, angry, love and haha. According to reports, police said Facebook is able to use the tool to tell when people are likely to be in a good mood — and then decide when is the best time to show them ads. “The icons help not only express your feelings, they also help Facebook assess the effectiveness of the ads on your profile,” a post on Belgian’s official police website read.
“By limiting the number of icons to six, Facebook is counting on you to express your thoughts more easily so that the algorithms that run in the background are more effective,” the post continues. “By mouse clicks you can let them know what makes you happy. “So that will help Facebook find the perfect location, on your profile, allowing it to display content that will arouse your curiosity but also to choose the time you present it. If it appears that you are in a good mood, it can deduce that you are more receptive and able to sell spaces explaining advertisers that they will have more chance to see you react.”
“A document obtained by New Scientist reveals that the tech giant’s collaboration with the UK’s National Health Service goes far beyond what has been publicly announced. The document — a data-sharing agreement between Google-owned artificial intelligence company DeepMind and the Royal Free NHS Trust — gives the clearest picture yet of what the company is doing and what sensitive data it now has access to. The agreement gives DeepMind access to a wide range of healthcare data on the 1.6 million patients who pass through three London hospitals.
It includes logs of day-to-day hospital activity, such as records of the location and status of patients – as well as who visits them and when. The hospitals will also share the results of certain pathology and radiology tests.
As well as receiving this continuous stream of new data, DeepMind has access to the historical data that the Royal Free trust submits to the Secondary User Service (SUS) database – the NHS’s centralised record of all hospital treatments in the UK. This includes data from critical care and accident and emergency departments.
Google says it has no commercial plans for DeepMind’s work with Royal Free and that the current pilots are being done for free. But the data to which Royal Free is giving DeepMind access is hugely valuable. It may have to destroy its copy of the data when the agreement expires next year, but that gives ample time to mine it for health insights.”