Archives August 2017

“Are you happy now? The uncertain future of emotion analytics”

Elise Thomas writes at Hopes & Fears:

“Right now, in a handful of computing labs scattered across the world, new software is being developed which has the potential to completely change our relationship with technology. Affective computing is about creating technology which recognizes and responds to your emotions. Using webcams, microphones or biometric sensors, the software uses a person’s physical reactions to analyze their emotional state, generating data which can then be used to monitor, mimic or manipulate that person’s emotions.”

Corporations spend billions each year trying to build “authentic” emotional connections to their target audiences. Marketing research is one of the most prolific research fields around, conducting thousands of studies on how to more effectively manipulate consumers’ decision-making. Advertisers are extremely interested in affective computing and particularly in a branch known as emotion analytics, which offers unprecedented real-time access to consumers’ emotional reactions and the ability to program alternative responses depending on how the content is being received.

For example, if two people watch an advertisement with a joke and only one person laughs, the software can be programmed to show more of the same kind of advertising to the person who laughs while trying different sorts of advertising on the person who did not laugh to see if it’s more effective. In essence, affective computing could enable advertisers to create individually-tailored advertising en masse.”

“Say 15 years from now a particular brand of weight loss supplements obtains a particular girl’s information and locks on. When she scrolls through her Facebook, she sees pictures of rail-thin celebrities, carefully calibrated to capture her attention. When she turns on the TV, it automatically starts on an episode of “The Biggest Loser,” tracking her facial expressions to find the optimal moment for a supplement commercial. When she sets her music on shuffle, it “randomly” plays through a selection of the songs which make her sad. This goes on for weeks.

Now let’s add another layer. This girl is 14, and struggling with depression. She’s being bullied in school. Having become the target of a deliberate and persistent campaign by her technology to undermine her body image and sense of self-worth, she’s at risk of making some drastic choices.”

“How Apple Is Putting Voices in Users’ Heads—Literally”

“… a collaboration between Apple and Cochlear, a company that has been involved with implant technology since the treatment’s early days … announced last week that the first product based on this approach, Cochlear’s Nucleus 7 sound processor, won FDA approval in June—the first time that the agency has approved such a link between cochlear implants and phones or tablets.

Those using the system can not only get phone calls directly routed inside their skulls, but also stream music, podcasts, audio books, movie soundtracks, and even Siri—all straight to the implant.

It connects with hearing aids whose manufacturers have adopted the free Apple protocols, earning them a “Made for iPhone” approval. Apple also has developed a feature called Live Listen that lets hearing aid users employ the iPhone as a microphone—which comes in handy at meetings and restaurants.An iPhone or iPod Touch pairs with hearing aids—cochlear and conventional—the same way that it finds AirPods or nearby Bluetooth speakers.

Merging medical technology like Apple’s is a clear benefit to those needing hearing help. But I’m intrigued by some observations that Dr. Biever, the audiologist who’s worked with hearing loss patients for two decades, shared with me. She says that with this system, patients have the ability to control their sound environment in a way that those with good hearing do not—so much so that she is sometimes envious. How cool would it be to listen to a song without anyone in the room hearing it? “When I’m in the noisiest of rooms and take a call on my iPhone, I can’t hold my phone to ear and do a call,” she says. “But my recipient can do this.”

This paradox reminds me of the approach I’m seeing in the early commercial efforts to develop a brain-machine interface: an initial focus on those with cognitive challenges with a long-term goal of supercharging everyone’s brain. We’re already sort of cyborgs, working in a partnership of dependency with those palm-size slabs of glass and silicon that we carry in our pockets and purses. The next few decades may well see them integrated subcutaneously.

Cities target smartphone zombies

“A ban on pedestrians looking at mobile phones or texting while crossing the street will take effect in Hawaii’s largest city in late October, as Honolulu becomes the first major U.S. city to pass legislation aimed at reducing injuries and deaths from “distracted walking.”

The ban comes as cities around the world grapple with how to protect phone-obsessed “smartphone zombies” from injuring themselves by stepping into traffic or running into stationary objects.

Starting Oct. 25, Honolulu pedestrians can be fined between $15 and $99, depending on the number of times police catch them looking at a phone or tablet device as they cross the street, Mayor Kirk Caldwell told reporters gathered near one of the city’s busiest downtown intersections on Thursday… People making calls for emergency services are exempt from the ban… Opponents of the Honolulu law argued it infringes on personal freedom and amounts to government overreach.”

In a related article: “The city of London has tried putting pads on their lamp posts “to soften the blow for distracted walkers…”