Archives 16 December 2019

The Price of Recycling Old Laptops: Toxic Fumes in Thailand’s Lungs

The e-waste industry is booming in Southeast Asia, frightening residents worried for their health. Despite a ban on imports, Thailand is a center of the business.

Crouched on the ground in a dimly lit factory, the women picked through the discarded innards of the modern world: batteries, circuit boards and bundles of wires. They broke down the scrap — known as e-waste — with hammers and raw hands. Men, some with faces wrapped in rags to repel the fumes, shoveled the refuse into a clanking machine that salvages usable metal. As they toiled, smoke spewed over nearby villages and farms. Residents have no idea what is in the smoke: plastic, metal, who knows? All they know is that it stinks and they feel sick.

The factory, New Sky Metal, is part of a thriving e-waste industry across Southeast Asia, born of China’s decision to stop accepting the world’s electronic refuse, which was poisoning its land and people. Thailand in particular has become a center of the industry even as activists push back and its government wrestles to balance competing interests of public safety with the profits to be made from the lucrative trade. Last year, Thailand banned the import of foreign e-waste. Yet new factories are opening across the country, and tons of e-waste are being processed, environmental monitors and industry experts say. “E-waste has to go somewhere,” said Jim Puckett, the executive director of the Basel Action Network, which campaigns against trash dumping in poor countries, “and the Chinese are simply moving their entire operations to Southeast Asia.”

NHS Gives Amazon Free Use of Health Data Under Alexa Advice Deal

Amazon has been given free access to healthcare information collected by the NHS as part of a contract with the government. The material, which excludes patient data, could allow the multinational technology company to make, advertise and sell its own products.

In July the health secretary, Matt Hancock, said a partnership with the NHS that allowed Amazon Alexa devices to offer expert health advice to users would reduce pressure on “our hard-working GPs and pharmacists.” But responses to freedom of information requests, published by the Sunday Times, showed the contract will also allow the company access to information on symptoms, causes and definitions of conditions, and “all related copyrightable content and data and other materials.” Amazon, which is worth $863bn and is run by the world’s richest person, Jeff Bezos, can then create “new products, applications, cloud-based services and/or distributed software,” which the NHS would not benefit from financially. It can also share the information with third parties. Labour’s shadow health secretary, Jonathan Ashworth, told the Sunday Times that the government was “highly irresponsible” and “in the pocket of big corporate interests.”

Turkey is Getting Military Drones Armed With Machine Guns

A drone with a machine gun attached can hit targets with high precision, according to its makers. Turkey is set to become the first country to have the drone, when it gets a delivery this month. The 25-kilogram drone has eight rotating blades to get it in the air. Its machine gun carries 200 rounds of ammunition and can fire single shots or 15-round bursts. Many countries and groups already use small military drones that can drop grenades or fly into a target to detonate an explosive. The new drone, called Songar and made by Ankara-based electronics firm Asisguard, is the first drone to be equipped with a firearm and be ready for service. Turkey expects the drones to be delivered before the end of the year.

It is hard for a drone to shoot accurately, partly because of the difficulty of judging range and angle, and partly because the recoil from each shot significantly moves the drone, affecting the aim for the next round. Songar has two systems to overcome these challenges. One uses sensors, including cameras and a laser rangefinder, to calculate distance, angle and wind speed, and work out where to aim. The second is a set of robot arms that move the machine gun to compensate for the effects of recoil.

Emotion Recognition Tech Should Be Banned, Says an AI Research Institute

A leading research centre has called for new laws to restrict the use of emotion-detecting tech. The AI Now Institute says the field is “built on markedly shaky foundations.” Despite this, systems are on sale to help vet job seekers, test criminal suspects for signs of deception, and set insurance prices. It wants such software to be banned from use in important decisions that affect people’s lives and/or determine their access to opportunities. The US-based body has found support in the UK from the founder of a company developing its own emotional-response technologies — but it cautioned that any restrictions would need to be nuanced enough not to hamper all work being done in the area.

AI Now refers to the technology by its formal name, affect recognition, in its annual report. It says the sector is undergoing a period of significant growth and could already be worth as much as $20 billion. “It claims to read, if you will, our inner-emotional states by interpreting the micro-expressions on our face, the tone of our voice or even the way that we walk,” explained co-founder Prof Kate Crawford. “It’s being used everywhere, from how do you hire the perfect employee through to assessing patient pain, through to tracking which students seem to be paying attention in class. “At the same time as these technologies are being rolled out, large numbers of studies are showing that there is… no substantial evidence that people have this consistent relationship between the emotion that you are feeling and the way that your face looks.”

YouTube’s Algorithm Made Fake CNN Reports Go Viral

“YouTube channels posing as American news outlets racked up millions of views on false and inflammatory videos over several months this year,” reports CNN.

“All with the help of YouTube’s recommendation engine.”

Many of the accounts, which mostly used footage from CNN, but also employed some video from Fox News, exploited a YouTube feature that automatically creates channels on certain topics. Those topic channels are then automatically populated by videos related to the topic — including, in this case, blatant misinformation.

YouTube has now shut down many of the accounts.

YouTube’s own algorithms also recommended videos from the channels to American users who watched videos about U.S. politics. That the channels could achieve such virality — one channel was viewed more than two million times over one weekend in October — raises questions about YouTube’s preparedness for tackling misinformation on its platform just weeks before the Iowa caucuses and points to the continuing challenge platforms face as people try to game their systems….

Responding to the findings on Thursday, a CNN spokesperson said YouTube needs to take responsibility.

“When accounts were deleted or banned, they were able to spin up new accounts within hours,” added Plasticity, a natural language processing and AI startup which analyzed the data and identified at least 25 different accounts which YouTube then shut down.

“The tactics they used to game the YouTube algorithm were executed perfectly. They knew what they were doing.”