Archives 1 July 2018

2018 > July > 01

We’ve Reached ‘Peak Screen,’ voice next

We’ve hit what I call Peak Screen. For much of the last decade, a technology industry ruled by smartphones has pursued a singular goal of completely conquering our eyes. It has given us phones with ever-bigger screens and phones with unbelievable cameras, not to mention virtual reality goggles and several attempts at camera-glasses. Tech has now captured pretty much all visual capacity. Americans spend three to four hours a day looking at their phones and about 11 hours a day looking at screens of any kind.

So tech giants are building the beginning of something new: a less insistently visual tech world, a digital landscape that relies on voice assistants, headphones, watches and other wearables to take some pressure off our eyes. This could be a nightmare; we may simply add these new devices to our screen-addled lives.” Google, Apple, Amazon voice assisants, etc.

Private investigators call for people to contribute their DNA to public database

Last month DNA-based investigations also led to the arrest of the suspected murderer of two vacationers in 1987, and helped identify a suicide cold case from 2001. Emboldened by that breakthrough, a number of private investigators are spearheading a call for amateur genealogists to help solve other cold cases by contributing their own genetic information to the same public database. They say a larger array of genetic information would widen the pool to find criminals who have eluded capture. The idea is to get people to transfer profiles compiled by commercial genealogy sites such as Ancestry.com and 23andMe onto the smaller, public open-source database created in 2010, called GEDmatch. The commercial sites require authorities to obtain search warrants for the information; the public site does not.

UK Police Plan To Deploy ‘Staggeringly Inaccurate’ Facial Recognition in London

Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns. The controversial software, which officers use to identify suspects, has been found to be “staggeringly inaccurate”, while campaigners have branded its use a violation of privacy. But Britain’s largest police force is set to expand a trial across six locations in London over the coming months.

Police leaders claimed officers make the decision to act on potential matches with police records and images that do not spark an alert are immediately deleted. But last month The Independent revealed the Metropolitan Police’s software was returning “false positives” — images of people who were not on a police database — in 98 percent of alerts… Detective Superintendent Bernie Galopin, the lead on facial recognition for London’s Metropolitan Police, said the operation was targeting wanted suspects to help reduce violent crime and make the area safer. “It allows us to deal with persons that are wanted by police where traditional methods may have failed,” he told The Independent, after statistics showed police were failing to solve 63 per cent of knife crimes committed against under-25s….

Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing, which is currently being stretched by budget cuts, falling officer numbers, rising demand and the terror threat.

A policy officer from the National Council for Civil Liberties called the technology “lawless,” adding “the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society.”

But a Home Office minister said the technology was vital for protecting people from terrorism, though “we must ensure that privacy is respected. This strategy makes clear that we will grasp the opportunities that technology brings while remaining committed to strengthening safeguards.”