Archives December 8, 2018

Facebook Privacy Social Networks Internal Emails Show Facebook Weighing the Privacy Risks of Quietly Collecting Call and Text Records From Its Android Users—Then Going Ahead Anyway

Earlier this year, many Android users were shocked to discover that Facebook had been collecting a record of their call and SMS history, as revealed by the company’s data download tool. Now, internal emails released by the UK Parliament show how the decision was made internally.

According to the emails, developers knew the data was sensitive, but they still pushed to collect it as a way of expanding Facebook’s reach. The emails show Facebook’s growth team looking to call log data as a way to improve Facebook’s algorithms as well as to locate new contacts through the “People You May Know” feature. Notably, the project manager recognized it as “a pretty high-risk thing to do from a PR perspective,” but that risk seems to have been overwhelmed by the potential user growth.

Initially, the feature was intended to require users to opt in, typically through an in-app pop-up dialog box. But as developers looked for ways to get users signed up, it became clear that Android’s data permissions could be manipulated to automatically enroll users if the new feature was deployed in a certain way.

Thieves Are Boosting the Signal From Key Fobs Inside Homes To Steal Vehicles

According to Markham automotive security specialist Jeff Bates, owner of Lockdown Security, wireless key fobs have a role to play in many recent car thefts, with thieves intercepting and rerouting their signals — even from inside homes — to open and steal cars. According to Bates, many of these thieves are using a method called “relay theft.” Key fobs are constantly broadcasting a signal that communicates with a specific vehicle, he said, and when it comes into a close enough range, the vehicle will open and start. The thief will bring a device close to the home’s door, close to where most keys are sitting, to boost the fob’s signal. They leave another device near the vehicle, which receives the signal and opens the car. Many people don’t realize it, Bates said, but the thieves don’t need the fob in the car to drive it away.

An Eye-Scanning Lie Detector Is Forging a Dystopian Future

Sitting in front of a Converus EyeDetect station, it’s impossible not to think of Blade Runner. In the 1982 sci-fi classic, Harrison Ford’s rumpled detective identifies artificial humans using a steam-punk Voight-Kampff device that watches their eyes while they answer surreal questions. EyeDetect’s questions are less philosophical, and the penalty for failure is less fatal (Ford’s character would whip out a gun and shoot). But the basic idea is the same: By capturing imperceptible changes in a participant’s eyes — measuring things like pupil dilation and reaction time — the device aims to sort deceptive humanoids from genuine ones.

It claims to be, in short, a next-generation lie detector. Polygraph tests are a $2 billion industry in the US and, despite their inaccuracy, are widely used to screen candidates for government jobs. Released in 2014 by Converus, a Mark Cuban-funded startup, EyeDetect is pitched by its makers as a faster, cheaper, and more accurate alternative to the notoriously unreliable polygraph. By many measures, EyeDetect appears to be the future of lie detection — and it’s already being used by local and federal agencies to screen job applicants.

In documents obtained through public records requests, Converus says that the Defense Intelligence Agency and the US Customs and Border Protection are also trialing the technology. Converus says that individual locations of Best Western, FedEx, Four Points by Sheraton, McDonald’s, and IHOP chains have used the tech in Guatemala and Panama within the last three years. (A 1988 federal law prohibits most private companies from using any kind of lie detector on staff or recruits in America.) WIRED reached out to all five companies, but none were able to confirm that they had used EyeDetect.

Google personalizes search results even when you’re logged out

According to a new study conducted by Google competitor DuckDuckGo, it does not seem possible to avoid personalization when using Google search, even by logging out of your Google account and using the private browsing “incognito” mode.

DuckDuckGo conducted the study in June of this year, at the height of the US midterm election season. It did so with the ostensible goal of confirming whether Google’s search results exacerbate ideological bubbles by feeding you only information you’ve signaled you want to consume via past behavior and the data collected about you. It’s not clear whether that question can be reliably answered with these findings, and it’s also obvious DuckDuckGo is a biased source with something to gain by pointing out how flawed Google’s approach may be. But the study’s findings are nonetheless interesting because they highlight just how much variance there are in Google search results, even when controlling for factors like location.