Resources

An Eye-Scanning Lie Detector Is Forging a Dystopian Future

Sitting in front of a Converus EyeDetect station, it’s impossible not to think of Blade Runner. In the 1982 sci-fi classic, Harrison Ford’s rumpled detective identifies artificial humans using a steam-punk Voight-Kampff device that watches their eyes while they answer surreal questions. EyeDetect’s questions are less philosophical, and the penalty for failure is less fatal (Ford’s character would whip out a gun and shoot). But the basic idea is the same: By capturing imperceptible changes in a participant’s eyes — measuring things like pupil dilation and reaction time — the device aims to sort deceptive humanoids from genuine ones.

It claims to be, in short, a next-generation lie detector. Polygraph tests are a $2 billion industry in the US and, despite their inaccuracy, are widely used to screen candidates for government jobs. Released in 2014 by Converus, a Mark Cuban-funded startup, EyeDetect is pitched by its makers as a faster, cheaper, and more accurate alternative to the notoriously unreliable polygraph. By many measures, EyeDetect appears to be the future of lie detection — and it’s already being used by local and federal agencies to screen job applicants.

In documents obtained through public records requests, Converus says that the Defense Intelligence Agency and the US Customs and Border Protection are also trialing the technology. Converus says that individual locations of Best Western, FedEx, Four Points by Sheraton, McDonald’s, and IHOP chains have used the tech in Guatemala and Panama within the last three years. (A 1988 federal law prohibits most private companies from using any kind of lie detector on staff or recruits in America.) WIRED reached out to all five companies, but none were able to confirm that they had used EyeDetect.

Companies ‘can sack workers for refusing to use fingerprint scanners’

Businesses using fingerprint scanners to monitor their workforce can legally sack employees who refuse to hand over biometric information on privacy grounds, the Fair Work Commission has ruled.

The ruling, which will be appealed, was made in the case of Jeremy Lee, a Queensland sawmill worker who refused to comply with a new fingerprint scanning policy introduced at his work in Imbil, north of the Sunshine Coast, late last year.

Fingerprint scanning was used to monitor the clock-on and clock-off times of about 150 sawmill workers at two sites and was preferred to swipe cards because it prevented workers from fraudulently signing in on behalf of their colleagues to mask absences.

The company, Superior Woods, had no privacy policy covering workers and failed to comply with a requirement to properly notify individuals about how and why their data was being collected and used. The biometric data was stored on servers located off-site, in space leased from a third party.

Lee argued the business had never sought its workers’ consent to use fingerprint scanning, and feared his biometric data would be accessed by unknown groups and individuals.

“I am unwilling to consent to have my fingerprints scanned because I regard my biometric data as personal and private,” Lee wrote to his employer last November.

“Information technology companies gather as much information/data on people as they can.

“Whether they admit to it or not. (See Edward Snowden) Such information is used as currency between corporations.”

Lee was neither antagonistic or belligerent in his refusals, according to evidence before the commission. He simply declined to have his fingerprints scanned and continued using a physical sign-in booklet to record his attendance.

He had not missed a shift in more than three years.

The employer warned him about his stance repeatedly, and claimed the fingerprint scanner did not actually record a fingerprint, but rather “a set of data measurements which is processed via an algorithm”. The employer told Lee there was no way the data could be “converted or used as a finger print”, and would only be used to link to his payroll number to his clock-on and clock-off time. It said the fingerprint scanners were also needed for workplace safety, to accurately identify which workers were on site in the event of an accident.

Lee was given a final warning in January, and responded that he valued his job a “great deal” and wanted to find an alternative way to record his attendance.

“I would love to continue to work for Superior Wood as it is a good, reliable place to work,” he wrote to his employer. “However, I do not consent to my biometric data being taken. The reason for writing this letter is to impress upon you that I am in earnest and hope there is a way we can negotiate a satisfactory outcome.”

Lee was sacked in February, and lodged an unfair dismissal claim in the Fair Work Commission.

He argued he was sacked for failing to comply with an unreasonable direction, because the fingerprint scanning was in breach of Australian privacy laws. His biometric information was sent to a separate corporate entity that was not his employer, Lee argued. His employer had no privacy policy in place at the time, and he argued it had failed to issue a privacy collection notice to its employees, as required by law. Lee argued the company had effectively breached the privacy of its 150 workers twice a day, every day since fingerprint scanning was introduced.

But the unfair dismissal claim failed. The Fair Work Commission found the site attendance policy that Lee had breached was lawful. It found that although the company may have breached privacy laws, the site-attendance policy was not automatically rendered unlawful as it related to Lee.

“While there may have been a breach of the Privacy Act relevant to the notice given to employees, the private and sensitive information was not collected and would never be collected relevant to Mr Lee because of his steadfast refusal,” the commission found. “The policy itself is not unlawful, simply the manner in which the employer went about trying to obtain consent may have constituted a breach of the Privacy Act.”

Lee told Guardian Australia he planned to appeal. He said the ruling implied that Australians only owned their biometric data until an employer demanded it, at which point they could be sacked if they refused to consent.

“My biometric data is inherently mine and inseparable from me,” Lee said. “My employer can’t demand it or sack me for refusing to give it.”

“It’s not about this particular employer. Ownership to me means that I can refuse consent without being sacked.”

Fake fingerprints can imitate real ones in biometric systems

Researchers have used a neural network to generate artificial fingerprints that work as a “master key” for biometric identification systems and prove fake fingerprints can be created.

According to a paper presented at a security conference in Los Angeles, the artificially generated fingerprints, dubbed “DeepMasterPrints” by the researchers from New York University, were able to imitate more than one in five fingerprints in a biometric system that should only have an error rate of one in a thousand.

The researchers, led by NYU’s Philip Bontrager, say that “the underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis.” As with much security research, demonstrating flaws in existing authentication systems is considered to be an important part of developing more secure replacements in the future.

In order to work, the DeepMasterPrints take advantage of two properties of fingerprint-based authentication systems. The first is that, for ergonomic reasons, most fingerprint readers do not read the entire finger at once, instead imaging whichever part of the finger touches the scanner.

Crucially, such systems do not blend all the partial images in order to compare the full finger against a full record; instead, they simply compare the partial scan against the partial records. That means that an attacker has to match just one of tens or hundreds of saved partial fingerprint in order to be granted access.

The second is that some features of fingerprints are more common than others. That means that a fake print that contains a lot of very common features is more likely to match with other fingerprints than pure chance would suggest.

Based on those insights, the researchers used a common machine learning technique, called a generative adversarial network, to artificially create new fingerprints that matched as many partial fingerprints as possible.

The neural network not only allowed them to create multiple fingerprint images, it also created fakes which look convincingly like a real fingerprint to a human eye – an improvement on a previous technique, which created jagged, right-angled fingerprints that would fool a scanner but not a visual inspection.

They compare the method to a “dictionary attack” against passwords, where a hacker runs a pre-generated list of common passwords against a security system.

Such attacks may not be able to break into any specific account, but when used against accounts at scale, they generate enough successes to be worth the effort.

Banks and Retailers Are Tracking How You Type, Swipe and Tap

When you’re browsing a website and the mouse cursor disappears, it might be a computer glitch — or it might be a deliberate test to find out who you are.

The way you press, scroll and type on a phone screen or keyboard can be as unique as your fingerprints or facial features. To fight fraud, a growing number of banks and merchants are tracking visitors’ physical movements as they use websites and apps.

The data collection is invisible to those being watched. Using sensors in your phone or code on websites, companies can gather thousands of data points, known as “behavioral biometrics.”
 


A phone’s touchscreen sensors can track where and how you swipe your device to help determine who you are.

 


The angle at which you hold your device is one of the many biometric markers that can be measured.

 

Behavioral monitoring software churns through thousands of elements to calculate a probability-based guess about whether a person is who they claim. Two major advances have fed its growing use: the availability of cheap computing power and the sophisticated array of sensors now built into most smartphones.

The system’s unobtrusiveness is part of its appeal, Mr. Hanley said. Traditional physical biometrics, like fingerprints or irises, require special scanning hardware for authentication. But behavioral traits can be captured in the background, without customers doing anything to sign up.

BioCatch occasionally tries to elicit a reaction. It can speed up the selection wheel you use to enter data like dates and times on your phone, or make your mouse cursor disappear for a fraction of a second.

“Everyone reacts a little differently to that,” said Frances Zelazny, BioCatch’s chief strategy and marketing officer. “Some people move the mouse side to side; some people move it up and down. Some bang on the keyboard.”

Because your reaction is so individual, it’s hard for a fraudulent user to fake. And because customers never know the monitoring technology is there, it doesn’t impose the kind of visible, and irritating, roadblocks that typically accompany security tests. You don’t need to press your thumb on your phone’s fingerprint reader or type in an authentication code.
 


Biometric software can also determine the pressure you tend to apply to your phone when you tap and type.

“We don’t have to sit people down in a room and get them to type under perfect laboratory conditions,” said Neil Costigan, the chief executive of BehavioSec, a Palo Alto, Calif., company that makes software used by many Nordic banks. “You just watch them, silently, while they go about their normal account activities.”

UK Police Plan To Deploy ‘Staggeringly Inaccurate’ Facial Recognition in London

Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns. The controversial software, which officers use to identify suspects, has been found to be “staggeringly inaccurate”, while campaigners have branded its use a violation of privacy. But Britain’s largest police force is set to expand a trial across six locations in London over the coming months.

Police leaders claimed officers make the decision to act on potential matches with police records and images that do not spark an alert are immediately deleted. But last month The Independent revealed the Metropolitan Police’s software was returning “false positives” — images of people who were not on a police database — in 98 percent of alerts… Detective Superintendent Bernie Galopin, the lead on facial recognition for London’s Metropolitan Police, said the operation was targeting wanted suspects to help reduce violent crime and make the area safer. “It allows us to deal with persons that are wanted by police where traditional methods may have failed,” he told The Independent, after statistics showed police were failing to solve 63 per cent of knife crimes committed against under-25s….

Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing, which is currently being stretched by budget cuts, falling officer numbers, rising demand and the terror threat.

A policy officer from the National Council for Civil Liberties called the technology “lawless,” adding “the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society.”

But a Home Office minister said the technology was vital for protecting people from terrorism, though “we must ensure that privacy is respected. This strategy makes clear that we will grasp the opportunities that technology brings while remaining committed to strengthening safeguards.”

New York high school will use CCTV and facial recognition to enforce discipline

Next year, high schools in Lockport New York will use the “Aegis” CCTV and facial recognition system to track and record the interactions of students suspected of code of conduct violations, keeping a ledger of who speaks to whom, where, and for how long.

The record will be used to assemble evidence against students and identify possible accomplices to ascribe guilt to.

Lockport Superintendent Michelle T. Bradley justified the decision by noting, “We always have to be on our guard. We can’t let our guard down.”

Lockport will be the first school district in the world to subject its students to this kind of surveillance. The program will cost $1.4m in state money. The technology supplier is SN Technologies of Ganonoque, Ont., one of the companies in the vicinity of Kingston, Ontario, home to the majority of the province’s detention centers.

The Lockport district says that the system will make students safer by alerting officials if someone on a sex-offender registry or terrorist watchlist enters the property. None of America’s school shootings or high-profile serial sex abuse scandals were carried out by wanted terrorists or people on the sex-offender registry.

Deployed law-enforcement facial recognition systems have failure rates of 98%. The vendor responsible for Aegis would not disclose how they improved on the state of the art, but insisted that their product worked “99.97% of the time.” The spokesperson would not disclose any of the workings of the system, seemingly believing that doing so was antithetical to security.

Japan researchers warn of fingerprint theft from ‘peace’ sign, selfies

“Could flashing the “peace” sign in photos lead to fingerprint data being stolen? Research by a team at Japan’s National Institute of Informatics (NII) says so, raising alarm bells over the popular two-fingered pose. Fingerprint recognition technology is becoming widely available to verify identities, such as when logging on to smartphones, tablets and laptop computers. But the proliferation of mobile devices with high-quality cameras and social media sites where photographs can be easily posted is raising the risk of personal information being leaked, reports said. The NII researchers were able to copy fingerprints based on photos taken by a digital camera three meters (nine feet) away from the subject.”

Fingerprints to be tested as ‘currency’

“Starting this summer, the [Japanese] government will test a system in which foreign tourists will be able to verify their identities and buy things at stores using only their fingerprints.

The government hopes to increase the number of foreign tourists by using the system to prevent crime and relieve users from the necessity of carrying cash or credit cards. It aims to realize the system by the 2020 Tokyo Olympic and Paralympic Games.

The experiment will have inbound tourists register their fingerprints and other data, such as credit card information, at airports and elsewhere.

Tourists would then be able to conduct tax exemption procedures and make purchases after verifying their identities by placing two fingers on special devices installed at stores.”