Resources

Beijing Bus Drivers Have Been Told To Wear Wristbands To Monitor Their Emotions

The move was initiated by the state-run Beijing Public Transport Holding Group, which says it is aimed at protecting public safety. But legal experts have raised privacy concerns and say the wristbands could cause bus drivers undue distress and potentially lead to discrimination. Some 1,800 wristbands were distributed to bus drivers on cross-province and highway routes on Wednesday, the official Beijing Daily reported. It is unclear how many drivers will be required to wear the devices. The report said they would be used to monitor the drivers’ vital signs and emotional state in real time to improve safety.

247

A Government Watchdog May Have Missed Clearview AI Use By Five Federal Agencies

A government inquiry into federal agencies’ deployment of facial recognition may have overlooked some organizations’ use of popular biometric identification software Clearview AI, calling into question whether authorities can understand the extent to which the emerging technology has been used by taxpayer-funded entities. In a 92-page report published by the Government Accountability Office on Tuesday, five agencies — the US Capitol Police, the US Probation Office, the Pentagon Force Protection Agency, Transportation Security Administration, and the Criminal Investigation Division at the Internal Revenue Service — said they didn’t use Clearview AI between April 2018 and March 2020. This, however, contradicts internal Clearview data previously reviewed by BuzzFeed News.

In April, BuzzFeed News revealed that those five agencies were among more than 1,800 US taxpayer-funded entities that had employees who tried or used Clearview AI, based on internal company data. As part of that story, BuzzFeed News published a searchable table disclosing all the federal, state, and city government organizations whose employees are listed in the data as having used the facial recognition software as of February 2020. While the GAO was tasked with “review[ing] federal law enforcement use of facial recognition technology,” the discrepancies between the report, which was based on survey responses and BuzzFeed News’ past reporting, suggest that even the US government may not be equipped to track how its own agencies access to surveillance tools like Clearview. The GAO report surveyed 42 federal agencies in total, 20 of which reported that they either owned their own facial recognition system or used one developed by a third party between April 2018 and March 2020. Ten federal agencies — including Immigration and Customs Enforcement and Customs and Border Protection — said they specifically used Clearview AI.

440

Microsoft Also Patented Tech to Score Meetings Using Filmed Body Language, Facial Expressions

Newly surfaced Microsoft patent filings describe a system for deriving and predicting “overall quality scores” for meetings using data such as body language, facial expressions, room temperature, time of day, and number of people in the meeting. The system uses cameras, sensors, and software tools to determine, for example, “how much a participant contributes to a meeting vs performing other tasks (e.g., texting, checking email, browsing the Internet).”

The “meeting insight computing system” would then predict the likelihood that a group will hold a high-quality meeting. It would flag potential challenges when an organizer is setting the meeting up, and recommend alternative venues, times, or people to include in the meeting, for example… A patent application made public Nov. 12 notes, “many organizations are plagued by overly long, poorly attended, and recurring meetings that could be modified and/or avoided if more information regarding meeting quality was available.” The approach would apply to in-person and virtual meetings, and hybrids of the two…

The filings do not detail any potential privacy safeguards. A Microsoft spokesperson declined to comment on the patent filings in response to GeekWire’s inquiry. To be sure, patents are not products, and there’s no sign yet that Microsoft plans to roll out this hypothetical system. Microsoft has established an internal artificial intelligence ethics office and a companywide committee to ensure that its AI products live by its principles of responsible AI, including transparency and privacy. However, the filings are a window into the ideas floating around inside Microsoft, and they’re consistent with the direction the company is already heading.

566

Amazon is looking into tech that can identify you using the veins in your hand

Amazon filed a patent for technology that could identify you by scanning the wrinkles in the palm of your hand and by using a light to see beneath your skin to your blood vessels. The resulting images could be used to identify you as a shopper at Amazon Go stores. It was previously reported that the Seattle-based tech giant might install these hi-tech scanners in Whole Foods grocery stores. However, the U.S. Patent and Trademark Office published an application on Thursday that suggests the e-commerce behemoth sets its sites on Amazon Go stores…

While fingerprint scanners have been around for years, Amazon hopes to innovate by developing a personal identification system that you don’t have to touch. Imagine hovering your hand in front of an infrared light as a camera snaps two images — one from the surface, and one that looks for “deeper characteristics such as veins.” An internal computer system would then identify you based on that information.

660

NYPD Kept an Illegal Database of Juvenile Fingerprints For Years

For years, the New York Police Department illegally maintained a database containing the fingerprints of thousands of children charged as juvenile delinquents–in direct violation of state law mandating that police destroy these records after turning them over to the state’s Division of Criminal Justice Services. When lawyers representing some of those youths discovered the violation, the police department dragged its feet, at first denying but eventually admitting that it was retaining prints it was supposed to have destroyed. Since 2015, attorneys with the Legal Aid Society, which represents the majority of youths charged in New York City family courts, had been locked in a battle with the police department over retention of the fingerprint records of children under the age of 16. The NYPD did not answer questions from The Intercept about its handling of the records, but according to Legal Aid, the police department confirmed to the organization last week that the database had been destroyed. To date, the department has made no public admission of wrongdoing, nor has it notified the thousands of people it impacted, although it has changed its fingerprint retention practices following Legal Aid’s probing. “The NYPD can confirm that the department destroys juvenile delinquent fingerprints after the prints have been transmitted to DCJS,” a police spokesperson wrote in a statement to The Intercept.

Still, the way the department handled the process–resisting transparency and stalling even after being threatened with legal action–raises concerns about how police handle a growing number of databases of personal information, including DNA and data obtained through facial recognition technology. As The Intercept has reported extensively, the NYPD also maintains a secretive and controversial “gang database,” which labels thousands of unsuspecting New Yorkers–almost all black or Latino youth–as “gang members” based on a set of broad and arbitrary criteria. The fact that police were able to violate the law around juvenile fingerprints for years without consequence underscores the need for greater transparency and accountability, which critics say can only come from independent oversight of the department.

It’s unclear how long the NYPD was illegally retaining these fingerprints, but the report says the state has been using the Automated Fingerprint Identification System since 1989, “and laws protecting juvenile delinquent records have been in place since at least 1977.” Legal Aid lawyers estimate that tens of thousands of juveniles could have had their fingerprints illegally retained by police.

699

Vimeo Sued For Storing Faceprints of People Without Their Consent

Vimeo is collecting and storing thousands of people’s facial biometrics without their permission or knowledge, according to a complaint filed on September 20 on behalf of potentially thousands of plaintiffs under the Illinois Biometric Information Privacy Act (BIPA).

The suit takes aim at Vimeo’s Magisto application: a short-form video creation platform purchased by Vimeo in April 2019 that uses facial recognition to automatically index the faces of people in videos so they can be face-tagged. BIPA bans collecting and storing biometric data without explicit consent, including “faceprints.” The complaint against Vimeo claims that users of Magisto “upload millions of videos and/or photos per day, making videos and photographs a vital part of the Magisto experience.”

The complaint maintains that unbeknownst to the average consumer, Magisto scans “each and every video and photo uploaded to Magisto for faces” and analyzes “biometric identifiers,” including facial geometry, to “create and store a template for each face.” That template is later used to “organize and group together videos based upon the particular individuals appearing in the videos” by “comparing the face templates of individuals who appear in newly-edited videos or photos with the facial templates already saved in Magisto’s face database.”

The complaint also asserts that Magisto analyzes and face-matches the biometrics of non-Magisto users who happen to appear in the photos and videos, which is a violation of BIPA.

662

A Researcher Attempted To Opt Out of Facial Recognition at the Airport — It Wasn’t Easy

The announcement came as we began to board. Last month, I was at Detroit’s Metro Airport for a connecting flight to Southeast Asia. I listened as a Delta Air Lines staff member informed passengers that the boarding process would use facial recognition instead of passport scanners. As a privacy-conscious person, I was uncomfortable boarding this way. I also knew I could opt out. Presumably, most of my fellow fliers did not: I didn’t hear a single announcement alerting passengers how to avoid the face scanners.

To figure out how to do so, I had to leave the boarding line, speak with a Delta representative at their information desk, get back in line, then request a passport scan when it was my turn to board. Federal agencies and airlines claim that facial recognition is an opt-out system, but my recent experience suggests they are incentivizing travelers to have their faces scanned — and disincentivizing them to sidestep the tech — by not clearly communicating alternative options. Last year, a Delta customer service representative reported that only 2 percent of customers opt out of facial-recognition. It’s easy to see why.

745

Phones Can Now Tell Who Is Carrying Them From Their Users’ Gaits

Most online fraud involves identity theft, which is why businesses that operate on the web have a keen interest in distinguishing impersonators from genuine customers. Passwords help. But many can be guessed or are jotted down imprudently. Newer phones, tablets, and laptop and desktop computers often have beefed-up security with fingerprint and facial recognition. But these can be spoofed. To overcome these shortcomings the next level of security is likely to identify people using things which are harder to copy, such as the way they walk. Many online security services already use a system called device fingerprinting. This employs software to note things like the model type of a gadget employed by a particular user; its hardware configuration; its operating system; the apps which have been downloaded onto it; and other features, including sometimes the Wi-Fi networks it regularly connects through and devices like headsets it plugs into.

LexisNexis Risk Solutions, an American analytics firm, has catalogued more than 4 billion phones, tablets and other computers in this way for banks and other clients. Roughly 7% of them have been used for shenanigans of some sort. But device fingerprinting is becoming less useful. Apple, Google and other makers of equipment and operating systems have been steadily restricting the range of attributes that can be observed remotely. That is why a new approach, behavioral biometrics, is gaining ground. It relies on the wealth of measurements made by today’s devices. These include data from accelerometers and gyroscopic sensors, that reveal how people hold their phones when using them, how they carry them and even the way they walk. Touchscreens, keyboards and mice can be monitored to show the distinctive ways in which someone’s fingers and hands move. Sensors can detect whether a phone has been set down on a hard surface such as a table or dropped lightly on a soft one such as a bed. If the hour is appropriate, this action could be used to assume when a user has retired for the night. These traits can then be used to determine whether someone attempting to make a transaction is likely to be the device’s habitual user.

If used wisely, the report says behavioral biometrics could be used to authenticate account-holders without badgering them for additional passwords or security questions; it could even be used for unlocking the doors of a vehicle once the gait of the driver, as measured by his phone, is recognized, for example.

“Used unwisely, however, the system could become yet another electronic spy, permitting complete strangers to monitor your actions, from the moment you reach for your phone in the morning, to when you fling it on the floor at night,” the report adds.

693

Facial Recognition to board a plane

A boarding technology for travelers using JetBlue is causing controversy due to a social media thread on the airline’s use of facial recognition. Last week, traveler MacKenzie Fegan described her experience with the biometric technology in a social media post that got the attention of JetBlue’s official account. She began: “I just boarded an international @JetBlue flight. Instead of scanning my boarding pass or handing over my passport, I looked into a camera before being allowed down the jet bridge. Did facial recognition replace boarding passes, unbeknownst to me? Did I consent to this?” JetBlue was ready to offer Twitterized sympathy: “You’re able to opt out of this procedure, MacKenzie. Sorry if this made you feel uncomfortable.”

But once you start thinking about these things, your thoughts become darker. Fegan wanted to know how JetBlue knew what she looked like. JetBlue explained: “The information is provided by the United States Department of Homeland Security from existing holdings.” Fegan wondered by what right a private company suddenly had her bioemtric data. JetBlue insisted it doesn’t have access to the data. It’s “securely transmitted to the Customs and Border Protection database.” Fegan wanted to know how this could have possibly happened so quickly. Could it be that in just a few seconds her biometric data was whipped “securely” around government departments so that she would be allowed on the plane? JetBlue referred her to an article on the subject, which was a touch on the happy-PR side. Fegan was moved, but not positively, by the phrase “there is no pre-registration required.”

746

Tenants Outraged Over New York Landlord’s Plan To Install Facial Recognition Technology

A Brooklyn landlord plans to install facial recognition technology at the entrance of a 700-unit building, according to Gothamist, “raising alarm among tenants and housing rights attorneys about what they say is a far-reaching and egregious form of digital surveillance.”

[Last] Sunday, several tenants told Gothamist that, unbeknownst to them, their landlord, Nelson Management, had sought state approval in July 2018 to install a facial recognition system known as StoneLock. Under state rules, landlords of rent-regulated apartments built before 1974 must seek permission from the state’s Homes and Community Renewal (HCR) for any “modification in service.” Tenants at the two buildings, located at 249 Thomas S. Boyland Street and 216 Rockaway Avenue, said they began receiving notices about the system in the fall. According to its website, Kansas-based company StoneLock offers a “frictionless” entry system that collects biometric data based on facial features. “We don’t want to be tracked,” said Icemae Downes, a longtime tenant. “We are not animals. This is like tagging us through our faces because they can’t implant us with a chip.”

It is not clear how many New York City apartments are using facial scanning software or how such technology is being regulated. But in a sign of the times, the city’s Department of Housing Preservation and Development last June began marketing 107 affordable units at a new apartment complex in the South Bronx. Among the amenities listed was “State of the Art Facial Recognition Building Access….” Across the real estate industry, New York City landlords have increasingly been moving to keyless entry systems, citing convenience as well as a desire to offer enhanced security. Over the years, in response to appeals filed by tenants, HCR has ruled in favor of key fob and card entry systems, saying that such substitutions did not violate rent-stabilization and rent-control laws. But the latest technology has triggered even more concerns about the ethics of data collection….

Last month, the management company reached out to a group of tenants to assuage their concerns about StoneLock. But tenants said the presentation, if anything, only deepened their fears that they were being asked to submit to a technology that had very little research behind it.

“This was not something we asked for at any given time,” one tenant complaint, while one of the attorneys representing the tenants said that, among other things, their landlord had “made no assurances to protect the data from being accessed by NYPD, ICE, or any other city, state, or federal agency.”

“Citing concerns over the potential for privacy and civil liberties violations, tenants at Brownsville’s Atlantic Plaza Towers filed an objection to the plan in January…”

724

Prisons Across the United States Are Quietly Building Databases of Incarcerated People’s Voice Prints

In New York and other states across the country, authorities are acquiring technology to extract and digitize the voices of incarcerated people into unique biometric signatures, known as voice prints.

Prison authorities have quietly enrolled hundreds of thousands of incarcerated people’s voice prints into large-scale biometric databases. Computer algorithms then draw on these databases to identify the voices taking part in a call and to search for other calls in which the voices of interest are detected. Some programs, like New York’s, even analyze the voices of call recipients outside prisons to track which outsiders speak to multiple prisoners regularly.

Corrections officials representing the states of Texas, Florida, and Arkansas, along with Arizona’s Yavapai and Pinal counties; Alachua County, Florida; and Travis County, Texas, also confirmed that they are actively using voice recognition technology today. And a review of contracting documents identified other jurisdictions that have acquired similar voice-print capture capabilities: Connecticut and Georgia state corrections officials have signed contracts for the technology

Authorities and prison technology companies say this mass biometric surveillance supports prison security and fraud prevention efforts. But civil liberties advocates argue that the biometric buildup has been neither transparent nor consensual. Some jurisdictions, for example, limit incarcerated people’s phone access if they refuse to enroll in the voice recognition system, while others enroll incarcerated people without their knowledge. Once the data exists, they note, it could potentially be used by other agencies, without any say from the public.

721

An Eye-Scanning Lie Detector Is Forging a Dystopian Future

Sitting in front of a Converus EyeDetect station, it’s impossible not to think of Blade Runner. In the 1982 sci-fi classic, Harrison Ford’s rumpled detective identifies artificial humans using a steam-punk Voight-Kampff device that watches their eyes while they answer surreal questions. EyeDetect’s questions are less philosophical, and the penalty for failure is less fatal (Ford’s character would whip out a gun and shoot). But the basic idea is the same: By capturing imperceptible changes in a participant’s eyes — measuring things like pupil dilation and reaction time — the device aims to sort deceptive humanoids from genuine ones.

It claims to be, in short, a next-generation lie detector. Polygraph tests are a $2 billion industry in the US and, despite their inaccuracy, are widely used to screen candidates for government jobs. Released in 2014 by Converus, a Mark Cuban-funded startup, EyeDetect is pitched by its makers as a faster, cheaper, and more accurate alternative to the notoriously unreliable polygraph. By many measures, EyeDetect appears to be the future of lie detection — and it’s already being used by local and federal agencies to screen job applicants.

In documents obtained through public records requests, Converus says that the Defense Intelligence Agency and the US Customs and Border Protection are also trialing the technology. Converus says that individual locations of Best Western, FedEx, Four Points by Sheraton, McDonald’s, and IHOP chains have used the tech in Guatemala and Panama within the last three years. (A 1988 federal law prohibits most private companies from using any kind of lie detector on staff or recruits in America.) WIRED reached out to all five companies, but none were able to confirm that they had used EyeDetect.

799

Companies ‘can sack workers for refusing to use fingerprint scanners’

Businesses using fingerprint scanners to monitor their workforce can legally sack employees who refuse to hand over biometric information on privacy grounds, the Fair Work Commission has ruled.

The ruling, which will be appealed, was made in the case of Jeremy Lee, a Queensland sawmill worker who refused to comply with a new fingerprint scanning policy introduced at his work in Imbil, north of the Sunshine Coast, late last year.

Fingerprint scanning was used to monitor the clock-on and clock-off times of about 150 sawmill workers at two sites and was preferred to swipe cards because it prevented workers from fraudulently signing in on behalf of their colleagues to mask absences.

The company, Superior Woods, had no privacy policy covering workers and failed to comply with a requirement to properly notify individuals about how and why their data was being collected and used. The biometric data was stored on servers located off-site, in space leased from a third party.

Lee argued the business had never sought its workers’ consent to use fingerprint scanning, and feared his biometric data would be accessed by unknown groups and individuals.

“I am unwilling to consent to have my fingerprints scanned because I regard my biometric data as personal and private,” Lee wrote to his employer last November.

“Information technology companies gather as much information/data on people as they can.

“Whether they admit to it or not. (See Edward Snowden) Such information is used as currency between corporations.”

Lee was neither antagonistic or belligerent in his refusals, according to evidence before the commission. He simply declined to have his fingerprints scanned and continued using a physical sign-in booklet to record his attendance.

He had not missed a shift in more than three years.

The employer warned him about his stance repeatedly, and claimed the fingerprint scanner did not actually record a fingerprint, but rather “a set of data measurements which is processed via an algorithm”. The employer told Lee there was no way the data could be “converted or used as a finger print”, and would only be used to link to his payroll number to his clock-on and clock-off time. It said the fingerprint scanners were also needed for workplace safety, to accurately identify which workers were on site in the event of an accident.

Lee was given a final warning in January, and responded that he valued his job a “great deal” and wanted to find an alternative way to record his attendance.

“I would love to continue to work for Superior Wood as it is a good, reliable place to work,” he wrote to his employer. “However, I do not consent to my biometric data being taken. The reason for writing this letter is to impress upon you that I am in earnest and hope there is a way we can negotiate a satisfactory outcome.”

Lee was sacked in February, and lodged an unfair dismissal claim in the Fair Work Commission.

He argued he was sacked for failing to comply with an unreasonable direction, because the fingerprint scanning was in breach of Australian privacy laws. His biometric information was sent to a separate corporate entity that was not his employer, Lee argued. His employer had no privacy policy in place at the time, and he argued it had failed to issue a privacy collection notice to its employees, as required by law. Lee argued the company had effectively breached the privacy of its 150 workers twice a day, every day since fingerprint scanning was introduced.

But the unfair dismissal claim failed. The Fair Work Commission found the site attendance policy that Lee had breached was lawful. It found that although the company may have breached privacy laws, the site-attendance policy was not automatically rendered unlawful as it related to Lee.

“While there may have been a breach of the Privacy Act relevant to the notice given to employees, the private and sensitive information was not collected and would never be collected relevant to Mr Lee because of his steadfast refusal,” the commission found. “The policy itself is not unlawful, simply the manner in which the employer went about trying to obtain consent may have constituted a breach of the Privacy Act.”

Lee told Guardian Australia he planned to appeal. He said the ruling implied that Australians only owned their biometric data until an employer demanded it, at which point they could be sacked if they refused to consent.

“My biometric data is inherently mine and inseparable from me,” Lee said. “My employer can’t demand it or sack me for refusing to give it.”

“It’s not about this particular employer. Ownership to me means that I can refuse consent without being sacked.”

745

Fake fingerprints can imitate real ones in biometric systems

Researchers have used a neural network to generate artificial fingerprints that work as a “master key” for biometric identification systems and prove fake fingerprints can be created.

According to a paper presented at a security conference in Los Angeles, the artificially generated fingerprints, dubbed “DeepMasterPrints” by the researchers from New York University, were able to imitate more than one in five fingerprints in a biometric system that should only have an error rate of one in a thousand.

The researchers, led by NYU’s Philip Bontrager, say that “the underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis.” As with much security research, demonstrating flaws in existing authentication systems is considered to be an important part of developing more secure replacements in the future.

In order to work, the DeepMasterPrints take advantage of two properties of fingerprint-based authentication systems. The first is that, for ergonomic reasons, most fingerprint readers do not read the entire finger at once, instead imaging whichever part of the finger touches the scanner.

Crucially, such systems do not blend all the partial images in order to compare the full finger against a full record; instead, they simply compare the partial scan against the partial records. That means that an attacker has to match just one of tens or hundreds of saved partial fingerprint in order to be granted access.

The second is that some features of fingerprints are more common than others. That means that a fake print that contains a lot of very common features is more likely to match with other fingerprints than pure chance would suggest.

Based on those insights, the researchers used a common machine learning technique, called a generative adversarial network, to artificially create new fingerprints that matched as many partial fingerprints as possible.

The neural network not only allowed them to create multiple fingerprint images, it also created fakes which look convincingly like a real fingerprint to a human eye – an improvement on a previous technique, which created jagged, right-angled fingerprints that would fool a scanner but not a visual inspection.

They compare the method to a “dictionary attack” against passwords, where a hacker runs a pre-generated list of common passwords against a security system.

Such attacks may not be able to break into any specific account, but when used against accounts at scale, they generate enough successes to be worth the effort.

747

Banks and Retailers Are Tracking How You Type, Swipe and Tap

When you’re browsing a website and the mouse cursor disappears, it might be a computer glitch — or it might be a deliberate test to find out who you are.

The way you press, scroll and type on a phone screen or keyboard can be as unique as your fingerprints or facial features. To fight fraud, a growing number of banks and merchants are tracking visitors’ physical movements as they use websites and apps.

The data collection is invisible to those being watched. Using sensors in your phone or code on websites, companies can gather thousands of data points, known as “behavioral biometrics.”
 


A phone’s touchscreen sensors can track where and how you swipe your device to help determine who you are.

 


The angle at which you hold your device is one of the many biometric markers that can be measured.

 

Behavioral monitoring software churns through thousands of elements to calculate a probability-based guess about whether a person is who they claim. Two major advances have fed its growing use: the availability of cheap computing power and the sophisticated array of sensors now built into most smartphones.

The system’s unobtrusiveness is part of its appeal, Mr. Hanley said. Traditional physical biometrics, like fingerprints or irises, require special scanning hardware for authentication. But behavioral traits can be captured in the background, without customers doing anything to sign up.

BioCatch occasionally tries to elicit a reaction. It can speed up the selection wheel you use to enter data like dates and times on your phone, or make your mouse cursor disappear for a fraction of a second.

“Everyone reacts a little differently to that,” said Frances Zelazny, BioCatch’s chief strategy and marketing officer. “Some people move the mouse side to side; some people move it up and down. Some bang on the keyboard.”

Because your reaction is so individual, it’s hard for a fraudulent user to fake. And because customers never know the monitoring technology is there, it doesn’t impose the kind of visible, and irritating, roadblocks that typically accompany security tests. You don’t need to press your thumb on your phone’s fingerprint reader or type in an authentication code.
 


Biometric software can also determine the pressure you tend to apply to your phone when you tap and type.

“We don’t have to sit people down in a room and get them to type under perfect laboratory conditions,” said Neil Costigan, the chief executive of BehavioSec, a Palo Alto, Calif., company that makes software used by many Nordic banks. “You just watch them, silently, while they go about their normal account activities.”

998

UK Police Plan To Deploy ‘Staggeringly Inaccurate’ Facial Recognition in London

Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns. The controversial software, which officers use to identify suspects, has been found to be “staggeringly inaccurate”, while campaigners have branded its use a violation of privacy. But Britain’s largest police force is set to expand a trial across six locations in London over the coming months.

Police leaders claimed officers make the decision to act on potential matches with police records and images that do not spark an alert are immediately deleted. But last month The Independent revealed the Metropolitan Police’s software was returning “false positives” — images of people who were not on a police database — in 98 percent of alerts… Detective Superintendent Bernie Galopin, the lead on facial recognition for London’s Metropolitan Police, said the operation was targeting wanted suspects to help reduce violent crime and make the area safer. “It allows us to deal with persons that are wanted by police where traditional methods may have failed,” he told The Independent, after statistics showed police were failing to solve 63 per cent of knife crimes committed against under-25s….

Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing, which is currently being stretched by budget cuts, falling officer numbers, rising demand and the terror threat.

A policy officer from the National Council for Civil Liberties called the technology “lawless,” adding “the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society.”

But a Home Office minister said the technology was vital for protecting people from terrorism, though “we must ensure that privacy is respected. This strategy makes clear that we will grasp the opportunities that technology brings while remaining committed to strengthening safeguards.”

857

New York high school will use CCTV and facial recognition to enforce discipline

Next year, high schools in Lockport New York will use the “Aegis” CCTV and facial recognition system to track and record the interactions of students suspected of code of conduct violations, keeping a ledger of who speaks to whom, where, and for how long.

The record will be used to assemble evidence against students and identify possible accomplices to ascribe guilt to.

Lockport Superintendent Michelle T. Bradley justified the decision by noting, “We always have to be on our guard. We can’t let our guard down.”

Lockport will be the first school district in the world to subject its students to this kind of surveillance. The program will cost $1.4m in state money. The technology supplier is SN Technologies of Ganonoque, Ont., one of the companies in the vicinity of Kingston, Ontario, home to the majority of the province’s detention centers.

The Lockport district says that the system will make students safer by alerting officials if someone on a sex-offender registry or terrorist watchlist enters the property. None of America’s school shootings or high-profile serial sex abuse scandals were carried out by wanted terrorists or people on the sex-offender registry.

Deployed law-enforcement facial recognition systems have failure rates of 98%. The vendor responsible for Aegis would not disclose how they improved on the state of the art, but insisted that their product worked “99.97% of the time.” The spokesperson would not disclose any of the workings of the system, seemingly believing that doing so was antithetical to security.

779

Japan researchers warn of fingerprint theft from ‘peace’ sign, selfies

“Could flashing the “peace” sign in photos lead to fingerprint data being stolen? Research by a team at Japan’s National Institute of Informatics (NII) says so, raising alarm bells over the popular two-fingered pose. Fingerprint recognition technology is becoming widely available to verify identities, such as when logging on to smartphones, tablets and laptop computers. But the proliferation of mobile devices with high-quality cameras and social media sites where photographs can be easily posted is raising the risk of personal information being leaked, reports said. The NII researchers were able to copy fingerprints based on photos taken by a digital camera three meters (nine feet) away from the subject.”

751

Fingerprints to be tested as ‘currency’

“Starting this summer, the [Japanese] government will test a system in which foreign tourists will be able to verify their identities and buy things at stores using only their fingerprints.

The government hopes to increase the number of foreign tourists by using the system to prevent crime and relieve users from the necessity of carrying cash or credit cards. It aims to realize the system by the 2020 Tokyo Olympic and Paralympic Games.

The experiment will have inbound tourists register their fingerprints and other data, such as credit card information, at airports and elsewhere.

Tourists would then be able to conduct tax exemption procedures and make purchases after verifying their identities by placing two fingers on special devices installed at stores.”

752