Archives 9 January 2020

Ring Fired Employees for Watching Customer Videos

Amazon-owned home security camera company Ring has fired employees for improperly accessing Ring users’ video data, Motherboard reported Wednesday, citing a letter the company wrote to Senators. The news highlights a risk across many different tech companies: employees may abuse access granted as part of their jobs to look at customer data or information. In Ring’s case this data can be particularly sensitive though, as customers often put the cameras inside their home. “We are aware of incidents discussed below where employees violated our policies,” the letter from Ring, dated January 6th, reads. “Over the last four years, Ring has received four complaints or inquiries regarding a team member’s access to Ring video data,” it continues. Ring explains that although each of these people were authorized to view video data, their attempted access went beyond what they needed to access for their job.

Companies Are Using AI-Generated People To Appear More “Diverse”

AI startups are selling images of computer-generated faces that look like the real thing, offering companies a chance to create imaginary models and “increase diversity” in their ads without needing human beings. One firm is offering to sell diverse photos for marketing brochures and has already signed up clients, including a dating app that intends to use the images in a chatbot. Another company says it’s moving past AI-generated headshots and into the generation of full, fake human bodies as early as this month. The AI software used to create such faces is freely available and improving rapidly, allowing small start-ups to easily create fakes that are so convincing they can fool the human eye. The systems train on massive databases of actual faces, then attempt to replicate their features in new designs. But AI experts worry that the fakes will empower a new generation of scammers, bots and spies, who could use the photos to build imaginary online personas, mask bias in hiring and damage efforts to bring diversity to industries. The fact that such software now has a business model could also fuel a greater erosion of trust across an Internet already under assault by disinformation campaigns, “deepfake” videos and other deceptive techniques.