Resources

Taser Company Axon Is Selling AI That Turns Body Cam Audio Into Police Reports

Axon on Tuesday announced a new tool called Draft One that uses artificial intelligence built on OpenAI’s GPT-4 Turbo model to transcribe audio from body cameras and automatically turn it into a police report. Axon CEO Rick Smith told Forbes that police officers will then be able to review the document to ensure accuracy. From the report:
Axon claims one early tester of the tool, Fort Collins Colorado Police Department, has seen an 82% decrease in time spent writing reports. “If an officer spends half their day reporting, and we can cut that in half, we have an opportunity to potentially free up 25% of an officer’s time to be back out policing,” Smith said. These reports, though, are often used as evidence in criminal trials, and critics are concerned that relying on AI could put people at risk by depending on language models that are known to “hallucinate,” or make things up, as well as display racial bias, either blatantly or unconsciously.

“It’s kind of a nightmare,” said Dave Maass, surveillance technologies investigations director at the Electronic Frontier Foundation. “Police, who aren’t specialists in AI, and aren’t going to be specialists in recognizing the problems with AI, are going to use these systems to generate language that could affect millions of people in their involvement with the criminal justice system. What could go wrong?” Smith acknowledged there are dangers. “When people talk about bias in AI, it really is: Is this going to exacerbate racism by taking training data that’s going to treat people differently?” he told Forbes. “That was the main risk.”

Smith said Axon is recommending police don’t use the AI to write reports for incidents as serious as a police shooting, where vital information could be missed. “An officer-involved shooting is likely a scenario where it would not be used, and I’d probably advise people against it, just because there’s so much complexity, the stakes are so high.” He said some early customers are only using Draft One for misdemeanors, though others are writing up “more significant incidents,” including use-of-force cases. Axon, however, won’t have control over how individual police departments use the tools.

116

Facebook Accused of Watching Instagram Users Through Cameras

Facebook is again being sued for allegedly spying on Instagram users, this time through the unauthorized use of their mobile phone cameras. Bloomberg reports:
The lawsuit springs from media reports in July that the photo-sharing app appeared to be accessing iPhone cameras even when they weren’t actively being used. Facebook denied the reports and blamed a bug, which it said it was correcting, for triggering what it described as false notifications that Instagram was accessing iPhone cameras.

In the complaint filed Thursday in federal court in San Francisco, New Jersey Instagram user Brittany Conditi contends the app’s use of the camera is intentional and done for the purpose of collecting “lucrative and valuable data on its users that it would not otherwise have access to.” By “obtaining extremely private and intimate personal data on their users, including in the privacy of their own homes,” Instagram and Facebook are able to collect “valuable insights and market research,” according to the complaint.

559

Ring Fired Employees for Watching Customer Videos

Amazon-owned home security camera company Ring has fired employees for improperly accessing Ring users’ video data, Motherboard reported Wednesday, citing a letter the company wrote to Senators. The news highlights a risk across many different tech companies: employees may abuse access granted as part of their jobs to look at customer data or information. In Ring’s case this data can be particularly sensitive though, as customers often put the cameras inside their home. “We are aware of incidents discussed below where employees violated our policies,” the letter from Ring, dated January 6th, reads. “Over the last four years, Ring has received four complaints or inquiries regarding a team member’s access to Ring video data,” it continues. Ring explains that although each of these people were authorized to view video data, their attempted access went beyond what they needed to access for their job.

647

Xiaomi Camera Feed is Showing Random Homes on a Google Nest Hub, Including Still Images of Sleeping People

So-called “smart” security cameras have had some pretty dumb security problems recently, but a recent report regarding a Xiaomi Mijia camera linked to a Google Home is especially disturbing. One Xiaomi Mijia camera owner is getting still images from other random peoples’ homes when trying to stream content from his camera to a Google Nest Hub. The images include sills of people sleeping (even an infant in a cradle) inside their own homes. This issue was first reported by user /r/Dio-V on Reddit and affects his Xiaomi Mijia 1080p Smart IP Security Camera, which can be linked to a Google account for use with Google/Nest devices through Xiaomi’s Mi Home app/service. It isn’t clear when Dio-V’s feed first began showing these still images into random homes or how long the camera was connected to his account before this started happening. He does state that both the Nest Hub and the camera were purchased new. The camera was noted as running firmware version 3.5.1_00.66.

643

A Billion Surveillance Cameras Forecast To Be Watching Within Two Years

As governments and companies invest more in security networks, hundreds of millions more surveillance cameras will be watching the world in 2021, mostly in China, according to a new report. The report, from industry researcher IHS Market, to be released Thursday, said the number of cameras used for surveillance would climb above 1 billion by the end of 2021. That would represent an almost 30% increase from the 770 million cameras today. China would continue to account for a little over half the total. Fast-growing, populous nations such as India, Brazil and Indonesia would also help drive growth in the sector, the report said. IHS analyst Oliver Philippou said government programs to implement widespread video surveillance to monitor the public would be the biggest catalyst for the growth in China. City surveillance also was driving demand elsewhere.

632

Police Can Keep Ring Camera Video Forever, and Share With Whomever They’d Like

Police officers who download videos captured by homeowners’ Ring doorbell cameras can keep them forever and share them with whomever they’d like without providing evidence of a crime, the Amazon-owned firm told a lawmaker this month… Police in those communities can use Ring software to request up to 12 hours of video from anyone within half a square mile of a suspected crime scene, covering a 45-day time span, wrote Brian Huseman, Amazon’s vice president of public policy. Police are required to include a case number for the crime they are investigating, but not any other details or evidence related to the crime or their request.

Sen. Edward Markey, D-Mass., said in a statement that Ring’s policies showed that the company had failed to enact basic safeguards to protect Americans’ privacy. “Connected doorbells are well on their way to becoming a mainstay of American households, and the lack of privacy and civil rights protections for innocent residents is nothing short of chilling,” he said. “If you’re an adult walking your dog or a child playing on the sidewalk, you shouldn’t have to worry that Ring’s products are amassing footage of you and that law enforcement may hold that footage indefinitely or share that footage with any third parties.”

While Ring tells users not to film public roads are sidewalks, Ring isn’t enforcing that, according to the article. Amazon argues that that’s ultimately the user’s responsibility.

And will their cameras start using facial recognition algorithms? Amazon answers that that feature is “contemplated but unreleased,” though they add that “We do frequently innovate based on customer demand,” and point out that other competing security cameras are already offering facial-recognition.

626

Voice From ‘Nest’ Camera Threatens to Steal Baby

Jack Newcombe, the Chief Operating Officer of a syndication company with 44 million daily readers, describes the strange voice he heard talking to his 18-month old son:
She says we have a nice house and encourages the nanny to respond. She does not. The voice even jokes that she hopes we don’t change our password. I am sick to my stomach. After about five minutes of verbal “joy riding,” the voice starts to get agitated at the nanny’s lack of response and then snaps, in a very threatening voice: “I’m coming for the baby if you don’t answer me….” We unplug the cameras and change all passwords…

Still helpless, I started doing the only thing I could do — Googling. I typed “Nest + camera + hacked” and found out that this happens frequently. Parent after parent relayed stories similar to mine — threatening to steal a baby is shockingly common — and some much worse, such as playing pornography over the microphone to a 3-year-old… What is worse is that anyone could have been watching us at any time for as long as we have had the cameras up. This person just happened to use the microphone. Countless voyeurs could have been silently watching (or worse) for months.

However, what makes this issue even more terrifying is a corporate giant’s complete and utter lack of response. Nest is owned by Google, and, based on my experience and their public response, Google does not seem to care about this issue. They acknowledge it as a problem, shrug their shoulders and point their fingers at the users. Their party line is to remind people that the hardware was not hacked; it was the user’s fault for using a compromised password and not implementing two-step authentication, in which users receive a special code via text to sign on. That night, on my way home from work, I called Nest support and was on hold for an hour and eight minutes. I followed all directions and have subsequently received form emails in broken English. Nobody from Google has acknowledged the incident or responded with any semblance of empathy. In every email, they remind me of two-step authentication.

They act as if I am going to continue to use Nest cameras.

662

Amazon Workers May Be Watching Your Cloud Cam Home Footage

In a promotional video, Amazon says its Cloud Cam home security camera provides “everything you need to monitor your home, day or night.” In fact, the artificially intelligent device requires help from a squad of invisible employees. Dozens of Amazon workers based in India and Romania review select clips captured by Cloud Cam, according to five people who have worked on the program or have direct knowledge of it. Those video snippets are then used to train the AI algorithms to do a better job distinguishing between a real threat (a home invader) and a false alarm (the cat jumping on the sofa). An Amazon team also transcribes and annotates commands recorded in customers’ homes by the company’s Alexa digital assistant, Bloomberg reported in April.

AI has made it possible to talk to your phone. It’s helping investors predict shifts in market sentiment. But the technology is far from infallible. Cloud Cam sends out alerts when it’s just paper rustling in a breeze. Apple’s Siri and Amazon’s Alexa still occasionally mishear commands. One day, engineers may overcome these shortfalls, but for now AI needs human assistance. Lots of it. At one point, on a typical day, some Amazon auditors were each annotating about 150 video recordings, which were typically 20 to 30 seconds long, according to the people, who requested anonymity to talk about an internal program.

663

Google Loans Cameras To Volunteers To Fill Gaps in ‘Street View’

Kanhema, who works as a product manager in Silicon Valley and is a freelance photographer in his spare time, volunteered to carry Google’s Street View gear to map what amounted to 2,000 miles of his home country. The Berkeley, Calif., resident has filled in the map of other areas in Africa and Canada as well.

“We start in the large metropolitan areas where we know we have users, where it’s easy for us to drive and we can execute quickly,” says Stafford Marquardt, a product manager for Street View.

He says the team is working to expand the service’s reach. To do that, Google often relies on volunteers who can either borrow the company’s camera equipment or take photos using their own. Most images on Street View are collected by drivers, and most of these drivers are employed by third parties that work with Google. But when it comes to the places Google hasn’t prioritized, people like Kanhema can fill in the gaps.

“It’s so conspicuous to have a 4-foot contraption attached to the roof of your car,” Kanhema says. “People are walking up and asking questions about, ‘Is that a camera? What are you recording? What are you filming? It is for Google Maps? Will my house be on the map? Will my face be on the map?'”

Google doesn’t pay him or the other volunteers — whom the company calls “contributors” — for the content they upload. Kanhema, for example, spent around $5,000 of his own money to travel across Zimbabwe for the project.

Google currently has no plans to compensate its volunteers, adding that it pays contributors “in a lot of other ways” by offering “a platform to host gigabytes and terabytes of imagery and publish it to the entire world, absolutely for free.”

648

Pentagon testing mass surveillance balloons across the US

The US military is conducting wide-area surveillance tests across six midwest states using experimental high-altitude balloons, documents filed with the Federal Communications Commission (FCC) reveal.

Up to 25 unmanned solar-powered balloons are being launched from rural South Dakota and drifting 250 miles through an area spanning portions of Minnesota, Iowa, Wisconsin and Missouri, before concluding in central Illinois.

Travelling in the stratosphere at altitudes of up to 65,000ft, the balloons are intended to “provide a persistent surveillance system to locate and deter narcotic trafficking and homeland security threats”, according to a filing made on behalf of the Sierra Nevada Corporation, an aerospace and defence company.

The balloons are carrying hi-tech radars designed to simultaneously track many individual vehicles day or night, through any kind of weather.

A rival balloon operator World View recently announced that it had carried out multi-week test missions in which its own stratospheric balloons were able to hover over a five-mile-diameter area for six and a half hours, and larger areas for days at a time.

Ryan Hartman, CEO of World View, said that World View had also completed a dozen surveillance test missions for a customer it would not name, capturing data he would not specify.

“Obviously, there are laws to protect people’s privacy and we are respectful of all those laws,” Hartman said. “We also understand the importance of operating in an ethical way as it relates to further protecting people’s privacy.”

696

Amazon’s ‘Ring’ Doorbells Creating A Massive Police Surveillance Network

“Police departments are piggybacking on Ring’s network to build out their surveillance networks…” reports CNET, adding that Ring “helps police avoid roadblocks for surveillance technology, whether a lack of funding or the public’s concerns about privacy.”

While residential neighborhoods aren’t usually lined with security cameras, the smart doorbell’s popularity has essentially created private surveillance networks powered by Amazon and promoted by police departments. Police departments across the country, from major cities like Houston to towns with fewer than 30,000 people, have offered free or discounted Ring doorbells to citizens, sometimes using taxpayer funds to pay for Amazon’s products.

While Ring owners are supposed to have a choice on providing police footage, in some giveaways, police require recipients to turn over footage when requested. Ring said Tuesday that it would start cracking down on those strings attached…

While more surveillance footage in neighborhoods could help police investigate crimes, the sheer number of cameras run by Amazon’s Ring business raises questions about privacy involving both law enforcement and tech giants… More than 50 local police departments across the US have partnered with Ring over the last two years, lauding how the Amazon-owned product allows them to access security footage in areas that typically don’t have cameras — on suburban doorsteps. But privacy advocates argue this partnership gives law enforcement an unprecedented amount of surveillance. “What we have here is a perfect marriage between law enforcement and one of the world’s biggest companies creating conditions for a society that few people would want to be a part of,” said Mohammad Tajsar, staff attorney at the ACLU of Southern California…

Despite its benefits, the relationship between police departments and Ring raises concerns about surveillance and privacy, as Amazon is working with law enforcement to blanket communities with cameras…. “Essentially, we’re creating a culture where everybody is the nosy neighbor looking out the window with their binoculars,” said Dave Maass, a senior investigative researcher at the Electronic Frontier Foundation. “It is creating this giant pool of data that allows the government to analyze our every move, whether or not a crime is being committed.” On a heat map of Bloomfield, there are hardly any spots in the New Jersey township out of sight of a Ring camera.

Tajsar says in some scenarios “they’re basically commandeering people’s homes as surveillance outposts for law enforcement,” and the articles notes that when police departments partner with Ring, “they have access to a law enforcement dashboard, where they can geofence areas and request footage filmed at specific times.”

While law enforcement “can only get footage from the app if residents choose to send it,” if the residents refuse, police can still try to obtain the footage with a subpoena to Amazon’s Ring.

687

Airbnb Has a Hidden-Camera Problem

Airbnb’s rules allow cameras outdoors and in living rooms and common areas, but never in bathrooms or anywhere guests plan to sleep, including rooms with foldout beds. Starting in early 2018, Airbnb added another layer of disclosure: If hosts indicate they have cameras anywhere on their property, guests receive a pop-up informing them where the cameras are located and where they are aimed. To book the property, the guests must click “agree,” indicating that they’re aware of the cameras and consent to being filmed.

Of course, hosts have plenty of reason to train cameras on the homes they rent out to strangers. They can catch guests who attempt to steal, or who trash the place, or who initially say they’re traveling alone, then show up to a property with five people. A representative for Airbnb’s Trust & Safety communications department told me the company tries to filter out hosts who may attempt to surveil guests by matching them against sex-offender and felony databases. The company also uses risk scores to flag suspicious behavior, in addition to reviewing and booting hosts with consistently poor scores.

If a guest contacts Airbnb’s Trust & Safety team with a complaint about a camera, employees offer new accommodations if necessary and open an investigation into the host. […] But four guests who found cameras in their rentals told The Atlantic the company has inconsistently applied its own rules when investigating their claims, providing them with incorrect information and making recommendations that they say risked putting them in harm’s way. “There have been super terrible examples of privacy violations by AirBnB hosts, e.g., people have found cameras hidden in alarm clocks in their bedrooms,” wrote Jeff Bigham, a computer-science professor at Carnegie Mellon whose claim was initially denied after he reported cameras in his rental. “I feel like our experience is in some ways more insidious. If you find a truly hidden camera in your bedroom or bathroom, Airbnb will support you. If you find an undisclosed camera in the private living room, Airbnb will not support you.”

701

Police Bodycams Can Be Hacked To Doctor Footage, Install Malware

Josh Mitchell’s Defcon presentation analyzes the security of five popular brands of police bodycams (Vievu, Patrol Eyes, Fire Cam, Digital Ally, and CeeSc) and reveals that they are universally terrible. All the devices use predictable network addresses that can be used to remotely sense and identify the cameras when they switch on. None of the devices use code-signing. Some of the devices can form ad-hoc Wi-Fi networks to bridge in other devices, but they don’t authenticate these sign-ons, so you can just connect with a laptop and start raiding the network for accessible filesystems and gank or alter videos, or just drop malware on them.

770

UK Police Plan To Deploy ‘Staggeringly Inaccurate’ Facial Recognition in London

Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns. The controversial software, which officers use to identify suspects, has been found to be “staggeringly inaccurate”, while campaigners have branded its use a violation of privacy. But Britain’s largest police force is set to expand a trial across six locations in London over the coming months.

Police leaders claimed officers make the decision to act on potential matches with police records and images that do not spark an alert are immediately deleted. But last month The Independent revealed the Metropolitan Police’s software was returning “false positives” — images of people who were not on a police database — in 98 percent of alerts… Detective Superintendent Bernie Galopin, the lead on facial recognition for London’s Metropolitan Police, said the operation was targeting wanted suspects to help reduce violent crime and make the area safer. “It allows us to deal with persons that are wanted by police where traditional methods may have failed,” he told The Independent, after statistics showed police were failing to solve 63 per cent of knife crimes committed against under-25s….

Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing, which is currently being stretched by budget cuts, falling officer numbers, rising demand and the terror threat.

A policy officer from the National Council for Civil Liberties called the technology “lawless,” adding “the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society.”

But a Home Office minister said the technology was vital for protecting people from terrorism, though “we must ensure that privacy is respected. This strategy makes clear that we will grasp the opportunities that technology brings while remaining committed to strengthening safeguards.”

828

High School in China Installs Facial Recognition Cameras to Monitor Students’ Attentiveness

A high school in Hangzhou City, Zhejiang Province located on the eastern coast of China, has employed facial recognition technology to monitor students’ attentiveness in class.

At Hangzhou Number 11 High School, three cameras at the front of the classroom scan students’ faces every 30 seconds, analyzing their facial expressions to detect their mood, according to a May 16 report in the state-run newspaper The Paper.

The different moods—surprised, sad, antipathy, angry, happy, afraid, neutral—are recorded and averaged during each class.

A display screen, only visible to the teacher, shows the data in real-time. A certain value is determined as a student not paying enough attention.

A video shot by Zhejiang Daily Press revealed that the system—coined the “smart classroom behavior management system” by the school—also analyzes students’ actions, categorized into: reading, listening, writing, standing up, raising hands, and leaning on the desk.

An electronic screen also displays a list of student names deemed “not paying attention.”

The school began using the technology at the end of March, vice principal Zhang Guanchao told The Paper. Zhang added that students felt like they were being monitored when the system was first put in place, but have since gotten used to it.

849

Artificial intelligence can create a 3D model of a person—from just a few seconds of video

Artificial intelligence has been used to create 3D models of people’s bodies for virtual reality avatars, surveillance, visualizing fashion, or movies. But it typically requires special camera equipment to detect depth or to view someone from multiple angles. A new algorithm creates 3D models using standard video footage from one angle.

The system has three stages. First, it analyzes a video a few seconds long of someone moving—preferably turning 360° to show all sides—and for each frame creates a silhouette separating the person from the background. Based on machine learning techniques—in which computers learn a task from many examples—it roughly estimates the 3D body shape and location of joints. In the second stage, it “unposes” the virtual human created from each frame, making them all stand with arms out in a T shape, and combines information about the T-posed people into one, more accurate model. Finally, in the third stage, it applies color and texture to the model based on recorded hair, clothing, and skin.

The researchers tested the method with a variety of body shapes, clothing, and backgrounds and found that it had an average accuracy within 5 millimeters, they will report in June at the Computer Vision and Pattern Recognition conference in Salt Lake City. The system can also reproduce the folding and wrinkles of fabric, but it struggles with skirts and long hair. With a model of you, the researchers can change your weight, clothing, and pose—and even make you perform a perfect pirouette. No practice necessary.

716

‘Living laboratories’: the Dutch cities amassing data on oblivious residents

Stratumseind in Eindhoven is one of the busiest nightlife streets in the Netherlands. On a Saturday night, bars are packed, music blares through the street, laughter and drunken shouting bounces off the walls. As the night progresses, the ground becomes littered with empty shot bottles, energy drink cans, cigarette butts and broken glass.

It’s no surprise that the place is also known for its frequent fights. To change that image, Stratumseind has become one of the “smartest” streets in the Netherlands. Lamp-posts have been fitted with wifi-trackers, cameras and 64 microphones that can detect aggressive behaviour and alert police officers to altercations. There has been a failed experiment to change light intensity to alter the mood. The next plan, starting this spring, is to diffuse the smell of oranges to calm people down. The aim? To make Stratumseind a safer place.

We get that comment a lot – ‘Big brother is watching you’. I prefer to say, ‘Big brother is helping you’

All the while, data is being collected and stored. “Visitors do not realise they are entering a living laboratory,” says Maša Galic, a researcher on privacy in the public space for the Tilburg Institute of Law, Technology and Society. Since the data on Stratumseind is used to profile, nudge or actively target people, this “smart city” experiment is subject to privacy law. According to the Dutch Personal Data Protection Act, people should be notified in advance of data collection and the purpose should be specified – but in Stratumseind, as in many other “smart cities”, this is not the case.

Peter van de Crommert is involved at Stratumseind as project manager with the Dutch Institute for Technology, Safety and Security. He says visitors do not have to worry about their privacy: the data is about crowds, not individuals. “We often get that comment – ‘Big brother is watching you’ – but I prefer to say, ‘Big brother is helping you’. We want safe nightlife, but not a soldier on every street corner.”

When we think of smart cities, we usually think of big projects: Songdo in South Korea, the IBM control centre in Rio de Janeiro or the hundreds of new smart cities in India. More recent developments include Toronto, where Google will build an entirely new smart neighbourhood, and Arizona, where Bill Gates plans to build his own smart city. But the reality of the smart city is that it has stretched into the everyday fabric of urban life – particularly so in the Netherlands.

In the eastern city of Enschede, city traffic sensors pick up your phone’s wifi signal even if you are not connected to the wifi network. The trackers register your MAC address, the unique network card number in a smartphone. The city council wants to know how often people visit Enschede, and what their routes and preferred spots are. Dave Borghuis, an Enschede resident, was not impressed and filed an official complaint. “I don’t think it’s okay for the municipality to track its citizens in this way,” he said. “If you walk around the city, you have to be able to imagine yourself unwatched.”

Enschede is enthusiastic about the advantages of the smart city. The municipality says it is saving €36m in infrastructure investments by launching a smart traffic app that rewards people for good behaviour like cycling, walking and using public transport. (Ironically, one of the rewards is a free day of private parking.) Only those who mine the small print will discover that the app creates “personal mobility profiles”, and that the collected personal data belongs to the company Mobidot.
‘Targeted supervision’ in Utrecht

Companies are getting away with it in part because it involves new applications of data. In Silicon Valley, they call it “permissionless innovation”, they believe technological progress should not be stifled by public regulations. For the same reason, they can be secretive about what data is collected in a public space and what it is used for. Often the cities themselves don’t know.

Utrecht keeps track of the number of boys and girls hanging in the streets, their age and whether they are acquaintances

Utrecht has become a tangle of individual pilots and projects, with no central overview of how many cameras and sensors exist, nor what they do. In 2014, the city invested €80m in data-driven management that launched in 80 projects. Utrecht now has a burglary predictor, a social media monitoring room, and smart bins and smart streetlights with sensors (although the city couldn’t say where these are located). It has scanner cars that dispense parking tickets, with an added bonus of detecting residents with a municipal tax debt according to the privacy regulation of the scanner cars. But when I asked the city to respond to a series of questions on just 22 of the smart projects, it could only answer for five of them, referring me to private companies for the rest of the answers.

The city also keeps track of the number of young people hanging out in the streets, their age group, whether they know each other, the atmosphere and whether or not they cause a nuisance. Special enforcement officers keep track of this information through mobile devices. It calls this process “targeted and innovative supervision”. Other council documents mention the prediction of school drop-outs, the prediction of poverty and the monitoring of “the health of certain groups” with the aim of “intervening faster”.

Like many cities, Utrecht argues that it acts in accordance with privacy laws because it anonymises or pseudonymises data (assigning it a number instead of a name or address). But pseudonymised personal data is still personal data. “The process is not irreversible if the source file is stored,” says Mireille Hildebrandt, professor of ICT and Law at Radboud University. “Moreover, if you build personal profiles and act on them, it is still a violation of privacy and such profiling can – unintentionally – lead to discrimination.” She points to Utrecht’s plan to register the race and health data of prostitutes, which came in for heavy criticism from the Dutch Data Protection Authority.

Another unanswered question regards who owns data that is collected in a public space. Arjen Hof is director of Civity, a company that builds data platforms for governments. “Public authorities are increasingly outsourcing tasks to private companies. Think of waste removal or street lighting,” he says. “But they do not realise that at the same time a lot of data is collected, and do not always make agreements about the ownership of data.”
‘A smart city is a privatised city’

Hof gives the example of CityTec, a company that manages 2,000 car parks, 30,000 traffic lights and 500,000 lamp-posts across the Netherlands. It refused to share with municipalities the data it was collecting through its lamp-post sensors. “Their argument was that, although the municipality is legally owner of the lamp-posts, CityTec is the economic owner and, for competitive reasons, did not want to make the data available,” Hof says. This was three years ago, but for a lot of companies it remains standard practice. Companies dictate the terms, and cities say they can’t share the contracts because it contains “competition-sensitive information”.

When I interviewed the technology writer Evgeny Morozov in October, he warned of cities becoming too dependent on private companies. “The culmination of the smart city is a privatised city,” he said. “A city in which you have to pay for previously free services.”

Morozov’s fear about public subsidies being used for private innovation is well illustrated in Assen, a city of 70,000 people in the north of the country. Assen built a fibre-optic network for super-fast internet in 2011, to which it connected 200 sensors that measure, among other things, the flow of cars. There was an experiment to steer people around traffic jams, even though traffic in the city is relatively light. The city also connected its traffic lights, parking garages and parking signs to this grid. The cost of €46m was split between Brussels, the national government, the province and the municipality. Companies such as the car navigation firm TomTom have used the sensor network to test new services.

The project, called Sensor City, filed for bankruptcy a year ago. Now the publicly funded fibre-optic network, sensors and all, will be sold to a still-unidentified private company. The municipality will have to strike a deal with the new owner about the use of its public traffic lights and parking signs.

853

“Are you happy now? The uncertain future of emotion analytics”

Elise Thomas writes at Hopes & Fears:

“Right now, in a handful of computing labs scattered across the world, new software is being developed which has the potential to completely change our relationship with technology. Affective computing is about creating technology which recognizes and responds to your emotions. Using webcams, microphones or biometric sensors, the software uses a person’s physical reactions to analyze their emotional state, generating data which can then be used to monitor, mimic or manipulate that person’s emotions.”

Corporations spend billions each year trying to build “authentic” emotional connections to their target audiences. Marketing research is one of the most prolific research fields around, conducting thousands of studies on how to more effectively manipulate consumers’ decision-making. Advertisers are extremely interested in affective computing and particularly in a branch known as emotion analytics, which offers unprecedented real-time access to consumers’ emotional reactions and the ability to program alternative responses depending on how the content is being received.

For example, if two people watch an advertisement with a joke and only one person laughs, the software can be programmed to show more of the same kind of advertising to the person who laughs while trying different sorts of advertising on the person who did not laugh to see if it’s more effective. In essence, affective computing could enable advertisers to create individually-tailored advertising en masse.”

“Say 15 years from now a particular brand of weight loss supplements obtains a particular girl’s information and locks on. When she scrolls through her Facebook, she sees pictures of rail-thin celebrities, carefully calibrated to capture her attention. When she turns on the TV, it automatically starts on an episode of “The Biggest Loser,” tracking her facial expressions to find the optimal moment for a supplement commercial. When she sets her music on shuffle, it “randomly” plays through a selection of the songs which make her sad. This goes on for weeks.

Now let’s add another layer. This girl is 14, and struggling with depression. She’s being bullied in school. Having become the target of a deliberate and persistent campaign by her technology to undermine her body image and sense of self-worth, she’s at risk of making some drastic choices.”

942

Google forming ‘smart cities’

“An ambitious project to blanket New York and London with ultrafast Wi-Fi via so-called “smart kiosks,” which will replace obsolete public telephones, are the work of a Google-backed startup.

Each kiosk is around nine feet high and relatively flat. Each flat side houses a big-screen display that pays for the whole operation with advertising.

Each kiosk provides free, high-speed Wi-Fi for anyone in range. By selecting the Wi-Fi network at one kiosk, and authenticating with an email address, each user will be automatically connected to every other LinkNYC kiosk they get within range of. Eventually, anyone will be able to walk around most of the city without losing the connection to these hotspots.

Wide-angle cameras on each side of the kiosks point up and down the street and sidewalk, approximating a 360-degree view. If a city wants to use those cameras and sensors for surveillance, it can.

Over the next 15 years, the city will go through the other two phases, where sensor data will be processed by artificial intelligence to gain unprecedented insights about traffic, environment and human behavior and eventually use it to intelligently re-direct traffic and shape other city functions.”

883
Stare Into The Lights My Pretties

Snapchat launches video-recording sunglasses

“Social media app Snapchat is introducing video-recording sunglasses called Spectacles and is changing its company name to incorporate the new product.

The glasses can record video 10 seconds at a time by tapping a button on the device. The video is then uploaded automatically to the popular image-messaging app via Bluetooth or Wi-Fi. The glasses are the first hardware from the Los Angeles-based company.”

817