Here’s the Pitch Deck for ‘Active Listening’ Ad Targeting

404 Media previously reported Cox Media Group (CMG) was advertising a service that claimed to target ads based on what potential customers said near device microphones. Now, here is the pitch deck CMG sent to prospective companies.

80

Are Marketers Using Smartphones to Listen to Your Conversations to Target Ads? Yes, Cox Media Group Says in Materials Deleted From Its Website

It’s been a long-held suspicion by many people: that smartphones and smart speakers are listening in on their private conversations for various reasons.

Now one company — Atlanta-based Cox Media Group — has revealed that yes, your devices are listening to you. Indeed, CMG touted its ability to identify “relevant conversations via smartphones, smart TVs and other devices” using AI to let local businesses target ads to those people.

“It’s True. Your Devices Are Listening to You,” said a page on the CMG Local Solutions site, which has since been pulled down. “With Active Listening, CMG can now use voice data to target your advertising to the EXACT people you are looking for.”

In a Nov. 28 blog post (which also has been deleted), CMG Local Solutions said its “Active Listening” technology can pick up conversations to provide local advertisers a weekly list of consumers who are in the market for a given product or service. Example it cited of what Active Listening can detect included “Do we need a bigger vehicle?”; “I feel like my lawyer is screwing me”; and “It’s time for us to get serious about buying a house.”

79

Marketing Company Claims That It Actually Is Listening to Your Phone and Smart Speakers to Target Ads

“What would it mean for your business if you could target potential clients who are actively discussing their need for your services in their day-to-day conversations? No, it’s not a Black Mirror episode—it’s Voice Data, and CMG has the capabilities to use it to your business advantage.”

86

How the Pentagon Learned To Use Targeted Ads To Find Its Targets

In 2019, a government contractor and technologist named Mike Yeagley began making the rounds in Washington, DC. He had a blunt warning for anyone in the country’s national security establishment who would listen: The US government had a Grindr problem. A popular dating and hookup app, Grindr relied on the GPS capabilities of modern smartphones to connect potential partners in the same city, neighborhood, or even building. The app can show how far away a potential partner is in real time, down to the foot. But to Yeagley, Grindr was something else: one of the tens of thousands of carelessly designed mobile phone apps that leaked massive amounts of data into the opaque world of online advertisers. That data, Yeagley knew, was easily accessible by anyone with a little technical know-how. So Yeagley — a technology consultant then in his late forties who had worked in and around government projects nearly his entire career — made a PowerPoint presentation and went out to demonstrate precisely how that data was a serious national security risk.

As he would explain in a succession of bland government conference rooms, Yeagley was able to access the geolocation data on Grindr users through a hidden but ubiquitous entry point: the digital advertising exchanges that serve up the little digital banner ads along the top of Grindr and nearly every other ad-supported mobile app and website. This was possible because of the way online ad space is sold, through near-instantaneous auctions in a process called real-time bidding. Those auctions were rife with surveillance potential. You know that ad that seems to follow you around the internet? It’s tracking you in more ways than one. In some cases, it’s making your precise location available in near-real time to both advertisers and people like Mike Yeagley, who specialized in obtaining unique data sets for government agencies.

Working with Grindr data, Yeagley began drawing geofences — creating virtual boundaries in geographical data sets — around buildings belonging to government agencies that do national security work. That allowed Yeagley to see what phones were in certain buildings at certain times, and where they went afterwards. He was looking for phones belonging to Grindr users who spent their daytime hours at government office buildings. If the device spent most workdays at the Pentagon, the FBI headquarters, or the National Geospatial-Intelligence Agency building at Fort Belvoir, for example, there was a good chance its owner worked for one of those agencies. Then he started looking at the movement of those phones through the Grindr data. When they weren’t at their offices, where did they go? A small number of them had lingered at highway rest stops in the DC area at the same time and in proximity to other Grindr users — sometimes during the workday and sometimes while in transit between government facilities. For other Grindr users, he could infer where they lived, see where they traveled, even guess at whom they were dating.

Intelligence agencies have a long and unfortunate history of trying to root out LGBTQ Americans from their workforce, but this wasn’t Yeagley’s intent. He didn’t want anyone to get in trouble. No disciplinary actions were taken against any employee of the federal government based on Yeagley’s presentation. His aim was to show that buried in the seemingly innocuous technical data that comes off every cell phone in the world is a rich story — one that people might prefer to keep quiet. Or at the very least, not broadcast to the whole world. And that each of these intelligence and national security agencies had employees who were recklessly, if obliviously, broadcasting intimate details of their lives to anyone who knew where to look. As Yeagley showed, all that information was available for sale, for cheap. And it wasn’t just Grindr, but rather any app that had access to a user’s precise location — other dating apps, weather apps, games. Yeagley chose Grindr because it happened to generate a particularly rich set of data and its user base might be uniquely vulnerable.
The report goes into great detail about how intelligence and data analysis techniques, notably through a program called Locomotive developed by PlanetRisk, enabled the tracking of mobile devices associated with Russian President Vladimir Putin’s entourage. By analyzing commercial adtech data, including precise geolocation information collected from mobile advertising bid requests, analysts were able to monitor the movements of phones that frequently accompanied Putin, indicating the locations and movements of his security personnel, aides, and support staff.

This capability underscored the surveillance potential of commercially available data, providing insights into the activities and security arrangements of high-profile individuals without directly compromising their personal devices.

135

TikTok sued for billions over use of children’s data

Lawyers will allege that TikTok takes children’s personal information, including phone numbers, videos, exact location and biometric data, without sufficient warning, transparency or the necessary consent required by law, and without children or parents knowing what is being done with that information. TikTok has more than 800 million users worldwide and parent firm ByteDance made billions in profits last year, with the vast majority of that coming via advertising revenue.

471

Why Don’t We Just Ban Targeted Advertising?

Google and Facebook, including their subsidiaries like Instagram and YouTube, make about 83 percent and 99 percent of their respective revenue from one thing: selling ads. It’s the same story with Twitter and other free sites and apps. More to the point, these companies are in the business of what’s called behavioral advertising, which allows companies to aim their marketing based on everything from users’ sexual orientations to their moods and menstrual cycles, as revealed by everything they do on their devices and every place they take them. It follows that most of the unsavory things the platforms do—boost inflammatory content, track our whereabouts, enable election manipulation, crush the news industry—stem from the goal of boosting ad revenues. Instead of trying to clean up all these messes one by one, the logic goes, why not just remove the underlying financial incentive? Targeting ads based on individual user data didn’t even really exist until the past decade. (Indeed, Google still makes many billions of dollars from ads tied to search terms, which aren’t user-specific.) What if companies simply weren’t allowed to do it anymore?

Let’s pretend it really happened. Imagine Congress passed a law tomorrow morning that banned companies from doing any ad microtargeting whatsoever. Close your eyes and picture what life would be like if the leading business model of the internet were banished from existence. How would things be different?

Many of the changes would be subtle. You could buy a pair of shoes on Amazon without Reebok ads following you for months. Perhaps you’d see some listings that you didn’t see before, for jobs or real estate. That’s especially likely if you’re African-American, or a woman, or a member of another disadvantaged group. You might come to understand that microtargeting had supercharged advertisers’ ability to discriminate, even when they weren’t trying to.

661

Facebook Allowed Advertisers to Target Users Interested in “White Genocide”—Even in Wake of Pittsburgh Massacre

Apparently fueled by anti-Semitism and the bogus narrative that outside forces are scheming to exterminate the white race, Robert Bowers murdered 11 Jewish congregants as they gathered inside their Pittsburgh synagogue, federal prosecutors allege. But despite long-running international efforts to debunk the idea of a “white genocide,” Facebook was still selling advertisers the ability to market to those with an interest in that myth just days after the bloodshed.

A simple search of Facebook pages also makes plain that there are tens of thousands of users with a very earnest interest in “white genocide,” shown through the long list of groups with names like “Stop White South African Genocide,” “White Genocide Watch,” and “The last days of the white man.” Images with captions like “Don’t Be A Race Traitor” and “STOP WHITE GENOCIDE IN SOUTH AFRICA” are freely shared in such groups, providing a natural target for anyone who might want to pay to promote deliberately divisive and incendiary hate-based content.

723

Instagram is testing the ability to share your precise location history with Facebook

Revealed just weeks after Instagram’s co-founders left the company, Instagram is currently testing a feature that would allow it to share your location data with Facebook, even when you’re not using the app.

Instagram is not the only service that Facebook has sought to share data between. Back in 2016 the company announced that it would be sharing user data between WhatsApp and Facebook in order to offer better friend suggestions. The practice was later halted in the European Union thanks to its GDPR legislation, although WhatsApp’s CEO and co-founder later left over data privacy concerns.

Facebook is also reportedly testing a map view to see friend’s locations, similar to what’s already offered by Snapchat. Instagram’s data sharing could provide additional data points to power this functionality, while providing Facebook with more data to better target its ads.

805

Targeted advertising hits emergency rooms

Patients sitting in emergency rooms, at chiropractors’ offices and at pain clinics in the Philadelphia area may start noticing on their phones the kind of messages typically seen along highway billboards and public transit: personal injury law firms looking for business by casting mobile online ads at patients.

The potentially creepy part? They’re only getting fed the ad because somebody knows they are in an emergency room.

The technology behind the ads, known as geofencing, or placing a digital perimeter around a specific location, has been deployed by retailers for years to offer coupons and special offers to customers as they shop. Bringing it into health care spaces, however, is raising alarm among privacy experts.

792