Conversation Guide

Introduction

This basic guide is designed to help screening organisers and teachers think about and engage with some of the issues presented in the film with their audiences in a constructive way. There are some questions below that could help open up helpful conversation, as well a summary of key points for further exploration.

Inspiring thinking, conversation, and action on the film’s specific arguments is a key goal.

This guide aims to help foster constructive screenings that remain focused on the issues and arguments raised in the film.

Synopsis

We live in a world of screens. The average adult spends the majority of their waking hours in front of some sort of screen or device. We’re enthralled, we’re addicted to these machines. How did we get here? Who benefits from this arrangement? What are the cumulative impacts on people, society and the environment? What may come next if this culture is left unchecked, to its end trajectory, and is that what we want?

Stare Into The Lights My Pretties investigates these questions with an urge to return to the real physical world, to form a critical view of technological escalation driven by rapacious and pervasive corporate interest. Covering themes of addiction, privacy, surveillance, information manipulation, behaviour modification and social control, the film lays the foundations as to why we may feel like we’re sleeprunning into some dystopian nightmare with the machines at the helm. Because we are, if we don’t seriously avert our eyes to stop this culture from destroying what is left of the real world.

Summary of Key Points

“Progress”

  • This culture’s technologies are developed and “progressed” (escalated) through one of three power blocs: (A) the Armed Forces, (B) Bureaucracy and, (C) Corporate Power.
  • The creation of the computer and the Internet have direct origins in war, being invented by and for the military.
  • Technologies do not exist in a vacuum. They are a result of: mindset, social structures, culture.
  • Once technologies come into existence, they have an impact on mindset, social systems, culture.
  • Lewis Mumford was a philosopher of technology. He was very pro-technology until World War Two. That changed his mind.
  • Mumford used the word technics to describe the interaction between the social system that emerges from and gives rise to certain technologies, and then how those technologies, in turn, impact the social system, and so on.
  • From the Greek tekhne (which means not only technology, but also art, skill, and dexterity), technics refers to the interplay of social milieu and technological innovation—the “wishes, habits, ideas, and goals” of a society.
  • Mumford shows that technology is just one part of technics. He writes at the beginning of his book Technics and Civilization from 1934, “other civilizations reached a high degree of technical proficiency without, apparently, being profoundly influenced by the methods and aims of technics.”
  • War, conflict, and competition have fundamentally driven the development and escalation of computers and digital technology. We can see this in the late 1960s and 1970s through major historical events such as the Cold War, the Space Race; the digitisation of the American economy during the 1980s; the rise and almost totalising completeness of globalisation throughout the 1990s and 2000s; the Dot-Com bubble throughout the proceeding decades concentrating corporate power, and so on. The origins of war, conflict, and global commerce continue this trajectory into the current day, with the advent of mass surveillance regimes in democratic and authoritarian societies alike, cyber-warfare, drone strikes; financial systems extensively powered by the Internet, algorithms, cryptography, advertising, automation, predictive analytics, and so on.
  • Technology changes how you perceive the world and relate to the world.
  • “If all you have is a hammer, everything looks like a nail.”
  • The technics themselves can become in charge, requiring certain social structures and mindset, a lens through which the world is primarily perceived and acted upon. For example, car culture requires carparks and highways and constant expansion to primarily facilitate the requirements of the machine itself, secondary to human needs or indeed the natural world. The technic itself has become in charge and in control of the social, political and material conditions of the society.
  • Screen culture requires and perpetuates the digitisation of everything, as well as constant expansion and escalation. Digital technology never takes a step back. The technic itself has not only become in charge of the society, but is society. It’s how we can come to see and describe the trajectory of the subsumption of all things non-digital, including the notion of the human mind itself: as merely a “computer.” The technic has imbued a certain social structure and mindset, a lens through which the world is primarily perceived and acted upon.
  • The rapid escalation of technology, and the cumulative personal, social and political impacts, is a self-reinforcing, accelerating postive-feedback loop. Like a “snowball effect.”
  • In less than one human lifetime, we’ve literally gone from no computers at all, to computers everywhere, connected to all the other computers everywhere, seemingly touching every facet of our lives.
  • What are the trends or common threads to this trajectory?
  • What is the end of this trajectory? Does it have any hard limits?
  • Susan Greenfield: “This era is unprecedented. Never before have digital technologies so changed our environment from three dimensions to two, and challenged the boundaries as not only information technology has done but nanotechnology and biotechnology are doing. They are radically shaking up our concepts of space and time in a way that the motor car or the telephone or the television did not.”
  • The culture is fundamentally challenging the notions of what it means to be a human being; to live in the physical world, a real world; to be materially, physically, psychologically and emotionally grounded.

Mindset

  • Screen culture: A whole way of life revolving around digital devices. More specifically concerning the amount of time people spend in front of some kind of screen—phone, laptop, xbox, television, digital billboards, “Internet of Things,” etc.
  • The amount of time is substantive. The majority of our waking hours are spent using some kind of screen or device.
  • Conservative figures are that the average adult spends more than 8 hours a day in front of a screen.
  • A report from Neilson from 2014 showed that an average adult in the United States spends more than 11 hours per day with digital media. That accumulates to more time than an average working week, looking at screens.
  • Global human population is 7.6 billion. 3.8 billion people are online (~50% of global human population).
  • 2.2 billion Facebook users every month (69% of global human population that is online).
  • According to a report from 2013 by IDC Research, sponsored by Facebook itself, 79% of people aged 18-44 have their ‘smartphones’ with them for 22 hours a day. Within the first 15 minutes of waking up, 4 out of 5 ‘smartphone’ users are checking their phones. 63% of ‘smartphone’ owners keep their phone with them for all but an hour of their waking day. 79% keep it with them for all but two hours of their day. 1 out of 4 study respondents couldn’t recall a time in their day when their phone was not within reach or in the same room as them.
  • By the time the average person reaches 70, they will have spent the equivalent of 10 to 15 years of their life watching television. 4+ years just watching the ads (not including the now-ubiquitous forms of advertising that are embedded into the content, or are indiscernibly masquerading as content).
  • An average of 500 million “tweets” are sent every day. 300 hours of video are uploaded to YouTube every minute. Over 40,000 search queries are made using Google every second, which translates to over 3.5 billion searches per day, and over 1.2 trillion searches per year, worldwide.
  • Of the world’s 7 billion people, 6 billion have access to a mobile phone—a billion and a half more, the United Nations reports, than have access to a working toilet.
  • A poll conducted in 2016 showed that ~2,500 British youths aged between 18 and 25 preferred “a decent internet connection over ‘trivial matters’ … such as hot water, daylight, a good night’s sleep, a healthy diet,” and so on.

Neuroplasticity

  • The human mind changes and adapts to its environment, is shaped by its experience. Neuroscience calls this phenomenon “neuroplasticity” which comes from the Greek plastikos, which means “to be moulded.”
  • Neuroplasticity is analogous to exercising a muscle. For example, if you’re practising a particular form of thinking or doing, the involved pathways in the brain are strengthened and stimulated such that the particular form of thinking or doing becomes ‘second nature.’ Likewise, if you don’t practise for a while, the pathways atrophy, analogous to neglecting to exercise. The ‘mind muscles’ break down. Practising a particular form of thinking or doing keeps that form of thinking or doing physically alive and strong in the mind.
  • If you’re spending 11 hours a day with screens, what are the impacts on the mind?

Screen Culture

  • The screen environment is highly sensory and stimulating for a certain mindset. You get near-instant feedback from computers, for instance. This can create and sustain an unequal emphasis on the mindset of process, rather than content; outcomes rather than meaning or reflection. The premium becomes about the senses. A “sensational” time, in contrast to a contemplative or meaningful one.
  • As the medium of the screen is only vision and sound, the screen environment is biased towards “what you see is what you get,” as opposed to metaphor or analogy. How do you convey “Out, out, brief candle!” on a screen?
  • The screen encourages distraction and shallowed thinking. Sometimes this is by design, other times it is ancillary and/or unintended. Some examples of this may be the way in which hyperlinks can take you off-track; alerts or notifications interrupt what you’re doing or thinking about, or can change the purpose as to why you came to the computer to do something in the first place. Even just the possibility that the screen provides that any fleeting curiosity can be satisfied on a whim, means that the screen can quickly transform from a means to an end, to an end in-and-of itself.
  • The premium on the senses requires a increase in stimulation to stay interested or focused. Once the brain adapts to this virtual world which is highly sensory, the real world can seem boring and slow in comparison, exacerbating a short attention span. This in-turn gives the screen a further allure, and the cycle repeats.
  • The screen does not encourage long-form thinking. The distraction of the technology itself gets in the way.
  • Multitasking is a rapid form of distraction and scatter-brain thinking, encouraged and exacerbated by the screen environment. There is a ‘cognitive switching cost’ to switching from task to task. Multitasking can negatively affect memory, analytic reasoning, the ability to concentrate, think clearly, and so on.
  • If one does not practise long-form thinking, critical thinking, reading deeply or having contemplative time, what happens to those abilities?
  • What are the implications on a personal, social, political, and environmental level if contemplativeness and the ability to think well and clearly is lost?
  • Google affects our memory, because we’re practising externalising memory as opposed to recalling information and considering it from within ourselves. This has huge implications not only on topics such as discourse, history, intelligence, and community, but also the way in which a corporation has come to dominate and redefine the experience of “remembering” itself.
  • Susan Greenfield on the ‘cut-and-paste’ mentality: “[Externalising memory through Google] does concern me because let’s take that ad absurdum: If you feel you can look anything up and you don’t have to learn anything, this would mean conversation is going to be pretty clunky because if I normally meet someone of roughly my generation and culture, I will assume they know where Barcelona is and I will assume they know who Napoleon was or who Henry VIII was, I’ll assume they’ll know where New York is. So you can have a conversation—and we all know the delight of having conversations with people where you share a lot of background knowledge that can develop ideas—but imagine having a conversation with someone who knew nothing, who didn’t know who Hitler was, who hadn’t heard of fascism, who had to look it up each time? This would mean that you couldn’t really have a very fluid or interactive conversation, and I know that sounds extreme, but if we are always having recourse to an external source of memory than I think it’s going to have a severe impact on how we interact, how fast we have ideas, and what we do with them, and I think that the younger generation perhaps might be disadvantaged in not having such ready agile processes and have a much more cut-and-paste mentality.”
  • The speed and immediacy of screen culture contributes to a lack of contemplativeness. There is both a social and technological pressure to answer or respond or engage now. “It’s been 5 minutes, why haven’t you responded to my text?” The tools push us to quick responses. This means we start to ask each other questions that are easy to answer. But the questions before our planet right now are not the sought of questions that should be thought about or answered in the time-space of texting.

It’s all about me!

  • The Facebook algorithm shows the stories that get lots of “social love” to more people and shows the stories that don’t to less people. People find it very easy to click Like on “I just ran a marathon” or “I just baked this awesome cake,” but the stories like the genocide in Darfur enters its 10th year don’t get so many Likes. What does this mean? If these sorts of stories are filtered out, does that mean that we don’t find out about important but unpleasant things that are happening?
  • Facebook CEO Mark Zuckerberg says, “A squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” Is there something wrong with this view? If so, what? Why do you answer the way you do?
  • Facebook, Google, Amazon, Netflix, Yahoo News, etc—most online platforms—record the links you click on; track the websites you visit and how long you stay there and what elements of the page you interact with; capture your keystrokes; and cross-correlate data about your past behaviour, interests, location, and with whom you’ve been communicating, and so on. These extremely rich information sets, among others, are then used to feed algorithms that go about customising and manipulating the information content, to personalise the information experience to you. This technic is called The Filter Bubble.

The Filter Bubble

  • The filtering is substantial enough that globally important social and political events such as the uprising in Egypt can be completely filtered out.
  • Eli Pariser: “It’s not just Google and Facebook either. This is something that’s sweeping the Web. There are a whole host of companies that are doing this kind of personalisation. Yahoo News, the biggest news site on the Internet, is now personalised. Different people get different things. Huffington Post, the Washington Post, the New York Times—all flirting with personalisation in various ways. And this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.”
  • Former CEO of Google Eric Schmidt says, “The power of individual targeting—the technology will be so good it will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
  • Katina Michael: “The Screen Culture makes people look within and not to look outside. So when I’m using my smartphone, and I’m being sent instant messages, and I’m being communicated to, it’s about me. And people can say, “That’s great for personalisation, that’s how I want it, I want to customise my whole life,” but in fact, we’re internalising a lot of things: If I think about me than most likely I will neglect my children, I will neglect my partner, I will neglect my workplace, because it’s about me and my interactions and the instantaneous communications that take place. There’s always a danger in that, in ignoring your neighbour, in a lack of collective awareness.”
  • Katina Michael: “When my senses are enveloped, and it’s about me and my communications, it’s not about my children, it’s not about my partner, is not about my workplace, it’s about me, I think there’s a great danger in trust within society, in building relationships with one another–or a lack of building–when we are concerned about the Me.”
  • The Me has been a target of corporate power for a long time. Advertising after the Second World War changed into instilling desires and manipulating the masses to want things and see the world in a certain way. From where the computer and the Internet originated, this is in hyper-drive of the world of the Screen Culture, where not only the convergence of technologies has amplified the power and influence of corporate voice, the Screen Culture provides a centralised mechanism of social control pretending to be freedom and democracy.
  • The Filter Bubble puts you at the centre of what seems like a vast world of connectivity and relevance. But really, you’re in a walled information garden, a holding cell of two-way mirrors, a giant echo-chamber. What happens to our communities, our relationships, the culture, if we’re each walking around in our own information echo-chamber, this lack of collective awareness?
  • The Filter Bubble reinforces polarisation within the society in terms of people not being exposed to and listening to alternative viewpoints, considering them or thinking about them.
  • The famous Daniel Patrick Moynihan quote: “Everybody’s entitled to their own opinions but not their own facts.” But it’s increasingly possible to live in an online world in which you do have your “own facts,” and you Google climate change and you get the climate change links for you. You don’t actually get exposed necessarily, you don’t even know what the alternate arguments are.
  • As we willingly pour our lives into the screen, the screens not only simply reflect this more of the same, it’s strengthening corporate power, studying and analysing us inside this playpen, projecting into our individually targeted mirror world. We become the product of the consumer culture in totality.

Dataveillance and Monetisation: The “Cost of Free”

  • There’s a myth online that what we’re doing is free. Instead, we are paying with data about ourselves–our habits, our preferences, our relationships, our personality traits.
  • Bruce Schneier: “Everything we do on a computer produces a transaction record. Whether it’s your laptop, whether it’s your phone, whether it’s an ATM machine, a toll booth, using your credit card, anything with a computer creates a transaction record. Data is a by-product of all of our information society’s socialisation. Increasingly, company’s computers are mediating all of our social interactions. And all of this data is increasingly stored and increasingly searchable.”
  • And all of this data is not only where social control centres from via a screen culture, it’s where our value is ultimately extracted, turned into huge profits.
  • Think about your digital trails. What did you do today that involved a computer, a screen? Your choice or not?
  • The screens are always watching: gathering and saving data. For example, Facebook Pixel, Google Adwords, Google Analytics or YouTube video embeds. These services are loaded from the company’s computers, which in turn allows for data collection, aggregation, mining for patterns, predictive analytics, selling the insights, and so on.
  • Many websites use Google Analytics or Facebook Pixel, etc. This means the dataveillance and sphere of influence over the content is immense, concentrated and centralised.
  • Douglas Rushkoff: “The product online is not the content. The product online is you! The product online are the eyeballs looking at that content and as much information about how to influence the hands connected to those eyeballs as possible.”
  • The myth of free extends to the obfuscation of the workings and purpose of the tools themselves.
  • For example, Douglas Rushkoff: “The average kid today, you look at Facebook and you think, “Oh look at this place, Facebook is here to help me make friends, isn’t that great? This is what this is for.” The distance between the user and the program is so great, we don’t even know what the programs we’re using are for. Talk to little Johnny and he thinks that Facebook is there to help him make friends? Go to Facebook, what do you think they’re talking about there? How are we going to help little Johnny make more friends? Deeper lasting human relationships? No! They’re thinking: how are we going to monetise Johnny’s social graph? How are we going to use big data to predict what Johnny’s going to do and then sell Johnny’s future to himself before he knows he’s there himself?”

The Society of the Spectacle

  • “From Uber to Eric Schmidt, tech is closer to the US government that you’d think.” For example, the “Android Administration,” Google’s remarkably close relationship with the Whitehouse; NSA spying programmes such as Tempora or PRISM involving Google, Apple, Microsoft, Facebook, and so on.
  • Kevin Bankston: “What you Google for defines you. A log of your searches on Google, or any other search engine, is practically the closest thing to a printout of the contents of your brain that we’ve ever seen. It indicates your political leanings, your religious leanings, your medical concerns, your sexual concerns: a vast array of sensitive data that in the past no one ever had.”
  • Jeff Chester: “The basic fundamental paradigm of advertising is called one-to-one marketing. That’s what was made possible by the Internet. I can know everything you do and I can reach you at any point. First, in the 90s, it was when you were in front of a computer, but now because of the growth of the Internet and especially mobile devices, I can reach you 24/7. I can reach you and your friends and I can target you and I can engage in visible digital behaviour modification.”

The Megamachine

  • According to Pew: Today in the United States, more than 85% of adults get their news from social media, and 64% get news from only one source, usually Facebook.
  • Corporate forces make huge profits not only off the data about who we are and what we do and by shaping us to be subservient consumers, and the cycle goes round and round, this manipulation is now done so well that politics has turned to using the very same methods: using data for persuasion and highly sophisticated propaganda techniques, manipulating elections, reinforcing the power of the powerful, mass surveillance, and so on.
  • The billionaires and technologists that are behind politicians for corporate interest, who want us to vote a certain way, or depend on a certain political outcome, not only donate huge amounts of money to individual political candidates, but tap in to the rich droves of digital data trails about each of us to generate extremely targeted political campaigns.
  • This targeting is carried out with scientific precision on a scale never before possible.
  • Technology companies are the among the largest political lobbyists in the United States, second only to the mining industry and defence.
  • The tech industry spent over $120 million for political lobbying in 2010.
  • Microsoft alone spent over $6.9 million in federal lobbying efforts in 2010.
  • Alexander Nix, CEO of Cambridge Analytica on the power of Big Data and psychographics in the electoral process: “Back in the days of Mad Men, communication was essentially top-down, that is, it’s ‘creative led.’ Brilliant minds get together and come up with slogans like, “Beans means Heinz” and “Coca-Cola is it” and they push these messages on to the audience in the hope that they resonate. Today, we don’t need to guess at what creative solution may or may not work. We can use hundreds or thousands of individual data points on our target audiences to understand exactly which messages are going to appeal to which audiences way before the creative process starts.”
  • Once an audience segment is identified, they can be sub-segmented by the issues that are most relevant to them, and then be targeted with specific messages.
  • The new age of the consumer culture that has also subsumed politics is run by data. ‘Technocracy is the new democracy.’
  • Jeff Chester: “The commercial surveillance system that the advertisers have created all across the world is stunning and a cause for serious alarm. It threatens our civil liberties, it’s about getting us to buy high interest-rate credit cards, junk food, prescription drugs for illnesses we may or may not have, politicians who may or may not be good for us—that’s what all this data is being used for. We have no access to this data, we have no ability to control this information, we can’t challenge it, we can’t correct it. So we’ve allowed the Googles and the Facebooks and the Yahoos to create invisible repositories of information about each and every one of us that they can use, but it’s closed to us. So ultimately it’s not about just selling, it’s about maybe the next evolution of capitalism: creating an undemocratic society.”

Creeping Normalcy

  • Debunking the myth: “I’ve got nothing to hide, so I’ve got nothing to fear.”
  • Roger Clarke: “We hear this nonsense about the only people who’re concerned about privacy are people with something to hide. Well, yes. How about your password? How about your pin? How about various aspects of your physical person? How about various aspects of your health? Various aspects of your finances? The fact that you’ve got a really really valuable painting in a house that is really easy to break into and that doesn’t have a security system? How about the way your kids go to and from school? What your daughter drinks, and which drink to spike? There’s any number of things that people have to hide.”
  • Surveillance Camera Man: A person who walks around the city filming people, analogous to the way pervasive CCTV watches our every move, whether we like it or not, unanswerable.
  • On the whole, people react negatively to Surveillance Camera Man but not to the wider surveillance culture, even though it’s analagous: The surveillance culture follows us everywhere, tracking, recording, analysing, scrutinising, unanswerable. So why aren’t we pissed off about this in the same way? Is it because the surveillance is diffuse, coming at us at all directions? It’s not a guy with a camera right in front of our eyes. It’s something that’s been normalised in slow incremental stages, a kind of creeping normalcy, hidden in plain sight?
  • Katina Michael: There is a novelty effect to the new technologies. At first we notice them. But then they are normalised, and we forget to question what is going on.

Historical Memory and The Panopticon

  • Roger Clarke: “The Jews who lived in Holland in 1939 had nothing to fear from a database that identified them as being Jews. Well they don’t have anything to fear now ‘cause they’re all dead, with very few exceptions. The people who had reasonable educational qualifications in Cambodia in the 70s had nothing to fear. What’s there to fear about knowing that you’ve got a certificate of an advanced diploma? Well most of them are dead too. In Rwanda/Burundi it was being thought to be of a particular ethnic background. Now it’s not terribly easy for people to tell ethnic backgrounds when they’re tribal and they’re adjacent and they’ve been adjacent for hundreds and even thousands of years. But the decision was made that you were of that ethnic persuasion therefore you were dead. Now these are just the sharp end, where the worst case of invasion of privacy occurs and you get killed. There’s lots and lots of circumstances where obscure bits of information do harm the people that somewhat less than killing them.”
  • The inventor of the World Wide Web, Tim Berners-Lee states that the “Internet has become the world’s largest surveillance network.”
  • William Binney, ex-NSA spy technician turned whistleblower, says: “A lot of people say, ‘I’m not doing anything so it doesn’t matter.’ Well you may not be doing anything now but you can’t say anything contrary to the administrative position because you may become a target. And if you do that and you become a target they’ve already assembled all this back-data on you, now they can start looking for some way to charge you with some kind of criminal act, or harass you in some way, like at the border, or internally here, they can harass you through businesses. Things like that. All that is a very, it’s a real danger when government assembles that kind of knowledge about its citizenry. I mean, from my background, the KGB, the Stasi, the SS, and the Gestapo, they could never have dreamt of having such an ability to monitor the population. That’s the real threat.”
  • Rebecca MacKinnon: “Surveillance is much more pervasive in our online lives than it is in our offline lives. If police officers come into your house or come into your office and they go through your files, they go through your desk, they go through your drawers and cabinets, it’s pretty obvious that happened. But if they do the equivalent, in your email, in your online storage spaces and your Facebook, you don’t know.”
  • This is the concept of a panopticon: A circular prison design with the watchtower in the centre and cells along the outside. “The scheme of the design is to allow all prisoners to be observed by a single watchman in the centre, without the inmates being able to tell whether or not they are being watched. Although it is physically impossible for the single watchman to observe all the inmates’ cells at once, the fact that the inmates cannot know when they are being watched means that they are motivated to act as though they are being watched at all times. Thus they are effectively compelled to regulate their own behaviour.”
  • William Binney: “The danger here is that we fall into something like a totalitarian state like East Germany.”
  • Petra Epperlein: “That’s actually the creepy thing about surveillance, like pre-emptive surveillance. You can find anything about someone. You can make stuff up. You just collect all this material and you can go back later and interpret it in so many ways, and so everybody can become the enemy instantaneously.”

Are we paying attention?

  • Eli Pariser: “One of the big questions of our age I think, is whether the Internet is helping make people more informed. And clearly we all have a lot more access to information, but is that information turning into a better informed citizenry that’s better able to make decisions about the important issues that we all face? And at this point we have a lot of data on this, and I think we can actually say, it’s not really.”
  • Eli Pariser: With people’s informedness about foreign affairs before and after the Internet (from 1989 to 2007), on many metrics actually people dropped in terms of their knowledge. Despite there being no barriers to access—it’s as easy to go to De Zeit or Le Monde as it is to go to the New York Times—the percentage of Americans who know the name of the Russian president has dropped by ~10 percentage points.
  • Eli Pariser: This isn’t just a problem of informedness about foreign affairs, Americans are pretty poorly informed about domestic affairs as well. 44% can’t define the Bill of Rights. Americans want to cut foreign aid from what they think it’s at (about 30%) to about 14% of the budget. The real figure is actually way less than 1%.
  • We have a real challenge with informedness, facts, truth, and a base-line shared reality on the larger social scale.
  • Robert McChesney: “We have a situation in which a significant percent population doesn’t vote, doesn’t care about the issues, is tuned out entirely, is what we call de-politicised. In fact, we have a rate of de-politicisation in the United States that must make a tyrant like in, you know, Indonesia envious. They’d say, how can I get one of these vegged out populations?”
  • Juan González: Is the technoculture merely responding to the interests and needs of the people who use the system?
  • Eli Pariser: “[The technologists] say, ‘We’re just giving people what we want,’ and I say, ‘Well what do you mean by what we want?’ because I think actually all of us want a lot of different things. There’s a short-term sort of compulsive self that clicks on the celebrity gossip and the perhaps more trivial articles, and there’s a longer-term self that wants to be informed about the world and be a good citizen. Those things are in tension all the time, and the best media helps the long-term self get an ‘edge’ a little bit. It gives us some information vegetables and some information dessert, and you get a balanced information diet. This is like you’re just surrounded by empty calories, by information junk food.”

It’s full of sugar and it tastes so nice…

  • The main aspects of modern screen culture revolve around the deliberate use of psychological drivers such as ‘intermittent variable reward’ or ‘schedules of reinforcement.’ That is to say, the experience of screen culture is analogous to playing a poker machine: Sometimes you get an exciting and enticing reward, and other times you don’t. And this uncertainty is what drives people to come back to the screen and stay engaged. It creates, drives, and perpetuates an addiction.
  • The behaviourist B.F. Skinner envisaged the use of such persuasion as the root of behaviour modification: Society can be acculturated and programmed using powerful forms of persuasion based on addiction and reward. Such methods are now known in modern psychology as “applied behaviour analysis.” Skinner explores these ideas in his 1948 novel, Walden Two, which in its time, could have been considered science fiction, since science-based methods for mass-altering people’s behaviour did not yet exist.
  • The most efficient way to manipulate the masses is not to use physical force, but pleasure and reward. Past social systems have used punishment to keep populations in line, but pleasure and reward works much better and uses much less resources. People not only police themselves and others for being out of line (or for impacting their perceived pleasure and reward with the system), but they themselves also individually chase the pleasure and reward given by the social system and its validation, which creates and exacerbates powerful co-dependency and addiction.
  • Lewis Mumford: “Skinner has a very ingenious way of making a system of highly compulsive organisations seem as though it were a very humane one. This is his great and original contribution. The old-fashioned mechanical collectives, the Megamachines as I call them, were brutal: They used punishment as a way of enforcing conformity. Skinner and the psychological school that he represents have found a much better system than punishment. They first tried it out on animals and now they’re applying it to human beings: reward them. Make them do exactly what you want them to do without the whip, but with some form of sugar-coated drug or candy which will make them think that they’re actually enjoying every moment of it. This is the most dangerous of all systems of compulsion. That’s why I regard Skinner’s Walden Two as another name for Hell, and it would be a worse hell because we wouldn’t realise we were there. We would imagine we’re still in heaven.”
  • Natasha Schüll: “It’s all about: how do you make your website or your app stickier, how do you retain attention? Right now it’s measured through these little units of time and clicks, and how long do you sit there. And you think about: what is the logical sort of endpoint that they’re after here? They’re trying to turn everybody into what looks a lot like an addict.”
  • Ex-president of Facebook, Sean Parker says: “If the thought process that went into building these applications, Facebook being the first of them to really understand it, that thought process was all about: how do we consume as much of your time and conscious attention as possible? And that means that we need to sort of give you a little dopamine hit every once in a while, because someone Liked or commented on a photo or a post or whatever, and that’s going to get you to contribute more content, and that’s going to get you more Likes and comments. It’s a social validation feedback loop that… it’s like a… it’s exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. And I just… I think that we… the inventors, creators, you know, it’s me, it’s Mark, it’s Kevin Systrom at Instagram, it’s all of these people, understood this consciously and we did it anyway.”
  • Douglas Rushkoff: “The way we represent ourselves online has devolved from the quirky, personalised HTML webpage, homepage of the ‘90s, to the somewhat modular but still strange presence of a MySpace page, to the completely formatted and market-friendly presence of a Facebook page. What we’ve done is moved from personal, human, open-ended self-expression, to completely market and computer-friendly, regimented and conformist expression. And that’s because we’ve turned the Net from a venue for self-expression, to a way to render ourselves up onto the market.”
  • Katina Michael: “Screen Culture is not only addictive but obsessive-compulsive addictive. It’s a health problem and we’ve yet to really master even to begin where to ask the questions. It’s taken us 20 years to realise that fast foods cause obesity. This is a well-known fact. Fast food advertising, even in the sports arena, causes obesity. How long is it going to take us to realise the addictive nature of smartphone usage? 5 years, 10 years? Is that going to be too late by then, because the mimicry will have been well entrenched in the next generation? What do you do about that? The thing is you’ve got to do something about it today.”
  • Susan Greenfield: “Absence of evidence isn’t evidence of absence. As witness the smoking in the 50s, that was the mantra of the tobacco companies, that there wasn’t any evidence. So that’s the first issue. Second, as a neuroscientist, I think there is evidence certainly the brain will adapt and therefore it’s not an unreasonable assumption that it will adapt in a way that is equipped to survive almost as a computer itself, in a two-dimensional world, where fast responses are mandated, in response to stimulation rather than in a thought. Third, there is actually evidence accumulating. There’s Nicholas Carr’s The Shallows, there’s Richard Watson’s Future Minds, there’s Sherry Turkle’s Alone Together, there’s a brilliant review by Daphne Bavelier in the very high impact journal Neuron from 2010, where she suggested that there was a tendency now for increase in distraction, violence, and addiction, related to screen technologies. Now of course one swallow doesn’t make a summer and one can go through all the literature and find fault with some of the studies and show their shortcomings, of course. But we must do this. We must have the debate.”
  • Katina Michael: “We are sleepwalking into a world that has become over-reliant on technique. Soon, we will not just be talking about the social implications of technology, but about how society has become technology. We, who created the computer, will invite it into our body to govern us, and the machine itself will rule over us. Ladies and gentlemen, I leave you with one final question: who will control this emerging new smart surveillance infrastructure? And what will be the rights of the controlled?”

The Real World

  • Derrick Jensen: “There’s these transhumanists who believe that someday humans will be incorporated into the machine, and machines and humans will, sort of, be one. And really what I have to say to them, apart from the fact that they’re completely crazy, is that they’re way too late and it’s already happened. We’re already embedded in these machines and we are enthralled to these machines. Think about it. Do you touch plastic or human flesh more often? Or think about it, how many machines do you have daily relationships with? And, on the other hand, how many wild animals do you have daily relationships with?”
  • If you’re in an echo chamber, if you’re under sensory deprivation conditions, you start to hallucinate. Most of our ideologies are hallucinations.
  • Susan Greenfield: “Increasingly the techno-haves are very very distinct from the techno have-nots, where some people on a dollar a day with no access to drinking water, and there’s other people with Gameboy thumbs and Prozac and Botox and so on. And it struck me that this world, leaving aside humanitarian issues, was economically and ecologically not viable. You can’t have a divide like that.”
  • Lelia Green: “The mythology of technological change really, is that it’s beyond our control. That technology is like one of the great forces of the universe, that it will ‘progress inevitably’ and that all we can do is jump on or jump out the way, you know, that there’s ‘no stopping technology.’ And that is a myth which is propagated to make us feel powerless that we have any say in the way that technology is used, because technology is an expression of the elites of the society that create it.”
  • Once we know that the three elites (the A-B-C power blocs) that are actually harnessing technological power and spreading this myth that there is nothing we can do about it, then in fact, once we see that for the myth that it is, then we are more able to say: Resistance is not futile.
  • Derrick Jensen: “This culture will consume the world in order to power these machines. And, you know, it doesn’t require some fiendishly clever conspiracy on the part of machines to do this. What it requires is: I love this line, unquestioned assumptions or unquestioned beliefs are the real authorities of any culture, and all it takes is an unwillingness to question the beliefs on which the system is based.”
  • Derrick Jensen: “And there’s a great line also by Upton Sinclair. It’s hard to make a man understand something when his job depends on not understanding it. And I would say, it’s hard to make a person understand something when their entitlement depends on them not understanding it, and when their addiction depends on them not understanding it. I think this is all tied to addiction too.”
  • Derrick Jensen: “The word addiction actually comes from the same root as ‘to enslave,’ because originally a judge would issue an edict causing someone to become a slave, and so they were edicted, addicted. It’s pretty clear you know when we talk about people who are heroin addicts or something it’s pretty clear that they are enslaved to the addiction. And it’s a little bit less easy to see in ourselves as we spend most of the day staring at a screen. And, it’s also a bit more difficult to see when we talk about some group of people being addicted to power over others, which is what this culture is really based on.”
  • Technology is not neutral. Technology reflects the elites, the passions, the capacities, of the people that create and then continue to use it.
  • Lelia Green: “Once you start looking at the technology plus the culture, those things together mean that the technology is constructed in a certain way, and it’s understood in a certain way, and it’s used in a certain way. And once you start putting all those things together, then it has a purpose for the people that are talking about it which is far from neutral.”
  • Lelia Green: “Technology is always harnessed to a particular end. Now sometimes that can be positive or negative, but it’s not as simplistic as saying, “Oh, it’s up to people how they use it,” because people can only use it within the constraints of how it’s designed, the knowledge that they have, and the society that they’ve been socialised in. And those three things together mean that technology has an actual cultural value which is far from neutral.”
  • Lelia Green: “Until we start asking those questions, you know, what are the social costs and benefits, and who is excluded, and what is the environmental cost of all this? I mean, you know, there are, in terms of the huge amount of a toxic landfill from discarded mobiles, for example; e-waste in Ghana; until we start looking at those other pictures in terms of, we look at the way that technology works as a sort of network of connection but instead see it as part of a living world, because technology is not part of that living world, that it has an impact upon that living world, until we see those bigger questions and technology in the broader scheme of things, rather than just as its own story, I think we’re only scratching the surface of the many ways in which we are using it, and we are using it to change ourselves and our futures.”
  • Derrick Jensen: “My solution by the way is not to say, “Oh, just everybody turn off their computers,” and the reason that’s not the answer, yeah we need to turn off the computers, but that’s not the answer, and the reason that’s not the answer is because me turning off… we have to recognise at every moment that no matter how much fun I may have playing Left 4 Dead 2 on the computer, or how much fun I may have talking to you, I have to recognise that the computer’s primary purpose is, as you said earlier, where it emerged from, is making war and doing commerce. And all this other stuff is just gravy. And the global economy as it is couldn’t exist without computers and I can turn my computer off and it would not stop the destruction of the planet one little bit. What needs to happen is the entire infrastructure, the entire technics surrounding all this needs to be stopped. And it needs to be… it’s so highly addictive that I think it needs to be destroyed because I believe that people are so addicted to it that they won’t give it up.”

# # #

Some Example Questions

  1. We live in a culture where more people get their news from one social media website (usually Facebook) than any other source, including TV. Putting aside the effects of the Filter Bubble for the moment, where information streams are tailored and edited for each individual, what consequences does using only one source of news have on shaping and sustaining a certain worldview? What does this mean for society?
  2. Reconsider the above question factoring in the effects of the Filter Bubble, where information experiences are manipulated and customised for each individual. Does this widespread effect on the way we each perceive and understand the world change your answer? If so, how? Why do you answer the way you do?
  3. Computer games are consumed more than TV, radio, music and movies combined. How does this shape one’s perception and experience of the world?
  4. How many machines are within 20 metres of you? With how many machines do you have a daily relationship? How many machines do you see right now? How many machines do you see daily? How many wild animals are within 20 metres of you? With how many wild animals do you have a daily relationship? How many wild animals do you see right now? How many wild animals do you see daily?
  5. Do you touch plastic or human flesh more often?
  6. Has technology done more harm or good for human life? For life in general? And what measures do you use to determine your answer?
  7. Do you believe that high technology is neutral (do you believe any technology is neutral)? Why do you believe as you do?
  8. Do you believe that the primary function of technology in this culture (in any culture?) is to leverage power? Why or why not?
  9. Do governments better serve corporations or living human beings? What are the implications of your answer?
  10. If you had to make a choice, would you rather the world have ice caps and polar bears, or screen culture and more broadly the industrial oil-based economy?
  11. Look around the room. Do you know how to make any of the objects you see? Do you know who made them? Did you make any of them?
  12. If you’re in front of a computer, could you make your computer from scratch? What does this mean? How are we beholden to these objects?
  13. Do you get enough sleep? Why or why not?
  14. Do you get to dream enough? Why or why not?
  15. How often do you hear silence? For how long at one stretch? Is that not enough, enough, or too much?
  16. When was the last time you spent all day and all night without a watch or clock, not knowing (or caring) what time the clock said it was?
  17. Activist and author Claude Alvares wrote, “Science and technology constitute two major oppressions of our time. Yet, if one goes by the literature, not only are science and technology seen as liberators (either from superstition, fear or material deprivation and want), those who control and direct them (technocrats, industrialists, statists) are seen as liberators too.” What do you think about this? Is this a problem? Why or why not?
  18. The policy of the Catholic Church has been and continues to be “Nulla salus extra ecclesium,” which means “Outside the Church there is no salvation.” What do you think of this? What are the implications? What do you think about the following statements? “Outside Science there is no knowledge,” or “Outside Technology there is no comfort,” or “Outside Capitalism there are no economic transactions,” or “Outside Industrial Civilisation there is no humanity,” or “Outside the Panopticon there is no security.” For each, do you agree or disagree? What are the implications of each?
  19. Is the technoculture like a “cult,” as in, if you’re not part of it, you “just don’t get it”? Why or why not?
  20. Do you have a mobile phone? If yes, how would your life be different without it? If no, how would your life be different with it? What are the forces that encouraged (or would encourage) you to get a phone and be part of the culture?
  21. Philosopher and activist Jerry Mander wrote, “As recently as two decades ago, it was possible to speak about different parts of the planet as distinct places, separate from one another, with distinct cultures, living habits, conceptual frameworks, behaviors and power arrangements, and it was possible to speak of distinctly different geographies as well.” What do you think about the homogenisation of culture brought about by technology and corporate culture? Is it good or bad?
  22. Does technology exacerbate emotional numbing? If so, how have you noticed this in your own life? Why do you answer the way you do?
  23. Has technology done more harm or good for human life? For non-human life? For life in general? What measures do you use to determine your answer?
  24. Consider the following: “A byproduct of aggression is paranoia, because you fear that others are as aggressive as you are, which leads to an obsession with control (power over others) and security (protection of self).” Do you agree or disagree? Why do you answer as you do? Do you consider yourself aggressive? Is this good or bad? Do you consider this culture to be aggressive and paranoid? Is this good or bad?
  25. Sociologist Max Weber wrote, “Rational calculation reduces every worker to a cog in this bureaucratic machine and, seeing himself in this light, he will merely ask how to transform himself into a somewhat bigger cog.” Does this sound familiar, or not? Do you wish to be a bigger cog? Is that a good thing?
  26. Max Weber: “From a purely technical point of view, a bureaucracy is capable of attaining the highest degree of efficiency, and is in this sense formally the most rational known means of exercising authority over human beings.” Is this a good thing? Why or why not?
  27. Max Weber: “It is horrible to think that the world could one day be filled with nothing but those little cogs, little men clinging to little jobs and striving toward bigger ones. This passion for bureaucracy is enough to drive one to despair. It is as if in politics we were to deliberately become men who need ‘order’ and nothing but order, become nervous and cowardly if for one moment this order wavers, and helpless if they are torn away from their total incorporation in it. That the world should know no men but these: it is in such an evolution that we are already caught up, and the great question is, therefore, not how we can promote and hasten it, but what can we oppose to this machinery in order to keep a portion of mankind free from this parceling-out of the soul, from this supreme mastery of the bureaucratic way of life.” How can we stop these bureaucracies?
  28. Would you agree that the technologies this culture creates mirror our collective psyche? If so, can you name some technologies and say what they teach us about our cultural psyche?
  29. A single nuclear weapon can kill thousands of people in an instant and irradiate the world for thousands of years. What kind of cultural psyche would create such a technology?
  30. Nuclear weapons lead to an international arms race. The same can be said of surveillance technologies or computers in general. What other examples can you find of technologies which emerge from and lead to escalation?
  31. Would you agree or disagree that we have for the most part surrendered control over our lives—and over our survival, and the survival of most of the planet—to the machines created by a mechanistic way of seeing the world? Why do you answer as you do?
  32. Is “technological progress” a good thing? Why or why not?
  33. Sociologist George Ritzer said, “No characteristic of rationalization is more inimical to enchantment than predictability. All of these enchanted experiences of magic, fantasy, or dream are almost by definition unpredictable. As for the other characteristics of rationalized systems, control and nonhuman technologies are absolutely inimical to any feeling of enchantment. Fantasy, dreams, and so on cannot be subjected to external controls; indeed, autonomy is much of what gives them their enchanted quality.” Do you agree or disagree? Why or why not? In either case, what does this mean for your own life, and how does it affect you?
  34. How do machines and a technoculture make the world more predictable and controlled rather than enchanting and free?
  35. What, if anything, is wrong with a utilitarian worldview (that is, perceiving others through the lens of their usefulness to you)?
  36. Consider the following sentences: The assembly line mass murder of the Holocaust is production stripped of the veneer of economics. It is the very essence of production. It took the living and converted them to the dead. That’s what this culture does. It was efficient, it was calculable, it was predictable, and it was controlled through nonhuman technologies.” Do you agree or disagree? Why or why not? What are the implications of this?
  37. The Nazis used a Hollerith machine, a very basic computer made by IBM, to tabulate their datasets. Today’s pervasive technoculture of surveillance is a huge resource for those in power in comparison. Transposing the history of the Nazis and their use of personal data to the present, how could today’s technoculture help a similar regime be more terrifying? How could today’s technoculture help a similar regime emerge?
  38. What level of repression do you think Hitler could have achieved with modern surveillance and propaganda technologies? What if Hitler would have had a modern nuclear arsenal? Access to modern biological warfare stocks?
  39. Philosopher Stanley Aronowitz wrote, “The point of science [and digital technologies]—and this may or may not be true of individual scientists [or technologists]—is to make the world subject to human domination. If they can abstract, and then they can predict on the basis of that abstraction, then they can try, at both the human and natural levels, to use that prediction in order to exert control.” What do you think about this? Do you agree or disagree? If you agree, do you think this is acceptable? If not, what are you going to do about it?
  40. Consider the following description of the values and consequences of modern society: “The global economy trumps local subsistence ecology. Science trumps place-and myth-based culture and spirituality. Money trumps common sense as well as ethics. Science pushes us to try to know everything. The earth is losing its wild places and wild people. Technological research is driven by the military/security apparatus, and public universities are dominated by military-related funding. National security is trumpeted as a criteria for relating to the peoples of the world. Communities have no say in what factories will be built in their towns, or what chemicals will be dumped into their drinking water. This isn’t science fiction. This is your world.” Do you agree or disagree? Why or why not? What are the implications of all of this?
  41. American broadcast journalist Edward R. Murrow claimed in 1957 that “if television and radio are to be used to entertain all of the people all of the time, then we have come perilously close to discovering the real opiate of the people.”
  42. Why are machines more efficient than living beings? After you answer, consider the following, and answer again: “Why are machines more efficient than living beings? Because machines do not give back. All living beings understand that they must give back to their surroundings as much as they take. If they do not, they will destroy their surroundings. By definition, machines—and people and cultures that have turned themselves into machines—do not give back. They use. And they use up. This gives them short-term advantages in power over, the ability to determine outcomes. They outcompete. They overwhelm. They destroy.” Do you agree or disagree? Why do you answer as you do?
  43. Consider the following: “It is more difficult by far to control diverse beings than it is to control objects that are all alike. If control is important to you, diversity must be destroyed. All cultures serving gods besides production—death—must be destroyed. All languages that do not serve this end must be forgotten. All creatures we can’t use must be eliminated. All people must be standardised as well. (What do you think schooling is for?) One religion. One way of knowing the world. One economic system. One way of living on the land. If this language seems too strong to you, look around and ask what is happening to cultural diversity, to diversity of languages, to biodiversity, to all forms of diversity. They’re disappearing.” Do you agree or disagree? Why do you answer as you do?
  44. What is more important to you than money? What is less important to you than money? If your answer were determined not by your words but by your actions, would your answer seem different?
  45. What would happen if we rejected the myth of the machine, and the machine itself?
  46. What is the relationship, if any, between personal lifestyle choices and social change? In other words, do you believe that changing your lifestyle leads to significant changes in the larger culture?
  47. What is your threshold in which the ‘creepiness level’ of technology is reached for you? When your TV spies on you just like Orwell’s telescreens? When Amazon Echo or Google Home has microphones through your house listening in to and recording your private conversations at every moment? When Google drives cars down the street photographing you and your house and uploads the pictures to the Internet without asking you first? When computers can read your own thoughts?
  48. What is your threshold at which you’ll take a stand?
  49. What is happiness?
  50. How do you want to live? What are you going to do to achieve this?

# # #

 

 

Some example questions above are based on, and/or inspired by content from Derrick Jensen’s Endgame Questions.