Scott Rettberg and Mathias Klang discuss the panopticon of surveillance and our problematic relationship with the devices that make our lives easier — while also eroding our privacy.
Scott Rettberg: I’m here with Mathias Klang, who's a Professor of Media Studies at Fordham University in New York and has been a Fulbright Scholar here at the University of Bergen during the fall 2023 semester. Welcome Mathias.
Mathias Klang: Thank you. It’s great to be here.
Scott Rettberg: Last episode of Off Center, I spoke with Jill Walker Rettberg about her book, Machine Vision, and we got to do an interesting conversation about surveillance, which is also going to be the topic today.
Mathias Klang: Yeah, it’s a great topic to have.
Scott Rettberg: So, we’re talking about your project, Surveillance Microcosms, which is a book project you’ve been working on for a while.
Mathias Klang: That’s right. I’ve been kicking around the idea for a long time. I thought first, like you do, it’s a small article, and then I realized, no, it’s grown, and then coincided nicely with the Fulbright. So, I thought, this is amazing. I’ll have time to start writing it, planning it out and developing it. I’ve done some really good work.
Scott Rettberg: Excellent. When do you think the book’s going to be published?
Mathias Klang: I hope by the end of this year.
Scott Rettberg: Great. And your background, Mathias, you started out, not as an academic.
Mathias Klang: No. I started out as a lawyer. I trained in law and then sort of realized that it was interesting, but not really challenging in the way I wanted it to be challenging. So, I found through contacts that there was an opening at the informatics department. I started out in informatics, and my PhD is in informatics, but I’ve moved from department to department. So, after Informatics I went to Library and Information Science because they were doing some really cool stuff with social media and activism. And then over to media studies, which is where I am now.
Scott Rettberg: Excellent. And I imagine that legal background gives you a different perspective on the topic of surveillance.
Mathias Klang: It does, and it’s two parts. One, of course, it gives me a great structure to how to plan, how to think about surveillance and rights; and the other part, of course, is trying to overcome the purely legalistic ideas that are in my way of thinking, I need to think more broadly about it.
Scott Rettberg: All right. So, I know you start out looking at surveillance from a broad historical perspective. Going back to —
Mathias Klang: Jeremy Bentham.
Scott Rettberg: Yeah, the panopticon.
Mathias Klang: Exactly. It’s a well-trodden path, and it needs to be talked about, but it’s always a challenge. How do you repeat the same story over and over again for the reader who maybe doesn’t know enough about it? What I talk about is the ways in which Jeremy Bentham designs the idea of the prison while visiting his brother in Russia, actually in Belarus, and how this sort of forms the early ideas about surveillance and control through technology that we still have today.
Scott Rettberg: And that’s something that Foucault developed, with the idea of the panopticon, the surveillance society. But you then move to a different kind of surveillance that we see being very prevalent today.
Mathias Klang: Yes. So, the historical part goes from Jeremy Bentham’s panopticon, but then it also brings in the legal ideas. When we start talking in law about privacy and surveillance, and that’s with the article to The Right to Privacy from 1890, written by Warren and Brandeis. It shows that we’re starting to think about the fears of a new technology. Basically, the small portable compact camera, the Kodak, which was seen as a large threat for a group of people. It doesn’t really develop, and we have an interesting reoccurrence of this idea of surveillance and cameras going back and forth. We had the discussion when someone said “let’s get a cell phone and put a camera on it.”
Scott Rettberg: Right.
Mathias Klang: We had the same discussion again, but it was a very interesting idea where two lawyers said there should be somewhere in the law that is understood to be a right to privacy, and we just need to reveal it somehow. It doesn’t actually happen then, but it’s one of the earliest discussions. The next big jumping off point, if you’d like, of course, is George Orwell who comes along and gives us a metaphor of everything we’re worried about. You know —
Scott Rettberg: Big Brother.
Mathias Klang: Yeah, Big Brother watching you. And that one’s very strong. It’s interesting, in my lifetime, it's had a bit of a decline. It had a nice peak in 1989, but then had a decline again, we’re not as worried about Big Brother.
Scott Rettberg: It’s funny, I just read an article about some conservative Republicans in the U.S. wanting to ban Orwell.
Mathias Klang: To ban Orwell!
Scott Rettberg: Yeah. Yeah.
Mathias Klang: Oh, that will solve everything.
Scott Rettberg: Kind of scary. They don’t like the idea of people thinking about double-speak.
Mathias Klang: Right.
Scott Rettberg: If they think too hard about it, they might, you know, recognize something. But yeah. So, we see the shift from the panopticon where someone in the center can see everything, to the personal.
Mathias Klang: Yes. And this is where we end up having Foucault coming in and bringing back the idea of the panopticon, making sure we all understand it’s a metaphor and not a prison, like my students still struggle with; and explain to us that we live in a society, that everyone is looking at everyone else. It’s not the guards looking at the prisoners, but the prisoners are looking at the guards, if you want. And we’re all under this level of control, and therefore we should not be worried about “the looking”, we should be more worried about the power involved. What power do you have when you look at me? This fundamentally explodes the idea of surveillance and structuralism and a whole other bunch of interesting discussions that come along. And that was a large part of the discussion of surveillance until, really, when the technology of surveillance goes from being owned by either the state or by big corporations into everyone’s hands, when we meet the idea of social media, or your phone, or have all these devices for tracking, for recording, for uploading.
Scott Rettberg: And that’s sort of the surveilled participating in their own surveillance.
Mathias Klang: Exactly. And we see a lot of interesting ideas come about that, where we talk about, it’s not just surveillance. If I put a webcam in my house and broadcast myself to the world, it is surveillance, but it’s not surveillance as we know it. I get pleasure out of my ideas of surveillance.
Scott Rettberg: Right.
Mathias Klang: So that is where we were in the middle of the early 2000s and social media. We had a very important work by Shoshana Zuboff which talks about —
Scott Rettberg: Surveillance Capitalism.
Mathias Klang: Right. That’s the one. Where she comes and talks about surveillance capitalism, where she is focusing on the ways in which all these devices are collecting our data, analyzing our data, creating data doubles, and making future predictions of where we are going to be, what we’re going to do, and the idea is, of course, to insert advertising just before we want something or need something and sell stuff to us.
Scott Rettberg: Or if we’re about to vote.
Mathias Klang: Or if we’re about to vote. Exactly. And there’s an interesting mix of questions here, whether this is actually as efficient as they say it is, how is it working? But it is there, and we know it’s there and it’s interesting. But that’s how we got to the point where we are now. What I’m interested in is what happens with all these devices, but from the perspective of ourselves. So how do I change my behavior? How does my partner change her behavior? How do my friends change their behavior when we have a wide array of surveillance devices at our disposal? I mean, very sophisticated surveillance infrastructures, and that’s what’s the starting point of the book and I try to — and this is why I call it a microcosm. The idea of the microcosm is, of course, that it’s not the outside looking in, but everyone looking at everyone. And I’m looking at my little network of people in different ways.
Scott Rettberg: It’s also these little devices too, right?
Mathias Klang: Oh yeah. And the devices are partially created for surveillance, and partially just created because we want to monitor something or do something, and we can use them for surveillance. So, if you have a device that is air monitoring your home, it has the side effect of being able to measure how many people are in your home. If you have a camera outside your house to increase security to see if someone’s stealing your Amazon deliveries, it has a side effect of also checking up on your partner or your kids when they leave the house.
Scott Rettberg: I wonder if anybody does Ring family albums. Gets around that Christmas card issue.
Mathias Klang: Right. Like the Spotify Wrapped but it’ll be a Ring wrapped. “You left the house this many times.” I try to do each case study as a chapter. I try to look at it as a phase of life almost but also have a higher or a theoretical perspective on each one. The first one I look at is the way in which we monitor and surveil babies, and even in utero, how we use surveillance on the unborn and the just born. There’s a lot of lovely research about this.
Scott Rettberg: That goes back quite a bit, right? That the baby cam idea or the device to listen in on your baby at night to check on them sleeping.
Mathias Klang: Yes. The first baby monitor was developed in the 1930s. If I remember correctly it was Zenith who developed it, and it’s a beautiful design. In its advertising, it speaks a lot to how you can bring the baby anywhere that you are in the house or in the space. So, this idea that you can extend the parental gaze, or ear in this case, without much effort. It comes about in a very interesting way. It’s a reaction to the kidnaping of the Lindbergh baby. It replaces the trust in your servants with the trust in technology. We’ve come a long way since a recording device, we now have — we’ve gone through nanny cams, and now we have devices that we strap on to our children. We measure their vital signs. We have cameras following them when they move and all these kinds of questions.
Scott Rettberg: That way you don’t miss the first steps, right?
Mathias Klang: You don’t miss the first steps, but also the data that’s gathered will support the parent into making the correct decisions, which is also an interesting question.
Scott Rettberg: Well, and of course, there are — we always think of, slightly paranoid or malevolent uses of these things, but also, there’s sudden infant death syndrome, right? So, I know a lot of parents worry about that when they have infants, right?
Mathias Klang: Right. And that’s the interesting part about a lot of this technology, it will be sold to you with a sort of, “oh, the problem is, for example, SIDs, and therefore you need to monitor. We have the monitoring solution.” It can’t guarantee to stop, or to create a solution to this problem. And yet, somehow, it sells the technology as if it was the solution. A lot of the devices we’re talking about here will very explicitly say they are not surveillance devices, they are not medical devices. They are just devices for comfort. And yet when you look at the way in which they market them, they market them 100% as surveillance devices, as medical devices.
Scott Rettberg: Just benign —
Mathias Klang: Yeah.
Scott Rettberg: Benign surveillance. And of course, there is this desire, I think, for people like parents or schools to have this caring extension of surveillance or how to, as teenagers break away from you —
Mathias Klang: Exactly.
Scott Rettberg: Well, maybe it’d be nice if I had just a little bit of tracking on your cell phone to see who you’re with and where you’re at.
Mathias Klang: I find it really interesting because, if you talk to people who aren’t parents, there’s a very strong reaction, almost visceral reaction, like “oh, I wouldn’t track my children.” But when we are presented — much like, when you’re presented with a newborn and someone says, “aren’t you worried about SIDs?” You go “well, I have to worry about it now.” You just said the words. When you don’t have a teenager, you think "life can’t be that difficult. I’ll just talk to the teenager." And then when you have a teenager and you’re stressed and you have all the life realities around you, it’s very easy to slip into "well, there’s a lot of good devices to surveil your teenager or to keep your teenager under control." I’ve looked a lot into that. The case study is all about how we make managerial parenting where the parent is no longer, having a — the technology supports, not a dialog, but more of a managerial approach. The panopticon approach to parenting.
Scott Rettberg: And how do you see people resisting, for example, teenagers, resisting these technologies —
Mathias Klang: Yeah. Teenagerdom, I guess, is a fantastic area for looking at resistance and resistance to technology. Once the household has embraced levels of surveillance technology that might not be there to study the teenager per se, it could be a camera to check that, you know, a ring camera. But there’s a lot of really good narratives about how the teenagers have to deal with the technology, how they will, you know, if there are air quality sensors in the house, how do they smoke? If there are cameras, how do they escape? And of course, if you give them cell phones with monitoring devices on them, how do they evade that? The teenage case study is a lot about resistance to technology and how we see this arms race, almost, between the parents and the young adults.
Scott Rettberg: Yeah, that’s really interesting. Of course, it’s what teenagers do, resist what their parents try to inflict upon them. What about this idea of autoveillance?
Mathias Klang: Yeah. The next part is how we look at ourselves. Surveillance, sousveillance, autoveillance, looking at myself and it’s about how we create our self-image. We have step counters, we have calorie counters, we have all these counters around us. We have devices that tell us when we should stand up or sit down. If you have a smartwatch, it will say you’ve been sitting for too long and stuff like that.
Scott Rettberg: I always try to get my 10,000 steps.
Mathias Klang: There you go, 10,000 steps. Which on its own is a fun — 10,000 is a randomly chosen number that happens to come because the Japanese sign for 10,000 looks like a walking man. That’s what it’s come to. You think “oh, wait, this is not medical.” If you can do it, you should get your 10,000 steps. Absolutely. But if you can’t, then you should get 3000 steps. And if you can’t do that, maybe 200 steps. But the interesting thing is the 10,000 steps is how I’ll relate to my technology and how my technology tells me I’m doing good or not doing good, but I’m failing at being me because my technology says “no, you didn’t get your 10,000 steps.”
Scott Rettberg: So, it’s almost like a moral check.
Mathias Klang: It is a moral check. It’s how I understand myself. So, New Year. There’s a lot of people who set up a lot of really interesting goals about how they going to live their lives, and come February, a lot of people are going to feel like they’ve failed and the technology’s going to tell them “You failed.” So, we don’t design technology with a set of principles about how to help people succeed. We design it with a very like, “this is what you should be doing, and oh, look, you failed.” Yeah. It’s not the technology that’s failing. It’s we’re failing. So, an easy one of those is, for example, 10,000 steps is a wonderful goal, but a personal trainer or anything will tell you, you have to have a rest day.
Scott Rettberg: Yeah.
Mathias Klang: And there’s no rest days and there’s devices. It’s like, what have you done for me lately?
Scott Rettberg: You talked a little bit about how we look at ourselves in comparison to others. So, this becomes a collective sousveillance, but where we’re sort of sharing our data not only because we have these entities that want to scrape it for advertisement, but because we like looking at how we are in relation to others.
Mathias Klang: Right? So, we move from the how I look at myself to how I share that data with other people. There’s a lot of really good research here as well, on the idea of lateral surveillance, staring at our peers and trying to figure out who we are. But what I see that’s interesting here is the motivational factor that comes from looking at others. The leaderboard is a classic idea. If you’re a runner, you’ll have a running app, you’ll put your data up there and you’ll see the leaderboard and you say, “I’m doing well, I’m not doing well.” And again, there’s nothing inherently wrong with the technology that says “this is how you’re doing. This is how your friends are doing.” But the question is, what aspect are we trying to do? Because the same technology is used, for example, in someone who has an eating disorder, who is trying to intentionally reduce and measure the reduction of calories to a point that’s actually harmful.
Scott Rettberg: Yeah.
Mathias Klang: And they are then looking at other people around them and they’re being told “yeah, no, I lost this much more, or I lost that much more.” So again, looking at ourselves creates our self-image. This generates the idea of what I should be doing. It’ll increase your ability or desire to go on that run you’re supposed to go on. But it can also increase the desire and pressure to lose too much, to be actually bad.
Scott Rettberg: So, it’s ratifying these societal biases as well, right?
Mathias Klang: Absolutely. And then, of course, there’s the whole element of the quantified self that comes in here and how we’re like, “oh, look, I’ve done this, I’ve set up this weird goal, and now I will achieve the goal by pushing myself even further than I’m supposed to.” It’s not the technology or the desire for goals themselves, it’s that it can be used for either thing, and we tend to focus on, “oh yes, this is just a community of friends doing good stuff.” And I’m like, it can be, but it can also be a community of friends doing less good stuff.
Scott Rettberg: Yeah. You go to the dark side.
Mathias Klang: Yes.
Scott Rettberg: Well, and speaking of the dark side, and this is something I talked with Jill a little bit about, when she was talking about the example of Ring cameras in Oak Park. You’re going to have a chapter on stranger danger.
Mathias Klang: Yeah. There’s two chapters. One is about how we keep our homes safe, and one is how we keep ourselves safe. I mean, the homes is an easy one. We talked about the Ring camera that looks at the people coming towards my house, it records them automatically. It can save automatically. The ring camera has audio capabilities. I can record people talking across the road. It’s become the largest corporately owned, privately installed surveillance network anywhere. It also has a whole bunch of factors, like it collaborates with local law enforcement, they have access to this data, so, it becomes problematic on that level. Again, my focus is what it does to us and what we see happening. These kinds of cameras and other devices together with apps like Citizen and, even a bit of WhatsApp and Facebook, where we take snippets of these videos and say “I saw this person walking down the road — suspiciously walking down the road.” We have a problem here. We’re creating gated communities that are not visibly gated, but definitely are unfriendly, and usually against people of color. The whole problem here is about these devices, together with your own paranoia, and then the platforms for sharing them create a very hostile environment.
Scott Rettberg: And so, it amplifies your biases and amplifies your fears.
Mathias Klang: Exactly. And that’s the interesting thing about a lot of these so-called home security devices is that they create this atmosphere of, “oh, no, you should be a responsible home owner. You should now have some security for your home,” but it doesn’t actually do enough of that. But what it does do is it makes you paranoid.
Scott Rettberg: Right.
Mathias Klang: If you check your device, it tells you the camera was alerted at 2 a.m.. Who the hell is walking outside my house at 2 a.m.?
Scott Rettberg: Somebody walking outside.
Mathias Klang: Yeah, it’s a road, it’s a public space. They just walk past. They were probably going home.
Scott Rettberg: Of course, if they move to the window on the side of the house.
Mathias Klang: Oh yeah. But then the question really is, if nothing happened, is it better that you knew that someone looked in your window? Do you feel better or worse? I’m not saying from a security perspective, of course, probably worse. But from my paranoia perspective is like, “wow, should I even be living here?” Is someone creeping in, so there’s an element —
Scott Rettberg: This is bringing back teenage memories for me when I locked myself out and tried to get in the side window of the house, and all of a sudden there were police behind me.
Mathias Klang: Oh my God. Yeah.
Scott Rettberg: What about, this is a side question, but I know you use Facebook a lot. I use Facebook a fair amount, and there is this willing participation or maybe almost an addiction, right, to sharing. “Hey, here’s where we are, here’s what we’re doing.”
Mathias Klang: Yeah.
Scott Rettberg: “Look at this beautiful space” or even, I’ve had friends who’ve been, say, going through a medical emergency or really depressed in some way, using those media to say, “hey, look at me”, or “I’m in trouble.”
Mathias Klang: Yeah. This is the problem, and the fascinating part with this is, if used in the right way, of course, “look at me. I’m having problems.” can gain you a lot of support. On the other hand, of course, it could turn into just wallowing in someone else’s tragedy, and we take tragedy and we turn it into some sort of clout or clicks or something. That is definitely part of the problem. One of the other chapters I have there is about the idea of my own security versus my own sharing. Should we be sharing? How much should we be sharing? Do we want to keep sharing? But what are we seeing, even from the discussion about housing and homes and then this part is the way in which it’s become more and more my own responsibility. My personal security, my mental health, my physical health is all becoming my responsibility as opposed to the idea, how can we fix society’s ills. Now we have, “look, there’s technology that will fix your problem, and if you’re not using that technology, you failed.”
Scott Rettberg: Yeah.
Mathias Klang: You didn’t fail if nothing happens. But if, for example, you get into an accident and you didn’t use this technology that could have helped you, then obviously it’s your fault, as opposed to there being a whole level of protection around you.
Scott Rettberg: Yeah, I guess even auto insurance companies have asked you to install these devices, and you get a break on your insurance —
Mathias Klang: Exactly.
Scott Rettberg: As long as you agree that they surveil your speed continuously.
Mathias Klang: Exactly. And I mean and then we have the more — this can go to the extreme bizarre. We have a lot of devices that are intended for personal safety. Usually, they are linked to either just loud noises like sirens that you will have like a personal alarm that would make a loud noise, or you’d have GPS devices that are linked to organizations, or people that will be able to save you or come and help you. This stems from a basic idea with a lot of — if you talk to people, especially women, who, when they’re walking home at night or getting into an Uber, will have their phone on speed dial, or even have a connection with someone. It’s really unclear what these people can do, but of course it increases the level of security. This then evolves as technology does, and when we see the rise of even more, stranger toys or tools or gadgets, and there’s now even a GPS enabled pepper spray.
Scott Rettberg: If you trigger it, it sends your location off?
Mathias Klang: Yeah.
Scott Rettberg: It’d be interesting if your location was in a particular space, it just triggered.
Mathias Klang: Right? But it’s also interesting because who do you send this to? If you are connected to a service and you pay for it, you can have a security firm. But do you give it to your friends? Do you give it to your parents? Can you imagine being a 22 year old girl and your dad gets this alarm to the pepper spray? Can you imagine being the dad if the pepper spray is being alerted.
Scott Rettberg: Yeah, that means something went wrong.
Mathias Klang: And of course, in an American context, is he grabbing his gun and getting into his truck to find him?
Scott Rettberg: What about location sharing? A lot of people might do this, I don’t know. Do you do this with your partner?
Mathias Klang: Yeah, I’ve done some of it. Usually, we just forget to turn it on or forget to turn it off. It’s usually there running in the background.
Scott Rettberg: It’s useful if you’re in some foreign city.
Mathias Klang: Yeah, it is. The technology itself has a use, but it can easily slip into a more paranoid space. And I have a chapter on the intimate surveillance between partners, and that’s partially the sharing of location or even having cameras in the house if you’re living apart, but also then the depth of the dark side of one partner tracking another partner.
Mathias Klang: Right. You know —
Scott Rettberg: Stalking.
Mathias Klang: Stalking. Exactly. So, these things do have two different levels of, where are we, how do we know what’s good and not good?
Scott Rettberg: And it used to be, give someone your house keys when you’re dating.
Mathias Klang: Right.
Scott Rettberg: Right. Now it’s, “here’s my location continuously.”
Mathias Klang: Here’s my location. Here’s the code to the Wi-Fi. Here’s the cameras. I haven’t looked enough into that part, but I’m really curious to talking to some people. How much do you reset when you and your partner break up? Supposing you have a house filled with cameras and sensors and two partners break up? I mean, I don’t know. That’s a lot of resetting to do to make sure that your partner doesn’t have access to the cameras.
Scott Rettberg: It’s like getting a new phone. It’s like, yeah, getting a new boyfriend or girlfriend is as much of a hassle as getting a new phone.
Mathias Klang: If you look at that, Netflix is doing this for us now. But if you look at your old Netflix, “who’s that on my Netflix account”, it turns out it’s a friend of a friend who you shared it with way back, or there’s a really interesting and creepy case in Australia where an ex-partner was stalking his ex-partner via her car because the location on her car was set and stuff like that, and there’s a whole array of devices like tiles and tracking devices and GPS devices that are all available.
Scott Rettberg: We’ve got to wrap up pretty soon. But one question I wanted to ask was, coming back to your perspective as a lawyer, do you think that there has been a legal erosion of privacy, in addition, a change in the way that we felt about privacy?
Mathias Klang: Well, so the law has been unevenly used in different spaces. The European Union has beefed up its legislation and is trying to beef up its legislation, it’s coming out now with more legislation on AI, which is going to be part of the solution for that. But and this is why I find this so particularly interesting, most of these uses of these devices are not — we’re not worried about being protected from the tech organizations or from the state. It’s about 1 to 1, person to person. A lot of this is already done freely and we’re sharing our stuff because we want to. So, the question then becomes, or — with children, of course it’s paternalistic and we don’t give them choice. The question then becomes, wait a minute, this surveillance is permitted surveillance or allowed surveillance, but where is the power balance then.
Scott Rettberg: Excellent. Any parting thoughts on surveillance before we go? Are things getting better, or are they getting worse?
Mathias Klang: Oh, no. We’re going to see a lot worse before anything changes there. We’re so fascinated and enamored by our devices, so we’re going to end up moving along in that direction. I’m looking more into the question of where the limits of surveillance are, and talking about this, as I sometimes say, is all “looking” surveillance and is there a difference being looking and surveillance? But then the problem becomes, of course, if everything is surveillance, then what is surveillance?
Scott Rettberg: Yeah. Well, thanks so much, Mathias, and thank you for being here in Bergen, with us for the last six months.
Mathias Klang: Oh, well, thank you for having me. It’s been great being in Bergen. It’s been great being with the department and you and the CDN.
Scott Rettberg: And we’ll look forward to seeing your book eventually. The books title is going to be Surveillance Microcosms.
Mathias Klang: I hope so, yes. The book is got to come out soon and I hope the title will be Surveillance Microcosms. There usually ends up being a negotiation at the end there.
Scott Rettberg: Yeah. With the publisher.
Mathias Klang: Yeah, right. They want —
Scott Rettberg: Whatever is going to sell.
Mathias Klang: Right. They want something sexy or whatever.
Scott Rettberg: All right. Thanks so much.
Mathias Klang: Thank you.
Scott Rettberg: Cheers. Bye-bye.
Listen to the full episode of Off Center.
References:
Bentham, Jeremy, 1748-1832. The Panopticon Writings. London; New York: Verso, 1995.
Foucault, Michel, 1926-1984. Discipline and Punish: the Birth of the Prison. New York: Pantheon Books, 1977.
Warren, Samuel D. and Brandeis, Louis D. “The Right to Privacy.” Harvard Law Review 4. No. 5 (1890): 193-220.
Orwell, George. 2021. Nineteen Eighty-Four. Penguin Clothbound Classics. London, England: Penguin Classics.
Zuboff, Shoshana and Karin Schwandt. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. London, England: Profile Books.
This research is partially supported by the Research Council of Norway Centers of Excellence program, project number 332643, Center for Digital Narrative and project number 335129, Extending Digital Narrative.