Scott Rettberg, director of the Center for Digital Narrative is joined by journalist and Digital Culture graduate Ashleigh Steele to talk about memes, post-truth and the way narrativity shapes our understanding of ourselves and our world. We are increasingly affected by algorithms, AI and conspiracy theories, but what kind of effect does this have on our discourse, and how do we fight back?
SR: Welcome to Off Center, the podcast about digital narrative and algorithmic narrativity. My name is Scott Rettberg, and I'm the director of the Center for Digital Narrative at the University of Bergen. In this podcast, I'll have conversations with the researchers at the center, as well as other experts in the field to discuss topics revolving around digital storytelling and its impact on contemporary culture. Today, I'm here with Ashleigh Steele, a recent graduate of our master's program in Digital Culture. Today we're going to talk about Ashleigh's master's thesis on meme culture and its connection to the January 6 uprising in the United States.
SR: All right, let's dive right into it. Welcome to another episode of Off Center. I'm really glad today to be here with Ashleigh Steele.
AS: Hello.
SR: And congratulations, Ashleigh.
AS: Thank you very much.
SR: You just completed your master's thesis.
AS: I did.
SR: Although you're sort of a nontraditional master's student with a very interesting background. You're a journalist.
AS: That's right, yes. I'm a mature age master's student in Norway.
SR: Although you still seem quite young to me.
AS: Young enough, I guess, to get away with it.
SR: You've worked now as a journalist for how many years?
AS: Actually, if I really count back, it was since 2012, so eleven years now. That's when I finished my first degree in media and communications and went straight into TV because at that time, it was sort of the traditional pathway that you'd first go into print, radio, television. So, I went straight into the TV world, and I've kind of been there since. It's a very interesting and dynamic career. But as you'll know from my thesis, it has raised a lot of interest points for me at seeing how technology is changing, how we live and how we consume the news and what we trust—
SR: —in a very sort of troubled period of history.
AS: That's right, yeah.
SR: And you've worked for Al Jazeera?
AS: Yes. Al Jazeera, CNN, also Sky News, so a lot of the international broadcasters.
SR: So I was really surprised when I got your application to do your thesis in Digital Culture here in Norway. In the middle of the pandemic.
AS: I bet. Yeah, I was surprised too, to be quite honest.
SR: And you've been flying in from London and kind of splitting your time between Bergen and London the past couple of years while you've kept working as a journalist, and you were driven by a very personal and professional reason to write this thesis. And this is sort of the reason why you decided to do a graduate degree. Can you say a little bit about the problems that drove you here?
AS: Sure. When I first started in news, so Facebook existed, I suppose, but it was kind of when we first started encouraging viewers to give their views on news, and we were broadcasting people's thoughts and people were tweeting. And I suppose in my career since then, I've just really noticed how that shift to user generated material and also opinionated news, if I if I can call it that, has really dominated and overtaken the more objective, quality journalism that I went into the industry to do. Because of the pandemic, I started questioning to myself, I wanted to do some further study. I wanted to be more up to date with how technology was changing how news was operating. It's quite coincidental that the timing just so happened. I was in the quarantine hotel waiting to start this program. I specifically remember being there on January 6. I'd been out for a walk in the snow, and I was watching the events in Washington happen live on BBC News, and I was so shocked that this was happening in, I guess, an advanced democracy like the US. And I was so intrigued as well that a lot of this was happening on social media platforms and some social media platforms that I'd never heard of before, these free speech sites like Parler and Gab.
AS: So that really sparked my interest, I guess, of what I could study in this program and learning a bit more about those technologies and what had driven this major event, because clearly there was some planning that had happened prior to this.
SR: For me, it was a major shock, as a US citizen, suddenly seeing people rush from a Donald Trump rally, essentially to "Stop the Steal," and then actually attacking the Capitol while calling for hanging the Vice President of the United States.
AS: It's amazing. I included an image in my thesis: gallows had been erected outside the building, and it's phenomenal. And I think since that time, we've seen it happening in Sri Lanka and Brazil. Other world leaders just feel that that's kind of okay, that you can double down on believing that you've won an election that clearly you haven't, and you can rally your supporters to take to social media and to organize these kinds of offline events, which really are very violent and dangerous and personally, which I found from doing this research, especially in the US. And I'm not American, so perhaps I don't have quite as much of an inside source as I should, but nothing's changing. We're seeing there's new platforms, new avenues for people to spread these views and galvanize more followers, so it's really concerning.
SR: Well, let's back up on that a little bit. And I probably should apologize as a US citizen for the fact that they elected Donald Trump and that this has now spread, this way of being a populist and disregarding the law. But let's back up a little bit, because when people think about how these ideas are spread, I think people think about things like Fox News or Donald Trump's new—
AS: Truth Social—
SR: Truth Social. But actually, yeah, social media is a big part of this, and this is something at the Center for Digital Narrative that we're interested in. Not just sort of digital narratives in the sense of things like electronic literature, but these large-scale digital narratives that really do have an effect the way that we conduct ourselves and our political discourse on things ranging from whether we should take vaccines or not when we have a pandemic to who is elected, and to the the norms of society. And you kind of zeroed in, even though your background is television news, you sort of zeroed in on digital media culture, social media, and in particular, memes. And this is a funny thing, that when I talk about the importance of digital narrative to people in society, I say, look, if we think about this, how absurd it would be to us, even ten years ago, to think that these little pictures with funny captions on them could actually throw an election or drive an insurgency in a democracy.
SR: Let's talk a little bit about what memes are. What are memes?
AS: Good question. I mean, I could pull out my exact definition if you like, but I won't. So basically, the way I understand memes, it can be a variety of different modes. They could be short gifs or videos or images with interlaid with text and manipulated pictures, or it could be a true picture put in a new context. They're sort of remixed items of content that spread between online communities. They're posted to Reddit or social media sites, shared on WhatsApp or via messages. And I think one of the main factors of memes, or the most obvious, is that they're mostly humorous or meant to be shared for fun purposes, to make light of something. But what I found, particularly in the groups and the platforms I observed, there's a lot of kind of dangerous and negative messaging underlying all of this humor. And whilst the way that these memes can be interpreted in different communities will change, it's so reinforced on Gab in particular, which is the platform I studied, that you're just having anti-transgender narratives, the post truth, just all of these different tropes that just keep being presented. And you might dismiss it as, oh, they're trying to say this about Joe Biden, but actually it's very dangerous messaging, and it's just continually presented.
SR: And Gab is sort of like—Twitter has its own problems—but it's sort of like a specialized Twitter for the alt-right?
AS: That's right. Yeah.
SR: All right, let's think a little bit about this concept that you mentioned, post truth. As a journalist, that's a really interesting concept for you.
AS: Yeah. I believe that we're just in this moment now where it might be obvious to us what objective facts and truth is. But it seems that we're in this period where what you might perceive to be true has more to do with your values and emotions and your worldview than the facts, the actual objective facts. I would argue that I think the value of objectivity is diminishing in favor of emotion and worldview and sort of your identity as well. So, you're seeing it, which I mentioned in my thesis. Kellyanne Conway, one of Trump's aides in the early days coming out saying there's such things as alternative facts. I mean, just the fact that that is repeated now and considered just a general phenomena is insane. I cannot believe that we had someone in a position like that that could claim that there are alternative facts and just refuse reality.
SR: Your reality is not my reality
AS: That's right. Alternative realities. Absolutely.
SR: Wow. I still get depressed just kind of thinking about it—
AS: It is depressing. I know—
SR: —because it's still going on. At least there are some trials going on—
AS: That's right.
SR: —of some of the people. But yeah, let's talk a little bit about how memes operate and how they're bred online. You talked a little bit about humor. And if I think about early memes, a lot of these things were spread on Facebook and you'd have The Most Interesting Man in the World, the guy from the Dos Equis or Corona commercials, and you'd have some sort of witty phrase next to it, maybe really started out as this almost kind of juvenile, indulgence humor. And we used to think, about, say, conservative Republicans in the US as kind of being a fuddy-duddy culture of older white men. The demographic is probably still skewed towards white men. But I think when we look at memes, we think, oh, this is sort of a young people's genre, and instead of the right, we begin to talk about the alt-right. Can you say a little bit about the alt-right and what that is, how it operates?
AS: Yeah, good question. I think the alt-right, I guess, has probably existed prior to this, but it really did become better known and more of an online movement from 2016, when Donald Trump was elected. And I think what sets that apart from other right-wing groups is the fact that it is very much born online and it's quite a decentralized, dynamic movement and that there's so many different viewpoints that the alt-right has identified with. And trolling and meme culture is really the two sort of techniques that were born with the alt-right movement. So it's very much what they use as forms of, not only political messaging, but sort of building identity. It's ways of identifying that, hey, I believe in this. And this might mean that I would also believe in these other right wing conservative views.
SR: What about Pepe the Frog? Who is Pepe the Frog?
AS: Who is Pepe the Frog? Gosh. So, it's very much a symbol that has come up well before the insurrection. But it is something that I've found in my thesis that's been used in memes since. I think it's really just come to be an icon for the alt-right and that you see the image of Pepe used in all sorts of —
SR: This is a cartoon character?
AS: Cartoon character, a little green frog. And actually, interestingly, it's very similar to the original Gab logo, which I can't show you right now, obviously, but there's just all these parallels there where you see this frog pop up. It's an icon for the alt-right group and I found it appearing in some memes referring to election fraud. Again, sort of gender issues, anti-liberal ideals. So I can't go into the background of where Pepe the Frog came from.
SR: We had another master's thesis at the University of Bergen that was mainly about Pepe the Frog.
SR: Memes spread online and there's some debate over as to how they spread. If all of the meme spreading doesn't happen by individual people, how else could memes get around?
AS: Yeah, very good point and something I really wish I could have gone further into in my thesis, but I had to keep the scope to a certain point. Now we're seeing chatbots, AI agents, driving the spread of these memes. So automated accounts that have been made either through Twitter, Gab, all of the platforms.
AS: One thing I did go into in my thesis was looking at the factors where you can determine how accounts may be automated. So often it's if there is just an obscure gobbledygook username that just makes no sense, they'll often not have any display picture and often the accounts will have been made very recently, but the number of posts that they've put out is very high. So those are the indicators where you might see and this is the problem, that that metadata is not always very obvious to users and also some users might not even go to check for that. But if you have the media literacy skills to look at an account that's been posting a lot of material that might be dubious, you really need to be able to go and look at these things to determine who's running this account. Is it a human? How many posts have they spread in the last week or day? And other things that are coming out now that could make this even more complicated, of course, is ChatGPT and other chatbots coming.
SR: How do you think these AI text generation systems, image generation systems and video generation systems coming on the near horizon, are going to affect this phenomena?
AS: To be honest, I hate to be such a pessimist, but I really do fear that this is going to be game-changing for news and political campaigning. For example, we've seen Trump truthing images of himself being arrested in an earlier case, the Stormy Daniels case. Obviously that was fake and I think most people knew that. Trump, I guess, will be regarded as someone that is very self-promoting and dramatic. And of course he's going to put out these kind of images that may be false, but at the same time, I'm not sure if you're aware, there was actually a video of Putin that went viral recently that was a deepfake. It actually wasn't broadcast very much because, as you know, media in Russia is very controlled. But there are a lot of reports of these kind of incidents happening.
SR: I've actually seen some art projects that feature Putin, doing deeps fakes.
AS: Yeah. Maybe one of them was involved in this recent case. But as a journalist, I am quite concerned. I know in my workplace, we've used ChatGPT to put in some of our scripts to just see how they're manipulated or whether they're better or worse. And there's also prototypes of, I guess, fake news presenters or AI-generated news delivery.
SR: Hopefully they're worse. Right? As long as you have decent writers.
AS: To me, it just seems a bit robotic. You just don't have that human touch. And I don't say this out of a fear of losing a job. I mean, it's great if this can improve our work, but I just fear for the sort of the fabric of society and how we determine what's true and what's not.
SR: And even claiming truth, truth media
AS: That's right.
SR: Truthing. It's not a tweet. It's a truth.
AS: And it's my truth. It's personal. Truth, it's a brand. It's just all of these this move away from a fact. It's a worry.
SR: Yeah. For me, it's also kind of theoretically interesting just because so much of postmodernism was about questioning these metanarratives or these shared narratives and arguing that in a way, we're all living according to different narratives, and then we see this kind of literalized now in a frightening kind of way.
AS: It's also a wonder, I mean, what comes next? Maybe it is a case that we pull back to being very hard-nosed about. No, these are our news sources and we don't use social media. Maybe there is going to be this pullback to a kind of old school news ecosystem, but I certainly don't think that's going to be happening for at least 10-20 years. We're going to get excited by things like ChatGPT and AI, and then we're really going to see maybe some of the negative factors come through.
SR: Maybe the trouble is the "we." Is there a "we" anymore in such a polarized political environment? And there's fake news. And you get to own fake news if you're Donald Trump, right? Where essentially anything that's not news that fits your ideology, that's fake. And how does that feel to you and your colleagues? Because you're probably part of the fake news media, of course.
AS: Yeah. I must say, even now, for example, I was working the other evening when Trump was arraigned in Miami, and we all just shake our heads every single time. We shake our heads at his antics, at what he says. And I also believe that one of the problems that we are feeding into his narratives and his ideologies when we sort of fight back. We broadcast his comments, we try to correct him, and it just feeds into the way he operates. He just continues to call us the fake news media laugh at our—
SR: Maybe even gives him talking points, right?
SR: That's right. Keeping this in the cycle only sort of perpetuates the problem. It's either we don't give him airtime, which I hoped would be the case once he left office, but of course, with all of the cases now and the fact that he's running again, he's creeping back up.
SR: And the scary thing is, I haven't looked at the numbers, but for a certain demographic, the fact that he is being prosecuted makes people living in his reality feel like he's being persecuted, and now he's playing that up as becoming a martyr.
AS: That's right. We – actually sorry. I keep saying we, on Al Jazeera, the other evening, we were speaking to some of the protesters and asking them what they thought, and a lot of them were saying, oh, it's political persecution. Biden wants to throw his opponent in jail. And we question them, saying, well, isn't that what Trump was calling for Hillary Clinton to be jailed because of the email scandal? So it's just kind of—
SR: Lock her up.
AS: That's right. It's sort of one way. One side can say something and not the other, and it's all just political game playing. And unfortunately, there is something about Donald Trump that appeals to so many people. If it's the celebrity factor, or if it's the fact he can say and do what he wants without consequences. It's obviously very frustrating for people that don't buy into that.
SR: One interesting thing I read, I read the indictment, at least the New York Times highlights of the indictment. And one of the things that is sort of damning from a legal perspective for Trump is the things that he was saying about Hillary Clinton, that if people hold on to these documents, they should be thrown in jail, and that's horribly wrong. So, they're able to kind of hang that out to him and say, who is that wrong for?
AS: That is true, but unfortunately, I think because some documents have also been found at Biden's home, that's not playing very well at the moment. I mean, what can you do? Unfortunately, luck seems to be on Trump's side with this particular issue.
SR: Maybe. Maybe not. I think the distinction there, the real thing that they're saying is a crime, is sort of knowingly having these documents, knowing that they're national security documents, and then lying to the FBI about your possession of them. He's being tried in Miami, and he's being tried by—the presiding judge is someone he appointed.
AS: Yeah.
SR: So, we don't know. But it's been almost surprising how many of the other people involved in the January 6 uprising, even though so much of the American population did vote for Trump, there's been an amazingly high level of conviction.
AS: Yes.
SR: Because they did it.
AS: That's right. Yeah. And I think I wrote about this in my thesis. It's one of the biggest criminal investigations in US history. I believe it's over 600 the number of people that are still being processed, and a lot have been jailed already. But unfortunately, I just don't think it's really making that much of an impact. Personally, I find when there is news of another sentencing, it's sort of far back in our memories now and it's just not making as big news as perhaps it should be. And we're not making an example of some of these far-right leaders and the fact they've been jailed. I think it would only really buoy the organization to keep going further and appoint new leaders and continue on to free their political prisoners, you know?
SR: I really hope that's not the case. One thing it does seem, in terms of a positive aspect of this, I know that you and a lot of news organizations, for example, when he was arraigned in Miami, there's these big rallies and there's almost an anticipation that there's going to be violence like there was at the Capitol. And I think maybe the fact that these people know that people get put in jail when they do that might have some kind of an effect.
AS: Right, because definitely, at least the last two appearances, there's been so much concern about that and security, but there's really been no outcome of that sort. There're always the staunch supporters and quite a number of people there did rally, but you're seeing both sides now as well. I know in Miami recently there was also anti-Trump protesters. You would hope that there's not going to be some violence breakout between those two groups. But it's not just the mad sort of angry pro-Trump mob that think that they can just batten down the doors and pull him out and—
SR: That there are some consequences now.
AS: That's right. Exactly.
SR: Let's zero in a little bit on what you did within your thesis. You did a multimodal semiotic analysis. Sounds very academic and fancy, but what does that actually mean when you were trying to process and understand these memes? Can you kind of give us some examples?
AS: Yeah, for sure. What I found really interesting when I was figuring out what I wanted to do and gathering all of these memes, as I was sort of alluding to earlier, I was so struck at how many different themes and ideas were coming out of a single meme and common threads that I could draw between all of them using semiology and multimodality. Semiology is the study of signs. And signs can be, say, if you're looking at the color red, a sign, a signified meaning is to stop. Or it could be love. There's all these kinds of different societal values that we've given to different signs. So that's sort of semiology and multimodality is text, color, the different modes that are employed in memes video, gaze, body language, stance, positioning,
SR: Language, too.
AS: Language, absolutely. I was able to sort of combine all of that in looking at how these memes are structured, whether it's intentional or not, the different meanings that you can draw by looking at the signs and the modes that have been employed.
SR: Great. Well done.
AS: It was fun.
SR: One of the interesting things I thought about your thesis, it's sort of a theory I hadn't really encountered before, is this classification of humor, and humor is an important part of these memes, but there are different kinds of humor. Can you say a little bit about that?
AS: Yeah, I found that to be interesting point as well. So, I wanted to get across that I'm not against satire or humor, but I wanted to understand how humor can be used in different ways. It can be dangerous or it can be uplifting, it can build community, it can polarize people. I came across some theory of the different types of humor, and I believe that there were four. You have affiliative humor that can be sort of team building, identity building, and it's the type of humor that you would send to a friend. You'd expect them to laugh as well. And it kind of builds a bit of a rapport between you. Some other types of humor might be more detrimental. Sort of antagonizing, I believe, was the—
SR: Trolling.
AS: That's right, trolling. That's where humor would be attacking one subset of people or an identity.
SR: Othering them.
AS: Othering. Absolutely. So that's kind of polarizing. There are another two. I think the other two sort of fall in between those two extremes. Having those sort of different types of humor allowed me to look at these memes and say, okay, well, is the intention here to be antagonistic and to pull people apart and to other certain identities, or is this a little bit innocent and harmless? It's really just wanting to make people laugh. But maybe there are some underlying—
SR: There's this whole thing of people claiming that one is the other, right? Like, oh, you can't take a joke, right? It's just a joke, right?
AS: And again, and I do try to make this clear in my thesis that it is very impressionistic and open for interpretation. I think everyone's obviously got a different threshold for humor, whether they find it offensive or they enjoy it. But when you are seeing them in this concentrated way that I found on Gab, you just can't help but acknowledge that it is dangerous. Because if you're in this filter bubble of getting your news and your political views from Gab, it's just building on these themes and constant presentation, it's going to trap you in this identity that's very anti-liberal. Unfortunately, while a lot of them may be harmless, I mean, personally, some might be funny at first glance, but when you do this semiotic multimodal analysis and you're pulling out these themes, there's a lot of harmful content.
SR: How is it a speech act? Right? And what does it actually do?
AS: That's right. Yeah.
SR: And you talk about "preparatory media" too. And when I think of preparatory, I think of sort of preppers and people putting cans of food in their basement and rifles, ammo and stuff like that, building bomb shelters. But what function are you talking about there when you talk about preparatory media?
AS: Yeah. This was an interesting theory that I found in a paper written post the insurrection. So, this scholar, Luke Munn, he wrote about how these platforms, and they were free speech platforms, actually they work to sort of galvanize groups and they work to incite and plan and recruit for—it doesn't have to be an offline real world event—but there's this action to prepare the group for something, whether it's for belief in an ideology, whether it's for a rally. Just the way that they make cohesive groups by the sharing of viral media or by iconography. Pepe the Frog, different things like that. This scholar found, particularly on Gab and Parler, which were two apps that were used a lot in the insurrection, and I focused on Gab, that these forces were very inherent. It's actually built in as part of these platforms for users to use iconography to share memes. And if you don't, it's like you don't belong. So, these platforms are kind of working to galvanize these groups and prepare them for something, whether it's just being on this platform and being a member of the community or taking action offline.
SR: And it's sort of a recruitment tool.
AS: That's right.
SR: It's bringing people to different kinds of movements, and then maybe that's what gets replicated in other parts of the world, that people see that this is an effective propaganda tool.
AS: That's it exactly. I believe, Parler was and Telegram, Parler was used a little bit in Brazil during the insurrection there. So, people are understanding that these apps exist and sort of when one gets shut down, a new avenue opens up. There's this sense that I found particularly on Gab, that users are sort of anti-Big Tech, as they say. One thing I wrote about in my thesis was a period of what alt-right followers would call the Great Purge in 2016, and that was when a lot of users were banned from Twitter, YouTube, Facebook, because they were posting hateful messaging. So they migrated to these other platforms such as Gab and Parler.
SR: And maybe that's good, but maybe it drives them even more into it.
AS: It's creating this kind of alternative society again because they feel that Big Tech is part of this big illuminati of the world.
SR: Yes, it really is this sort of getting people convinced of these narratives that participate in a completely different reality than one that many of us share.
SR: Let me ask you a little bit about the future. You have sounded a little pessimistic, but you're in the news industry and you're a concerned member of society and an informed Digital Culture critic. How do we work against this? How do we sort of encourage people to come back to an idea of a shared reality that's driven by reasonable debate and agreement about basic things like facts?
AS: Yeah. And I tried to come at this whole project with a view of techno-neutrality. I've never suspected that technology forces our hand with anything or changes our reality. But unfortunately, I do believe after this study that a lot of the responsibility lies with platforms and corporations, governments as well, schools. I think we need some sort of universal content management rules that platforms are obliged to uphold because the way it operates now, content management, particularly of sites like Facebook, a lot of it is outsourced, and the poor operatives are being exposed to extreme obscene content that they're choosing to remove from the site or allow—
SR: the moderators—
AS: That's right, the moderators, a lot of that can be performed by AI agents as well. But I think the bottom line is we need sort of universal rules as far as that goes. So ensuring neutrality—
SR: Should we ban memes?
AS: No, absolutely not. I think memes are great. I share memes all the time. I probably enjoy some memes that are borderline offensive. I mean, I think that's human nature and we should continue sort of critical political satire, absolutely. But the danger is that we're just ending up in these platforms where it's complete filter bubble, echo chamber. That's the only thing you're being exposed to. Media literacy, government rules, platforms cracking down and ensuring content management. That's what we need.
SR: Great. Yeah. And I guess we need to agree a little bit more to think about humor in the old way and maybe not as a weapon. And I guess we've talked a little bit about some of the changes that have happened. There's been a change in government in the US. There's been some elections that while they didn't go precisely as the left would have preferred, they definitely weren't dominated by the alt-right. Do you think that some of the information, the actual information about how these phenomena operate and maybe how they don't adhere to a shared truth, do you think that literacy is kind of developing now?
AS: Yeah, I mean, to be quite honest with you, I actually think that the bigger problem now is probably the classic Republican supporter demographic. It's slightly older people who are finding out about these apps now and using them and maybe not being as aware as younger people what filter bubbles are, what algorithms do. So perhaps in that respect, I might be a little bit hopeful that as younger voters sort of are able to cast their ballots and they have this knowledge of the way the Internet can often manipulate what we're being exposed to, that could be a good thing. I think that the most dangerous element is the generation I suppose, that technology has not been as natural to them, and they might not be as aware of algorithms and the forces of filter bubbles.
SR: Great. Well, thank you very much. We've been talking with Ashleigh Steele, a master of Digital Culture.
AS: Oh wow, I'll have to get used to saying that now. That's pretty cool.
AS: And also a journalist, about our shared reality and some of the negative impacts that digital narratives spread by memes can have on our political discourse and indeed on our reality. Thank you very much, Ashleigh. I know you've got to get back to London, and have a great flight.
AS: Thank you so much. Great to be here.
SR: And spread the word among your colleagues.
AS: I certainly will.
SR: In this episode, Ashleigh Steele and I talked about the January 6 uprising and meme culture, the topic of her recent master's thesis here at the University of Bergen.
SR: Make sure to follow us on social media by searching your favorite network for the Center for Digital Narrative to keep up to date with our next episodes, you can find us on LinkedIn, Facebook, Instagram and Twitter. Thank you for listening. Have a great day and see you next time.
Listen to the full episode of Off Center.
References
Barthes, Roland. 1968. Elements of Semiology. Translated by Annette Lavers and Colin Smith. New York: Hill and Wang.
Munn, Luke. 2021. “More Than a Mob: Parler as Preparatory Media for the U.S. Capitol Storming”. First Monday, 26(3). DOI:10.5210/fm.v26i3.11574.
OpenAI. 2023. ChatGPT [Large language model]. https://chat.openai.com/chat.
Petras, George., Loehrke, Janet., Padilla, Ramon., Zarrachina, Javier & Borresen, Jennifer. January 7, 2021. “Timeline: How the storming of the U.S. Capitol unfolded on Jan. 6.” USA Today News. https://eu.usatoday.com/in-depth/news/2021/01/06/dc-protests-capitol-riot-trump-supporters-electoral-college-stolen-election/6568305002/.
Steele, Ashleigh. 2023. Free speech platforms and the impact of the U.S. insurrection: Misinformation in memes. [Master’s Thesis].
Torba, Andrew. “Gab.” Version 2019. https://gab.com/.
Trump Media & Technology Group. “Truth Social.” Version 2022. Truthsocial.com. https://truthsocial.com/.
This research is partially supported by the Research Council of Norway Centres of Excellence program, project number 332643, Center for Digital Narrative.