Do either of you have any comments you'd like to make after listening to this? I think it was interesting to see how even though we were speaking on quite different subjects, how there were still those residences and similarities. Talking through the history, I was interested when eugenics came up, of course, when you were looking at images. And when I think about the history of Silicon Valley and the history of the tech industry, eugenics is something that you can never kind of extricate from that history because it's so core to building these kind of central ideas that come into shaping how a lot of these ideologies and these kind of approaches to the world, you know, basically kind of form themselves in Silicon Valley, right? Stanford University, which is at the core of Silicon Valley, that really helped to, you know, create this model was, you know, one of the hotbeds of eugenics thought, you know, in the United States, because it was literally founded, its first president was a prominent eugenics thought in the United States because it was literally founded, its first president was a prominent eugenicist who then brought in eugenicists from other parts of the country to ensure that they could promote these ideas broadly. And of course they were then involved in sterilization programs across the United States using racist metrics like IQ. But the guy who founded that university, Stanford, of course, was his name, he was also using photography, right? You've probably seen this early film of a person riding a horse, and you can see the various frames. That was filmed on his horse stock farm, farm right and he kind of rationalized the production of horses for racing along you know this model of just allowing the horses to break their legs when they're young to start getting them trained early because then they weren't kind of sitting around and wasting time being trained and eating and stuff for years before you could start training them. It didn't matter, right? Just start killing them, and then you'll kind of find the good ones that way. But, yeah, so it was interesting to see, like, the similarities with eugenics and images and all that stuff. Yeah, I found a lot of resonance, too, with, too with the historical perspectives into these companies. I always find that I have not enough patience to look into the companies. I'm very grateful for all this work that you've done, because indeed it is the other side of all these sort of rationalized efforts to produce correct or to produce truth, to produce knowledge. So we have these kind of calls for, you know, the more efficient, the kind of more accessible truths, the democratization. There's all these, like, really positive connotations that are coming from the, even the manifestos of these companies that are very much connected to, I don't know, European history of education, let's say. And then, you know, they're getting kind of passed on to these American politicians in a kind of neoclassicist move, and then projected back onto all of us as these companies tend to be global. And I think this is beautiful also in the way you show it, as this is always something that, this is something that we are often lacking words for. I think that your talk provides that. It is the non-conspiracy theory perspective on the domination. Like, it's not, you know, oh, the Americans want us to do something, or oh, we love the Americans. It's not really about a national ideology. It's about money, and money is quite visibly traceable and quite a reasonable reason to do things that they do. Even when you say that they are not wanting to lose their money, it's not just that they don't want to lose any more money but think about the power they have to lobbying for whatever cause they want now you know such people can get others to wear specific kinds of clothes if that is in their interest so it's not really it's not really about some like dark ideologies it's not really about some dark ideologies. It's quite visible and quite explainable. Definitely. I always think about how, just a quick comment. I always think about how, obviously, in the past, we had these new media technologies, and immediately there was a recognition that, okay, you know, the government had a role here to ensure that these technologies were used for the public good. And eventually those things opened up, right? Public radio, public broadcasting were always very important with the introduction of these new media technologies. But the internet comes along right in this period where we're embracing neoliberalism and free markets and this idea that the state needed to be out of all of these areas. And also, of course, you had the United States kind of seeing the benefits that it could get from the rollout of the internet and its companies kind of following along with that. And the idea was that the government can be nowhere near the internet, right? The government cannot be kind of thinking about public platforms or what it means to have something on the internet that is in service of the public good. it needs to be left to the free market in order to do these things. And I think now we're trying to address the issues that have arisen after three decades basically of this approach, and I think it does force us to reflect on those initial ideas that we received about what the internet was and what it was supposed to be, to question those, because obviously it hasn't worked out, and then to think about kind of what some alternative looks like. I think recently, Corey Doktorov wrote the difference between conservative and progressives is that conservatives believe some people are born to rule and others to be ruled, which seemed like a really good definition. And that makes a lot of sense in this context too with what you're describing, that these people think it's not about individuals, it's not about all individuals, it's about these specific individuals. And so looking at where does the data come from, what data are these individuals going to collect or people like them? So I think that kind of makes a lot of sense in that respect as well. Yeah, I think what I like about the photography example is exactly that, like what data they're going to collect is like whatever technology is going to be allowing us to do. So obviously with photography, we collected a lot of images. It informed a lot of questions. And I think it's very important, like what questions are we able and going to ask? What I think is our main resistance and liberty is actually asking questions. So, you know, I think that for me is the most important. Like, what are we going to ask ourselves, others, and technology to do? So it seems that with these eugenicist questions, the question was, like, who is better and how can we measure questions, the question was like, who is better and how can we measure that? So indeed, like, who are these individuals that are more individual than others? And therefore, you know, technology was adapted and developed in the way that it can provide answers to that. And now it also can do other things, of course. I think it's important to also resist the answer to the question, do artifacts have politics? You know, this question, famous and ongoing question, because I think it's a wrongly posed question because indeed it can never be answered. Yes, photography is not racist. Like, you can say that, but it only makes sense in specific contexts when you want to trace certain lineages of the way photography has supported and contributed to racism. But it's not useful to think that all photography is racist and to think of data being that which, the data being the problem. The problem is that there is this power grab that there are certain individuals who can have access to manipulations of data and not manipulations in terms of not telling the truth, but to process, to collect, to combine, to infer things from, and to develop these technologies. And in that sense, I think indeed, those who have money have access to lobbying and have access to also a lot of computation, as you just very well showed. Definitely. And when you were talking about the way like as you just very well showed. Definitely, and when you were talking about like the way that photography was used and can be used and how it represents different people in different ways, one of the things that immediately came to mind to me was, you know, when early photography was being introduced in New Zealand, you know, the Maori people there, the indigenous people had this long history of, you know, very intricate and detailed face tattoos, right? But when they would take the photos of those people with the face tattoos because of, you know, the chemical process of photography, it would represent them without the face tattoo. It would not pick up that there was ink under the skin. And so then you had all of these photos of Maori elders and chiefs and things like that, with this cultural symbol taken off of them, and that was the way that they were displayed, right? And that of course has cultural power, especially when you have this colonial force that's trying to kind of push its idea of how this society should work on it. And I think just to pick up on the point about individuals as well, right? Like, we think about how we have these very powerful individuals today, you know, the Elon Musk, the Jeff Bezos of the world. And in the same way that this narrative about technological libertarianism and what the internet was going to be was very much crafted by people who wanted us to feel it was going to be a particular thing as they built these major businesses that did something else entirely different, is as these tech billionaires gained their billions and their power, they also created a narrative for us, right? That these people were geniuses and that they got rich because they had these unique skills that allowed them to create these products that we all embraced and found wonderful and that's why they're rich right not just because they happen to be in the right place at the right time and you know they you know we're kind of the luck of the draw with their company being one that made money in the dot-com boom that they can then use again you know to kind of keep building on or that because they came from you know kind of wealthy backgrounds that gave them a leg up that gave them connections that made it possible for them to fail a few times before their first company actually worked in a way that many other people couldn't and so now you know they've reached this level where they are so powerful and so wealthy and they certainly do not want us to think that they got there through fluke and through luck and through privilege right they want us to think that they got there through fluke and through luck and through privilege, right? They want us to think that it is inherent, that they are better, that they have higher IQs, that they are geniuses. And that is why they deserve to, you know, command these companies. That is why they deserve to have hundreds of billions of dollars and shouldn't be taxed and shouldn't be regulated because they are doing this for humanity. They're, you know, they're so benevolent to do this for us. And that's also a lie, right? Well, yeah, that reminds me of my other favorite podcast next to yours, which is called Seriously Wrong, which does really well dramatization of this effective altruism, as they like to call it. But I'd like to think back to something that I really don't understand how to think through, which is this entanglement between the right, the tech, and the state. As you also pointed out, this prior investment of the state into these companies, which has always been completely ignored in public. And there is this great book by Mariana Mazzucato, who taught... Mazzucati. Mazzucato. Okay. Very sorry. To her. Anyway, this book is called, I think, The History of Value or... The Entrepreneurial State, maybe? There is something of value in the title of the book that I read. And in this book, it all is about noting down and acknowledging the public investment into all businesses that then turn out profitable. So this kind of lie of the market, of the free market. Basically, it's about demystifying the free market claims of a lot of companies and recognizing, I think it's called the value of everything or theory of value. And it's about the history of the definition of what is value. It also looks at, like, how is what valued, how is GDP calculated, and how some things, even in certain countries, get counted as towards the tax and GDP contributions, such as, for instance, when you have a house and your potential return of rent is counted as taxable, even if you're living in the house yourself. So that's a way to think about the privilege, to tax the privilege. But then you know how childcare and the kind of work that you're doing on child care is not counted towards taxes so that the work mostly done by women even in very progressive economies is not counted as this value that can be exchanged so a house can be exchanged you can live in somebody else's house but you cannot take care of somebody else's children. That is definitely not counted towards GDP. So in this wonderful book, there is quite a lot of historical review of how free market has been articulated also as free. And indeed, I think it kind of matches well with the question of who is the individual that is worth noting. And then I think, well, it's interesting to see how all these right-wing movements are trying to sort of control the state. Because if you think naively, why would they even be interested in the state? The state sucks. They just need money and they have it. So why do they want to also be a president? Like, why do they want to be in this classical, neoclassical politics when what the claims of libertarianism and freedom are, normally, somehow ideologically, suffice to be articulated through money and market power. And what are your thoughts on that in this context? I was just going to say that I think I would like to open up the discussion because people who've been listening and I'm sure everybody's thinking along and some point we'll have to make a really long list of books. Oh, you have a bookshelf. You have book recommendations. I do, yeah. All right. Would someone like to ask a question or comment on the two lectures now? Was that too fast? Or it's still too dark? Yes. I think there's a microphone. Is the mic coming? Right here. Thank you. Maybe make a question more to use, Lena. And I'm trying to speculate about the thoughts you already said already right now about paraphrasing this privatizing of profits and uh communalizing the expenses and in terms of moving with these optics to this landscape of accumulating the data as a value, these artistic approaches to somehow to heal these technologies and to find new ways to represent people who are somehow were dropped out of these technologies. Is it not communalizing the expenses? I think I kind of recognize the provocation, so maybe I get you wrong, but let me try. The thing is, if I follow your question and the logic, that yes, these artworks are doing the same as the technology is already doing so they're kind of using the master's tools right and are are therefore participating in the expropriation of or like simply are not dismantling the house. And I think what I find really courageous and interesting in these approaches and what I would hope to think about myself is how can we not actually burn the house down? So it comes from another feminist angle on the fact that we do have these technologies now in the world and that they are there and that one thing to do to make sure that abuse does not happen in the future is to burn all data centers. Like, then there will be no private data collections, there will be no crawling, there will be like, it will be fine. But I think it's important to think about some, to think about ways to subvert what is there, and not only subvert it in this kind of privileged playful way of like me I have time and now I'm going to think about you know doing things fun things generating generating stuff that is really like outrageously um um uh negative uh as um as negative it can be. I think it's important to think how we can, you know, really make this technology do something for us in a way that is kind of disobedient to the logic of binarism. So we have always this logic of binarism in which this technology is either very positive, it's going to save the world, it's going to make people rich, or it's going to destroy the world and it's going to take our privacy away and also destroy truth and we won't know anything. So there is like this, and it's's very polarized and I think what I want to think of contemporary feminism is to not participate in this polarization. So that's something that I think is a major thing to refuse and there I find this act of actually articulating like Nat Netris-Guskins does, what in the mathematics of the reflection of the skin color is the problem, and what are ways to do this? There is specular and spectral reflection, if I get it right, at the moment. And so when you, you know, you don't say, oh, I don't know, I want to have anything to do with this technology because it's abusive and it's extractive, but you think, okay, but what in it is actually doing the work of discrimination? If you find, if you are able to think of art practice or other ways of researching what is doing the work of discrimination, then you can actually, well, not stop doing that work, but you can address that, you can do something about it. And indeed, yes, then it's about, again, like using lots of water to cool down the servers that were generating the image of that man that she wanted to celebrate. But at least it's putting even more emphasis on the work that needs to be done and on the fact that while we are all complicit in society and that we are able to now say, yes, I'm not using chat GPT because that's bad for the environment, but you know, we are, yeah, in a way we are all using chat GPT. Like, I don't know, somehow there is a value in making your practice of participating in society visible. That's what I see in this. Another question? Nate, back there? I think. Got a question? Oh, yeah. More to Paris, asking or wondering, have you thought about how the history of free software movement is threaded into this bigger arch of techno regimes that you have been telling us? Especially because it's also, I think, a troubled story that puts an emphasis on the questions of freedom and libertarianism and is kind of troubled in these ways, but also feeds a lot still into why we are here today and maybe some utopian thoughts that we might put into practice, actually, and feeds more towards this very last question that you opened in your talk. Absolutely. Thanks so much for that question. I would say to a certain degree. Not as much have I kind of worked it into this history, but I think that obviously when I see it, and I would say I'm more familiar with obviously the open source side of things and what has happened there rather than the free software movement, which is obviously distinct from it, right? And I think what we see very much with open source is a lot of kind of very promising goals and motivations, but then we can see very clearly how these major tech companies have been able to co-opt a lot of those movements and a lot of those projects in order to benefit themselves. And so then the question isn't so much like, okay, so we shouldn't do that at all, but rather how is it done in such a way that, you know, these major tech companies are not able to co-opt it, are not able to ensure that all of this work that is done on these particular projects is just used to, you know, kind of forward their business models and their power because they have so many resources in order to take advantage of these particular projects, right? And I think that that is quite a difficult thing to try to contend with, to try to think about, you know, the different ways of how we approach technology and how we think about creating technology in such a way that benefits communities and where people can embrace it, but the tech companies cannot seize it. One of the things that I've found rather inspiring in recent years is learning about this group in New Zealand called Tehiku Media. It's a Maori kind of group that you know they're initially a media organization that broadcasts in Maori for Maori people to try to promote their language and to ensure that their culture is very you know accessible and being promoted and things like that but they also have this technological side where they are developing technologies in order to promote the you know revitalization of the language to promote learning the language to try to promote the revitalization of the language, to promote learning the language, to try to make the language more accessible to people. And they've even developed large language models using the kind of large amount of Maori text and audio files that they have available to them. But they do so very clearly with a license that does not allow major tech companies to use any of this sort of stuff, and they will not allow any of the work that they do to be used by Google in any of its models or Microsoft or anything like that, right? Because they see that as like a continuation of colonialism where these major tech companies are basically taking their data, taking their language, and using it for their own benefit. And of course, a lot of these companies will basically say that they offer Maori translation or Maori tools, but they work very poorly. And what they often do produce is kind of like New Zealand English speakers pronouncing Maori words incorrectly and so it's actually, you know, degrading the language and making people not hear the language as it's supposed to be heard and they see it as incredibly damaging, right? And so they're, I'm not even going to try to pronounce the name of the license properly but the way that they have developed these technologies use the limited tools that they have in order to do it but done it in such a way to try to help their communities while keeping it separate from these major tech companies I find quite inspiring you know as one kind of good news story in the whole range of bad news stories that we often have when it comes to tech. Another question? Marina? I also have a question to Paris, and I maybe will go on this line that was posed, because actually in such context of Amaro I always understood we talk about very serious, not rhetorical and journalistic way about, of course, capitalism, racialization, colonialism, and then the impact of technology, of course, and not being against technology, but really to see. So this will be also maybe my disappointment in this chit-chat in the end after the lecture that you put the ideology, imperialism apart and so on. But still, I would like to ask you a question. It's interesting that you actually just went beyond Ukraine and Russia and the implications that you actually connect with the U.S. and a small thing about Europe. So I would like if you can comment why you bypass or how you see what you said to us in the relations to Ukraine and actually what is said the new reappearance of the Cold War and especially because it's also ideological, strategical, structural elements inside of what were said also the politics of Europe in this context. Where you stand or how you see these relations and what is the place of Ukraine and Russia? In terms of these discussions around the tech industry? In terms of the histories that you pull up until the moment and then you finished your presentation. I'm not sure I have a specific comment in terms of where it fits in, other than to say that when the Soviet Union collapsed, the United States obviously saw a great opportunity in the post-Soviet states and to promote the neoliberal model that it created. I'm not going to pretend to be anyicated, especially in the war in Ukraine, and Elon Musk and all these relations that were... Because you made this history, so I was interested to see what is actually this imperial, let's say, ideological elements inside, also how Europe actually reacts to this, because it's a big provincialization of Europe. You don't have such analysis of history, how Europe actually behave in these questions. You just point some drops. I was just curious to know your research. Yeah, I would say I'm certainly more familiar with, you know, the United States, right, and what it has carried out. Obviously, if you look at the ongoing conflict right now and what Elon Musk has been up to with regards to Ukraine, we can see initially he seemed to be very positive toward the Ukrainian effort and its ability to defend its territory from Russian aggression and, of course course made Starlink available to them. You know, those Starlink kits were brought in by the United States because it saw it, of course, as important to allowing Ukraine to have the connectivity to plan its movements and its kind of military activities. But of course, what we have seen since then, in the months after that, activities. But of course, what we have seen since then in the months after that is that kind of broader shift that we've seen in Elon Musk's politics where he has been much closer to the Russian position and kind of what Russia aims to get out of the war and its control over Ukrainian territory. And in particular, Elon Musk seemed to be quite concerned that the Russian government would be angry about the fact that he allowed Starlink to be used in this way by the Ukrainian military. So he started to restrict the areas where Starlink would actually operate so that it wouldn't be anywhere close to the front or where any kind of military activities would be, or any actual fighting would be being carried out to try to disrupt these attempts by the Ukrainian military, you know, to take back territory or to attack, you know, Russian positions close to Crimea and other areas like that. The United States has found it very difficult to rein Elon Musk in when he does those sorts of things. They have signed a contract with him and they have taken over a lot of the cost of supplying that Starlink service within Ukraine. but Elon Musk still commands a lot of power because he controls this massive infrastructure because the United States has moved to privatizing its space program basically, and allowing SpaceX in particular to take over a lot of the power that comes with that. And so Elon Musk now has like the power of you know a diplomat in some in some cases where he can choose where the service is going to be available and where it's not. For example when he made his trip to Israel part of the reason that happened was because the Israeli government was angry that he mentioned potentially giving access to humanitarian groups in Gaza to Starlink and they wanted to make sure that he would not do that and that he mentioned potentially giving access to humanitarian groups in Gaza to Starlink, and they wanted to make sure that he would not do that, and that he would only do it if the Israeli government approved of it first. So there are a lot of conflicts around the world that he now puts himself in the middle of because he has this control over this powerful infrastructure. When it comes to Europe, obviously I was talking about its approach to regulation of these tech companies, in particular American tech companies, and the problem that I identified is not solely a European one. Like when I look at the Canadian government as well, it is more open to regulating American tech companies to try to restrict what they are going to do. However, ultimately the goal of their policy is to ensure that they are creating tech companies along the same model as what is pioneered in Silicon Valley because that is the way that these various countries are incentivized to do it, because that is what looks good in the economic numbers and all those sorts of things, right? Yeah. Another question? Kim, over here. Actually, I have two questions. Both of you mentioned extractivism, and I just want to be clear in what context are you using extractivism? Is it consistent with my definition of extractivism in terms of the colonial past, the settler movement and so on and structural racism as we know it today. That would be the first question. Should I go on to the next question? Okay, next question would be with Selena. You mentioned you already, you mentioned that the quote from Audre Lorde, the master's tools can't dismantle the master's house. But I think what she meant by that was not just that we can't just destroy it, it's there, but how do we use these things critically? Discrimination and racism is structural and it's not something that the United States invented, it's not something that just comes out of the blue, that these big powers, there are individuals who are upholding structural racism and everyday racism. And how do we use these technologies just like how we use education. We can't just say education is a colonial structure. Let's forget it. So who benefits and who stays marginalized? So I think it's a question to say, what can we as individuals also do? Instead of just saying, okay, it's something much bigger than we are. And that's what I find is a little missing for me personally, that we bring it down to earth. Because we're all upholding racism. We're all upholding a biased technology. And how can we rethink that? And I think if we start then, if we just get down back to basics, then we have to rethink our own position in society and our own privileges. Okay, back to extractivism. Sure, I'll start and then Selena will pick it up. So when I refer to extractivism, I think that's used quite broadly in the tech industry. It's often used with reference to data extractivism. But when I think about it, I think about it as being much broader than that. When you think about how these technologies actually work, yes, okay, they're collecting data on people, but there is a much deeper material aspect to that when you dig into the way that they work and the supply chain that goes into them. There's a lot of extraction that is necessary in order to create the technologies that power, that are not just in our computers and our phones, but that power all of these services and the internet itself that we depend on. And those minerals are often extracted from places in the global south. You know, these companies often talk a lot about their supply chain management and how they're trying to do it more ethically and all that kind of stuff. But we still know that there's child mining going on with cobalt in the Democratic Republic of Congo. We still know the issues with lithium extraction for the batteries that go into all this stuff. Like all this stuff is very real and comes into how all of these technologies work. And in particular in this moment, the tech industry and in particular the major cloud companies, Google, Amazon, and Microsoft, are in the process of a massive expansion of their data center network. which not only relies on the extraction of all these minerals in order to build the servers that are going into these things and then sends the waste from all that, you know, who knows where. But there's a growing backlash to the way that they are building these data centers in communities all over the world where they require a lot of energy, require a lot of water and you know in certain parts of the world there are concerns about what that is going to mean for how people live in a working-class neighborhood in Santiago in Chile Google wanted to build a major data center down there and you know the community fought back again against it because they had recently gotten water piped into their homes for the first time, and Google would not give them assurances that if this data center was built that they wouldn't be pushed back onto water trucks and to lose their access to running water. Like, this is how basic some of these things are, right? And then the other piece of it, of course, is there have been protests in South Africa as well around these data centers, around the question of, you know, who is actually controlling these technologies? Do we want these major American corporations often to set up these data centers and then control the computation and the services and everything that we use? Or is this a continuation of these unequal relations where, again, it's these major companies in the United States or Europe or whatever who is deciding our fate ultimately, right? And to your piece on, you know, what do individuals do and what can people do to push back on it, just to pick up on that piece of the question that you posed to Selena, I don't mean to say, like when I say that the state is a key actor in trying to rein in these power that there's nothing that people can do right but I think that the real power of that comes in collective organizing and collective power rather than just like you know I deleted an app from my phone or something like that and I think that part of you know the hope that I have seen in recent years is this growing in the mobilization around technology where there are a number of movements that are organizing against data centers or that are organizing against other issues that these technologies have created. And then of course, the worker organizing that has also been happening in a lot of the tech sector to try to force the employers of these people to not only treat them better, but also to expect them to have a more kind of socially responsible approach that aligns with the narrative that a lot of these companies put out for a long time. Google saying, don't be evil, was its slogan for years when it was very ready to be evil for a long time. So, yeah, those would be my responses to your questions. I hope I can continue on this because it goes into so many directions what I would like to say to your question first of all thank you for this question and thank you for posing it so precisely and for stating this that we are all upholding racism and other systematic oppressions, and that's a fact. And this is something that I was trying to get at, answering your question about what it means to do this, then generate another AI artwork. I think that, to to me is connected. Indeed, it shows and it practices this complicity. You say, I'm complicit and I do this with it. And now let's see what collectives will do with this, what society will do with this, and what maybe states will do with this, probably very little. What I know very, or far too little about colonialism, what I do think I see, and what I now start saying that I know, is that there is, with technology, a kind of tendency to mix up and use in an interchangeable manner, a kind of metaphorical and a historical interpretation of technology's effects on society today. And so we would say data colonialism. We will say data extraction. We will say these things because we mean that they are using similar mechanisms of power that were practiced against people, but now are affecting. And also land, like how you actually extract ore and stuff from the land, how you extract wood from the forest. And now there is kind of a parallel to the mechanisms how what is used by algorithms is extracted from data. I guess this metaphor is somehow bringable to the ground through the kind of comparisons of mechanisms, and for that there is this book called Costs of Connection by Nick Caldry and Ulisses Mejia. I think this book starts with the claim that it is not about a metaphor, but about reality, and I read it as still inhabiting that metaphor, the space of metaphor, the space of comparison, the space of this is like that. And I think the space of this is like that is important for us to think because we cannot really, because of this kind of history repeating and because it's so valuable to look at how things have repeated through history, it helps us to gather together to get some thing going with others through these references to shared experiences of the past so that we can try to think of ways, of modes, of things to do together to resist them. But then there are other, I guess, material and historical ways in which colonialism is still present in these operations. So indeed, people speak about kind of yet the image that I showed with the outsourcing of work that I think speaks of kind of the re-incentiation of colonial roots of exploitation that are being still upheld by corporate actors. So that is something that I would think speaks not metaphorically, but rather really and physically on the ground, like how is AI being produced and operated and how it plays out in the world. So this is what we can call the globality of AI. It's like globally playing out in this way, replicating the structures, which is no surprise. There is no reason to even theorize a new model when people are poor in countries which were impoverished by exploitation, by historical extraction, and violence all over people and now those countries, so those lands and those people are more easily susceptible to be again subsumed into these economies of exploitation in which now people have to label data sets and for very little money or watch terrible videos and filter things that we don't want to see or you know there's all sorts of things that people have to do for us to be able to scroll on on the screen and that again makes me complicit in contemporary colonialism. I don't know, does that go in the direction? I think we do mean similar things with extraction, but I think it's not possible to complete everything that extraction can mean today. So I think unless someone has a very urgent question that you want to pose now with a microphone, I have the impression it's time to break up this setting to a more informal setting and continue other conversations together. All right? So thank you very much for the questions, for your attention, and thank you both very much for your lovely speeches.