Mariana is a PhD researcher at the Creative Computing Institute at the University of Arts, London and thank you for having you here now for actually closing everything up. No pressure there, but you will mix yes perfect thank you so yeah thank you for being here tonight. I hope it doesn't take that long. And so today I would like to talk about multilingual programming languages as perhaps this anti-imperialist tool of resistance. And I would like to start by contextualizing my ambivalent relationship with the English language. It is at the same time the reason why I'm even here today, but also the object of my analysis. So growing up in South America, I had the idea of learning English and later learning programming languages with an English-like syntax and with English documentation mostly as promises of social mobility. And then also the language itself then that's in which computing takes place creates a critical indeterminate who benefits, who loses, who gains, who is excluded, who is included. In short, how this information age or post-information age impacts the peoples and cultures around the world. So we're seeing here the relationship of languages to power, wealth, privilege, and access to resources. to power, wealth, privilege, and access to resources. So programming languages, all of them are, in a sense, developed in distinction between an abstract machine code of zeros and ones, and the cognitive processes of the humans who interface with it. So from the human reader's point of view, or the human coder, they're always more or less strings of numbers and letters. And for the processing hardware's point of view, they are more or less sequences of voltage differences. So in the end, for the hardware, it doesn't matter which language it is. It's always a construction, an abstraction on top of those voltage differences. So there's a lot of even thinking, if you come from background that's not in English, to think that English is more suitable for programming languages, or even like this refusal to not feel like your local home language is suitable for programming. But that was also a constructed imaginary, a constructed concept. And then, like, as the machine is no longer needed to be programmed by rewiring, but then, like, already, like, creating the sort of abstraction in the process, those zeros and ones start, like, to not anymore correspond to the switches of the machine itself, but to those languages that were constructed by engineers. And the majority of practical programming languages today are usually used are higher level languages such as JavaScript and Python, for example. And those languages are either translated into machine code by a compiler or interpreted by interpreter. So even if you think about like having, which we're just going to see later, it's English all the way down in a sense. Because even if you manage to design your own programming language in your local language, there's still the aspect of the assembly language that's going to be, in most of the cases, the compiler, that still has some sort of English-like syntax based on it. So in a sense, there's always this in-between, like the binary machinic code and your own programming language. There's always going to be some sort of English embedded in the machine itself. And as I said, this was by design. And it more or less started, as Mark Marino says, goes in a lot of detail in his 2020 book, Critical Code Studies, on the creation of FLOMATIC by Grace Hopper and her team in around like 1958. And this is the promotional booklet for FLOMATIC. And this was a very innovative language at the time. And it was purposefully designed to approximate, to be more approximated with the English language, to be more beginner friendly, especially for businessmen without a computing background, because that was like when they started to see the kind of financial potential of computation. And that was also very critical because FLOMATIC ended up also being the starting point for many other languages and programming paradigms that came later, such as COBOL, for example. And here you can see in the booklet, like Flomatec system is a revolutionary new programming aid developed for the Univac data automation system, blah, blah, blah, using English language description of application requirements. And its instruction code, this new product, is specially designed for use by those who know and can best define their data processing needs. So you can really see this was the way of creating the sort of beginner-friendly language at the time. In a sense, this then could be seen as this opening of a Pandora's box, like the opening of this wicked problem of choosing English as the lingua franca of computation. Because even though English is considered also in the real world, like this lingua franca for globalization, it's actually like less than 5% of people that speaks English natively. 5% of people that speaks English natively. And then English becomes this most popular second language in the world. And there are some quotes here that I'd like to highlight because they really showcase that this, for example, like quoting Kenny Stone, that written originally in English makes language a non-issue for approximately 7% of the world's population that speaks, reads and writes fluent English. So when you say like it's not a big deal or that's how it is and there's no way there's no need to change it. Yeah like you're kind of just talking to a very small population. And then Mark Marino again says that all these movements towards natural language seems like progress except when your native tongue is not being included. Through that perspective, the colonizing force of natural languages embedded in programming language become clear. And talking about that as well, like in a sense, if you think about the world becoming ever more datafied and digitalized as well, we think that this kind of creates some sort of timeline for a lot of languages to die. So you're kind of condemning a lot of languages to not be brought to the digital world, to the future, in a sense, or this version of future. And then there's also, it's an interesting thing to see that programming languages also die or become dormant or just become obsoleted. And then there's the question of COBOL language that Linda Hittestadter wrote in her doctoral dissertation which I found like very fascinating because it really kind of investigates the colonial and racist power relations in software industry outsourcing as India became the larger provider of COBOL programmers that are actively maintaining critical systems of the global north. So even in India, like it's not a common thing to study COBOL at computer science universities, but then they create those companies that train those people because there's a lot of this kind of need for those programmers and no one in the north wants to write in COBOL because it's terrible. But even still, there's a lot of 220 billion lines of COBOL in existence nowadays, like in form of like bank databases, ATM machines, and a lot of like these systems that we interface with every day are still written in COBOL. So think about like that's the whole kind of automated world that we live in is just like in this very brittle kind of basis that less and less people know how to program with. And then also there was the tyranny of ASCII in some senses. And that like was the kind of starting point on how most languages were overlooked by this first design decision. So of course they had to make difficult decisions at the time, because they needed a compact and elegant solution to fit the characters in the screen. That's so much limited hardware of the time. And then ASCII was developed in part from telegraph code even and it was like this low resolution pixel font that used Latin alphabet for the English language. So it didn't have a lot of special characters, for example. And then it was the basic kind of the first font or the first characters that you could use for computers. So it kind of brought an elegant solution for the time, but at the same time excluded everything else. The diacritics of other languages, the logographic languages, the non-Latin based characters, so all of the other languages were just like impossible to be brought into this. Even like this gets even more complicated for example in the specific case of the Chinese like Mandarin Mandarin language, for example, because it would require so many more bytes of memory. And if you imagine that it contains, for example, thousands of characters, how would they fit in the same design? There wasn't enough hardware for that. And there wasn't incentives for the US government to also study how to do that. So the Chinese were on their own, and that's also what you're going to see later, what happened with that. And of course, this was necessary, like some sort of protocols were necessary of encoding characters so machines could interface to one another so we could have networks, etc. So they needed this standardization, of course. Otherwise it just gets like those garbled characters and mojibake because of this kind of wrong encoding protocol that you're using. So yeah, like nowadays we have Unicode, which aims to include all human languages and a lot of extra characters. But still programming languages and libraries, network protocols, and even some file formats still rely on ASCII in their internal representation of characters and strings. So when you're programming, if you cannot name something using diacritics, for example, that's often the case. And that's also often the case why using Unicode characters for programming is super difficult because it just breaks everything, breaks the code editors, etc. And then as I said, the Mandarin language had to circumvent the alphabetic and had to circumvent the QWERTY keyboard design as well because that was the main one, the hegemonic one. So then it became this whole mammoth task of trying to engineer different forms of inputting the Chinese Mandarin characters onto the screen. And the first ever one was the Wubi 8.6 system, which kind of break those characters apart, which would have to be combined through the keys. So you can see the QWERTY keyboard with the logographic characters all kind of stripped down in like radicals or parts of it that would be combined as you press them on the screen. And even though the Wubi method it's not so popular nowadays especially mainly in China because like they are using the pinyin system and the pinyin system is also another way of not then like not having the characters, inputting the characters themselves, but actually romanizing Mandarin phonemes, writing them with the Latin alphabet, and then choosing which character they would like to write on. So there's a whole system and kids start learning that in school really early on, so they can even start interfacing and putting Chinese text into computers, which is crazy to think about. And that's like at the beginning was a very, it really threatened the language itself because there was like this whole conversation of computers being the future. And so how they could translate their own language, their own culture into computers if it was like, it was impossibility by design, by the hardware of the machine. So that like, it required like so many years and like to even think that, yeah, like a language almost disappeared in a sense because of that is like really striking to think about. And then like when we go back to thinking about like English as this kind of almost naturally born language for enacting computation or like yeah like doing computation, there is this whole idea that's like algorithms existed even before there was a name for them. And they were also existent in many different cultures, in many different times and places. And there's even like this field called ethno-computing or ethno-mathematics that bring this into attention. And of course, like this is like a double-edged sword of a term because there's also others did what is the traditional mathematics and what is outside of this kind of heterogeneous notion of mathematics as this kind of absolute absolute truth as well. And then we go to a proliferation of alternatives and looking at different artists, researchers, activists that are using programming languages based on different natural languages as part of the work. So this is an example of Wenyan Lang by Lin Donghuan, which is based on traditional Chinese characters. And you can see even the distribution of the characters, et cetera, work as this sort of old writing style. There is also Albi by Ransi Nasser, which is written in Arabic. And it's really interesting to look at Ransi's process in creating this language because it was also like highlighting how difficult it is to make it even functional and that making it functional is a radical act by itself because as I said like using characters that are not ASCII make a lot of text editors to break the code and crash in a predictable ways. So making it minimally functional is like it was like the objective of the project to show this defiance to the machine. And more recently there's also this fascinating project by Kodinya Dulepala which is Praça, a programming language written based on Telugu poetic grammar. And Telugu is one of the Indian languages that are also based on syllables. And that creates its own kind of rhythm to the code as well. And it's super fascinating. I want to go into the intricacies of it, but I really recommend you taking a look because the whole process is really interesting like shows also this kind of bridge between poetic coding and even like a more concrete poetry as well because there are other possibilities of exploring the two-dimensional space of the characters on the screen. And again, there's also John Corbett, who is developing Ancestral Code operating system and running the programming language based on Cree language, which is Cree Sharp. And he's also designing keyboards and peripherals to go along with this customized operating system to really demonstrate that other paradigms are possible, or even that it's possible to create machines that don't follow Western rules of thought, hierarchy, or even mathematical structure. that don't follow Western rules of thought, hierarchy, or even mathematical structure. And then me as a practitioner and PhD researcher, I got asked recently by my supervisor what then would be involved in the design for Brazilian situated programming language, or how would I tackle this issue. And it wasn't something I was super interested at at first, but I thought it was like a really interesting challenge to think about because I believe the Brazilian context is very unique in this linguistic sense, especially because we have Brazilian Portuguese as the official language, but there are so many languages from indigenous people after the ascendant speech, and even European immigrants that came there mostly on the 19th or 20th century, that created this whole linguistic diversity that's not really taken into account, and it's really kind of forgotten about or hidden or violently erased from the country, which Carneiro and Sousa Santos call this process of epistemicide or violent epistemic practices as well. So I feel like it would really be unfair to think about Brazilian programming language that would be in Portuguese or into P or any other sort of indigenous language because it's much more complex than that. And it's not only about bringing a specific language to the digital, but even like as John Corbett does in his project, to think about the ontologies and even the structures of thought embedded into that specific culture and cosmogony as well. And that also prompted me to think about that perhaps an interesting way of dealing with this problem is to draw from the anthropophagic manifesto and its vertens, such as Anthropophagic Subjectivity by Sueli Ronic, Recife's Mungy Beat Musical and Cultural Scene from the 90s, and Eduardo Viveiros' The Cultural Cannibal Metaphysics, which creates this sort of distinct identity of the Brazilians through this symbolic absorption and devouring of dominant culture of the colonizer. Which is also, in a sense, a double-edged sword of this globalized comparability and this act of losing your own subjectivity in this process of assimilating hegemonic cultures from the West. But I feel like something that's like really present in Brazilian culture. And I feel like using this as this methodology for them, like trying to design a language that could use this kind of cannibalistic process could be interesting going forward. could be interesting going forward. So yeah, so I believe that the provocation here is not like I'm not presenting any solution but kind of questioning, yeah like the power relations in programming and computation and perhaps like alluding to like other words as well, or even kind of questioning the way that computation and programming languages are often tied to this idea of productivity and efficiency. And as Kenningston says, that computers are considered valuable because they aid us in this natural quest. And perhaps the most revolutionary realization is to accept sometimes that anti-practices is also a valid distance against these unsustainable digital practices that are taken from granted by the affluent North, which will invariably collapse in not so distant future. And that also there won't be a single universal framework that can effectively address these complex challenges. And that's the diversity of cultures and ecosystems is exactly what the current computational paradigms lack. And that's it. Thank you. Thank you.