Research on e-books with Wikimedia content/Interview with Mondotheque

From Meta, a Wikimedia project coordination wiki
Jump to navigation Jump to search

We had the pleasure to interview Mondotheque (, a collective of artists, archivists and activists, on the 13th of October 2016. is a notebook for experiments with the legacy of the universalist and documentalist Paul Otlet: drawings, images, systems, ideas. The platform is hosted by the Brussels association for arts and media Constant, and named after La Mondothèque, Otlet's design for an imaginary device, a thinking machine that could be at the same time archive, instrument, workstation, catalog and broadcasting machine. is inspired by the obstinate spirit of Paul Otlet, and wants to look at the way knowledge is managed and distributed today in a way that allows us to invent other futures and different narrations of the past.

Due to technical reasons, the interview took place on an etherpad ( The text reproduces the authorship colours of the pad:

FS: Femke Snelting AV: Alexia de Visscher NR: Natacha Roussel MM: Martino Morandi SF: Sandra Fauconnier CC: Cristina Cochior LD: Lucia Dossin

CC: In the introduction of the book you mention that the reason you use the wiki is because of the analogy between Otlet’s “intellectual machine” and the wiki as a thinking machine, that allows the user to gather and categorise its content. Did you encounter some missing or cumbersome features that made the process messier ?

AV: Probably the context of the research is itself "messy", in this way I mean that we did'nt know before doign it, witch shape it would take, and working in the wiki was a way to think the project at the same time exploring a tool in relation to Otlet's thought. Even the 'structure' of the group research was undefined, moving, with differents scales of collaborations, meetings, places.

FS: Also, some mess we made ourselves and sometimes on purpose ... One of the "features" that transformed the wiki into a thinking machine, was our transclusionism extension [ ].

SF: Yes, that's common on Wikipedia too. (mostly on documentation pages)

FS: Our idea was to transclude both ways, to exchange relevant texts between articles. Meaning: paragraphs of interest are exchanged between two or more articles. This idea started when we were analysing Le traite: How to write between texts, how to use the resource itself, the text had just entered into the Public Domain, to produce a perspective on it? How to make the link between the printed, and then digitalised texts tangible and vice versa?

AV: And it was a way to contribute to Wikisource getting into the text correcting some last 'misprints' at the same time.

SF: So, in cases where you re-use and quote pieces of text?

FS: For example p31 in the book/pdf, two paragraphs are "transcluded" from an online interview (this interview is actually not printed, it exists only on-line) and from a text by Dennis Pohl. The text from page 31 shows up on page 169.

LD: Which were these weird things that started to happen/broke down the system?

FS: We installed TimedMediaHandler and MwEmbedSupport plugins because we wanted to organise precise clips of videomaterial with translated subtitles, while making the full mediafiles and transcriptions available too. Jan Gerber, who developed these plugins for Wikipedia helped installing them; the ability to have multi-lingual subtitles as separate, editable pages is wonderful but at the same time brittle and not that easy to discover as a user. And it makes the hairs of sysadmins go grey and it breaks at every update. [ + ]. It was also interesting to see how wiki is text based and has trouble considering images on the same level. Technically, file pages are similar to text pages, but in the plugins, interfaces, default listings they behave radically different. For example, in the Simile Timeline (that used to be one of the proud showcases of what can be done with Semantic MediaWiki), those file pages only show up with a link and repeating "undefined" for each image. [ ].

LD: Could you clarify expand that a bit?

FS: The project somehow started with the frustration of seeing digitised documents from the Mundaneum archives appear on Google Cultural Institute: no downloads, no proper license and no structural links back to the catalogues of the actual archive, let alone a mention of the people that have done the work. One way to respond was to distribute a scrape from the Mundaneum digital images published on the Google Cultural Institute website; 24.000 digital tiles that make no sense as digital objects but might become something else one day [ + ]. The base material on the mondotheque wiki is not text, but a collection of images taken from floating we-transfer zips that people sent us, images of the current political landscape that we downloaded, pictures we took ourselves in the archives, or portraits of all the different people that we started to identify in the way the Mundaneum project travelled. Understanding what these digital objects meant, together, and in different constellations, was how the wiki started. It also seems that when you work with MediaWiki in this way, its many historical layers, the way the project has grown organically in parallel alongside Wikipedia, come to the foreground.

SF: So it was an attempt to make sense of almost a chaos of input? Or did you have some ideas where you wanted to go?

FS: Hmm. I think it was more a way to find patterns, to see similarities and dissimilarities that seemed at first sight chaotic but in fact weren't so much that; it was more about figuring out how all the many layers interplayed.

AV: It was a way to get into the Otlet's thougts with a tool permitting to pass through the core of his research. A kind of immersion and also at the same time a place to make diversions, to step aside from his radical and univoqual ideas.

MM: And also the wiki became a provisional site to show glimpses of the 'chaos' to potential collaborators and accomplices to invite them to add some more. [ ]

SF: And by doing that, also approach / understand the way Otlet worked? And/or comment upon that?

MM: I think there was an interest in making a conceptual relation between the modern positivist stance of Otlet with the NPOV approach of wikipedia. Which we can consider a post-modern way of attaining a substitute of 'true knowledge'.

NR: Well I guess as soon as you decide a fixed format (the book) you have to find some form of organisation that is different from the hyperlink that structures online information, I guess this feature wether hyperlink or tranclusions are hard to transcribe.

CC: Is it for this reason that you opted for a semantic wiki?

NR: I guess semantic wiki seemed to offer more possibility of internal linkage and horizontal organisation, but this is still to be defined

FS: As the direct comparison between a search engine (Google) and The Mundaneum did not make sense, it seemed more interesting for us to see how the Semantic Web project aligned with ideas that Otlet + Lafontaine had about the wonderful things that can happen when you pre-structure information into little 'bits', and how they then could come together in unsuspected and useful ways later.

CC: (I found it very interesting how the images arranged side by side built their own narrative and enriched the research in this way...)

FS: Of course, we soon realised that Semantic MediaWiki comes with a very specific idea of what can/should be linked (like hotel locations for example. Or names of famous people and their favorite color) so with our kind of content that is multifaceted by nature, the system quickly broke down.

AV: As every element of the wiki is a page, image, subject and even categorie, it was very exciting to work on a horizontal way, without any classification as organisation. In parrallel of the Universal decimal classification for instance, considering every level of the classification as a page, and developpe an horizontal tree of the research.

MM: The semantic wiki also came with a promise of potential from eventual categorical shifts, semantic searches, transversal ways of linking materials, that could be let 'open'. so to leave a certain degree of indetermination to the direction the whole wiki would go, while still being very precise in certain places. I think to some extent it happened, more in the online version than in the printed publication. And yes at some points wierd things started to happen and we were dragged into the indexalist's delirium..

AV: The semantic wiki allowed us to generate pages from fragments of content, from different levels, as Otlets would do when he wrote with his index cards.

SF: What also interests us - you then 'poured' this process into a (very linear in its nature) print book - why? and what were your thoughts behind it?

FS: We called the book an 'incision' ;-)

AV: we have to say that the book is not the mirror of the wiki work. Also contributions from outside expand the outlook. But indeed, in a way, it is an publishing object. Is the wiki also a publishing work?

FS: This idea of the "thinking machine" was more related how we imagined the wiki to work, at first. But when we decided to make a book, it made sense to think of the wiki as part of this constellation, to not separate the process of making the book from the work we were doing on the wiki. With Andre Castro, weasyprint and some make-recipes we managed to make that happen.

AV: Also, the publishing work took place between the wiki and the book, in the interval of these two ‘media’. In the process of embeding a certain kind of content, from a specific format to another. In this space between a lot happend.

FS: The transclusions are one way of breaking the linearity. The chapters are another way, they somehow tell the same story from different perspectives. Also the images repeat, and appear in different listings (I am now typing a report of a discussion that is happening here).

MM: The book allows different ways of circulation so some of the materials of the wiki and some contributions were thought in the direction of a printed object

FS: I think it was not so much a "pouring", from one place to another but a reciprocal process of the two locations of the project that have been massaging, or kneading each other into shape.

SF: It's like a cut in an architectural structure?

FS: Yes, a cut through. It hurts!-)

AV: The wiki is not really architectural, more organic I would say. Also in this HTML-to-print structure, apparently architectured by the structure of the html and css, It was not so clear. The text was flowing, always escaping and taking different forms at each redering.

SF: But a very conscious cut.

FS: Of course. It now circles back to some of the agents we spoke to, worked with, discussed with; it allowed us as a working group to take a step back and look at the cut to see what that means. A book is a making-public of quite a different kind than a wiki.

CC: Do you see the book as a complementary object to the wiki or the other way around?

MM: the wiki and the book (and the walks, and the workshops, and the public events, too) are overlapping with each other a lot, so not strictly complementary, and the mondotheque project consists of them and the relations between them.

AV: The book is a 'moment' but it's not a fixed moment, we could generate it again, during the process of work and it would have a different content each time.

CC: Did you already plan on producing a book from the start, or did that idea emerge while the project evolved?

MM: No, it emerged somewhere in the middle of the project.

FS: Thinking about possible inter-relations between books, documents and digital content has been part of mondotheque reflections and discussions [ ]. At some point it seemed interesting to try to work with printing ourselves, especially once we discovered that the printer that produced Otlet's Traite de documentation still existed in Molenbeek (we think the presses are still there but we never managed to go inside )! Printing a book was a way to delineate the project in time, even if it was possibly just a temporary halt.

CC: We noticed that some of the software options of the wiki were later on embedded into the physical publication, such as the disambiguation concept. How did you make this selection?

FS: yes! some of the software options offer a very interesting site where you can playfully question their functioning and their meaning at the same time. so since one of the main issues we wanted to deal with was the fiction of the Google-Otlet sameness, disambiguation (and ambiguation) became a very useful resource.

NR: Disambiguation very much fitted our topic that was transversal to different epochs and places.

CC: Did you consider the possibility of using Wikimedia content while working on the project?

FS: There are several images included from there, but probably they are not easy to identify as such. We did make extensive use of the wikisource effort to publish a corrected OCR of Le traite de documentation [ ], and the analysis that we made would not have been possible without it [é_de_documentation ].

MM: We invited a member from Wikimedia in a Mondotheque workshop, too, who brought up the option of actually uploading materials from the Mundaneum to Wikimedia... so maybe the reverse will also happen?

CC: Are there any plans for the project after the book finale?

NR: yes this needs to be done still, clarify it and make it easier to grasp.

FS: The wiki had to be 'bent' into strange shapes in order to make a printed book from it; we had to trick it sometimes to end up with saner html to go into the weasyprint. Also, the MediaWiki software recently had to be updated, which broke half of the Semantic Wiki extensions. But it might take a little while for us to get back to it.

MM: We should gather a reworkathon?? also we don't look at the book as a finale, rather a step. now that there is a book with some of the wiki-content in it, we want to think together what can still happen with the rest. Some of the texts might appear in other places, by the way.

FS: Like for example Geraldine Juarez will present another iteration of her pre-emptive history of the Google Cultural Institute as part of DiVersions, on 4 December in Brussels [ + ] and the curriculum in amateur librarianship is being implemented in Graz and other places [ + ]

LD: We're also interested in technical aspects like the need to 'bend' the wiki to produce/generate a specific format. Could you please tell us more about the issues you encountered and the solutions you came up with?

MM: Mediawiki is not really made to be printed.. So the way it worked for us was finding the most basic and stripped rendering of the page ( by way of the &action=render mode of viewing a wiki page, like ) and later adding CSS styles to that basic html. Then Andre's hand-crafted python script reconstructs the structure of the book from the Index, downloads all the linked content like pictures, and finally uses weasyprint to convert the whole thing to the final pdf... [ ]

SF: What did you learn / take away from this process? Do you think you might do something similar for another project in the future?

AV: My experience with the wiki in this context of research is more a starting point, it opens for a continuation, not with the wiki specifically.

MM: This process made me stop hating wiki and start loving transclusions.

NR: I also got very much into the wiki