from CRUMB

Date:    Sat, 17 Dec 2016 10:10:35 +0000
Subject: Re: CRUMB discussion: Methods for studying the (after)lives of Internet art

From:    Annet Dekker <xxxxxxx@AAAAN.NET>

As promised before, I would tell you a little bit more about the acquisition of two museums in the Netherlands around digital and internet(based) artworks. Although some of you might have seen the recent announcement already, just to briefly introduce – Ward Janssen from MOTI museum in Breda has been working on this project for more than a year now, and after much negotiation, selection and discussion between MOTI and the Stedelijk Museum in Amsterdam, contracts are signed between artists and the museums (for more info see for example I’ve only been involved in the final part of it, and in particular when discussing the work by Martine Neddam – hopefully Ward will have time in the coming days to follow up on the larger ‘project’.

As for, you can imagine this was and still is a difficult case to acquire: for the museum one of the core questions was what are they acquiring (nothing new there). is a particular interesting case, because since its inception<> has evolved into numerous digital works, spread out over domains, sub domains and non-browser based renditions that all originate in and under the overall guise of the same concept. Next there are also offline projects/objects. Martine told me once that she hardly remembers all the different iterations of the project. Another important aspect of the work is the question of what to do with the participatory aspect of the work. This is one of the most important elements that keeps the work going. Besides the various privacy aspects attached to such an acquisition and when considering the preservation of a participatory work (not unlike the case of Twitter giving all their tweets/traffic to the Library of Congress), it foremost requires continuous work because Neddam keeps close track of all the communication or ‘traffic’, as she mentioned earlier, she sees the work as a communication / social platform. In several discussions with Neddam, it became clear that the core of the work is its database – the point where all the action takes place, and more importantly where all the information about the work is stored. This alone was difficult to explain to the museums; it is not the aesthetics, form or media but its underlying database which Martine wants to preserve – more importantly she believes it is crucial to preserve it in the sense of a living database, or as she recently remarked I’m interested in ‘generative preservation’ (

Unfortunately the museums couldn’t acquire the whole project – a twenty-year long work would require an amount that would probably far exceed their budget. Instead they proposed to buy a separate project which could be shown by itself and offline. However, after several discussions and explaining the core of the work, thus making clear that it is perhaps possible to single off a part of the project, but it would make little sense since all the different parts are somehow connected and interlinked to the database, the museums asked about other solutions. Neddam’s response was to see this whole process as a first step in a longer lasting acquisition process. So, for this first phase she proposed the museums the following: to sell the museum a version of the website<> – Version 01, which would be a ‘date-stamped’ mirror site of<>, together with documentation of the work at various times and in different ways (for instance, historical imagery and a screen capture video of the working of the site). The concept of ‘version’ rather than ‘edition’ was important (reflecting earlier comments from Jon Ippolito in his article ‘Death by Wall Label’), because it would signal the time and the variability of the work, as well as its continuing development.

Some interesting propositions follow from this approach, to mention a few:

– in what way can the participatory aspect of the work be transferred (next to learning someone to handle the database, we’re also looking at methods from performance artists here, for example the ways Tino Seghal transfers the knowledge about a performance to the museum)

– how to involve the larger community around<>, for instance the mouchette network and its previous ‘caretakers’; for example how to create a distributive network that helps to take care of the work (one of the main wishes of Neddam is that the work will survive through an evolving network)

– what can be learned from new methods such as the block-chain and version control system for tracking and tracing changes over time?

– what are the limits of mirror sites and/or virtualisation for institutions?

In the upcoming year, supported by the museum (as part of the contract), Neddam will investigate these questions and hopes to present some outcomes within one year which will lead to a second phase in the acquisition process. In the meantime, she’s very open for others to acquire other Versions of<>.

Obviously there are loose ends to this ‘story’, however in the light of readability and clarity I tried to focus on the propositions rather than all the discussions and their outcomes (hopefully this will end up in a book some time in the future). Of course, there are pro’s and con’s about this approach, but what was important to me that it shows that attitudes are changing and museums are opening themselves up for alternative approaches and methods. The discussions made clear that the museums are not necessarily trying to tame the work, although perhaps initially they were, they have changed their tune and listened to the artist to get as close as possible to the wishes on the artist, trying to preserve by prolonging the ever changing work. I’m very keen to hear your reaction and open to response to any comments, advice, questions and feedback!


A response by Michael Conner from Rhizome


Hi Annet, is a very interesting case study for an acquisition. For our research for Net Art Anthology<> I spent some time this month tracking down missing pieces from Martine’s contribution to the web zine Why Not Sneeze<>? and was struck by how large the site already was when it was on xs4all, prior to its move to<> in 1999.

With regard to the question about how participation can continue:

I wanted to draw the list’s attention to the very fine work done by my colleague Dragan Espenschied on Muntadas’ The File Room<>, which was built in Cold Fusion. To extend the lifetime of the project, Dragan imaged the server, which now continues to run on Rhizome’s servers in Cold Fusion. In the process Dragan did make some small bug fixes but chose not to fix the charming broken <TR> tag on the front page.

For Dragan’s work on that project, Muntadas felt that the project had to continue accepting submissions, so it was not a good idea to create a static snapshot in a stable format such as WARC. Thus, the project was captured in a way that left its architecture largely unchanged while allowing it to function in a dynamic way.

One difference between this project and Mouchette was that there was never an idea that Muntadas himself was vetting submissions to The File Room, whereas this conceit is crucial to Mouchette.

I think the versioning idea is a good strategy too, although it seems like the versions should have some sort of subtle overlay indicating the lack of participation. On a technical side, I’d urge you (and everyone on the list) to check out the remote browser capability of our web archiving tool, which allows you to create and replay stable archives of websites with Flash and Java. In the case of, there is some use of Flash, which presents future playback problems.

I hope the museum will be collecting those offline projects!!! Really wish I had a flyer from Mouchette Live at Triple X in 1997.



Comments are closed.