speculative digital tactics for historical inquiry.
Experiments in sonification, transmediation, remix, glitching, ai & other algorithmic techniques for new insights (& insounds) into the past.
The ductility of artifacts when they become digital data (or are born digital) offers untapped potential for generating compelling, new historical knowledge. This is true not only for text or numbers handled at large scales as “big data,” but also for less well-studied digital materials such as visual artifacts. When images become digital data, they are pliable in new and surprising ways.
We can transfer image data into sound, for instance, so that we can use our ears as well as our eyes to examine them. We can use computer algorithms to glitch images, letting the computer produce new versions that, in their randomized distortions, allow for fresh eyes on what the original artifacts represent about the past. We can remix images more strategically using collage tactics, shuffling their elements into new forms that help us better consider the historical content they contain. Or we can use the trained algorithms of “artificial intelligence” to generate new versions of existing artifacts and data. The goal with speculative digital tactics for historical inquiry is not to falsify the empirical record, but rather to harness methods of sonification, transmediation, glitching, remix, and trained algorithmic AI in order to notice new details, aspects, information, and meanings of the past that artifacts contain and might, when perceived, transmit in fresh and unexpected ways.
With these approaches, we read against the grain of the archive by actually changing the grain itself. But we do so with direct links to what the artifacts represent. The goal is re-representation in service of greater truth telling, an expansion of noticing by tweaking and distorting, the bending of the circuits of historical perception in order to try to glimpse the past more accurately.
Speculative digital tactics shift digital humanities away from the positivist orientation of much recent computer science, data science, and computational humanities, which threaten in troubling ways to narrow knowledge into only what computers can know. Instead, we strive to seize the means of computation for epistemological liberation by way of computational recalibrations coupled together with expanded human perceptiveness. This is more in the computational tradition of augmentation rather than automation. Returning, digital computers in hand, to the modernist defamiliarization or estrangement (ostranenie) approach of Russian Formalists such as Viktor Shklovsky, we strive to bring history forward. It becomes, as it has always been only more so, a story of all that artifacts, evidence, and the empirical and representation record can show us and tell us.
To pursue this work, I have experimented with what existing tools can do, from Adobe Photoshop to Michel Rouzic’s Photosounder to combinations of ChatGPT with the Suno.ai tool. Cross-disciplinary partnerships with computer scientists, artists, media studies scholars, visual studies specialists, and museum curators could eventually lead to the development of a suite of open-source applications that will allow for more effective pursuit of speculative digital tactics for historical inquiry.