“Using the latest in VR technology, Thresholds will restage one of the earliest exhibitions of photography in 1839, when British scientist William Henry Fox Talbot first presented his photographic prints to the public at King Edward’s School, Birmingham.”
This differed from my previous encounters with VR, which have used Google Cardboard apps. Whereas Cardboard offers a 360 degree visual experience, the sense of presence, or immersion in the unreal world is limited; whilst it is possible to look around at everything filmed by the camera/generated by the app, it is not possible to interact with or impact upon the scripted world.
There is an often-overlooked difference between 360-degree video and virtual reality, in that the latter offers opportunity for participation in a simulated world, alongside a fuller sense of immersion or presence in the simulation. VR requires more computer processing power, hence its association with head mounted displays and earphones (HMDs) connected to a computer, rather than a viewer holding a smartphone against the eyes.
I have, nonetheless, enjoyed Google Cardboard apps immensely, although after today these charming worlds will seem a bit tame.
Thresholds then, employs sophisticated technology to simulate an environment in which the viewer can walk around freely, and in which there is some further sense of presence afforded by the ability to see the hands as orange clouds, by holding them up in front of the headset (there was a short, second or two, time-lag until the ‘hands’ appeared). The virtual hands could interact with documents in cabinets within the environment; swiping at a document caused it to ‘leap’ out so that it could be examined more closely. This did not work for me however, although I managed to summon up one leap, the document pretty much smacked me in the face and then scurried swiftly back to its place in the cabinet. ( I felt a bit like Ron Weasley in Harry Potter, when the spell simply doesn’t work for him..) Further swipes, tried on all the other documents, were ineffective. On querying this with one of the technicians, I was told it was likely to be my bad swiping technique.
There are clearly implications for libraries/archives/museums here however. The short briefing given before we entered the environment, recounted that archivists had been consulted in the design of the program; this type of simulation could allow anyone, anywhere, to examine virtual renderings of rare, fragile documents, at a time and place convenient to them, personally (assuming good swiping technique).
I found the alloted six minutes too short. I really wanted to stay in this unreal world, which although somewhat cartoon-like, was delightful. I dutifully noted features mentioned to us in the briefing – the mice scampering across the floor, the cobwebs in the crevices, the moths fluttering around the lamps and the swirling smog outside the virtual windows. The sounds of the 1839 rioters seemed a bit remote, but I remember hearing them. The fireplace emitted real heat, although to me, the flames appeared as bright green. The background sound of the clock, shown above the entrance, ticking, was somehow comforting.
I didn’t like the heavy headset. We were warned to make sure the contraption was comfortable before we entered the simulation, but even though my apparel seemed comfortable to begin with, I soon felt the need to readjust the way the visor sat against my eyes. I felt I had made the headset too tight, in order to stop it slipping. I soon felt it was pulling at my lower eyelids, and consequently, my vision seemed a little blurry.
Other participants appeared in the simulation as white ghosts, to avoid collisions – there was another time-lag effect here as people appeared (to me) to be either stationary, or to move at lightening speed to another postition.
Another participant asked about glasses – the headsets don’t adjust for vision impairment, and in a short demonstration such as this, I would agree this is a bit too much to hope for. However, vision correction is something that VR designers should think about, as wearing glasses under a headset is annoying and uncomfortable.
I haven’t commented on the exhibition itself; the artist Mat Collishaw did not set out to recreate the original event, but to create something new, based upon original likenesses, documents, and archival materials. I don’t think it matters that we don’t have enough knowledge about the original exhibition to recreate it exactly. We are in a different time now, and the artist’s creative connection with the past was certainly enough to spark interest in the history of photography, and indeed the social context in which photographic developments occurred.
There are parallels in this virtual recreation of Fox Talbot’s first photography exhibition, with attempts to recreate performances from archival documents. Notably, to what extent it is ever possible to recreate an event, or an occasion of any sort? Our DocPerform project considers this question, along with the more fundamental issues of how we define and record documents, and how we approach the processes of documentation. What can technologies such as VR offer in documenting performance?
Leaving the conceptual questions of documentation aside, technology itself raises issues. How can we remove the interface? The face visor is clumsy. It reminds even those of us who are more than willing to jump into virtual worlds, that we have something physical and uncomfortable stuck to our face. How could we improve the design of VR systems? Contact lenses perhaps? Some other small, un-noticeable brain-computer interface?
Further, a more immersive environment could be encouraged by enhanced use of sound, and by employing technologies to replicate smell and touch.
But no matter. Mat Collishaw (@matcollishaw) is to be contragulated on this fabulous installation. Look at what is there. And look at what we see through the headset. It’s not bad.