During a residency at Blast Theory, I had discovered a new way of merging video with a live environment, a serendipitous discovery, that came directly from my artist inquiry – how to animate a landscape to a person walking through it. I made a prototype device, The Quizzer, with technologist Neil Manuell, and proof of concept sequence during the Reframed Project Development Residency at Lighthouse. On the back of that I became an XR Resident at Fusebox Brighton, a collaborative R and D space for tech start ups and artists with ideas around virtual, augmented, mixed reality.
I had stumbled on an exciting solution to the fundamental contradiction of my practice: how to use video or image sequences to heighten a live environment for a walker, without distracting them from the here and now. But I didn’t know how this new invention worked. I didn’t know if different individual eye sight would affect the experience. I was massively excited about its potential and wanted to find out its limitations and so understand what I could do with it.
At Fusebox I called it an AR hack, as I fought my impostor syndrome, realising I didn’t know how to talk about the discovery in this new context. People around me were developing products and use cases. Was it a product? I had never developed a product before. Did I want to focus on making it a product or was it more important to make work with it first?
I have worked on the idea from the beginning with developer Neil Manuell, but his availability to work on the project is limited. The a-n Bursary would fund the residency at Fusebox for ten months which meant having access to the Immersive Lab and the expertise of fellow residents. I wanted to learn the basics of Unity and how to process 360 images and video so I could rough out and test ideas before getting Neil involved.
My aim was to find out what I could do with the invention, how it could intersect with XR, and to become more independent with the digital processes needed to develop work with it.