Text by Victoria Gray
Technicality (Part 1)
The technical dimension of the project has challenged my lack of technological skills. As a performance artist, my techniques are techniques of the body and not ‘hardware’ and ‘software’. In order to address this, a formative meeting was set up with Dr Andy Hunt, Senior Teaching Fellow in Music Technology, Chair of Electronics Teaching Committee at the Department of Electronics, University of York. Andy was able to advise on some of the technical questions that were arising and most importantly, was able to put biofeedback technology in a wider cultural context by sharing a lecture he was preparing on emergent technologies.
As an artist I mainly work with my body and occasionally hand-made objects, therefore this was a jolt to my system and made me feel ‘other’ to the process. However, Dr Andy Hunt’s expertise was especially relevant to our project since his own research outputs are in Interactive Sonification (the science of displaying data as sound, with real-time user interaction), of which Hunt is known internationally for his pioneering work.
At this point, it was critical to allow Oliver and Alex to correspond over the technical aspects of the project, which from my perspective reads like a programming language that I did not always fully understand….Despite this, I opted to follow the threads and contribute to the discussions, making sure the eventual performative outcome was kept in the foreground. Acknowledging the skills of others means that placing trust in any collaborator is vital. In this way, diverse skills can be brought together for the mutual benefit of all artists involved.
In the following posts it seems appropriate to share a selection of correspondences between myself, Oliver and Alex which flesh out the technical dimensions of the project. Publishing them here as non-linear ‘raw data’ shows the networked dimension of this project and the extent to which the software language is (for me at least) highly nuanced.
Correspondence 1
Question: V and O to A
How are the sensors built? How are they attached to the body? What is their battery life? What is the API (Application Programming Interface)?
Answer: A to V and O
Each sensor consists of a central module and an accessory, such as adhesive wings, a clip and a band, etc. The module has an EXG sensor (sEMG, EKG, EEG), an accelerometer, memory, and a micro-controller with integrated BLE radio. sEMG is the muscle activity sensing modality in the MyoLink sensor.
The sensor runs on coin-cell batteries, rechargeable lithium ion 2032 coin cell or single-use 2032 form factor coin cell. It streams raw EXG data at roughly 2k sps and yields roughly four hours of continuous operation, with scaled lifespan as transmission rate decreases.
We currently use an API running on a pc (c#/.net) with an ANT dongle attached. This grabs the raw data coming out of the sensor and saves it. The PC app has a live raw data view graph as well.