A simple PIR (Passive Infra Red) motion detector
Basic motion sensors are surprisingly small and fiddly. They need to be connected to GPIO pins on the Raspberry Pi which you can do using simple leads that push into place, or you can solder joints for a more permanent connection. Soldering is also fiddly.
Once you have your wiring in place (ensuring that you get the connections correct to avoid power surge and subsequent sensor meltdown) you can write some python code to interrogate the status of the GPIO pins and find out if the sensor has sensed anything. If it has – bingo, you are on your way. What would you like to do with this information? Now you are back on terra firma and *just* have to do some creative coding that responds to the movement. Game on.
The sensors that I tried were only moderately or intermittently sensitive. Sometimes it seemed as though you could dance a waltz in front of them and nothing would happen, other times the merest twitch would set them off. This made it difficult to generate consistent responses for even relatively simple scenarios, such as the presence or absence of a viewer in front of the sensor. For more complex scenarios, such as detecting when the viewer is moving away from the sensor, the success rate was lower still.
In truth I hit loads of snags along the way – it is the nature of programming to have things fall over. Searching for the error online often provides a fix, but there were several times when I reached the outer limits of my own technical capabilities. This is where being part of a community of coders helps. I was amazed at how knowledgeable others are and how deep their understanding of what is going on.
For example, I needed to refresh the screen with cat pictures, but the Raspberry Pi couldn’t cope – it ran out of memory and the refresh rate got slower and slower. I thought this must be an absolute limit of the Raspberry Pi and that I would need to use something more powerful to run the code, but it turned out that I just needed to empty the trash. Including a line to explicitly clear down the memory after every cat display solved my problems.
This may sound fairly straightforward (and in some ways it was) but things are only ever easy if you know the answer; otherwise, minutes can turn into hours and hours into days whilst trying to solve what seems to be a trivial problem.
The alternative would be to pay someone to code everything for me; and this is tempting, except that (a) I don’t have the money and (b) I think it is important to get to grips with some of the code myself. Some of the best outcomes I have created have come from unexpected glitches during my search for a solution.
There are so many options to choose from when starting out with interactive artworks that it can seem overwhelming. Arduino, Pico, Python, MicroPython, Raspberry Pi… where to start? So it is important to orientate yourself and take some time to do a little homework.
I went along to the Feeling Machines Weekender in Bristol in the Spring to get a flavour or what was possible and to talk to others about what they were doing and how they got started. The key seems to be to find a community which you can join and get help and advice when you need it. There is a thriving creative coding community in Bristol, but that is just too far for me in Cambridge. So I joined my local MakeSpace Raspberry Pi club instead.
The good thing about Raspberry Pi as a starting point is that it is relatively cheap (around £40), its Windows interface is familiar, and it is designed to be an entry-level product to start you coding. The CamJam EduKit #2 for just £9 gives you step-by-step instructions for wiring up a simple motion sensor and writing the Python code that goes with it. I was on my way.
Inside a Raspberry Pi
My aim was to create an artwork that responded to the presence of the viewer, and I wanted to mimic the way in which social media keeps people hooked.
So my project was going to show some content on a screen, and every time the viewer starts to move away to look at something else, I wanted my artwork to make some sort of a sound and refresh the content in an attempt to lure the viewer back.
The end result was Catatonic, which I showed as a prototype at Cambridge MakeSpace Open Studio in July.
I thought that this would be a relatively straightforward first project that uses a motion sensor to detect presence and then trigger a response accordingly, but it turned out to be surprisingly difficult to achieve. Read on to find out how I got on.
Catatonic at Cambridge Makespace