Following up my initial tests with Sara and the forest club I made some more footage out and about in Hollingdean. I chose the old skatepark because I love it and the huge meadow that leads to the dew pond. My kids provided the action.
It was still horrendously difficult to line up, especially in the meadow and I struggled to define the exact nature of the problem. There were still too many variables: keeping the Quizzer still enough to stay matched up with the x, y and z orientation, finding the right viewing position and scaling the image. I had to zoom in a lot to approximate the right scale and this meant distorting the image which affected the match up.
I decided to try some high res footage Nick Driftwood had made using six panasonic lumix cameras by the West Pier and bandstand on the seafront. Neil made buttons which would fix the x, y and z axes. ThisĀ made it easier to make the initial match but it was still tricky to match up the position and scale of near and far objects in relation to each other.
Why was this?
I had thought that if you take a picture with a wide angle lens then the image becomes distorted in terms of perspective, but I found out from photographers’ forums that this is not the case. If you take a very wide and very close photo from the same position, objects will appear in exactly the same position in relation to each other. You can prove this by blowing up the wide photo to the same field of view as the close up one and comparing the two.
However the nearer an object in the foreground the more accurate you have to be with the viewing position. A slight shift of position up down or side to side will change the position of the foreground object in relation to the background. So viewing position is key.
Very wide lenses used in 360 photography do distort the image, but not in the way I thought. More research led me to tilt-shift lenses which correct perspective by physically tilting or shifting the lens. If you take a photo of a tall building the image will show it narrowing towards the top, something our brains correct for which means that through the naked eye, this narrowing effect is not in evidence. A tilt-shift lens will make the image look more like how we see it.
Maybe this was the key?
I took Neil down to the bandstand to test Nick’s 360 footage.
He agreed with me about the variables suggested a test we could do. We would place crosses at different distances and take photos from different positions and view them through the mirrorless Quizzer to see where the distortion happened.
To my surprise, even when we viewed a photo of a very near object it lined up perfectly with the background in position and scale if we viewed it from exactly the same position it was taken, it was just much harder to find that precise viewing position. Of course from a user experience point of view it’s probably better to make sequences without very near objects.
Michael Danks from 4ground Media looked into the event room and suggested we try with a 360 image taken with his “two eyed” camera, which takes very high quality photos from the left eye and right eye view points. The next task is to test them in the Quizzer.