Back at Longside today.
Human sensors, asking people to note down the temperature, humidity, atmostpheric pressure and decibels around an oak tree.
Making a stop motion animation based on the data collected from the visitors.
Live data projected from the oak tree. The projection worked really well and looks very interesting when it is changeable weather, it was one of those sunshine and shower days, but with real horizontal rain and wind and then suddenly sunshine and blue skys. At one point the visualisation turned yellow with the sunshine, we’d not seen that before. The temperature as the rain came dropped by the minute and the humidity rose to 99 percent (although we wondered if that was caused by a build up or rain on the sensor).
The activities seemed to work quite well and the animation is building up slowly.
Our ideas are also solidifying. We still have a big gap between the visualisation and the concepts for a physical sculpture, being pulled between the ability to create a dynamic augmented reality on screen and being tied to a physical space when you create a sculptural experience, we’d like to make the augmented space physical… which we did with Chemical Garden, but I am still unsure what materials would create the ethereal experience that the projection is creating.
Have been looking into CO2 as a solid substance and would like to create a tree out of it but it is lethal, can cause frostbite if you touch it and asphysixation in large quantities, sounds worse than the ammonia we used in Chemical Garden to create the salt crystal trees… I like it.
But not for this, I think this is likely to be mechanics and light, but we are thinking this work will evolve and different outcomes, at the tree (the sensors and speakers that enable you to hear a tree in another location), in the forest (tracking your journey through the forest), in the gallery (visualising and interpreting the data as a sculpture).
Our goal for the weekend is to redo the visualisation so that the parameters are more meaningful, to enable people to ‘decode’ the effect of the data on the image and also to enable people standing in front of the visualisation to interact with it.
We will do some tests with audio to see how we can capture audio from the tree and what kind of sounds we can capture.