Work by Zan&Tianjun


The idea is to create an interactive interface on which emotional connections and gentle conversations can happen, between two living creatures. Through the event of combing the tree which triggers projection and sound interaction, the audience could reveal the latent living process, which is subtle, invisible but ceaseless. Combing is a very humanistic, intimate and non-aggressive way of physical interaction. And combing a horizontal tree is more about playing old instrument, revealing and recreating from the existing texture of the tree. The texture on the tree is broken but rhythmic, static and full of time, hard but still warm to touch. The projection and sound interactive with combing add rich sensory experience to this physical activity. In this way, people could get more engaged in this material. And different modes of projection and sound when combing make different sensations accessible and revisitable, also bring delicacy, elegance and a sense of strangeness to certain extent.
This project is both my ICM final project and my Pcomp project.  I assisted my teammate in developing her Pcomp mid-term project. Besides  using the physical distance sensor and force sensor connection and Arduino coding, most of our effects are using p5.js. I am mainly responsible for the visual part of the code, and my teammate is responsible for the sound and communication part of the code.


Idea Development

For the final project, we would try to add another projection layer and make the sound more dynamic. We try to find more dimensions to describe the interaction between the tree and the comb: the graphic pattern and the roughness of the surface, the force we use when combing, the gesture we comb, the direction and the time we have been combing.

To get the information from all these dimensions when combing the tree, there are following ways we can go:

  1. Use photoresistor mounted on the tree to read where the comb is on. If these sensors would stick out and conflict with the comb, maybe they can be hidden into the shadow the wrist.

  2. Use microphone mounted on the comb to collect or even record the sound when combing. The real-time volume data can reflect the roughness of the surface and how gentle or harsh we comb the tree.

  3. Use a trestle with force sensors to support the each side of the tree trunk. According to the principle of leverage, the radio of the upward supporting force on the each side can represent the location of where the comb is on. The lateral supporting force can represent the friction between the comb and the tree.

  4. Use brightness sensor mounted on the comb to read the texture of the tree. But this method requires a rigid and steady light environment.

  5. Use camera mounted on the comb or the piezo mounted on the teeth of the comb to collect data. But they are not that easy to play with.


For the project part, we are thinking about to project the virtual image of the transection underneath where the comb is on. It would be like the tree rings changing when the comb goes from the beginning to the end to tell the history of the tree. It’s supposed to a dynamic image generated from noise in p5/processing. Also, the lengthways surface of the tree reminds me of the blood vessels under the skin or the wrinkles on the face.


The Final Look



For the first mode of projection and sound, we hope to use the projection as a reminiscent of the latent living process inside the tree, like the ceaseless flowing liquid in the vascular bundle, rough knots and fissures on the tree trunk, and wrinkles on skin…And the melody would create a feeling of nostalgia, distance, lifelong memory and like fire burning leaves.



code link

Sketch1 part of the visual part of the code I borrowed a sketch from @Technomupet67  OpenProcessing of the main body frame, because it is perfect in our imagination to the sketch1 pattern. The main idea is the line is composed of circular Particle system, giving each particle makes a range of life spawn and then disappear. At the same time, In the Move() function, I used Perlin noise generated random Angle, at the same time use the iterations index to control lines’ wave number, resulting in a messy, but an orderly line.



For the second mode, We tried to use the graphic translated from sand-like waving pattern to add a liquid layer to the existed tree texture. The sound sample was recorded when gently agitating a large pool of water. When people combing, the force sensor mounted on the comb tell the strength of the combing, and trigger the sound of agitating water when people play with the ‘liquid‘ on the surface.



code link


Used in sketch2 P5. Vector class, control the moving direction and speed of the particles. Similar sketch1 thoughts, first use Perlin noise function randomly generated in the Y coordinates X increases gradually to produce wavy lines, at the same time using dist function analysis X coordinate and line particles in the distance, if less than 150 of the direction in the direction of the mouse, the mouse position change direction before to the spread of the particles after recovery.



For me, the course of icm is a bit short. I hope that I can have more class hours and try to use p5 to achieve more visual animation. The experience of the ICM course is like my experience in the visual language. The process is painful but at the same time it is a surprise and growth. Because a good visual effect requires more than just logically complete functions, it also needs to constantly modify some parameters that determine color and object movement. Trying it multiple times is like using PS and AI. But p5 turned the code into an intuitive visual presentation, and he gave me unexpected visual surprises whenever the project was successful.