· · · · · · · · · · · · · | Uncategorized · Physical Computing

This project went through many imaginary iterations before it landed where it ended up.

In initial conversations between my collaborator, Marcela, and I, we considered making a physical object – a box of grass that moved as if being blown by the wind when people walked past. I was very excited by the challenge the idea offered but after a few conversations with with peers and more experienced makers, Marcela was feeling less than convinced. So we both agreed to move the project to the screen. Thinking this might somehow simplify things.

At some point in the conversations about a move to the screen we ended up changing our idea from grass blowing in the wind to a dandelion blown away by the wind. This happened mostly because we felt more attached to the idea – both aesthetically and personally.  We loved the nostalgia and the playfulness of it.

To get things started we decided Marcela would spend sometime on the design, while I figured out more about the sensors.

We first made the mistake of purchasing a PIR motion sensor, thinking this was what we needed to figure out if there was movement in the room.  But these sensors turned out too be way to sensitive and fairly difficult to do any nuanced detection with. Disappointed we explored further options and discovered the Distance Measuring Sensor which seemed perfect for our task, though we did not need to measure distance, because we were able to adjust the level of sensitivity.

I got to work:


And quickly discovered that I needed to find a library for this sensor.  I found the newPing Library on the Arduino site and started figuring out what I was capable of doing with this sensor. I approached it like an adventure.

I was having fun and soon discovered that I could use this library to detect motion with multiple sensors. And that I could get the sensors to only send data if they were triggered within a certain range. And that I could also use this library to figure out which of multiple sensors was triggered.

At first I  thought, “great. I will be able to tell direction based on the order of how the sensors are triggered. AND I will send serial data to p5 in a string on two numbers [first sensor triggered, second sensor triggered] and will be able to adjust movement in the sketch accordingly.”

Because or how the code in Arduino worked creating this type of string actually proved incredibly difficult and after meeting with TK – for a long time – I decided I would try the simplest approach and just create movement based on which ever sensor is triggered first.

Still very difficult to sort out but I started with a simple sketch in p5 and eventually I made it work:

(this video was shot at 1am after I had been sorting out -failing at- this code since about 2pm. but i did it! and felt very accomplished)

In the meantime Marcela came up with some designs for our Dandelion and created 2 p5 sketches.  One that built out the visuals of the flower and another the showed the movement she imagined.  They looked beautiful.

From there she spent many hours with Moon and eventually, with his help, ended up with a sketch that combined her visions and had movement that was triggered by a key press.

Now I took on trying to trigger the movement based on input from the sensors and figuring out how to change the directions of the movement in the sketch.  Not fully understanding how the code worked (and discovering that my collaborator also didn’t), this was very frustrating to try and accomplish. But Shawn Van Every saved the day. I flagged him down after hours pouring over documentation about particle systems and he took pity on me, showing me where the movement was triggered and why what I was trying to do had failed so far.  Thank god!!

He also helped me figure out how to get the sketch to keep reloading itself based on a timer I had already set up.

I also set up some time with my professor (hey Jeff!) and managed to work out some of the more detailed aspects of getting the movement to happen with some control  i.e. getting the sensor data to stop affecting the movement after after the first trigger occurred i.e. keeping the dandelion petals from moving all over the screen because the sketch was constantly getting data that the sensors were being triggered.

Meeting with Jeff also helped me figure out how to organize my code a little better by setting up “stages” in the way I wrote the sketch (wait, receive data, animate, stop data, reload, restart).

My final code looks like THIS.

(I just noticed a small mistake in the notes in my code. I will fix soon. Can you spot it?)

The final product is actually quite stunning and I have to say I am very proud.  We even took the time to build a bamboo box that holds the sensors. It looks good!

Here is Marcela demonstrating how it works:

Here is it failing and triggering wrong (after working correctly at first):

(if time allows it I would love to figure out how to send the serial string I first envisioned. If I could do that these kinds of mistakes wouldn’t happen.)

I was very grateful to see the kind of excitement this project produced in my fellow classmates. It is playful and fun to watch. People seemed to really want know about it and engage with it.

Look how fun it is!

No Comments

Comments are closed.