We managed to make our plant turn after a countless experiments. We used a lasy susan and mounted our plant on a circular acrylic sheet. We also used a slipper ring to keep our wires intact.

 

This is the final project brief which we changed after getting valuable feedback from the second playtesting which was very useful.

We want to bring forth the way you are each affecting the surrounding environment and sensing each others presence in difference ways. We plan to donate the money (quarter dollar paid by each person to interact with the plant) to 350.org which is a global grassroots climate movement that can hold our leaders accountable to the realities of science and the principles of justice.

Trees and plants provide important carbon sequestration benefits, which we hope to highlight with the CO2 data we collect. It is also an exchange with a non-human, generally immobile, but living being, that involves you communicating with it via your breath and physical presence while it communicates through photosynthesis and the representation of data.

The inputs we’re reading for the plants movement are proximity to the plant (measured using distance sensors) and the switch triggered with a coin drop. The sensor data we’re gathering will be part of the associated visualization, which right now only includes real-time data. We were going to have the coin drop trigger a switch between real-time and historical data, but based on feedback in class are now considering incorporating a slider instead, which would compress or expand the time scale.

When the user drops in a coin in the coin box we want the following page to get triggered so that the user knows where all the money is going.

 

1

 

We have seen a significant difference in the day and night readings of the microphone and CO2 sensor values. We also expect to see changes based on the number of people around the plant on what we have researched but we still need to experiment on that.

Here is a glimpse of how our responsive plant interaction looks like.

 

 

The following sketch appears on the screen which is a realtime visualization of the ambient data around the plant.

We have our API page https://helloplant.herokuapp.com/plantapi which collects data after every second.

We plan to have a slider on the plant and the user can interact with this slider to see the historical data of the ambient data. The ambient data includes – CO2 sensors , microphone and the soil moisture data.