For my Pcomp final project Jasmine and I are planning to work together and conceptualised on the responsive smart plant concept.
We decided to emphasise on mutualism ..interaction between the plant and human and humanising the plant. The entire interaction of the user approaching the plant should be a conversation.
Here were all the thoughts/ideas concepts that came to our mind which helped in narrowing down our concept.
A more crisp project proposal of our concept would be as follows –
A responsive plant that gathers and shows data about the physical environment, sensing things that are usually invisible.
We visualised the interaction to be as follows – We envisioned the screen to show historical data of the plant initially and once a human approaches, the plant would respond to the user by turning towards the user and then give realtime data visualisation on the screen.
This was the setup that we had for our playtesting. We had a lot of questions framed in our mind and put down which we asked to our users.
- Is a plant turning as people approach it enough feedback? Do we need more/other feedback? Like lights that increase intensity depending on the strength of EMF? What about haptic feedback (if we can text people’s phone or something?)
- Scale (size of the plant, supporting structure?) Number of plants?
- Is the data we’re trying to visualize interesting?
- Where should the visualization exist?
- Is the interaction engaging over a sustained period of time?
- Does the idea of invisible environment make sense?
- Kind of plant? And how this might affect the impression/sensation?
Questions about the visualization itself?
- Pull data from other places in comparison?
- Historical versus present data?
- Where should this be presented (projection versus. screen, online?)
We had some interesting insights from our playtesting
- People really did like the plant movement as a response. They liked it turning, and thought this acknowledgement was powerful.
- Many of them suggested us to have prompts for our interactions – for example turning on the light of the mobile phone and then the system would show the visualisation on the screen (emf sensors)
- Can you wave your cellphone at the plant and get waves on the screen and have the interaction based on human-generated EMF?
- Also one interesting suggestion that we got from one of the users was to give the plant a character – it could greet you and say hello while it turns towards you. It could then invite you to have an conversation and guide you as to how to talk to it.
- We also got a lot of feedback for our visualisations
- One was to have a toggle switch or some way to go back to the previous historical visualisation.
- Can you make the visualisation not look like data but like splashes of colour or smoke coming out from the screens.
- Can you not show me a graph in the data visualisation?
- Could we include augmented reality in some way in our projects?
- Could we have the plant talk about its benefits? How to make people empathise with the plant?
We tried to map the system diagram trying to finalise on our interactions and the components to be used in the project
Some examples of how the visualisations could be like.