In a street performance, the street is the crew of a street musician.
This AR experience will dynamically pick artifacts from the location of a street musician and makes them react to their music in real time. It is as if the artifacts are performing with the musician.

The first task was trying to change the Visual properties of augmented virtual objects according to properties of the music, such as pitch, beat and note.

After trying out various examples on these links, I managed to segregate the decibel range and understood that GetSpectrumData does following:

Frequency gets divided into 1024 samples (1024 is defined by variable: numberOfSamples)Those samples are stored into array named spectrum, in decreasing order of frequencies. End of array has lower frequencies.

Unity example: objects react to beats and don’t react to human voice.

AR_unity test from Swapna Joshi on Vimeo.

 

Experiment on the street trying to detect the surface and making the virtual objects react to the beats:

AR_street music in urban space from Swapna Joshi on Vimeo.

Next steps

  • Detect the geo location of a place and render contextually relevant objects into the space.
  • Gamification of the objects: To engage the users come and revisit similar locations and interactions – help the user save virtual objects and add them to a particular space
  • Tap on objects in the real environment to create a customized crew to accompany the street artist.

Issues faced

  • Mapping the objects to the ground
  • Hit Target function worked on unity but when built as an app on the I pad, did not work.
  • Unable to align and scale objects precisely in the AR Kit app.
  • In the unity example – the objects reacted only to the low frequency beats but in the AR kit app on the I pad – the objects reacted to human sound and beats.