Augmented Reality Elective

Purpose and Brief

Build onto my main year long project of an Augmented Reality Art book Project.

An exploration of how augmented reality can be used within an art book to bring static images to life.

  • To create own Augmented Reality art book with the theme being about the original dark versions of fairy tales.

In reality I can’t create a whole art book, so I will specifically be exploring the process of Augmented Reality and making it work for only one piece of artwork.

Ideas

  • Incorporating motion capture?
  • Augmented Reality in 2D?
  • Augmented Reality in 3D?

Research

2D AR:

3D AR:

 

Tests: Before I started anything to do with my project, I wanted to test if the software would actually work and if I would run into any issues on the way. Vuforia and Unity seemed to be the easiest to work with and had many resources online compared to other Augmented Reality apps.

Unity Vuforia Process:

1

In Unity 3D, using the Vuforia plugin, I was able to attach both 2D and 3D objects to an image tracked object. Although technically the 2D is not really 2D, as it is a 3D plane, with an image attached to it.

 

For my yearlong capstone project, I had decided I would not be incorporating 3D elements, or motion capture.b Rather, I would focus on 2D aspect of Augmented Reality, much like the eyejackapp, where 2D Augmented Reality is overlaid on top of illustrations.

Possible scenes to test:

12

Chosen Scene: Little Red Riding Hood

I have chosen to explore the first page of my Augmented Reality Illustration Book: The Little Red Riding Hood.

MoodBoard:

Picture1Picture2

Layout/Composition Exploration:

WIP:

I drew up illustrations in Photoshop as well as the images required for Augmented Reality. I then imported all the images into premiere pro to make a video for the unity vuforia to read. I included audio, however the audio would play upon start up of the app rather than when the image target had been detected. Therefore I attempted to split the audio and video into 2 separate sequences. Following this tutorial (https://developer.vuforia.com/forum/faq/unity-how-can-i-play-audio-when-targets-get-detected), I created a audio source to attach to the image target. I then attached the audio file (must be mp3 format) onto the audio source and explored the 3D sound settings.

Screen+Shot+2017-06-17+at+11.49.48+AM

The “Play on Awake” component must be checked on, and the Volume RollOff was interesting to explore, as the audio’s volume falls off the further you are from the physical image target in real life. The Script suggested by the previous tutorial did not work, at least for my version of unity. I found another useful one that ended up working. (https://stackoverflow.com/questions/36924828/play-audio-when-model-renders-in-vuforia-for-unity3d). This tutorial aimed to modify the “DefaultTrackableEventHandler” Vuforia Script so that when the image target could not be detected/tracked by the camera, then the audio would also stop playing along with the image disappearing. As it was extremely weird hearing audio without the image being tracked.

Screen+Shot+2017-06-17+at+12.04.07+PMScreen+Shot+2017-06-17+at+12.04.18+PMScreen+Shot+2017-06-17+at+11.54.18+AM

After this everything I needed to do on Unity was completed, and just needed to export to my phone, as it functioned as I wanted it to on my webcam in the Unity game test.

Screen+Shot+2017-06-18+at+1.34.11+PM

Next, I wanted to export the build to my android phone so that I can view the AR not just on the webcam. This process was rather long, and required many other applications to be downloaded such as the Android SDK and Java Development Kit. I followed this tutorial step by step (https://unity3d.com/learn/tutorials/topics/mobile-touch/building-your-unity-game-android-device-testing) to build to Android. I had to install packages in the Android SDK application such as “Android SDK Tools”, “Android SDK Platform tools” “Android SDK Build Tools” and “SDK Platform”. I then had to adjust the build/player settings within Unity, adjusting the name was import in order for it to export correctly. Company, Product Name and Identification needed to be changed. And Identification had to be in a specific format such as “com.Sitepoint.ARDemo. When Exporting the build, I ran into multiple issues. Firstly, the location of the root folder of the Android SDK and JDK needed to be directed to, through the Unity Preferences. Next, there was a problem with the script, saying Found plugins with same names and architectures. I then had to delete a duplicate of one of the Vuforia scripts that I had accidentally imported in twice under 2 different folders.

After these issues were fixed, I was able to transfer the build to my android phone, I encountered a major issue. The build did not detect my tracking marker image and the camera looked out of focus.  I looked to solve the camera focus issue first as this could’ve been why it wasn’t detecting properly. The camera could not focus when the application was open, and it was the phone camera’s problem, as I also tested every other application that used the camera and they all functioned properly. I then tried adding a script onto the AR Camera in Unity which was supposed to fix the auto focus issue.

Screen+Shot+2017-06-18+at+1.44.27+PMScreen+Shot+2017-06-18+at+1.53.57+PM

However, I found no visible difference when I re-built it to my phone, and the camera still refused to detect the image and overlay AR on it, even though it worked perfectly on my webcam in Unity.

I researched on google and through all of the Vuforia unity android forums but could not find anything relevant to solve my issue. I wonder where the problem is occurring, and if it’s a specific problem when exporting vuforia to android phones (or a specific software update).

I was unable to successfully export the build from my mac to android phone.

Advertisements

FIGURATIVE REFLECTIVITY

Brief

An open brief to create a 30 second motion capture sequence which explores alternative, non-figurative and experimental visualizations of movement and the body in motion. This is an opportunity to explore further the possibilities of creating a visual sequence using motion capture. You are encouraged to take a more experimental approach by exploring alternative visualisations of body in motion: dance, sport, mime etc. This might include the use of geometric shapes, particle effects, fire, smoke effects, and dynamic simulations. The use of the virtual camera as an element can also be explored.

Initial Ideas

  • My friend Kayla has background in dance, ballet, choreography who could potentially be our motion capture actor

I decided I wanted to have a dance of some sort. Either of abstract dance, or some sort of choreography.

Abstract Dance – https://www.youtube.com/watch?v=RVVI1uOZ5p8

Choreography – https://www.youtube.com/watch?v=AO5W1zBHPKg

Fire Dance Anime – https://www.youtube.com/watch?v=LPi-38oUtC8

Fire Particle Effect Following – https://www.youtube.com/watch?v=kaRGazSwt7c

Particle Effect Fire Constant – https://www.youtube.com/watch?v=3hqkEbVQwIg

  • I would like to use various particle effects, fire and smoke effects to go with my dance. Whether it be these mentioned effects popping up after a certain movement etc. or the effects following behind/attached somewhere to the rig.
  • The rig may be a just be built through simple geometric objects that resemble a human form. eg. circle head. cone body (for female) cylinders for arms etc. Or all one object, eg. all balls, or even basic character rigs
  • They could bloat up/reduce for figurative dancing. Certain parts could enlarge, reduce in size during the dance.

Further Ideas

After Greg posted up some videos of various motion captured abstract visualisations on the AUT motion capture page, I decided to rethink my initial concepts. I became interested in these moving digital sculpture type particles and dynamic particles that follow the initial movement and remain for a period of time before fading away. I want to create something that still  with circular particles and their movement rather than lined particles.

http://www.thisiscolossal.com/2013/08/human-movement-converted-into-digital-sculptures/

http://www.thisiscolossal.com/2016/05/kung-fu-visualizations/

Asphyxia: A Striking Fusion of Dance and Motion Capture Technology

http://www.thisiscolossal.com/2015/03/asphyxia-a-striking-fusion-of-dance-and-motion-capture-technology/

 

Production Diary

Date Action Taken
24/03/16 Assignment Brief given and introduced.
31/03/16 Researched possible dances and documented.
11/04/16 Possible ideas generated and documented.
25/04/16 Motion Capture group formed and personal and possible ideas were discussed with Greg.
02/04/16 Sorted out actor and planned the time to shoot with her.
06/04/16 Equipment needed for Motion capture shoot prepared. Double checked actor was available to motion capture shoot.
08/04/16 Motion Capture Shoot.
15/04/16 Cortex Clean Up.
18/04/16 Cortex Clean Up continued.
06/05/16 Finished all the Cortex Clean Up.
06/05/16 Begun Motion Builder solving data.
13/05/16 Begun exploring various particle effects and various ways to display the movement. Eg. Abstract shapes, or skeleton, or polygon human figure etc.
20/05/16 Finalised idea to use spherical shapes attached to locators to form the movement and shape.
27/05/16 Attached particle effects.
01/06/16 Inserted lighting and cameras. Render.
02/06/16 Begin and finished final editing of sequence. Including – research and editing of background soundtrack.

Work in Progress

Capture11

Imported locators to experiment with attaching an actor onto it.

Capture

Attached character to the mocap data, but the movements seemed awkward and didn’t give the elegant effect I wanted to capture.

Capture2

I also experimented with various polygons to build as the character and attach the movement to those, however the effect was still quite stiff and rigid.

Capture3

Capture4

I decided to go with a more abstract idea, so I used spherical balls and applied a texture to it to create a bubbly/reflective marble look as well as to craft out the form of the dancer. These were all individually assigned to each mocap marker/locator. The locators had to be imported into Maya as an fbx to be able to read as well.

Capture5

I added particles, by adding an emitter to a few objects, in this case the spheres in the scene. I played around with the lifespan, size and gravity of the particles as well as the different render types of particle shapes and sizes.

Capture12

After I was happy with the overall adjustments of the particles and their lifespans/sizes etc. I decided to work on the colour of them. I wanted to change the colour at various stages of the dance for the dance to show progression and change. This was done through applying a material and changing the colour of it and then key framing each point that I wanted it to change at.

Capture13

Capture14

To finish, I decided to make a box/room by creating a few planes and standing them up in a box form. I made their surface 1.00 reflective to make a very reflective room. This created a nice reflection in the back etc. where the form can be seen, but the particles could not be reflected, giving it an abstract metaphor as if the camera is seeing both the reality and what the “dancer” is imagining. The dancer imagines she is a elegant and flowing with particles flying about, but in reality she is normal and just dancing as seen in the reflection.

Capture9