1968 Prototype

For the final I worked with Alex Zimmer and we made a prototype for a fellowship we’re trying to get.  As much as I am fascinated by AR and learned so much in this class I still think we are so limited by our phones.  The only thing that will make me look at the world through my phone is a story or to learn something.  For the fellowship we have to make an AR app related to the year of 1968.  We decided to focus on protests that occurred that year as a way to maintain a focus but still tackle the wide variety of issues that were going on.  We’d also like to find a way to tie it to protests occurring today which are mainly about the exact same issues.

While there was ample images and videos available, the videos are not of the highest quality and we knew we would want to recreate some.  So we explored shooting people on a green screen to then see in the app.  We did a super quick and dirty prototype.  There are obviously more things we want to do with this such as add audio that guides you + GPS to get more accurate locations, figure out how to fade out and keep the video turnt toward you no matter where you are. I also want to play with the idea of how we can get people to leave their stories there.  For example if there is a protest going on how someone using the app could document it and leave it for the next person that uses the app to find.

Conscience – A Socially Responsible AR App

Idea 1:

How many things do we own that were built with child labor? How many with slavery? What is the carbon footprint of the objects that we own?  People are trying to be more conscious with their purchases, but it’s hard.  It’s hard because people are lazy. Are you going to Google a company every time before you purchase a product from it?  A $20 sweater from Zara sounds great until I know that the fabric came from factories in Asia known for dumping highly toxic waste into waterways in Asia and with workers paid far below living wages.

We want to create an app that makes it easy for you to make informed and responsible choices.  To be able to scan an object, tag or advertisement, and see a rating of this product or corporation based on environmental or social performance. Simple and easy and it can keep track of your purchases and let you know if you’re making progress.





Object Recognition – Diorama AR Tours

This piece was inspired by an assignment in another class.  In Cabinets of Wonder we were tasked with going to AMNH to go and see the dioramas and then make a diorama of our own.  I went for school but rushed through because I had already seen them twice before.  It was a pity.  We’d spoken in class about how outdated the dioramas (some are borderline racist) but it’s extremely expensive to change them or update them and they’re part of the permanent collection. I thought it could be interesting to toy with the idea of AR tours of permanent collections.  You could make numerous tours that approach the collection from a different angle. It would allow you to see the same thing over and over again but learn something new each time.  For this prototype I chose animals in mythology.  Below is an owl which in Greek mythology represented the Goddess of Wisdom, Athena.


For the purpose of this project I wanted to try out object scanning and recognition with Vuforia.  I had a lot of difficulty getting it to work, I rescanned objects tried it in different lighting but no matter what I did I couldn’t get it to recognize it in Unity.  After about 6 hours of redoing everything I realized that unlike Image Targets the max amount number you can have simultaneously is 2.  I had mine set at 3 and even though I was only looking at one object at a time the 3 there rendered everything obsolete.  After I figured that out I tackled adding sound.  It was something I’d tried to get before to work with the image markers but did not succeed. However this time I did, I need to copy some code from Vuforia and use their library.

using System.Collections;using System.Collections;using System.Collections.Generic;using UnityEngine;using Vuforia;

[RequireComponent(typeof(AudioSource))]public class ImageTargetPlayAudio : MonoBehaviour,
ITrackableEventHandler{ private TrackableBehaviour mTrackableBehaviour; public AudioClip mySound; public AudioSource mySource;
void Start() { AudioSource audio = GetComponent<AudioSource>();  mySource.clip = mySound; mTrackableBehaviour = GetComponent<TrackableBehaviour>(); if (mTrackableBehaviour) { mTrackableBehaviour.RegisterTrackableEventHandler(this); } }
public void OnTrackableStateChanged( TrackableBehaviour.Status previousStatus, TrackableBehaviour.Status newStatus) { if (newStatus == TrackableBehaviour.Status.DETECTED || newStatus == TrackableBehaviour.Status.TRACKED || newStatus == TrackableBehaviour.Status.EXTENDED_TRACKED) { // Play audio when target is found mySource.Play(); } else { // Stop audio when target is lost mySource.Stop(); } }}

Object to Augment

I love the warmth that plants add to your apartment but I am a plant serial killer.  I’ve tried calendar invites, alarms, reminders I always forget to water them or move them.  I find all the info online about the specific needs of each plant but it is in one ear and out the other.

I want to make a projection on the pot of the plant that tells you when you need to water the plant.

Water Me,  Water Me!

More Sun Please!

Feed Me, Feed Me!

Depending on the type of plant the complexity of the projection could change.  Some plants just need to be watered X times a week that would be simple to set up with a projector.  But if you wanted to get really fancy and step your plant game up to the beautiful difficult ones you could add sensors into your plant that would monitor it’s health.  If it needs more water, if it needs more sun, soil quality etc.  The sensors could transmit to the projector which could voice the plants needs.

I like the idea of the plant talking to you.  It’s almost more like a pet and you are more responsive to it’s needs.



Rorschach AR

AR allows you to show people the hidden, the unseen.  I can show you what I see.  I can put my filter of the world on yours.  I’ve always found Roschach Tests so fascinating, not that I believe in their scientific validity but there’s something so earnest about people’s answers.  You know that what you see is supposed to represent you, your mental state and supposedly clue a psychiatrist into your subconscious.  I think there’s something romantic and vulnerable about allowing other’s to see what you see in inkblots.  The inkblots are also a perfect image tracker, they are so distinct.

I’ve managed to get Vuforia to work with videos and 3D assets on the computer and am now attempting to get it to work on my Android phone.

I thought transferring it onto the phone wouldn’t be so difficult as I heard android is easier than ios.  So not the case. After downloading the jdk and sdk multiple times, multiple versions, from multiple sources and obsessively checking the sdk pathway I found that there was an issue in the sdk download itself that Unity tells you to download.  The tools section doesn’t work, I had to dive deep into a forum to discover.

This is how you do it:

  1. Go to your Android SDK Folder. (If you did not change the path during Android Studio installation you will find the SDK folder here: C:\Users\YourUsername\AppData\Local\Android\sdk )
  2. Rename the old Android sdk “Tool” folder : [Your Android SDK root]/tools -> toolsXXXX
  3. Download Android SDK Tools, Revision 25.2.5 (January 2017):http://dl-ssl.google.com/android/repository/tools_r25.2.5-windows.zip
  4. Extract that to Android SDK root folder.
  5. In Unity go to Edit>Preferences>External Tools and check again, that the path for Android SDK points to the correct folder. (The correct path is the SDK root folder, not the tools folder which is inside the root folder!)
  6. Build your project.

Probably like 4 hours spent figuring this out.