Experiential Design - Project MVP Prototype (Task 3)

19/11/2024 - 15/12/2024(Week  9 - Week 12)
Katelyn Tan Kye Ling (0354148)
Bachelor Of Design In Creative Media (Experiential Design)
Task 3: Project MVP Prototype


LECTURES

Week 9 - 19/11/2024
Today we learnt how to activate the animation and change the colour of a component when it is scanned. We also learnt how to add a pointer in the middle so we don't have to guess where the middle of the screen is in order to scan a component. Then at the end, Sir asked us to switch the animation of the components. Which means when the cube is scanned then the sphere will play an animation and vice versa. After that, he gave us a brief on task 3 and ended the class. 
Fig 1.1 Week 9 - In class Exercise Video

Week 10 - 26/11/2024
This week we had class online. Mr. Razif then shared the playlist of tutorials from the previous semester. Then he shared with us a website called ready player me that could help create our own 3D avatar. After creating that, we learnt how to add animation to it with a website called Maximo. There were many premade animations that would just easily apply to your character. It was slightly tedious though as to converting to different files, as unity and Maximo accepted different types of files. 

Later on we added the animations to our week 2 exercise where there is an image target and the 3D model would pop up. We also made animation sequences in the animator so one animation can play after another. Mr. Razif then asked us to make buttons to play the animation sequence or click on the model to trigger the animation.

I made a button called "dance" where the character would dance > wave > idle. And another trigger where when you click on the model it would do a block animation.
Fig 1.2 Week 10 - In class Exercise Video

Week 11 - 3/12/2024
This week we had class online as well. We first learnt how to set active to each button and each button switches the 3 3D shapes triggered by the image target. Then after that, we learnt how to switch between the 3 3D shapes by just using 1 button. 
Next, Mr. Razif taught us how to switch between colours with a new button regardless of what colour was assigned to each shape as well.
Lastly, we were asked to tap on the 3D models itself to switch between them.
Fig 1.3 Week 11 - In class Exercise Video

Week 12 - 10/12/2024
In class consultation. (Feedback below)

INSTRUCTION
What we have to do:
- Exploration: Make sure the features in your app works.
- Do the main features of the app first.

AR Feature 1: Terminology
I first did the feature where you scan a card and a 3D animal will pop up with animation. I managed to make the model appear. However, I really struggled to make the animation trigger. I thought that I had to code it or do some technical things. After quite some time doing some research and trial and error, I ended up just having to change the file to FBX format and add an animator to the image object. 

Then after that I made the sound toggle where when you click the sound button the definition of the term will be read aloud. I also made the "Next" button where it leads you to scan another card and a "Done" button which leads you back to the main menu.
Fig 2.1 Scan Terminology Card for animal animation + Sound Toggle + Buttons

After that I created another scene because I wanted the scan page (First image of Fig 2.2) to go to the 3D model detected page (Image 3 of Fig 2.2). I wasn't sure how to do it, so I asked Mr. Razif for some help. He taught me how to do it and it was much easier than I thought. I didn't need a new scene, I just had to play around with the events.
Fig 2.2 Flow of AR Feature 1 in Figma

Fig 2.3 AR Feature 1 in Unity

AR Feature 2: Food Chain
I added all the image targets and attached all the animals to the it. I also changed the settings so that the AR camera can detect 4 image targets as the food chain exercise requires to link 4 cards together.

Fig 2.4 Flow of AR Feature 2

However I wasn't sure how to ask the system to check when the sequence is correct or wrong. So I asked Mr. Razif for help and he aided me with a video tutorial. After following the tutorial halfway, there was an issue with my app. When I scanned the sequence, only 1 image target could be found at a time.

Fig 2.5 Only 1 Image target found at once

After some time trying out here and there, nothing could work so I just continued with following the video tutorial from Mr. Razif and also added the pop up after the user clicks confirm (Fig 2.6).
Fig 2.6 Pop up after clicking confirm (If answer is correct)

Later on, I thought that maybe the cards I printed were way too big, so I re-printed smaller ones and tried to scan them. But again the sequence couldn't be scanned (Fig 2.7). At most 2 could be scanned at a time.
Fig 2.7 Unable to scan sequence

I really tried everything, I used the iruin webcam first to scan wirelessly from my phone but that couldn't work. Then I even tried the computer webcam to scan by holding 4 cards up but that didn't work either. And since time was really tight. I decided to further fix this issue later in my final assignment. Nevertheless, the things to be done in this feature to make it work has already been done by following the video tutorial, I just need to fix the scanning issue.

AR Feature 3: Quiz
First, I placed everything in it's position and created the multiple choice buttons. Then I added the cow to the question, it was showing up on my scene screen but not on the game screen (Fig 2.8).
Fig 2.8 Cow animation not showing up

I asked Mr. Razif about it then found out that the cow's position wasn't correct in the AR Camera's frame as what the scene and game shows is not the same. After shifting and adjusting it, it worked.

Later on, I wanted to show some visual feedback when the user chooses their answer, so I added a selected colour when the user picks their answer. Green for the correct answer and red for the wrong answer (Example in Fig 2.9). 
Fig 2.9 AR Feature 3 in Unity

What is left to be done:
So far, I've already done the main features of my app. Most of them work well but may just need some slight adjustments/improvements.

List of what to continue working on:
  1. Main Menu
  2. AR Feature 1:
    1. Add the rest of the terms.
  3. AR Feature 2:
    1. Instruction card.
    2. Fix scanning the sequence issue.
    3. When sequence is wrong. animation that plays & pop up text.
  4. AR Feature 3:
    1. Add the rest of the questions.
    2. Quiz results page.
FINAL TASK 3 SUBMISSION

Video Presentation of AR App Prototype - Eco Loop

Asset building or collection progress:

FEEDBACK

Week 11 (4/12/2024):
AR Feature 1:
- Next button is usually on the right side.
- Can use show and hide function, don't have to have 2 scenes for feature 1. Can create 2 panels.
- Add a scan button to trigger image target to function again because the next button disables it.
- Make a video of your app's issue and send it to me along with the ZIP folder of your app.

Week 12 (10/12/2024):
- Can add selected colour to the quiz buttons when clicked. (AR feature 3)
- Need to disable image target at the start if you want to enable it with a button. (AR feature 1)
 
REFLECTION

Experience
The AR world really is a whole new other world, it is so much more different from other apps and is very interactive and interesting. The overall journey of creating my AR app prototype was bumpy. I had many issues here and there, but once they were mostly solved, it felt so much like a reward. At many times I got frustrated as it takes a long time to figure things out since this platform is foreign to me. Nevertheless seeing my app come to life made me happy. Though I struggled with some parts of creating my prototype, the in class tutorials helped me a lot as I could apply those methods to my app.  Furthermore, Mr. Razif's guidance aided me in creating my prototype as well. However, I was a bit upset with myself for not being able to fully do feature 2 and would really hope to successfully complete it in my final assignment. I sure hope when I continue to create the full AR app in the final project, everything will go smoothly and successfully. 

Observation
I observed that through practice and exploration only you manage to slowly work things out. Understanding the unity app is also important as there is a lot of both technical and logical things. I also observed that organisation like naming the scenes and files and separating different assets into different files in so important as it can make your journey smoother and more seamless.

Findings
I found that solving the challenges in creating my AR app, like figuring out animation triggers and positioning the AR models, really helped me understand how everything works and helped me familiarise myself with the app. The more I worked through these problems, the more confident I became in using Unity and other tools. In addition, I really found Mr. Razif's tutorials very helpful and informative as he teaches us all the basics first for us to understand and even streams it for us on youtube, so that we can apply it later on in our app. 

Comments