Experiential Design - Final Project (Task 4)
16/12/2024 - 12/1/2025 (Week 13 - Week 15)
Katelyn Tan Kye Ling (0354148)
Bachelor Of Design In Creative Media (Experiential Design)
Task 3: Final Project
LECTURES
WEEK 13 - 17/12/2024
This week I decided to focus on AR feature 1 by adding the rest of the terminologies that a user can scan. However, some of the 3D models I added didn't have colour when added in from blender as an FBX file. It was weird because it worked for the most files but not for certain ones. I also wasn't sure how to not allow other target images to be scanned in certain panels.
Example of current issue: User is in the producer terminology panel, but is able to scan other cards like herbivore, omnivore and carnivore.
To end the week, I finalised my AR feature 1 as well by making all the buttons lead to where it should lead to and also setActive for what should or should not show when we button is clicked or an image target is scanned. Lastly, I fixed the issue where some of my 3D models did not have colour. I did some research and found that I had to click some settings when exporting the blender file, and I also had to extract materials in Unity.
Next, I added the note to ask users to turn their phones, the good job pop up, the try again pop up and also the set active and not active for each component.
Experiential Design Final Project Recap by Katelyn Tan
There wasn't any class today as sir said we could stay home to continue with our assignments and many people may have stayed up late to rush for submission for task 3. So I stayed home to complete my assignments.
HOLIDAY WEEK
Consultation with Mr. Razif, issues and solutions in the feedback section.
WEEK 14 - 31/12/2024
There was consultation however I didn't have much to show as I did a consultation during the holiday week already, therefore I did not go.
WEEK 15 - 7/1/2025
No class.
INSTRUCTIONS
PROGESS
Week 13:
This week I started off with working on the main menu first.
This week I started off with working on the main menu first.
Fig 1.1 Main Menu Scene
Then I worked on connecting all the buttons to its respective scenes that each button should lead to.
Fig 1.2 Script to control the buttons
Example of current issue: User is in the producer terminology panel, but is able to scan other cards like herbivore, omnivore and carnivore.
Holiday Week:
I started this week with doing AR Feature 3. I only did question 3 in my prototype to make sure it works, so in the final project I made the rest of the questions. After making all of the questions, I used the on click set active bool for each button choice so that when the button is clicked, that question panel will deactivate and activate the next question. Shown in Fig 1.3 - right side of the image.
Fig. 1.3 Switching from One Question to Next Question
After that I found an issue where the question switches immediately and doesn't show the pressed colour of the answer the user has chosen, which is the visual feedback that I want. I asked chatgpt for a little help with the script. It was pretty simple to do. I attached the script to each button and set the current and next question. I did not have to put the set active bool so I removed it. And then I also still had to add an OnClick event as assign the button to the game object. I set the transition delay to 0.4f.
After that I made the results page, but I wasn't sure how to calculate the results of the question. I asked Mr. Razif for help and he gave me a youtube playlist tutorial. Since his tutorial's exercise was slightly different from mine, I had to watch through the videos first and understand what each term means and how I had to change the code to fit my app. I took a little while to figure out, but I ended up succeeding which I was super happy about.Fig 1.6 Quiz Result Page
Fig 1.7 Changing settings before saving file in blender
Week 14:
In the previous task, there was an issue with my second feature. I started off with trying to fix that this week. In the previous task, I struggled to get my star ratings higher for some image targets. So I fixed that by having lesser white space around the image and making the text bigger.
After that, I tried to scan the 4 images again but it still was really difficult. So I printed the images smaller again. Then after trying and struggling a bit it worked, I just had to have good lighting and angle.
Fig 1.9 4 Image Targets can be scanned
And when I tried to play it, I noticed that even if my sequence is correctly arranged, Unity says otherwise. I believe it is either the Iruin camera or the sequence layout of my image targets. So I tried to fix that first and asked Mr. Razif for help. Mr. Razif asked me to test in my app first as it may work there but not in my computer, due to Iruin's camera only being portrait mode.
Week 15:
I tried to first export to app from my Windows laptop to my Iphone. However I was facing many issues during the time and the tutorial that I followed to export the app wasn't working for me. So I decided to work on finalising my design first.
Finalising the App's Designs:
I replaced my buttons and any image that's supposed to be rounded corners with images. Then I also wanted the video to play as a visual feedback when the user gets it wrong or right after arranging the food chain sequence. I couldn't directly add the video, so I split the gif into sprites and did animations in Unity.
I replaced my buttons and any image that's supposed to be rounded corners with images. Then I also wanted the video to play as a visual feedback when the user gets it wrong or right after arranging the food chain sequence. I couldn't directly add the video, so I split the gif into sprites and did animations in Unity.
Since it was really tough for me to export from Windows to IOS, I decided to transfer all my files to my mum's MacBook and even downloaded Unity and visual code there. That whole process was quite a hassle and even trying to export the app to my Iphone with the MacBook had quite a few issues. However after trying a few times, it exported!
After that, I still had to export the app a few times as there were a few issues with my Unity components placement. I noticed that my rect transform was not placed to center which caused my panels to not be the same as what I see in Unity. The app's size was also really weird so I had to change the canvas's size to my Iphone's size. After exporting a few times, I was satisfied with my app.
Fig 1.13 Issue on Iphone view (1)
Fig 1.14 Issue on Iphone view (2)
FINAL SUBMISSION
FINAL GOOGLE DRIVE LINK:
PRESENTATION + WALKTHROUGH VIDEO:
UPDATED APP PROPOSAL:
PRESENTATION SLIDES:
FEEDBACK
HOLIDAY WEEK (24/12/2024):
I had a consultation with Mr. Razif and asked him to help me with a few issues I have.
Issues:
AR Feature 1:
1. 3D Models no colour: Need to remap materials and download files into Unity.
2. Other terminology cards allowed to be scan in other panels: Simple solution is to just deactivate
other terminology cards when one image target is found.
AR Feature 3:
1. Results page (Calculate correct amount of questions): Mr. Razif gave a YouTube playlist for a
tutorial on my issue. (Playlist).
2. Rounded box design: Need to manually edit on photoshop and import to Unity.
WEEK 15 (8/1/2025):
AR Feature 2:
- Set the app to either all landscape mode or all portrait mode.
- Either that or try exporting the app first, it may not work from computer as the iruin webcam doesn't have landscape mode.
REFLECTION
Experience
Throughout this project, I experienced a mix of excitement and challenges. At the start, I felt excited and optimistic, but as the challenges started piling up, I also felt moments of frustration and doubt. However Mr. Razif's in class exercises, video tutorials and consultation session guided and through this final project journey which I am very grateful for. This assignment really taught me problem solving as when something goes wrong, you have to slowly look into every detail to find the issue and do trial and error. There were times when I questioned if I could even finish everything on time, but seeing my app slowly build up bit by bit made me satisfied and felt rewarding.
Observation
I observed that understanding the tools I used, like Unity and Blender, required patience and trial and error. Working with scripts, animations, and layout adjustments taught me how everything in an app is interconnected. I also was taught that testing every single time you add something is vital, so that you'd know what went wrong right away and won't have to keep looking for the issue once you've added a lot of things. Through this final project and even the module I've panicked so many times when something didn't work. So, I also realized how important it is to stay calm and keep experimenting when things don’t work out the way I imagined.
Findings
I discovered that materials in 3D models need proper mapping to display colors correctly in Unity. I also realized that breaking down problems into smaller steps, like watching tutorials or asking for help, makes solving issues easier. I learned to trust myself more, even when I was navigating unfamiliar territory. It also reminded me how satisfying it is to overcome challenges and see an idea turn into reality. I now understand how tough it is to build an app. I really appreciate and feel so much more for app developers now especially the ones that create these type of AR apps that help users be able to interact more, it really requires a lot of time and effort. Looking back, I feel proud of the journey and grateful for the growth it brought me.
Comments
Post a Comment