Week 1 : Introduction to the module
- Marker-based AR (e.g., using images to trigger content)
- Markerless AR (e.g., placing objects on flat surfaces)
- Location-based AR (e.g., AR that reacts to GPS coordinates)
๐️ Week 2: Finalizing My AR Direction & Group Journey Mapping
- Creating AR navigation through artwork sections
- Using multiple image targets to showcase different parts of my work
- Adding interactive buttons or voiceover introductions
- Gain Points (positive experiences)
- Pain Points (user frustrations)
- Proposed AR Solutions
- Empathy Maps: to visualize what users think, feel, see, and do
- User Journey Maps: to break down the entire user flow
- Pain Point Identification: to define where users struggle, and why
๐️ Week 3: Understanding XR + Group Ideation for AR in the Gym
Date: 7 May 2025
Module: MMD60204 – Experiential Design
Topic: XR Ecosystem, AR Types, and Practical Tools
XR Overview: AR, VR, and MR Explained
In this week’s lecture, Mr. Razif introduced us to the world of XR (Extended Reality) — a broad term that includes:
AR (Augmented Reality) – overlays digital content onto the real world
VR (Virtual Reality) – a fully immersive digital environment
MR (Mixed Reality) – blends physical and digital objects that interact in real time
This breakdown helped us see where our project ideas fit and how different levels of immersion affect user experience design.
We also did a group tutorial session where we were asked to brainstorm AR ideas for a gym environment. After short team discussions, each group shared their ideas and received feedback.
Some concepts discussed included:
AR form-correction overlays using mirrors or cameras
Scanning gym equipment for tutorial videos
Virtual coaches appearing next to you during workouts
Gamified reps tracking using AR scoreboards
Slide For Our Presentation:
After the idea presentations, we had a technical tutorial session to begin experimenting with Vuforia in Unity. We were guided through basic steps such as:
Setting up a Unity project with the Vuforia Engine
Importing and assigning Image Targets
Linking simple 3D objects to appear when a marker is scanned
This hands-on tutorial was especially helpful in preparing us for our prototype phase in Task 2. It also gave me more confidence in how I’ll implement the technical side of my AR portfolio project.
Continue testing Vuforia’s Image Target and Ground Plane modes
Sketch a visual layout of my AR portfolio interactions
Explore possible mini interactions like button pop-ups or animated transitions
Review the gym AR ideas as possible inspiration for gesture-based UI
๐ ️ Week 4: Adding Animation & Button Interaction in Vuforia
Date: 14 May 2025
Module: MMD60204 – Experiential Design
Focus: Animation Triggering with UI Buttons (Unity + Vuforia)
This week, we continued our hands-on work using Vuforia Engine inside Unity, building on what we learned last week about image tracking.
Previously, we set up a simple AR experience where a 3D object appears when an image is scanned using the camera. For this session, we took it a step further by adding interactivity and animation.
Learning Focus: Animation + Buttons
We used the same 3D object from last week, but this time we added an animation clip to make the object move (e.g., rotate, bounce, wave, etc.). Then we added two UI buttons into the AR canvas:
-
Start Button – plays the animation
-
Stop Button – pauses or stops the animation
-
The animation plays only when Start is pressed
-
It pauses when Stop is pressed
-
And when animation is running, the Stop button appears, while the Start button hides — and vice versa
This helped me understand how logic and UI can be combined to control user experience in a much more dynamic way, rather than just letting things run automatically.
Why This Matters for My Project
Since my Task 2 concept (“Meet Me in AR”) includes interactive avatars and environments, I will definitely need similar logic for:
-
Starting animations only when a specific avatar is selected
-
Switching between environments or hiding UI panels
-
Managing clean transitions between different scenes or states
This week’s tutorial gave me a solid foundation for those future steps.
๐Week 6: Scene Transitions & Panel Control with Buttons
Date: 28 May 2025
Module: MMD60204 – Experiential Design
Focus: Scene Management, Menu UI, and Interactive Panels in Unity
This week’s session continued our technical development in Unity, especially focusing on how to build interactive flows using buttons and scene transitions — something that will be very useful for my multi-scene AR portfolio concept.
Key Things We Learned
1. Switching Between Scenes
We learned how to create multiple scenes in Unity and connect them using UI buttons. By adding all the scenes into Unity’s Build Settings, we were able to:
-
Load one scene from another using a script
-
Navigate between a main menu, content scenes, or AR scenes
This is an important skill for building layered user experiences — like in my project where users switch between “Artistic Me,” “Sporty Me,” and “Professional Me.”
2. Creating a Menu Page
We also learned how to:
-
Design a menu interface
-
Position UI buttons exactly where we want them on screen (using anchors and canvas tools)
-
Apply scripts so each button loads a specific scene when clicked
It helped me understand how to plan the user journey visually — giving users a clear starting point or hub to navigate from.
3. Controlling Panels with Button Press
Another thing we did was learn how to show or hide UI panels using code. We added a panel that:
-
Is hidden by default
-
Appears only after a button is pressed
This gives more control over when certain information or UI elements are visible. It’s something I’ll apply to my portfolio when showing project info or artwork explanations — only when the user clicks.
Why This Matters for My AR Project
These skills are directly aligned with what I need for my multi-avatar scene structure:
-
I’ll use scene switching to jump between different environments (Art, Sport, Career)
-
Menus and back buttons will help control navigation
-
Panel visibility scripting will let me keep the screen clean until needed
๐ก Week 7: Testing AR on iPhone with MacBook (iOS Build Setup)
Date: 4 June 2025
Module: MMD60204 – Experiential Design
Focus: Unity Build for iOS + Real Device Testing
This week marked an important step for me — we finally started setting up our AR project to run on our mobile phones. Since I’m using an iPhone 14, I couldn’t test AR builds on Windows previously, so this week we switched to using the MacBooks provided by the university.
Setting Up Unity for iOS
In this session, we learned how to:
-
Configure Unity’s build settings for iOS
-
Set the correct bundle ID and provisioning profile
-
Use Xcode to deploy the AR project to our phones
This was my first time doing this process, so it took some time to understand how everything links together — but once set up, it worked!
Running AR on My iPhone
After building the scene and connecting the iPhone, I was able to:
-
Launch the Unity project directly onto my iPhone 14
-
Use the camera to scan an Image Target
-
See the 3D object appear in real-time AR on my device
It was really satisfying to see my scene come to life on an actual phone. It gave me a better sense of how users will experience the project in the real world — not just in Unity's preview window.
Why This Matters for My Project
Since my final concept is a mobile AR portfolio, testing on the actual device is essential. I need to make sure:
-
The UI and 3D models are the right scale
-
Buttons are responsive
-
Animations load smoothly on iOS
-
Image tracking works in different lighting conditions
This session gave me a head start on all of that.
๐ง Week 8: Adding UI Navigation to AR in Unity (Mac/iOS Build)
Date: 11 June 2025
Module: MMD60204 – Experiential Design
Focus: UI → AR Scene Transitions for iOS Devices
This week, we continued working on the MacBook Unity setup for iOS, building on what we started last week — which was mainly focused on getting AR tracking to run on iPhone.
Now that the basic scanning was working, we moved forward by adding UI pages and buttons to create a more complete user experience.
What We Implemented
-
UI Interface Page:
We created a simple main menu interface using Unity UI tools. This included a start screen with buttons that guide the user into the AR content. -
Button Navigation:
We coded buttons so that pressing one would trigger a scene transition — for example, from the welcome screen to the AR scanner scene. This is similar to what I’ll need in my final project, where users move between the 3D avatar menu and themed portfolio spaces. -
Build & Test on iPhone:
After integrating the UI, I exported the updated build to my iPhone 14 using the MacBook and tested the full flow:-
Open app
-
Tap "Start" button
-
Enter AR scene and begin scanning
-
See AR content appear
-
Everything worked smoothly, and the test gave me a clearer idea of how to polish the transitions for real users.
Why This Is Important for My Project
My AR portfolio is designed to feel intentional and guided. I don’t want users to be dropped straight into AR — I want them to start with a clean intro screen that invites them to explore.
This week helped me:
-
Understand how to manage scene loading with buttons
-
Think about entry points and flow between scenes
-
Make my AR experience feel more like an actual app
๐️ Week 9: Plane Tracking & GitHub Collaboration
Date: 18 June 2025
Module: MMD60204 – Experiential Design
Focus: Markerless AR (Plane Tracker) + Project Sharing with GitHub
This week’s session introduced two key areas that can take our AR development further:
-
Plane tracking for more immersive, markerless experiences
-
Using GitHub to manage and collaborate on Unity projects
Using Plane Tracker in Unity
Instead of using an image marker, we learned how to apply Vuforia’s Ground Plane tracking. This allows AR content to be placed directly on real-world surfaces (like a floor or table), making the experience feel more natural.
We practiced by placing a virtual house-like object onto a detected plane. Key things I learned:
-
How to enable device tracking and set up the plane finder
-
How to position and scale objects so they appear correctly in the environment
-
How to tap on a surface to place the object where the user wants it
This technique can create more open, exploratory AR experiences — useful for games, education, and real-world simulations.
Version Control with GitHub
After that, we were introduced to GitHub, a platform for storing and collaborating on code and Unity projects. This is especially helpful if you’re working in a group, since it allows:
-
Multiple people to sync their progress
-
Saving and backing up versions of the project
-
Avoiding file conflicts when working at the same time
Although I’m working solo for my final project, it’s still useful to know how Git works, especially if I want to:
-
Keep my own project organized
-
Experiment with different versions safely
-
Possibly collaborate with others in future modules
Relevance to My AR Portfolio
For now, I’m sticking with image tracking for my portfolio project (since the name card is a key entry point), but knowing how to use plane tracking gives me options if I decide to turn the 3D avatar menu or environments into more spatially anchored scenes in the future.
And GitHub is something I might start using to backup my work across devices or share it during presentation.
๐ฑWeek 10 : UI Button Building for Video
Module: MMD60204 – Experiential Design
Focus: UI Button Building for Video
Comments
Post a Comment