๐งช Task 3: Experience Design Final Prototype
Module: MMD60204 – Experiential Design
Weightage: 40% (Individual)
Timeframe: Week 07 – Week 14
Deadline: End of Week 14
๐ฏ Task Overview
For Task 3, we are required to develop a working prototype of the AR experience proposed in Task 2. This prototype must demonstrate a clear and interactive user journey, showing how users engage with the design and how the AR content responds.
The focus is on building a functional and meaningful AR experience — even if simplified — using Unity and Vuforia. This task showcases our understanding of experience design through actual implementation.
๐ Submission Requirements
Upload to Google Drive and share the link
- Video presentation and walkthrough of your AR project prototype
- Explain your AR app and elaborate on what has been done and what yet to be done.
- Include also your asset building or collection progress
- Blog link to the project post
Prototype Presentation Video
1. For the testing phase, I’ve started with two 3D models that represent different sides of me (e.g., artistic and sporty). These models are currently in basic form without textures or color, since my priority was to get the interaction and logic working first.
To save time sculpting from scratch, I:
-
Sketched out my character from multiple perspectives
-
Used an AI tool called hunyuan3d-2 to convert my 2D drawings into a 3D mesh
-
Imported the generated model into Blender for some minor adjustments (position, scale, smoothing)
This method helped me focus on design and concept testing rather than getting stuck too early in modeling details.
2. Once the models were ready, I:
-
Imported the Vuforia Engine (Windows) into Unity
-
Created a Vuforia Image Target by uploading my chosen scan image to the Vuforia Target Manager
-
Placed my 3D model as a child object of the Image Target in Unity
-
Tested the AR experience — and it worked! When I scanned the image, the 3D model appeared on screen correctly
This confirmed that the basic AR tracking pipeline was functioning properly on my test device.
3. Next, I wanted to simulate the transition from character selection to themed portfolio spaces, so I added:
-
A UI confirmation panel that appears when the 3D avatar is clicked
-
A button in the panel that says something like:
“Do you want to enter the artistic world?”
-
When the button is clicked, the app transitions into a new Unity scene representing the next stage of the experience (e.g., “artist_scene”)
4. Visual Enhancements: Artist World Environment & UI Style
To make the Artist World feel more immersive and personal, I designed a custom skybox in Blender. I created a gradient texture that represents a dreamy, colourful environment — something that reflects my artistic identity. After building the skybox, I exported it and applied it in Unity inside the artist_scene
, which is the virtual space the user enters after clicking the UI and scanning a ground plane.
This gives the AR scene a sense of mood and atmosphere, rather than just placing objects in a blank space.
5. Gradient-Based UI Design
I also experimented with a more dynamic and expressive UI. Inspired by colourful, fluid designs, I created a gradient-based moving UI — where buttons and panels subtly shift colours or glow over time. The goal was to make the interface feel more like a creative space, not just a menu.
This UI will appear after the user enters the Artist World, and will help guide them to view different floating artworks or read information about each piece.
Comments
Post a Comment