Experiential Design / Task 3: Project MVP Prototype

2025.06.10 - 2025.07.06 Week 8 - Week 12

Teh Ming En/0364908

Experiential Design-Bachelor of Design (Honours) in Creative Media--Taylors University

 Task 3: Project MVP Prototype


Instructions



Task 03: Project MVP Prototype

Instructions

Once their proposal is approved, the students will then work on the prototype of their project. The prototype will enable the students to discover certain limitations that they might not have learnt about before and they’ll have to creatively think about how to overcome the limitations to materialize their proposed ideas. The objective of this task is for the students to test out the key functionality of their project. 
  • The output may not necessarily be a finished, visually designed project. 
  • The students will be gauge on their prototype functionality and their ability to creatively think of alternatives to achieve the desired outcome.
Requirements
  • Screen Design visual prototype (Unity)
  • Functioning MVP of the App Experience
Submission
  • Video walkthrough and presentation of the prototype
  • Online posts in your E-portfolio as your reflective studies
Progress

This is a group assignment that Chin Ting and I worked on together. Our topic is about creating an AR route guide for freshmen and visitors at Taylor’s University. The idea is that users can scan a building with their device, select their destination, and have the AR route guide appear directly on their screen to help them navigate the campus.

Review of our Figma prototype, which we did at Task 02:


We initially planned to create the route scan using the Mapper app to capture the environment, along with the Immersal SDK engine in Unity for building the AR experience. However, we discovered that the Mapper app can only capture up to 100 photos in one session. This limitation meant that our original route—from the library to the Grand Hall—was too long to capture completely, since it would require far more photos than allowed.

Because of this restriction, we decided to revise our plan by changing both the starting and ending points. We chose a shorter, more manageable route—from E7.14 classroom to D7.03 classroom. This new route is much easier to scan within the photo limit and still allows us to demonstrate the AR navigation concept effectively.

We used the AR Foundation engine for the whole project.


Fig 1.1 AR Foundation Engine

We used the Measure app on the iPhone to record the distances for each specific segment of the route. This helped us accurately scale the 3D route path in Unity. For the AR route itself, we created a simple 3D cube object and adjusted its width and scale to resemble a path on the ground, making it easier for users to follow the guide visually in the AR experience.

Fig 1.2 Route Arrow

Route Arrow 3D Object

We added route arrows along the path to help guide users more clearly. To create the arrow, I used two 3D cube objects in Unity and adjusted their scale and rotation to form an arrow shape that would fit well on the route path. I also wrote a custom script that places these arrows automatically along the entire route and added a floating animation to make them appear more interactive and noticeable in the AR experience.

The script made it much easier for me to fine-tune the behavior of the arrows. It includes two adjustable fields: one for controlling the floating speed of the animation and another for setting the spacing between each arrow along the path. This flexibility allowed me to quickly test and modify the appearance and movement of the arrows to achieve the desired visual effect.

In addition to the arrow placement, we also worked on enhancing the route’s visual appearance by modifying its color. Initially, the route was rendered in a solid white material for clarity. Later, we updated it by applying a gradient that transitions smoothly from blue to white using a custom material. This gradient was implemented by modifying the shader or using a texture with a UV-based gradient, depending on the rendering pipeline in use. The gradient effect made the path more visually dynamic and easier to follow, especially in outdoor lighting conditions where contrast and clarity are important.

Fig 2.1 Arrow

Fig 2.2 Script

Arrived Destination Label 

I found this 3D object on Google and placed it at the final point of the route to indicate the destination. To make it more dynamic and visually engaging, we added a floating animation by writing a custom script in Unity. The script allows us to easily adjust both the speed and duration of the animation to draw attention to the endpoint and enhances the overall interactivity of the AR experience.

Click here for the 3D object link.

Fig 3.1 Destination Label

Fig 3.2 Script

We’ve added a Particle System to your scene under the ARRouteModel Variant object. This component is likely used to create a visual effect, such as glowing or sparkles, to make the final destination object stand out.

Fig 3.3 Particle System

UI Interface

For the UI interface of our app, we designed and implemented four main pages: Onboarding page, Home page, Location page, and Profile page. We began by creating a new canvas in Unity to serve as the foundational layout for all the UI elements. Then, we designed each interface using panels. 

Fig 4.1 UI Interface

Fig 4.2  Button Design

When designing buttons, we encountered a limitation in Unity which was we couldn't directly adjust the border radius of the button. To maintain our design, we decided to export rounded rectangle shaped button designs from Figma and then import them into Unity. This approach allowed us to preserve our initial design.

Additionally, we imported the font into Unity to better align with the overall design and tone of the app. The font choice plays an important role in reinforcing the app's identity and enhancing user experience, so we made sure to match it closely with our original design plan.

Fig 4.3 Imported Font

Fig 4.4 Link Button

To enable smooth navigation between pages, we used the scripts to link each button to its panel. In these scripts, we wrote functions that activated the target panel while deactivating the others to ensure only one page is visible at a time. These scripts were then attached to the buttons, and through the Unity Inspector. After that, we assigned the appropriate methods to the OnClick() event of each button. This allowed users to navigate seamlessly across the UI—from onboarding to home, from home to location, and so on—by simply tapping the buttons.

Background Music

Fig 5.1 BGM

We decided to implement background music that plays during walking or exploration to create a more immersive atmosphere while users navigate the campus in AR.

First, we selected and prepared a looping background music file in .mp3 format that matched the tone and mood of the app. We selected a background music track with a simple piano melody.

This helps to reduce user stress, enhance focus, and make the walking experience more enjoyable as users navigate the campus. We hope this can encourage users to slow down and absorb the environment around them, and create an immersive AR journey.

After that, we imported the audio file into Unity by dragging it into the Assets folder. Next, we created an empty GameObject in the scene and renamed it. We added an Audio Source component to this GameObject and assigned our background music clip to the Audio Clip field. To make sure the music plays continuously, we enabled both "Play On Awake" and "Loop" options in the Audio Source settings.

Testing Phase

Fig 6.1 Testing App

Once we had most of the features implemented, we exported the AR navigation app to a mobile phone and began testing it on-site at our campus. For our test route, we chose to walk from classroom E714 to D703. This allowed us to experience the app in a real environment and see how well the guidance system worked.

During the first few tests, we noticed that some features weren’t responding as expected. For example, the route and arrows didn’t appear properly.

After several rounds of testing and tweaking, we were finally able to get everything working the way we imagined. The arrows floated smoothly along the path and the gradient color on the route looked clear and attractive. Testing on-site really helped us identify and fix issues that we wouldn’t have noticed in the editor alone.

Submission

Fig 7.1 Walk Through Video

Reflection

Looking back on the development of our MVP prototype for the Experiential Design project, I feel a strong sense of achievement despite the challenges we faced. Initially, our plan was to capture a full route from the library to the Grand Hall using the Mapper app, but due to the limitation of uploading only 100 photos, we had to adjust our approach. We decided to shorten the route from E7.14 to D7.03 instead, which still allowed us to test the core AR navigation functionality effectively. The process of building this prototype in Unity taught me a lot—not just about technical implementation using AR Foundation, but also about problem-solving under constraints. We created a floating arrow path with gradient shaders and animated destination markers to guide users through the space. One of the most satisfying parts was integrating interactive elements like particle effects and a bouncing animation at the endpoint to give users a clear sense of arrival. I also learned to overcome Unity's UI design limitations by importing button designs and fonts from Figma, which helped us maintain a clean and consistent visual style. 

After building the app, we tested it on our actual campus route and encountered issues like unresponsive features and misaligned visuals, but through multiple iterations and adjustments, we eventually reached the result we envisioned. This testing phase was especially eye-opening—it reminded me of the importance of real-world usage and refinement. 

Comments