Experiential Design - Task 1

 Name : Ivy Chung Ai Shin

Student ID : 0358429 

Program : Bachelor Of Computer Science

Week 1 - Week 4 
(24/4/24 - 15/5/24 )


Week 1 : Introduction To Experiential Design

On week 1 , we explore on different experiential ideas using Augmented Reality (AR). We were given to watch some past student examples to gain understanding about AR as well as some inspiration. Mr Razif had also given some of his advice and tips into building the AR. Some of the advice that he had given us was about the visual designs and the usefulness of the idea. 

We then hopped into a short lecture session that introduces to us the fundamentals of AR. In the class , we experimented using AR feature provided by Google. It was really interesting to see how a cat-like model could be put into reality. 

Image 1.0 - Google A.R

Imagine a scenario in either of the two places. What would the A.R experiences be and what visualization can be useful? What do you want the user to feel? 

I imagine myself in the shopping mall where I would want to use mobile A.R to visualize the directories of the shops in the shopping mall. It will show the 3D model of the shop as well as it shows directions to the shop itself. This will help to better navigate through the mall and locate specific stores more easily. Using mobile A.R in this way can provide a dynamic and interactive experience, allowing to see virtual representations of the shops overlaid on your physical surroundings through your smartphone or A.R glasses. As you move through the mall, the A.R display can update in real-time, showing you where you are in relation to each store, offering directions, and even displaying promotions or information about each shop as you pass by. 

Week 2 : - 

It was a public holiday on week 2 , thus there was no class. Therefore , I proceeded to take this time to think of some AR project ideas that I can do. I went on to research some ideas and problems that many are facing currently. 

Week 3 : Concept Of Experiential Design

In week 3 , we delved into some concepts of experiential design.  In experiential design , there are various type of discipline that we should take note of are : 
  • User experience
  • Brand experience
  • Customer experience
  • Information architecture
  • etc.
Experience design is not just driven by a single design discipline but a cross-discipline perspective. We were then introduced the type of design technique. User mapping involves different ways such as empathy map , customer journey map , experience map and service blueprint. An empathy map is a tool used to articulate what we know about a particular type of user. Then , we were taught about the application of journey map that can allow us to come up with solutions that solves our day-to-day journey. From there , we had our first group activity to come up with a journey map together. 

Image 2.0 - Journey map

My group had decided to make a journey map through a themepark in Universal Studio Japan ! In the themepark , we visualize the scenario in the order below :
  1. Physical Purchasing Ticket
  2. Park Entrance
  3. Map/Navigation
  4. Rides
  5. Live Shows
  6. Eatery
  7. Merchandise Store
  8. Park Exit 
We had understand the gain points and pain points for each situation and came up with ideas on how we can solve it. Some of the examples was using special AR effects to improve the watcher's experience in a live show or using interactive AR that allows riders to enjoy interacting with the environment in the themepark. 

Image 2.1 - Group Presentation 

Through the group presentation , we had successfully come up with a few solution ideas in solving our experience in a theme park. 

The final half of the class , we had our first introduction to Unity. 

1. We were introduced with the basic of components and parameters in unity such as adding a basic cube and some parameters that we can modify. 
Image 2.2.0 - 3D Model

2. After downloading vuforia , we will be able to obtain the API key for us to access libraries such as dataset. 

Image 2.2.1 - Vuforia Licenses


Image 2.2.2 - Vuforia Database

After setting the database in vuforia by inserting the image that we want to set as ImageTarget. We can then access the database in Unity and obtain the image that we have inserted into the database.

Image 2.2.3 - API Credentials

3. Device Tracker allows unity to keep track of the model that is detected. 

Image 2.2.4 - Device Tracker

4. After settign the ImageTarget , we can just set any object as a child under the ImageTarget. With that , our first hands-on experience with AR was having to try on modelling a cube block through identifying the image. 

Image 2.2.5 - First AR Test

Video 1.0 - Image Overlay

4. Then , we can also play a video instead of setting an object under the ImageTarget. We just simply import the video we want to play and put it as a child under ImageTarget. There we have the video looping over the scanned image.

Video 1.1 - Video Overlay

Week 4 : Setting Up Device In Unity 

Image 2.0

In the beginning of the class , Mr Razif had continued from the previous class and thought us the basics of how we can stop the video when the camera is not facing the object. By doing so , we just have to :
  1. Uncheck play on awake 
  2. Allow .play() when events is found 
  3. Allow .stop() or .pause() when events is not found 
  4. Uncheck "Track Device Post" in the inspector setting under advanced 
Next , we were taught on how we can now use our mobile phones and add in functionalities. Based on what phone we use , as for me , it will be iOS. Therefore , I will have to select iOS module as the canvas and also change the size of the screen. Moving on to canvas scaler , the resolution is then changed to 1080x1920 pixels and scale with screen size. 

Image 2.1 - Device Setting

In the end , the result looks like this : 

Video 2.2 - End Result

We continued with additional functionalities such as showing and hiding an object. Using the same concept , now we apply the function setActive() onClick event. When the setActive() is unchecked , that means it is set as false and vice versa. We first added two new buttons called Show and Hide which then we set the function. 

Image 2.3 - setActive()

In short , the setActive boolean allows us to set if we want the object to be shown or no. In the end , the result looks like this : 

Video 2.4 - Hide/Show Sphere

Next , Mr Razif then demonstrated how we can connect our mobile phones to Unity for both iOS and android. As for me , I will be following the iOS connection. 

Image 2.4 - iOS Settings

For iOS users , we will have to download the Xcode that allows us to build and run the project into our iOS device. Therefore , after my download , I also went into build settings and ensure that my target iOS version is set at 12.0. I then proceed to build and run the project into Xcode which brought into this setting. 

Image 2.5 - Xcode Settings

From there , I will plug in my device into my laptop and run the project. In my device , I will have to turn on my developer mode to allow the project to run. With that , I am able to setup the project on my device.

Video 2.5 - Phone AR 


Project 1 : 
In this week , we are also requested to explore and research on more ideas that we can work on. I have explored a few ideas such as AR translation , AR automotive , and more. 

I have consulted Mr Razif and the one that he really liked was the AR automotive. However , I was suggested to change into the engine parts of the car to display the name and functions of the engine parts. Another suggestion was the AR audio play. He suggested to make it like an AR album product whereby users can view the content of the album specifically. 


Reflection : 

In these 4 weeks , I have learnt the fundamentals of what experience we can provide using AR. We were taught about applying a user journey map to understand what the problems and solutions we can provide to users using AR. From there , I am able to apply the same concept idea into my project to understand the problems faced by users. We were also introduced in the basics of using Unity to produce our first AR experience. In Unity , we were provided with so many functions that we can apply by using ImageTarget. This allows us to show any object , video or image. Mr Razif meticulously demonstrated on every details of the setting in Unity that we can explore to us which I really very much appreciated it. I was able to follow along in class perfectly well and also able to showcased my AR results.  

Comments

Popular posts from this blog