My Portfolio

Friday, December 31, 2021

Quest 2 - Application Space Warp and Optimization Test

The new application space warp announced by Oculus last month is supposed to gave applications up to 70 percent additional compute. I wanted to test its effectiveness with my own flocking project where the goal is to to have as many flocking objects on the Quest 2 while maintaining 72 fps! As a quick summary, application space warp did allow me to break past my bottle neck.

The Flock simulation is composed of a single Flock Manager that spawns and tracks all of the Flock with each individual Flock using a FlockController. The Flock has two main behavior methods which is to move towards a target point and separate from each other

For performance tracking I wanted to use the Oculus Developer hub for the first time and ran into issues with that as well. Upon install of ODH I found that Unity started running into issues when creating a build for the Quest. I found a helpful answer here where there suggestion of copying the files from Unity platform-tools folder into the ODH resolved the issue. 

To get as many objects on screen as possible while maintaining smooth fps I saved application space warp as the last optimization option to see if it would allow me to break past the project limits.

I iterated on several setups for the flock manager to try and push the limits. The first significant iteration was implementing a request manager which handles the movement needs of the flock which I showcased in my previous post. The second iteration I implemented was the C# Job system along with burst compiler. This allowed me to significantly increase the number of movement updates for the flock and enable better simulation without dropping FPS but not increase the flock count.

Finally during the setup of application space warp I ran into issues following the Oculus documentation because it was was already out of date. For my personal issue it was because I needed to use Unity 2022 beta and not Unity 2020.3 as specified in to documentation. Using application space warp requires a specific oculus branch of Unity's Universal Render Pipeline which was only compatible with Unity 2022 beta.

Once I got past these hurdles it was a big success. Application Space warp got me up to 1000 objects with no noticeable drop from my target 72 fps. All my other optimizations and set up I did only got me to 800 flocking objects where performance starts to dip a bit when it gets really busy.

Thursday, December 2, 2021

December 2nd - Spatial Partitioning and Custom Playables with Timeline

My goal this week was to to improve the performance of my flock simulation by improving the main update loop. 

Spatial Partitioning

Currently, the main bottle neck is how I implemented the Separation method for the flock controller which needs to do a distance check against all of flocks in the scene. This quickly becomes really expensive because each additional flock added increases the number of calculations needed in update for everyone.

In order to improve this I wanted to try and implement Spatial Partitioning. How it would work would be to split the map into grids where the distance check for each flock would only be against the other flocks in the same cell and ignoring the ones outside of it.

Thursday, November 25, 2021

Nov 25th - Procedural Animation and Basic Shaders

I did some animation rigging during my last project with the Dune worm and I decided that I wanted to do a really deep dive into it. So far I've been having a lot of fun and feel like the animation rigging package could really help me create some great boss battles and more easily creating cinematic/scripted events. Here are some clips from what I've been experimenting with.


My first mini project was to try and create a walking 2-legged robot where the legs automatically move themselves after they exceed a set distance from the main body. It was amazing when I got it work and I could see the legs automatically animating themselves while manually moving the Walker-Bot around. Here is a GIF where you can see me manually moving the body with the legs procedurally animating themselves. 

Saturday, November 20, 2021

November 13th - My Team's VR Mario Showcase Project

During the PXR2021 Conference there was a goal of creating some sort of collaborative virtual reality project throughout the week. I wanted to participate because I thought this could be a fun short-term creative VR project opportunity. 

My team had students across multiple disciplines and background. Not everyone could meet at the same day/time and as a result it was only towards the last 2-3 days of the conference that we really got going with sort of VR project.

For my team we had a few students who really wanted us to help them create a Mario inspired VR experience. Using that as our creative theme and direction we made a Nintendo Mario themed Altspace world. 

Poster art by Dylan Coakes

Friday, November 19, 2021

Nov 14 - Performance and XR 2021 Virtual Reality Conference

I got an opportunity to attend the Performance and XR 2021 Virtual Reality Conference through a director at the Centre for Entertainment and Arts. The whole conference was centered around using Altspace as the VR venue. 

When I first tried VR Chat I remember getting motion sickness after a while of using it. Similarly I'm finding a bit of motion sickness in Altspace when using continuous loco-motion but I think I'm starting to build more resistance to it now. Either way, I do prefer using the teleport locomotion since I get far less motion sickness with it. At the moment I find both VR chat and Altspace to be pretty similar to one another.

I also built my first Altspace world as well.

Experiencing and Building worlds in Altspace

For my first Altspace world I decided to try and create a scene inspired from the big sci-fi movie Dune with a world that has a giant Desert Worm hopping around. 

This is footage from the Unity Editor where I first created the VR experience. However, I found that you can't just directly import the Unity scene into Altspace. There is a huge limitation where you can't use be uploading a scene that utilizes any custom C# monobehavior scripts. 

After experimenting for an afternoon, I decided to use Unity's Timeline sequencing to create this scene and trigger the sound and particle effects at the right time. Kinda sucks that you can't use scripting at all. Especially since I had spent the time rigging the worm using Unity's Animation Rigging which allowed for real-time animation. Regardless I was still able to pretty much deliver the full VR experience I intended for VR.

For the modeling I used Probuilder to create the Dune Worm.

If you want to check out this Altspace world you can using the world code : PKI578.


Monday, November 1, 2021

November 1st 2021 - Autonomous Agents, Basic Flocking and Vector Math


Here is a quick example of a basic flocking system I created. The flock of circles is simulated to follow after where the mouse is positioned in the game screen. The position of the mouse is tracked through a screen raycast that hits the background plane. The flock is set up with a behavior to also avoid colliding with each other which gives it some more realistic movement. 

One of the key takeaways from this little programming project was the system design aspect. The flock system manager contains a list of all the flock agents and provides the agents with system context. The individual behaviors are managed at the flock agent level.

As an added benefit I feel like I've gotten a lot better at moving objects with forces in 3D space and how to use simple Vector Math.

The most difficult vector math concept was dot product. It wasn't that the math was difficult but the fact that I had misinterpreted the supporting image diagram which led me to repeatedly incorrectly using the formula and wondering why it wasn't working. The goal of the dot product I had for my project was to find the Vector3 location for where the red dot should be along the bottom horizontal line based on the box above. 

I used the dot product to make the yellow arrows follow a predetermined path. If the predicted position of the yellow arrow is too far from the target red dot it would trigger a steering correction for it to move again back towards the path.

The final recent project was a tutorial on how to create a editor script that can read a CSV file and convert it into a scriptable object. Pretty handy in the future in the scenario that I have a lot of data that needs to be managed or changed. 

Wednesday, October 27, 2021

October 27th - Programming & Game Design

October has been a mixed month where I've been spending quite a lot of time on learning Game design. I haven't stopped working on improving my programming though and been continuing to work my way through The Nature of Code, which is a free e-book if you are interested. 

The biggest benefit so far in learning from The Nature of Code is that it's helping me get a much better foundational understanding of how to use vectors  linear algebra and applying forces to create both better movement. 

Here is a quick GIF of a flying-type creature where the wings are moved using a looping rotation script based on cosine and the body utilizes a vertical force to push itself up and is affected by gravity and whether the wings are flapping or not.

Since I have access to the Advanced Game Design curriculum as a part of the teaching staff at the CEA, I figured I should use this opportunity to learn as much as I can and round out my understanding of creating games. Two primary areas of interest that I've been focusing on is Level Design and Storytelling. 

Friday, September 17, 2021

September 17th , 2021 - My One-button Game

This is my first 2D project. You can download and play my one-button game here :

One of my classes that I'm helping in as a teacher assistant is Rapid Prototyping. Over the last week students in this class needed to create a working game revolving around the concept of one-button input only. I figured I could participate as well through creating my own one-button game which I have now completed.


My concept was a reaction/timing focused game where you need to catch the correct fruit using spacebar as the only input button. The correct fruit is randomly selected by the customer.

Here were some of my key personal goals for the one week project:

  • Playable build
  • Utilize Unity's new input system
  • Scene Manager using Unity Scene Assets/Scriptable objects
  • Implement Factory method for fruit creation
Playable build / Scope Management

I wanted to take an iterative approach where I would make a build as early as possible with extremely basic gameplay and then slowly add features and functionality in new builds. The goal here was to ensure that there was always a playable build that I could submit at any point even if it doesn't fully realize the project vision yet.

Version 1 - Prototype

I'm pretty happy with this approach and it worked well for this project. One immediate benefit was that it allowed me to identify and fix an error really early on with my scripts that was preventing me from building a playable game. The issue was with an editor script. The game plays and works perfectly fine while in Editor but fails when you try to build it. Trying to create an earlier playable build allowed me to troubleshoot this without the pressure of needing to submit it during the last day.

Utilize Unity's new input system
So far I really like Unity's new input system. However, I feel like there is still a lot to learn when it comes to using this and will need to continue to experiment with the input callbacks. 

Scene Manager and using Unity Scene Assets
I feel that my previous projects did not handle scene loading and managing very well at all. It has previously always turned into a mess with the different scenes that need to be loaded/unloaded. I tried creating my own solution but there was aspects of it that were beyond my ability to program at the moment.

Luckily, I found some online references to Ryan Hipple scene manager demonstration which I've been using since.

Implement Factory method/pattern for fruit creation
It was much easier implementing the factory pattern then I thought. However, I don't really see how using factor pattern process for fruit creation is better than instantiating Gameobjects using prefabs. Regardless, I'm happy I was able to implement a working solution of the factory design pattern.

Happy with my project and personally think its pretty amusing to watch someone play it as its pretty hard.

For Art, I ended up purchasing Aesprite to make drawing pixel style faster. It's been a pretty easy and fun to use tool.

I can probably also use scriptable objects more instead of assigning/injecting dependencies.

From a programming difficulty perspective it hasn't been too hard. I've been able to implement the designs and use my previous experience to quickly build this prototype.

However, the project is heavily dependent on events to communicate between systems. Towards the end this did become a bit of a mess and I'm going to try and better manage the complexity through better re-naming in the future.

Saturday, September 11, 2021

September 11th, 2021 - I've got a job now

I've been hired on as a teacher assistant for the Centre of Entertainment Arts in the Advanced Game Design program and I am now no longer job-less. 

Woo !

Just finished my first week as a teachers assistant(TA) and it hasn't been too bad. I'm helping out in mostly beginner programming classes. So far I'm enjoying it since I do find some enjoyment in teaching and I get to talk about programming which most people don't really want to hear the details about or understand.

However, the part that most excites me about my job as a teachers assistant is that I gain access to the programs library of different game design and programming courses and a chance to interact with experts and veterans who have created/shipped games. The game design courses should be pretty interesting to check out since I've only focused on programming so far. Hopefully, overtime this will make me a more versatile game developer and allow me to make better games.

I've seen some of the portfolio pieces for graduates from the advanced game design program and have been pretty impressed with some of them. 

I don't have any game projects to showcase today but here's a short experimental clip of a VR experience inspired from Inception. Inception is one of my favorite movies and I remember one of the most visually stunning scenes was when they manipulate the world physics in the dream world so that the city bends in on itself. My goal in this VR project was to recreate that effect and for me to see what it might feel like myself. 

Monday, August 16, 2021

August 16th - Field of view with gizmos combined with Linear algebra

Here's a short video on a NPC character changing their head direction to face the VR player when they enter their field of view. 

One of my main focus for the last week and a half has been on gaining a better understanding of linear algebra and learning how to use Unity's Gizmos/Handles for visualization. This was a short little exercise to put it all together.

Linear Algebra

The best resource I've found that has been the most helpful is a math series for game developers by Freya Holmer. The video series provides a great breakdown of math concepts such as vectors, vector normalization, dot-product/cross-product, local vs world space, trigonometry and more. I've also been supplementing what I learned during this series with the free linear algebra online courses at khan academy

I've been no means mastered the math but I now have a basic understanding of the concepts and most importantly know what to look up if I need it.

Design Patterns

I've also been learning/reviewing design patterns. The best resource for this has been Refactoring Guru. It's explanations are very clear with great examples in a variety of different programming languages. I can see myself revisiting this website again in the future for when I need a refresher on a specific pattern or looking for information.

Animation Rigging

I really like Unity's new animation rigging package. The ability to override an existing animation creates an endless opportunity for unique custom emergent animation in my opinion. This is especially the case for VR. I've played around with it quite a bit and its not too hard to use as well. 

I also did an older but still good course on Unity's Animator and animation controller. I'm not planning to specialize in character animation but at the moment I feel like I could do the basic work of setting up animations in Unity.

Tuesday, August 3, 2021

August 3rd 2021 - VR Game Jam 2021


This was my first game jam and it was an intense 3 days of programming. It was announced by two prominent VR YouTube channels, VR with Andrew and Valem, so I scheduled time specifically to be able to participate. I joined their discord channels to see if there were any people open to collaborating as a part of a team but ultimately ended up doing this jam solo. 

This blog post is going to be about my strategy for the game jam, the challenges I faced and what I learned from the experience.

My Strategy - Scope Restriction and MVG

My strategy for tackling this game jam was that I would keep the scope restricted by trying to utilize what I've done in the past and interactive elements I'm familiar with. Basically, it was to make sure that my project scope didn't include too many new things I would have to learn for the first time. 

I didn't jump into coding right away and spent a good half to full hour just brainstorming and trying to get a project concept before starting. 

My final idea was a VR game where you created small worlds like a cook in a kitchen. You sprinkle the right elements in a specific order to create the completed worlds. 

Here's some pics from the brainstorming. I never really meant for others to see it so its a bit of a mess.

Tuesday, July 20, 2021

Mobile Game July 20th - Input Manager touch screen and keyboard controls

Decided to not create a VR focused game and instead prototyped an endless runner mobile game for Android. Also took the opportunity to further test out Unity's new input system. 

I was able to put everything together over three days which I'm pretty happy with. Most of the time was spent watching different video tutorials on the input system and how to configure it. It isn't too difficult to use when setting up the controls for one player. However, I haven't been able to get the local multiplayer to work using the same keyboard. For now I think I'll take a break from it and try a different project before re-visiting.

Thursday, July 15, 2021

VR Food Truck Prototype Game - July 15th

This is an overview of the VR Food Truck Prototype game which was built for Oculus Quest using Unity 2021. Recently we had to deal with a killer heat wave in Vancouver. During that time it was so hot I couldn't stand cooking in the kitchen and dealing with more heat so food was all about take-out and delivery. Thus the concept of VR food delivery game using a cannon to send you the food.

VR Food Truck Gameplay Trailer

Game Overview
This is a time based game so you are simply trying to get as high a score as you can before it runs out. You gain points by sending a sandwich with the ingredients the customer wants inside.

I like to think of this as a VR version of Overcooked which was a cooking game that I really liked.

Loading Scenes Asynchronously and Object Pooling
One of my learning goals for this project was to better manage loading within the game. To create a more seamless experience for the player I have a screen fade that occurs during game transitions. The screen fades to black and then levels are loaded asynchronously. During the period where the screen is black I have all the needed object pools and other game objects loaded. The goal is to make it so that any screen tearing/skipping or loading goes unnoticed by the player as a result.

Unity XR Toolkit and Action-based Input
This is my first time implementing Unity's XR Interaction Toolkit with the new action-based input system. I found that its pretty intuitive and not too difficult to use. I added additional functionality to the scripts to make this VR game work.

Sandwich and Topping System
The Sandwich system relies on a gameobject with a sandwich handler and gameobject with a topping handler. The topping gameobjects are created in an object pool at the start of the scene and retrieved from the pool when the player goes to grab a topping. How the topping interacts with the sandwich is simply through checking for the sandwich handler when the topping collides with other objects. If the topping collides against a object with a sandwich handler, the topping is set to inactive and calls the add topping method in the sandwich and passes on a string for what topping needs to be added.

The sandwich keeps track of what toppings have been added to it in a list<string> . A customer can then check this list in the sandwich handler to see if the toppings it currently has matches the customers own personal list for sandwich toppings.

Customer System
Customers are initialized upon loading the game level and sent in a Queue<customer> and set to inactive. Current customer is set to active and then they send their list of topping requests to the player topping request UI canvas. One unique challenge here was that I didn't want the customers to overlap with their spawn positions so when they are assigned a spawn location they check it against a list of used spawn locations. If the random spawn location they were assigned is used, the script generates a new spawn location number until it gets one that is unused.

VR Canon
The VR canon is controlled with a simple lever that uses Unity's configurable joint and locks its position and rotation to mimic a lever. The cannon aim is adjusted based on how you move the lever and it is is calculated by transforming the current position of the lever against its maximum and minimum range.

Overall I feel that this project went very smoothly with good separation between systems and that the scripts and classes are well organized. I was able to apply a lot of the lessons I learned from previous VR projects and I can see myself using the asynchronous scene loading with screen fading for future projects as well. 

VR Card Battle Game - July 15th

Game Play Trailer - Vertical Slice

This is a overview of the VR Card Battle Game System prototype which was built for Oculus Quest using Unity 2019. This was a project inspired from a friend who is a fan of card games. I thought it'd be pretty cool to make a real-time VR battling card battle game with deck building.

YouTube Video - Game System Overview

The game play loop is very simple and is a cycle of defeating your enemies to get rewards which allow you to build a stronger card deck and beat more enemies. Victory is achieved when the enemies health is reduced to zero. You accomplish this by using magical weapons which spawn from a selection of magical cards. Defeated enemies provide several magical cards which you can incorporate into your deck for the next enemy. 

The Gameplay Loop

In terms of similar games I think of it as a combination of the projectile slashing from Beat Saber mixed with card games like Slay the Spire.

Card System
The cards form the core system of the game and I tried to keep the scope limited so there are only four types of cards.. 

The primary damage card in the game is the basic sword card which fires a projectile when charged up. The second card is a shield that allows you to block projectiles and recharge mana. Third is a spell fireball card which deals larger damage at the cost of mana. Finally is the draw card which provides a small heal and allows you to draw a new card.

Decks are based on a list of Card Scriptable Object which contain the basic information for the card. The deck gets loaded at the beginning of a match for the player. When you draw your hand a generic Card is created which is then passed the Card Scriptable object which will determine the look and effects of the card.
Example Card Scriptable Object
VR Interactions
VR interactions and set up is done using Unity's XR Interaction toolkit which is easily extendable and its how I add my own interactive elements to the game objects such as making the sword fire a lightning range attack when you use the trigger button on the controller.

One of my focus for this project was to implement SOLID design principles as well as to try and avoid creating a massive singleton. This led me to learning more about how to use scriptable objects as a bridge and the power of combining events with scriptable objects which is incredibly useful. 

If your curious about using events and scriptable object together I recommend you check out the Unity's Open Project where they explain how they implemented it.  

Enemy AI
The main challenge was creating an enemy AI that behaves differently based on what cards the player chooses . To challenge myself I decided to try and implement two enemies which use either a finite state machine or a simple behavior tree. 

My finite state machine enemy was based on a tutorial a unity tutorial where they used scriptable objects. I just changed the scripts and behaviors in the scriptable objects to suit my specific purposes. If your interested in the tutorial you can learn more about it here.

The enemy basically checks against a series of decisions whether it needs to change state and those decisions are based on what card the player currently has selected. Each card selection state will lead to attack state where you can plug in what you want to occur. You just add the corresponding attack scriptable object which would be how many projectiles and which spawn location you want it to occur from.

The second enemy uses a behavior tree that relies on scriptable objects which I learned from this tutorial. There are probably really good existing solutions but I wanted to try and program a basic behavior tree for my own personal learning.

The end result isn't visually easy to use but still fairly straight forward for customizing. You have three types of nodes which are the Selector, Sequence and Tasks nodes. It has a default idle behavior which is to look at the player and wait for card selection. When the player has selected a card the behavior tree runs the corresponding attack sequence. 

Both enemies share the same code for projectile management and the behavior tree or state machine just need to provide the attack data.

All in all, I learned a lot from this project and hope you enjoyed this rough overview. The code is available on my github if you want to take a closer look.

Monday, June 14, 2021

Furniture VR Design Prototype - June 14th

Here is a prototype I made using VR to design a space. 

Inspired from the Global VR/AR summit I attended last week.  It was mentioned how 3D models will be used more in online shopping because it gives you a better sense of what you are buying than a 2d image.

If providing 3d models becomes the standard and widely available it would make it easy to use Augmented Reality or Virtual Reality to help consumers decide whether they want it or not. 

VR Interactions and Save/Load System

Created the prototype using Microsoft MRTK for the VR interactions in Unity. The scrolling menu specifically was a very cool tool that's available through MRTK. 

The save/load system was something I implemented for the first time. It was implemented through storing the furniture rotation, position and scale information in a game save data class. Using the JSON utility you can then transform the save data class into a JSON file as long as all the data information is serializable.  

Thursday, June 3, 2021

2021 VR/AR Global Summit Notes - Raising funds for your VR/AR Initiative

This is a collection of take-aways, thoughts, and notes from attending the 2021 VR/AR Global Summit panel on fundraising for your VR/AR initiative. 

XR Job Series - Raising money for your project or company tips

  • Don't forget to look into government and corporate financing programs. This isn't a comprehensive list but I did some research and here are programs that I think local VR/AR businesses might be interested in.
    • For BC, Canada this would be looking into Creative BC Interactive Fund
    • Research and Development Tax credits. For example SR&ED.
    • Innovate has programs that can be beneficial
    • For this fall you might want to pay attention to The Canada Foundation for Innovation. They will be hosting a competition in fall 2021 for the 2023 Innovation Fund .
    • If your AR/VR project has a community focus or social benefit you should also consider corporate grants
      • For example Rogers Wireless provides grants to organizations who have a focus on helping youth.
    • Reach out to organizations who are aligned with your vision and could be strategic partners.
      • They might not have funding they can provide at the moment but there may be other benefits that you could leverage. Establishing these relationships before going to angel investors or VC would also be advantageous.
  • Have a 6-12 month plan for what you would do with funding. A panelist mentioned that he expects the businesses that he invest in to have a commercially viable product within 9 months. It's important to get gauge and get an initial reaction to the VR/AR product or service.
  • What makes your VR/AR initiative unique?
    • Your chances of receiving funding will be low if a quick google search of your idea brings up competitors who are similar and you don't have a differentiation factor.
  • Founder Dynamics. If you are not a solo-founder, you should be aware that they will try to evaluate your relationship and dynamics with your co-founders. This was stressed as an important aspect for several panelist.
  • A "No" might not always be a "No". One of the panelist who has raised a lot of money mentioned that for a lot of their deals the reason why they got it was because they didn't give up. Situations change and new developments for your VR/AR initiative may change investors mind and convince them to invest with you.  
  • How COVID has impacted the way they look at investing in VR/AR
    • There is now a lot less resistance to investing in teams who are distributed.
    • If you were operating prior to COVID they will be expecting to hear how you utilize the increased interest in VR/AR.

Sunday, February 14, 2021

Desert Climbing VR Project and VR Archery

Recently completed two VR projects since the last update. One was the Desert Climbing VR project. Here is a video along with my key learnings:

Desert Climbing VR Project

Key Learnings

Faster Prototyping - Use free assets

I made the 3D models myself and realized that it took up a lot more time then I thought it would in the end. Ultimately this slowed down the time it took to make a prototype of some of the features. It made me realize that it would be better to use free assets in the Unity store or other websites to add some basic visuals and move through the building process faster.

Dialogue system

I made the dialogue system myself and its triggered through listening to button inputs on the Quest 2 controller. The Dialogue uses scriptable objects which store or reference the text and images needed which are inserted into a basic UI Canvas. It was only after making it that I realized that the system I built makes it difficult to do things like display multiple images and etc. If I were to try it a second time I would try and use Unity's timeline instead which might provided a lot more flexibility and faster implementation. I enjoyed using Scriptable Objects and can see how they are a great tool for any game developer.

Biggest Challenge

It was my first time implementing a VR climbing mechanic and there were challenges dealing with character body collisions with rock protrusions as well as the hand poses themselves when grabbing the rock. Perhaps I'll find a solution in the future but I realized that automated hand poses for grabbing objects would be pretty useful in increasing realism. Turning off the hand doesn't really help since seeing your hands is a big part of climbing.

Adding Story

Adding story was a fun creative challenge. It made me think more deeply about the mechanics of the game and helped with setting up scenes. 

VR Archery - Chinese New Year Inspired

I was thinking about what I should do for my next project and my wife gave me some inspiration. I decided to do a VR Archery game inspired by Chinese New Years. I'm pretty happy with it since I applied what I learned previously and it allowed me to gain more time on the game design itself. There's a lot more I could do to improve it as well.

Key Learnings

Decoupling systems 
I used scriptable objects to connect systems and components but even now as I look back I realized I could have done a better job of that. This was the first time I set up a spawn system with enemies and used coroutines to do it. An unexpected challenge would be overlapping spawns which I could solve by having the spawn points have their own spawn manager instead of using just one. 

I was excited that I was able to load levels using scriptable objects. This made making new levels with different parameters really easy.

XR Interaction Toolkit
I feel that I learned a lot about the XR interaction toolkit with this project when it comes to extending the features through overriding the base functions. Really useful and I can see how I could have avoided some spaghetti code in previous projects through using this.

The stumbling blocks were challenges that I didn't realize would be there during the initial planning process. One for example was destroying game objects with XR grab interactables. Turns out that the current XR Toolkit throws errors when you do this and I found a solution on a forum post to clear the colliders which helped. Unique specifically to XR toolkit.

I tried Object pooling as well but because I used coroutines, when you turn the enemy inactive they ended up being respawned. I solved this through removing the enemy from the list after they spawn so that they don't return to the list as an inactive object.

Tuesday, February 2, 2021

How I approach VR projects - Desert Grandma

This is to showcase my current process for approaching VR projects. Currently I want to create a simple short story driven VR experience while implementing a few new systems that I haven't done before in the past. 

My first step is to decide on a project concept and then create a VR project scope. 

Project Concept 

For this project my concept is focused on a small adventure experience about a child who wants to help their grandma who is sick. I've developed a small storyboard about how I want the player to experience the story.

I do want to establish limits for the project which is that I want to finish it in 1-1.5 weeks. I try to keep that in mind as I work on the storyboard.


The setting only has the child and the grandma. It'll be a lonely house in the desert. The grandma is the only family for the child.

I'm choosing a Desert because it removes they are naturally empty and reduce the number of art assets I'll need to make. It also puts visual emphasis on loneliness and hopefully strengthens the focus on family in the story. 

A simple character design inspiration for the grandma character.

The grandma is sick and that is the motivation for the story. There is a magic necklace that the grandma owns which is supposed to help those who are lost. Reveals that there is a cure high up in the desert mountains.

Child climbs the mountain to get into a cave. The cave has a plot twist for the child.

Gaming Systems Needed

As a part of the project scope I then try to think of all the systems I need to facilitate the experience. I try to divide this into two parts after I make the list with a rough time estimate. *Usually wrong since I don't have a lot of experience lol*

The first half is core systems to create a minimum viable VR experience that can be completed in the shortest time period and then the additional nice to haves if there is time.

Core System 

  • Scene Loading System(1 day)
  • Dialogue/Story Sequence/ Input system(2 days)
    • primary button A to quit game
    • primary button B to progress to next dialogue and story section
    • Story tracker
  • Climbing System (2 days)
  • Basic Art(1 day)
    • Sick Grandma in bed
    • Necklace
    • Mountain
    • Reward
Total Time : 6 days

  • Better Art Assets 
    • Skybox - (2 hours)
    • Hand models
    • House in the Dessert Scene - Main Menu (2 hours )
  • Haptic feedback - (2 hours)
  • Sandstorm / Wind Effects (3 hours)
  • Medicine Interaction ( 2 hours)
  • Background music & Voice over (3 hours)

  • Action Scene
    • I really want to try creating a VR action scene. Might try this if I can complete everything else quickly. Doesn't add a ton to the story other then making it cool.
Total Additional Time : 2 days

Total Project Time : 8 days

Wednesday, January 27, 2021

January 27th 2021 - Working on Quest 2 Projects

 Last month I got a Oculus Quest 2 and I've been working on VR projects since. This is a quick summary of those projects.

VR Flower Explainer 

Updated and recreated the Flower Explainer Project to work with the Quest 2 Input system. I used Unity's XR Input system to make game objects interactable.

VR Gun Mechanics

I wanted to create a VR Fire Extinguisher Training experience and thought that the mechanics for using a VR Gun could translate over pretty well. There are a ton of tutorials for how to create VR guns so these tutorials were easy to find. 

VR Fire Extinguisher Training

Built using my skills from previous projects to create this. Main challenge became state tracking for the fire extinguisher as I wanted it to only work if you are holding it up in your left hand, unlatched the key holding it and holding the nozzle. I also used spatial sound for this one along with a video that's played in game which I haven't done before.

Unity Shader Graph - Toon Shading and other effects

Visuals are definitely an area of weakness in my skillset. I took a week to explore Unity's Shader graph in URP. Did some tutorials on toon shading which I think might be a way to improve the visual appeal of my projects moving forward. I've always loved cell shading and the art style of games like wind waker but I'm not sure I'll use toon shading for all my projects.

Regardless, after doing several tutorials I've found that the outline effect might be something that I'll keep as it can highlight what objects are interactable in a VR experience.