Reality Studio

Client Project at The Devhouse Agency, 2021

Reality Studio is a set of two applications and a Unity Editor tool developed solely by me. Its goal is to create an environment where interior design layouts can be quickly and easily created, changed, and viewed in a high fidelity AR experience by someone who is inexperienced with traditional 3d software.

The first app in the pipeline is called “Content Creator”. This is the editor application and took the large majority of the development time. Some of the more impressive systems it contains are: Object and Surface Selection and Multi-Selection, Change History (Undo/Redo), Mesh and Submesh Manipulation, Custom Project File Saving/Loading, Material Management, and more.

The second piece of the pipeline is a Unity Editor tool that checks AWS for project files that need to be lightbaked and bundled for mobile use. It loads the saved project, bakes the lighting, and bundles the Unity scene to be used with Unity Addressables.

The end step in the pipeline is an AR phone/tablet app where the user can load the scene bundle via project name, place the interior space down in the real world, and view it as if they were actually in it.

Additional information can be found on the project page. The linked Git repo will only contain individual systems out of context rather than the complete project. I would really recommend checking out the project page, there is too much to this project to put here!


Simulation Modeling

UTD ATEC Capstone Project, Fall 2020

A simulated volume using Position-Based Dynamics that can be interacted with in real time. It has a dynamically generated and updated mesh that can be exported to a .obj file. This was my senior capstone project at UTD. See the project page for in depth on my development process.

My implementation of Position-Based Dynamics (PBD) consists of 3 internal constraints, which each node in the simulation must attempt adhere to. They are compressive, stretch, and rotational constraints. Additionally there is an external static constraint that uses information from colliders to prevent the simulation from passing through them.

The mesh is dynamically generated initially, then the vertices are updated in real-time as the nodes move around.


Portals

Personal Project, 2019

I have created multiple versions of portals/mirrors in Unity, improving with each iteration:
-(Old) A visual-only portal/mirror that renders to a texture on a plane.
-(Current) A visual and functional portal that uses a screen-space shader.
-(WIP) A visual and functional portal that combines the ideas of the previous two in order to make it seamless and extremely efficient, potentially allowing for many recursions.

Old: This version of the portal uses Unity’s physical cameras to position the POV from the other side of the portal so that its position matches the main camera’s, and so that its view frustum always exactly lines up with the surface of the portal. It renders this POV camera to a texture2d and uses it as the material’s texture on the portal plane. The positive of this method is that depending on the resolution you choose, it will have much less of an impact on FPS than the current method. The negative is that if you approach the portal plane close enough, you will easily be able to see the pixels of the texture.

Current: Unlike the old version, the POV camera of this one moves and rotates with the main camera, offset by whatever position and rotation the other side of the portal is in relation to the portal you are seeing. A culling matrix is used to render only what is in front of the linked portal. It is rendered to a texture2d and given to a shader that applies it to screen space. The positive is that regardless of how close you get to the portal plane, you will never see pixels, it is almost seamless. The negative is that, for each portal you need to render, you must render the entire screen resolution again, which is extremely expensive.

New WIP: This method uses a screen space shader, but limits what needs to be rendered. Cameras in Unity have a setting called “rect”, which determines where on the texture the rendering happens and what percentage of the texture it will take up. This is usually used to render mini-maps onto the screen. By calculating the smallest rectangle in screen space that a portal needs to fit into, I can limit the render to that rectangle and place it in the proper spot on the texture for it to be displayed properly using the shader. When recursively rendering portals, the amount of pixels needed to be rendered can be scaled down based on the size of the previous camera in the chain. Aside from a few edge cases, the number pixels that need to be rendered (including infinite recursion) should be at most around 2.5x the size of the screen. My hope is that this allows portals to be used in VR applications.


Wisps

UTD Virtual Environments Class, Spring 2019

A basic life simulation made in Unity that uses a Compute Shader to determine the behavior of each wisp.

Each tree contains one wisp and an internal storage of “light”. A wisp also has an internal storage of light. It takes 10% of the tree’s light when leaving the tree. The goal of each individual wisp is to increase the light of their tree. A wisp can only gain light by stealing it from a wisp that has more light than it. When it steals light, it takes 20% of the targeted wisp’s light. Wisps slightly favor closer candidates over farther away ones, even if the farther one has more light. Lastly, the more light a wisp carries, the slower it flies.

Because each wisp must take into account every other wisp (O(n^2)), a compute shader is used to speed up calculations. The only thing that needs to be determined is the target of each wisp based on the criteria list above, so it works well with the GPU.


Gravity

Personal Project, 2019

A simple gravity simulation with many particles using Unity’s Compute Shaders.
I am currently using Particle-Particle physics, but plan to try out a couple methods of approximation later.

The simulation contains 4096 particles.


Sprite Animation Controller

Personal Project, Fall 2018

A state-based animator for 2d Unity that uses sprite sheets.

Animations are stored in Scriptable Objects. These contain each frame of the animation and other information about the animation such as animation time. The animator class contains functions used to play the animations. It does this using Coroutines.


Logic Gate Puzzle System

Personal Project, 2018

A basic logic system to use with puzzles in my WIP game “Geldamin’s Vault”.

The logic system consists of various monobehavior classes that each have different settings. Each of these feeds into another, which can be used to create more advanced logic requirements. Whenever an input is activated, it chains upwards through the system using delegates, and if all criteria are satisfied at all points, the output is triggered.

Currently, there are scripts for: Door, pressure plate, lever, poison gas trap, AND gate, OR gate, NOR gate

Using what I know now, I may remake the logic gates to be entirely self-contained in one class, and be easier to use in the editor.