top of page
Picture1.jpg

Video Display

Overview

In the field of virtual reality technology, the importance of haptic feedback cannot be underestimated. Haptic perception is a critical component of human sensory perception, capable of providing an immersive experience, enhancing user interaction, and enriching the realism of virtual environments. With this project, we aim to explore the integration of physical devices with virtual reality, allowing users to touch different materials in the physical world and, through visual and auditory effects in the virtual world, evoke a sense of inner calm and a connection to nature.

I would like to express my heartfelt gratitude to my two team members, Lin Wei and Tianyuan. Together, we have successfully collaborated on this interdisciplinary project, and I consider it a great honor to have had the opportunity to be part of it. Before starting my internship, I had a research concept in mind regarding Virtual Reality (VR), aiming to compare multi-modal and single-modal VR experiences with the intention of demonstrating that users can achieve a more immersive experience in a multi-modal environment. Consequently, I planned to create two distinct VR demonstrations—one primarily focused on a single sensory modality (mainly visual with minimal auditory elements), and the other integrating visual, auditory, and haptic feedback.

My responsibility

Within the project, my primary responsibility is to provide technical support using Unity. In the early stages, we collaboratively discussed the project's narrative and the specific implementation of each scene. During the execution phase, we allocated different scene functionalities to different team members, and my role involved implementing Unity-specific functionalities and developing certain visual effects.

Technical overview

Here is a summary of the development and implementation work I have undertaken in this project. Some features were canceled due to design changes, but the development of these features or effects still showcases our entire project development journey. We will continue to refine and update this project in the future.

One-to-one simulation of the real environment

I began by measuring a portion of my room and then created a virtual room of the same dimensions in Unity. Wearing an HMD, I proceeded to walk forward until I reached the location with a blue circular marker on the door that I could touch with my hand. Throughout this process, through continuous testing and adjusting the positions of objects in Unity, I successfully achieved synchronization of object positions between the virtual world and the physical world. However, due to changes in the showcase scene and design, this functionality is temporarily not present in the current version.

simulate.png
Portal.png

Portal Effect

After users pass through the portal, they are instantly transported to another area where the environmental effects have been replaced with a high-definition skybox. In this scene, we intend to showcase a vast and awe-inspiring landscape associated with natural elements.

Hand Tacking

In the project, users primarily interact with the environment using their hands. Users touch various materials, some of which are connected to sensors that provide physical feedback. Quest 2 supports hand gesture tracking, with its front-facing cameras capturing the user's hand movements to perform various actions within the virtual scene. Users can interact with physical devices without the need for controllers, as they can freely use their hands to touch objects.

image.png

The process of generating specific effects by touching different objects involves the use of Unity colliders. I set up colliders on objects, and I also added additional colliders to the hand prefab. This way, when objects on the virtual tree are touched, various effects can be triggered.

image.png

VFX Effects And URP Shader

In our project, we aimed to achieve artistic enhancements within the virtual environment, and as such, we relied extensively on VFX (Visual Effects) rather than using physical object models in the virtual scene. I was responsible for implementing some of the special effects, while other team members made modifications based on the project's specific needs.

image.png

VFX Flower Animals

We wanted to introduce animals such as rabbits, deer, squirrels, etc., into our natural scenes, but these animals were composed of petals. Therefore, I created VFX effects for these animals, and they were designed to move within the scene. The implementation of these effects primarily involved using Skinned Mesh in VFX. It projected a full-bodied animal model made up of particles, with the particle materials representing petals. By adjusting parameters such as Capacity, lifespan, color, and others for the particles, we achieved specific visual effects.

image.png
image.png
image.png

Energy Ball Effect

The purpose of this effect is to guide the user to look upwards. We will place many illuminated branches at the top of the scene to create the sensation that the user is in a forest. The role of the energy ball is that when the user touches a physical tree in the real world, an energy ball is generated in the virtual environment and moved upward to direct the user's gaze towards the sky.

Tree Branch Shader

The tree branches above the user's head are a crucial component of our project. In this part, I began by retaining the tree's model for the branches, and then, by adding a URP Shader, I achieved the capability for the model to change colors under certain levels of transparency. Additionally, the model can exhibit variations in surface material effects.

image.png
image.png
image.png

Objects Appearing in Front of the Camera

image.jpeg

During the gameplay in the project, we have a segment where objects appear directly in the user's field of view when they touch a real-world tree. Initially, the design involved cloud-based animals appearing, but now it's flowers. These flowers are placed around the user's area and hidden in advance, and they gradually appear based on the angle in front of the user's view (default setting is 60 degrees).

image.png

Reflection

Once again, we want to express our gratitude to our team members Weilin and Tianyuan, as well as everyone who supported our project. Over the course of three months, our project has received a wealth of valuable feedback and suggestions. We plan to continue refining this project as we aim to explore further possibilities, such as potential business collaborations related to fabrics and ventures in the field of art and healthcare.

bottom of page