top of page

Animal Vision VR

111.png

Video Display

Overview

When users experience our animal vision VR project, they can enter the animal perspective in virtual reality through VR headsets and learn about the visual experiences and feelings of animals in the world. Our latest project uses Quest2 devices to provide visual experiences of four different animals and lets users control their movements in the virtual world through HMD controllers and specific gestures.

This project can combine existing scientific research results, such as animal visual systems, eye structures, and visual neural circuits, to deepen users' understanding of animal vision. In the future, we also plan to introduce more unique animal vision and food chain systems, such as simulating snake and mouse hunting scenes, to provide users with a more immersive virtual reality experience.

We also hope that this project can be helpful in computer vision technology research. In conclusion, an animal vision VR project can help people better understand and appreciate the world of animals, and also promote the development of animal protection and scientific research.

Picture2.png

My responsibility

In this project, my primary responsibilities were to write scripts that enabled users to move in a specific way within the virtual environment using Quest2 controllers and to complete the UI menu interface for switching between animal characters. I also implemented interaction functionality for the animals in the virtual world, such as the ability to eat fruit. Finally, I was responsible for integrating the animal vision camera and scene design completed by other team members into the Unity project.

User Flow Chart

Picture1.png

Technical overview

In the technical part, my main tasks involved implementing the use of HMD controllers to control the movement of animals, switching between animal interfaces, and triggering mechanisms for eating fruits.

Movement of animals
movement - HD 1080p.gif

The movement logic for dogs and mice is the same. I utilized Oculus' built-in mobile plug-in and my own script to control their movements by calculating the vertical distance between the left and right controllers (which move vertically in opposite directions) and achieving linear movement towards the direction of the user's gaze.

Dog Mice.png
Dog Mice code.png

The movement of the snake is to obtain the horizontal movement distance of the left and right controllers and achieve a straight line movement towards the visual direction of the eyes. I won't show the code here.

The way birds move is different. Although it still requires shaking the controller up and down vertically, when shaking the controller, the bird moves towards the direction of the front and top of the user's gaze. If the controller is not moved, the bird will automatically fall diagonally downward. The bird's movement stops when it reaches the ground, which is determined by the laser light on the ground.

Bird code.png
Switch menu interfaces for different animals
Menu - HD 1080p.gif
switch.png

This interface serves as a tutorial for mobile controls. Users can switch between different menus by pressing the B button on the controller and then pressing the A button to enter a specific animal scene. Additionally, users can return to the menu interface by pressing the B button while in the animal scene.

I will not display the scripts for object collision and disappearance or for making objects glow here. For more details, please refer to the GitHub link.

Reflection

I am pleased to report that despite encountering several challenges during the development process, we have successfully achieved our project objectives. During development, we had to deal with issues such as gameplay lag caused by high-definition materials and errors in the rendering pipeline, which required us to spend a considerable amount of time recreating the scene.

The primary challenge I encountered in the module I was responsible for was that the test scenario was not as complex as the formal scenario. Therefore, it was necessary to modify the script and use the Oculus plugin to complete the role movement in the formal scenario. In addition, I attempted to allow users to switch roles using a button by directly using the script in the scenario, but unfortunately, this was not successful. Although switching roles and cameras was successful, the control part failed, and I was unable to find a specific solution online. To address this issue, we added a new menu page for character selection.

bottom of page