This page has been translated automatically.
Video Tutorials
Interface
Essentials
Advanced
How To
Basics
Rendering
Professional (SIM)
UnigineEditor
Interface Overview
Assets Workflow
Version Control
Settings and Preferences
Working With Projects
Adjusting Node Parameters
Setting Up Materials
Setting Up Properties
Lighting
Sandworm
Using Editor Tools for Specific Tasks
Extending Editor Functionality
Built-in Node Types
Nodes
Objects
Effects
Decals
Light Sources
Geodetics
World Nodes
Sound Objects
Pathfinding Objects
Players
Programming
Fundamentals
Setting Up Development Environment
Usage Examples
C++
C#
UnigineScript
UUSL (Unified UNIGINE Shader Language)
Plugins
File Formats
Materials and Shaders
Rebuilding the Engine Tools
GUI
Double Precision Coordinates
API Reference
Animations-Related Classes
Containers
Common Functionality
Controls-Related Classes
Engine-Related Classes
Filesystem Functionality
GUI-Related Classes
Math Functionality
Node-Related Classes
Objects-Related Classes
Networking Functionality
Pathfinding-Related Classes
Physics-Related Classes
Plugins-Related Classes
IG Plugin
CIGIConnector Plugin
Rendering-Related Classes
VR-Related Classes
Content Creation
Content Optimization
Materials
Material Nodes Library
Miscellaneous
Input
Math
Matrix
Textures
Art Samples
Tutorials

Classes and Components Overview

When you open your project based on the VR Sample for the first time and see a lot of different classes, you might get confused a bit.

In this article, we'll take a closer look at how the VR Sample is organized, what each major class does, and how they work together, so you can better understand the system and start building on top of it with confidence.

VRPlayer Class#

This is a base class for all players. It contains declaration of baseline controls, common basic player operations, event management etc.

Notice
You can check out baseline controls for VR Controllers here.

The following component classes are inherited from the VRPlayer:

VRPlayerPCThis component implements a player with standard PC input devices (keyboard + mouse or Xbox360 controller) for VR emulation and contains all related settings and methods.
VRPlayerVR This base class provides shared functionality for all supported VR devices. It includes features such as motion controllers, input tracking, teleportation, haptics, and object interaction. It defines common parameters and methods used by both controller-based and hand-tracking input systems.

Hand Tracking#

Two specialized components extend VRPlayerVR class to support natural hand tracking. Only one can be active at a time, depending on the system configuration:

HandControllerInherited from VRPlayerVR and utilizes OpenXR's built-in hand tracking hand tracking API, if initialized and Ultraleap is not in use. It processes joint data to detect pinch and grab gestures, enabling natural interaction without physical controllers.
UltraleapHandControllerVarjoAlso inherited from VRPlayerVR and is available when the Varjo backend is active (-vr_app varjo) and Ultraleap plugin is loaded. It integrates directly with the Ultraleap SDK to provide detailed skeletal tracking and gesture interaction.

To animate hand models, two corresponding mappers are used:

VRMeshSkinnedHandMapperMaps OpenXR joint data to skinned hand meshes.
UltraleapMeshSkinnedHandMapperApplies Ultraleap skeletal data to the mesh.

For more details, see the article: Hand Tracking.

Eye Tracking#

Eye tracking enables interaction based on the user's gaze direction. It is supported in the VRPlayerVR class and can be used for gaze-based object selection and labeling.

EyetrackingPointerThis component calculates the gaze direction from the player's head position towards the focus point returned by the eye tracking system. It performs a raycast in that direction and identifies the object being looked at.
ObjectLabelingDisplays the name of the currently gazed object using a text label positioned near the hit point. The label follows the gaze and updates dynamically as the user looks at different objects.

Eye tracking is active only if supported by the hardware and both VREyeTracking::isInitialized() and VREyeTracking::isValid() are true.

VRPlayerSpawner Class#

This class is responsible for instantiating and registering all VRPlayer components within the Component System. During initialization, it performs runtime checks to determine the current VR configuration and selects the appropriate player controller accordingly:

  • If VR is initialized and a HMD is connected, it spawns a VRPlayerVR with all necessary components for full VR interaction.
  • If the application is running with the Varjo backend and the Ultraleap plugin is available, it adds UltraleapHandControllerVarjo for external hand tracking.
  • If OpenXR hand tracking is available and Ultraleap is not used, it adds HandController.
  • If VR is not initialized, a VRPlayerPC is spawned instead, providing basic desktop control for testing or fallback usage.

Interaction-Related Classes#

VRInteractable is a base class for all objects that you can interact with. It defines a basic set of interactions, in other words: here you define what can a user do with your object. You can add your own type of interaction here.

The following component classes are inherited from the VRInteractable:

ObjMovableThis component type can be used for all objects, that can be grabbed, held and thrown (it can be a ball, a pistol, whatever you can take, carry and drop).
ObjHandleThis component type can be used for all objects, that can be turned or moved while being held (various handles, levers, valves, etc.).
ObjSwitchThis component type can be used for all objects, that can be switched by grabbing (all sorts of buttons and on-off switches, including rotary ones).
ObjLaserPointerThis component enables casting a laser ray by the object.
Notice
The laser pointer object in the VR Sample world has ObjMovable and ObjLaserPointer components attached.
NodeSwitchEnableByGrabThis component toggles the enabled state of specified nodes when the player grabs the object.

The following components also toggle the enabled state of assigned nodes. However, unlike other interaction components, they do not inerit from VRInteractable.

NodeSwitchEnableByGestureToggles node state when a specific hand gesture is detected.
NodeSwitchEnableByKeyToggles node state when a specific controller's key is pressed.

Object Attachment#

These components attach objects to the VR player's hands or head, ensuring consistent relative positioning.

AttachToHandAttaches an object to the player's hand in VR. It supports both automatic alignment using custom position and rotation, or preserving the current transform. The node is re-parented to the hand controller node (left or right) and aligned with a predefined basis correction for controller orientation.
AttachToHeadPlaces an object in front of the player's head in VR and optionally updates its position every frame. It allows specifying orientation relative to a chosen axis and controlling whether the position remains fixed or dynamic based on the player’s gaze direction.

MenuBaseGUI Class#

This is a base class for all graphic user interfaces (GUI).

The following component classes are inherited from the MenuBaseGUI:

Notice
These components can be attached to GUI objects only.
HandSampleMenuGui This component implements initialization of widgets for the menu that is attached to a controller.
WorldMenuSampleGui This component implements initialization of widgets for the world menu.
MixedRealityMenuGui This component implements initialization of widgets for the menu that is attached to an HMD.

Framework#

Framework includes the Component System which implements the core functionality of components and a set of utility classes and functions used for playing sounds, auxiliary math and 3d math functions, callback system implementation.

Triggers Class#

Triggers is a framework class used to mark room obstacles for the VR Player (e.g. room walls, objects, etc.) and give a warning if there is an obstacle on the way (as the player gets closer to an obstacle, controllers' vibration becomes more intense).

Notice
Obstacles are not available for VRPlayerPC.

You can simply create primitives for walls and objects in your room and add them as children to the node dummy named Obstacles, which is a child of the VR dummy node (see the hierarchy in the Editor below).

All children of the Obstacles node will be automatically switched to invisible mode and will be used only to inform the player and prevent collisions with objects in the real room.

Notice
Primitives used to mark obstacles must have their first surface (the one with the 0 index) named as "box", "sphere", "capsule" or "cylinder" in order to be properly converted into corresponding trigger volumes.

Utils#

This class provide a utility module with a wide range of helper functions commonly used in VR applications for working with 3D math, object transforms, geometry, visualization, and world management.

Sound Manager Class#

The SoundManager class provides centralized management of audio playback in the scene. It supports one-shot and looped 3D sounds, grouped sound variants, volume control, and performance optimization through sound reuse.

Key Features:

One-shot sounds Use playSound() to play positional, non-looping sound effects (e.g. impact sounds).
Looped sounds Use playLoopSound() for continuous sounds (e.g. ambient noises). These automatically stop after a specified duration.
Sound grouping Similar sound files can be grouped and randomly selected for variety using addSoundGroup().
Volume control Global volume can be adjusted with setVolume(). Separate control is available for sound and music channels.
Sound warming Preloads all sound groups into memory using warmAllSounds() to minimize delays during playback.

The information on this page is valid for UNIGINE 2.20 SDK.

Last update: 2025-06-09
Build: ()