1. Overview
Triggers are the detection mechanisms that power interactive experiences in VRse Studio. They act as the “sensors” of your VR world, continuously monitoring for specific user actions and environmental changes. When conditions are met, triggers fire events that can advance your story, execute actions, or respond to user behavior. This comprehensive reference covers all available triggers and their implementation based on the current VRse Studio codebase.2. Understanding Triggers
2.1 What Are Triggers?
Triggers are event detection systems that monitor specific conditions within your VR experience. Each trigger watches for discrete events such as:- Object Interactions: Grabbing, touching, or manipulating objects
- Spatial Relationships: Object placement, collisions, or proximity
- User Behavior: Eye tracking, movement patterns, or responses
- Time-Based Events: Countdown completions or duration thresholds
- System Events: UI interactions, evaluations, or custom conditions
2.2 Trigger Structure
Every trigger follows a consistent JSON structure:- Name: The trigger type identifier
- Query: Target object name being monitored
- Option: Specific detection behavior within the trigger type
- Data: JSON string containing configuration parameters
- Type: Legacy field (currently unused, reserved for future functionality)
Note: TheTypefield is currently not used in the system and has no functionality. It is maintained for backward compatibility and may be utilized for future features. Always set this to1for triggers.
2.3 Common Parameters
Most triggers support these universal parameters:- handOption: Specify hand detection (“None”, “Left”, “Right”, “Both”)
- waitForCompletion: Control trigger completion behavior
- targetObjectName: Reference to monitored object
- duration: Time-based thresholds for detection
- isTrigger: Physics detection mode (trigger vs collision)
3. Trigger Categories
Triggers are organized into several categories based on their detection focus:| Category | Purpose | Examples |
|---|---|---|
| Object Interaction | Monitor object manipulation | Grab, touch, placement |
| Spatial Detection | Track spatial relationships | Collisions, eye level, proximity |
| Time-Based | Respond to temporal events | Countdown completion, duration |
| Universal | Catch-all interaction detection | Any trigger, custom conditions |
| Evaluation | Assessment and feedback | MCQ responses, completion tracking |
4. Object Interaction Triggers
4.1 GrabbableTrigger
Detects when objects are grabbed, released, or used (squeezed) with hand-specific detection capabilities. Options| Option | Description |
|---|---|
| Grab | Object is grabbed by user |
| Release | Object is released from grip |
| Used | Object is squeezed/activated while held |
| Parameter | Type | Description |
|---|---|---|
handOption | String | Hand specification: “None”, “Left”, “Right”, “Both” |
4.2 HandTouchTrigger
Detects hand contact with objects or surfaces for precise touch interactions. Options| Option | Description |
|---|---|
| Touch | Hand makes contact with object |
| Untouch | Hand breaks contact with object |
4.3 PlacePointTrigger
Monitors object placement at specific locations with support for correct/incorrect placement detection. Options| Option | Description |
|---|---|
| Place | Any object placed at location |
| Remove | Any object removed from location |
| PlaceCorrect | Correct object placed |
| PlaceWrong | Incorrect object placed |
| RemoveCorrect | Correct object removed |
| RemoveWrong | Incorrect object removed |
| Parameter | Type | Description |
|---|---|---|
grabbableName | String | Target object name (for Place/Remove) |
disableGrabOnPlace | Boolean | Prevent grabbing after placement |
createGhostMesh | Boolean | Show placement preview |
deleteGhostMeshOnPlace | Boolean | Remove preview after placement |
5. Spatial Detection Triggers
5.1 CollisionTrigger
Detects physical interactions between objects with support for both trigger and collision detection. Options| Option | Description |
|---|---|
| Enter | Objects begin contact |
| Stay | Objects remain in contact |
| Exit | Objects separate |
| Parameter | Type | Description |
|---|---|---|
targetCollisionGameObject | String | Object to detect collision with |
isTrigger | Boolean | Use trigger (true) or solid collision (false) |
5.2 EyeLevelDetectionTrigger
Tracks when objects enter, stay within, or exit the user’s eye level for attention-based interactions. Options| Option | Description |
|---|---|
| EnteredEyeLevel | Object enters user’s eye level |
| StayedEyeLevel | Object remains in eye level for duration |
| ExitedEyeLevel | Object exits user’s eye level |
| Parameter | Type | Description |
|---|---|---|
targetObjectName | String | Object to track |
stayDuration | Float | Required duration for stayed detection |
6. Time-Based Triggers
6.1 TimerTrigger
Responds to countdown timer completion for time-sensitive training scenarios. Parameters| Parameter | Type | Description |
|---|---|---|
endTime | Float | Timer duration in seconds |
7. Universal Triggers
7.1 AnyTrigger
Universal interaction detector that captures any user action with advanced filtering capabilities. ParametersignoreQueries | String Array | Object names to ignore |
|---|---|---|
ignoreTriggerTypes | String Array | Trigger types to ignore |
triggerFrom | Integer | Trigger after N interactions (default: 1) |
triggerInterval | Integer | Trigger every N interactions (default: 0) |
triggerOnce | Boolean | Only trigger once (default: false) |
ignoreOnRightTriggers | Boolean | Ignore correct interaction objects |
8. Evaluation Triggers
8.1 MCQResponseTrigger
Detects responses to multiple-choice questions with support for correctness evaluation. Options| Option | Description |
|---|---|
| AnyResponse | Any answer provided |
| CorrectResponse | Correct answer given |
| WrongResponse | Incorrect answer given |