JMRSDK Development
v4.57
v4.57
  • Jio Mixed Reality SDK Documentation
    • Changelog
      • Upgrade Guide
    • Application Requirements
  • Device Information
    • JioGlass
    • JioDive
  • Supported Smartphones
  • Controller Specifications
    • Physical Controllers
    • External Gamepad
    • Virtual Controller
  • Getting Started
    • Development Platform
    • Setting Up Jio Mixed Reality Project in Unity
    • Video Tutorials
    • URP Support
      • Setting Up Your Project With URP
  • JMRSDK
    • JMRSDK Content
    • JMRMixedReality Prefab
    • System Dock
    • JMRRig
      • Local Rig
      • Setting Homepage (Quit functionality)
    • Webcast
  • Develop
    • Editor Emulator
    • JioGlass Controller Interactions
    • Cameras
    • Tesseract Mixed Reality UI Toolkits
    • In-app purchase
    • Examples
  • Interaction
    • Gaze Interaction
      • Gaze and Click
      • Gaze and Dwell
    • Interaction
      • Pointer Manager
        • Examples
      • Active Input Source
    • Interfaces
      • IFocusable
      • ISelectHandler
      • ISelectClickHandler
      • IBackHandler
      • IHomeHandler
      • IMenuHandler
      • IFn1Handler
      • IFn2Handler
      • ITouchHandler
      • ISwipeHandler
      • IManipulationHandler
    • Controller Input Actions
      • Touchpad - Touch
      • Touchpad - Swipe
      • Source Buttons
      • Manipulation
    • Actions
    • Device State
      • JioDive Device State
      • JioGlass Device State
      • Controller Device State
  • Tracking
    • Tracking
      • Coordinate System
    • Tracking Framework
      • TrackerManager Actions
        • Get Head Position
        • Get Head Rotation
        • Get Head Transform
      • TrackerManager Methods
        • Get Head Position
        • Get Head Rotation
        • Get Head Transform
    • Recenter
  • Building and Testing
    • Building to Target Device
      • Merging AndroidManifest
      • Performance Optimization
      • App optimization
    • Running your application
      • JioImmerse App For Jio Mixed Reality (JMR) Devices
      • Running the application
      • IPD Calibration
    • Licensing Journey In Android JioImmerse
    • Licensing Journey in iOS JioImmerse
  • Publish
    • Branding Guidelines
    • Signing your App
    • Publishing to Google Play Store
      • Play Store Upload Journey
    • Publishing to JioImmerse Developer Console
    • Publishing to Apple Store
    • iOS Deep linking
  • Capturing and Recording
    • Capture Videos and Screenshots
      • Capturing Screenshot/Videos using scrcpy
      • Capturing Screenshot/Videos using Vysor
  • Troubleshooting
    • FAQs - Develop
    • FAQs - Building to device
      • Gradle
      • FAQs - iOS
    • FAQs - Running and Publishing
    • Laser Point Not Visible
Powered by GitBook
On this page
  • Introduction to Tracking
  • Types
  • 3DoF
  • 6DoF
  1. Tracking

Tracking

Overview

Introduction to Tracking

Positional tracking detects the precise position of the head-mounted displays, controllers, other objects, or body parts within Euclidean space. Because the purpose of VR is to emulate perceptions of reality, it is paramount that positional tracking is both accurate and precise so as not to break the illusion of three-dimensional space. Several methods of tracking the position and orientation (pitch, yaw, and roll) of the display and any associated objects or devices have been developed to achieve this. All said methods utilize sensors that repeatedly record signals from transmitters on or near the tracked object(s), and then send that data to the computer in order to maintain an approximation of their physical locations. By and large, these physical locations are identified and defined using one or more of three coordinate systems: the Cartesian rectilinear system, the spherical polar system, and the cylindrical system. Many interfaces have also been designed to monitor and control one's movement within and interaction with the virtual 3D space; such interfaces must work closely with positional tracking systems to provide a seamless user experience.

Types

3DoF

Three Degrees of Freedom, or 3DoF, is a virtual reality concept that describes how learners interact with a virtual environment. With 3DoF, learners can:

  • Look left and right

  • Look up and down

  • Pivot left and right

3DoF means learners cannot move throughout the virtual space. However, learners can interact with the environment via gaze control or a laser pointer controller. 3DoF can be useful for practising job duties that are significant yet sedentary, like navigating difficult conversations with coworkers.

6DoF

In AR and VR, 6DoF describes the range of motion that a head-mounted display allows the user to move on an axis in relation to virtual content in a scene. Three of the degrees refer to the motion of the user's head - left and right (yaw), backward and forwards (pitch), and circular (roll) - while the remaining three pertain to the movement within the space - left and right, backward and forwards, and up and down.

PreviousController Device StateNextCoordinate System