I am a beginner in VR development.

I'm developing for Oculus Quest using Unity's Asset of Oculus Integration, but I can't get Oculus Touch input.

From the review of Oculus Integration, ver1.4.2 seems to be defective, so I tried ver1.3.9 from Archive, but it still doesn't work.

How can I get Oculus Touch input?

Error message
  1. Oculus Integration's Oculus>SampleFrameworks>Usage>Load the DebugUI scene.
  2. In
  3. Hierarchy, set OVRControllerPrefab as a child object of PlayerController>OVRCameraRig>TrackingSpace>RightHandAnchor>RightControllerHandAnchor and change the Controller to R Tracked Remote.
  4. Place UIHelpers of Oculus Integration in the scene, turn on the LineRenderer of the LaserPointer child object, and set RayTransform of the OVRInput Module of EventSystem to RightControllerAnchor. (Because I want to emit a laser)
  5. Attach the following source code.
// ButtonPush.cs
using System.Collections;
using UnityEngine;
public class ButtonPush: MonoBehaviour
    void Start ()
    void Update ()
         if (OVRInput.GetDown (OVRInput.RawButton.RIndexTrigger)) {
               Debug.Log ("ButtonPush");

The contents of Debug.Log are not displayed on the console even if the file of 4. is attached to an object that seems to be related. There is no particular error.

I tried with Unity and Oculus Integration in multiple versions.

Unity2019.2.12 Unity2017.4.28
Oculus Integration1.4.2 Touch itself is not displayed Touch is displayed but input cannot be obtained
Oculus Integration1.3.9 Same as above Same as above

I think it's a version issue, or even lowering the version, the result is the same, so it might be something fundamentally different ...

I'd be glad if you could give me some advice.

  • Answer # 1

    In the case of

    v1.3.9, if you do not deal withuses-featurewritten in the following article, it will be recognized as the Oculus Go app and the behavior of the button will be different from the expected one. 1.4.1 or higher is recommended because it is troublesome.

    What to do when your app is messed up after updating Oculus Quest Build 7.0

    In addition, I tried to make the same scene with Unity2019.2.12 and Oculus Integaration 1.4.2, but Touch was displayed and a log appeared when I pulled the trigger of the right controller. There is nothing that comes to mind, but since the following items were changed from PlayerSettings after the project was created, something may be affected.

    Other Settings

    Change ColorSpace to Linear

    Change Graphics APIs to OpenGLES3 only

    Change Minimum API Level to Android 6.0 'Mashmallow' (API level 23)

    XR Settings

    Check Virtual Reality Supported

    Add Oculus to Virtual Reality SDKs

    I use Android SDK and OpenJDK that can be installed by Unity module.

    I think that the other is different, but when XR Managerment is installed from Package Manager, the controller is not displayed or there are other problems, so I have uninstalled it.

    ■ Add ALVR
    ALVR is the primary development version and does not support Quest, so use the fork version.

    To run on Unity and ALVR, switch PlayerSettings to PC, Mac&linux Standalone settings and set as follows.

    XR Settings

    Check Virtual Reality Supported

    Add OpenVR to theheadof Virtual Reality SDKs

    An error occurs when using OVRGrabber.cs like CustomHandLeft/CustomHandRight prefab in ALVR. This needs some modifications as follows.


    ■ Added Oculus Link
    Also, Oculus Link can be used. This is closer to Quest's behavior than ALVR, so the above OVRGrabber.cs measures are not required. If you use this, please change the top of Virtual Reality SDKs in PC, Mac&linux Standalone settings to Oculus.
    Although the official cable has not yet appeared, the recommended cable is introduced on the following support page.