Unity VR 中基于触摸位置的渲染纹理的光线投射精度问题

问题描述 投票:0回答:1

我遇到了 RenderTexture 在 Unity VR 多人游戏项目的面板上显示其他玩家视角的问题。该特征被实现为当用右手触摸面板时,光线被投射到与触摸位置相对应的虚拟空间中。然而,光线的方向偏离了触摸位置,导致其无法按预期工作。

环境:

设备:Meta Quest 3 无需手部追踪,使用控制器 手部显示在虚拟环境中 目标: 我想通过使用 RenderTexture 在面板对象上投射与触摸位置相对应的光线来准确地击中虚拟空间中的目标。

问题详情: 面板上的触摸位置已标准化(左下角为 (0,0),右上角为 (1,1)),并且使用显示 RenderTexture 的相机投射光线,但特别是当相机和面板处于同一位置时180度角(视频中1分19秒左右),触摸位置与光线对称错位。为了调试目的,在触摸位置生成了一个红色立方体,同时也确认了立方体位置关闭了。

参考视频

相关代码:

using UnityEngine;
using Photon.Pun;

public class PanelManager : MonoBehaviourPun
{
    public Camera displayRenderCamera; // Camera rendering the image to the RenderTexture
    private GameObject displayGameObject; // GameObject displaying the RenderTexture
    private bool collisionStatus = false;
    private Vector3 colliderPoint = Vector3.zero;

    void Start()
    {
        InitializeCameraAndPanel();
    }

    void Update()
    {
        bool gripHeld = OVRInput.Get(OVRInput.Button.PrimaryHandTrigger, OVRInput.Controller.RTouch);
        bool triggerNotPressed = !OVRInput.Get(OVRInput.Button.PrimaryIndexTrigger, OVRInput.Controller.RTouch);

        if (gripHeld && triggerNotPressed && collisionStatus)
        {
            InteractWithRenderTexture();
        }
        InitializeCameraAndPanel();
    }

    private void InitializeCameraAndPanel()
    {
        PhotonView[] allPhotonViews = FindObjectsOfType<PhotonView>();

        foreach (PhotonView view in allPhotonViews)
        {
            if (view.Owner != null)
            {
                if (view.Owner.ActorNumber != PhotonNetwork.LocalPlayer.ActorNumber)
                {
                    // Find the other player's camera
                    GameObject camera = view.gameObject.transform.Find("Head/ViewCamera")?.gameObject;
                    if (camera != null)
                    {
                        displayRenderCamera = camera.GetComponent<Camera>();
                        Debug.Log(displayRenderCamera);
                    }
                }
                else if (view.Owner.ActorNumber == PhotonNetwork.LocalPlayer.ActorNumber)
                {
                    GameObject panel = view.gameObject.transform.Find("Panel/Panel")?.gameObject;
                    if (panel != null)
                    {
                        displayGameObject = panel;
                    }
                    else
                    {
                        Debug.LogWarning("Panel/Panel not found on my object");
                    }
                }
            }
        }
    }

    private void InteractWithRenderTexture() // Main logic
    {
        Vector3 localHitPoint = getLocalHitPoint();
        // var displayGameObjectSize = displayGameObject.GetComponent<MeshRenderer>().bounds.size;
        Vector3 displayGameObjectSize = new Vector3(0.55f, 0.38f, 0.001f);

        // Calculate the viewport
        var viewportPoint = new Vector3()
        {
            x = (localHitPoint.x / displayGameObjectSize.x) + 0.5f,  // Center at 0.5
            y = (localHitPoint.y / displayGameObjectSize.y) + 0.5f,  // Center at 0.5
        }; // Range: 0-1

        float angleBetween = Vector3.Angle(displayRenderCamera.transform.forward, displayGameObject.transform.forward);

        // Adjust x coordinate based on the angle
        viewportPoint.x = AdjustedCoordinate(angleBetween, viewportPoint.x);

        // Generate a ray from the camera based on the viewport
        Ray ray = displayRenderCamera.ViewportPointToRay(viewportPoint);
        RaycastHit hit;

        // Get the coordinates of the hit point (for debugging)
        // Vector3 point = ray.GetPoint(2.0f);
        // GameObject Cube = GameObject.CreatePrimitive(PrimitiveType.Cube);
        // Cube.transform.position = point;
        // Cube.transform.localScale = new Vector3(0.1f, 0.1f, 0.1f);
        // Cube.GetComponent<Renderer>().material.color = Color.red;
        // Destroy(Cube, 0.1f);

        if (Physics.Raycast(ray, out hit, 10.0f))
        {
            // Activate the particle system of the detected object
            var cubeManager = hit.transform.GetComponent<CubeManager>();
            if (cubeManager != null)
            {
                cubeManager.StartParticleSystem();
            }
        }
    }

    private Vector3 getLocalHitPoint() // Get the local coordinates of the touched point on the panel
    {
        Vector3 localHitPoint = colliderPoint;
        if (localHitPoint != Vector3.zero)
        {
            return localHitPoint - displayGameObject.transform.position;
        }
        return Vector3.zero;
    }

    public static float AdjustedCoordinate(float theta, float x)
    {
        if (theta == 180.0f) return 1.0f - x;
        if (theta == 0.0f) return x;

        float adjustmentFactor = Mathf.Cos(theta * Mathf.Deg2Rad);
        float adjustedX = (1 - adjustmentFactor) * (1 - x) + adjustmentFactor * x;
        return adjustedX;
    }

    void OnTriggerEnter(Collider other)
    {
        if (other.gameObject.tag == "rightHand")
        {
            collisionStatus = true;
            colliderPoint = other.ClosestPointOnBounds(transform.position);
        }
    }

    void OnTriggerExit(Collider other)
    {
        if (other.gameObject.tag == "rightHand")
        {
            collisionStatus = false;
            colliderPoint = Vector3.zero;
        }
    }
}

我尝试过的: 我回顾了局部坐标系中计算触摸位置的方法,但偏差仍然存在。 任何有关此问题的原因和解决方案的建议将不胜感激。如果这个实现有问题,应该如何修复?

补充说明: 我尝试如下编辑,但问题没有解决。

private Vector3 getLocalHitPoint()
{
     Vector3 localHitPoint = displayGameObject.transform.InverseTransformPoint(colliderPoint);
     return localHitPoint;
}
c# unity-game-engine virtual-reality oculus unity3d-editor
1个回答
0
投票

所以一般来说你会做的是

  1. 在 3D 中获取正确的触摸点
  2. 获取RawImage空间中的本地点
  3. 将该本地点转换为 UV(= 视口)坐标
  4. 通过第二个相机发射光线

所以我认为第一个问题是

colliderPoint = other.ClosestPointOnBounds(transform.position);

transform.position
始终位于面板中心附近的某个位置。我可能会采取相反的方式,而是将手的位置放到 RawImage 上。例如,只需使用
Plane
并执行类似

的操作
private Vector3? colliderPoint = null;

void OnTriggerEnter(Collider other)
{
    if (other.CompareTag("rightHand"))
    {
        var plane = new Plane(transform.forward, transform.position);

        colliderPoint = plane.ClosestPointOnPlane(other.bounds.center);
    }
}

void OnTriggerExit(Collider other)
{
    if (other.CompareTag("rightHand"))
    {
        colliderPoint = null;
    }
}

现在这应该是更准确的 3D 碰撞点。

然后类似

private RawImage displayGameObject;

private void InteractWithRenderTexture() // Main logic
{
    if(colliderPoint == null) return;

    var worldSpaceHitPoint = colliderPoint.Value;

    // 2. convert to local space
    Vector2 localHitPoint = displayGameObject.transform.InverseTransformPoint(worldSpaceHitPoint);

     // 3. convert to UV coordinate - see https://gamedev.stackexchange.com/questions/198782/how-do-you-get-the-texture-coordinate-hit-by-the-mouse-on-a-ui-raw-image-in-unit
    var rect = displayGameObject.rectTransform.rect;
    Vector2 textureCoord = localHitPoint - rect.min;
    textureCoord.x *= displayGameObject.uvRect.width / rect.width;
    textureCoord.y *= displayGameObject.uvRect.height / rect.height;
    textureCoord += displayGameObject.uvRect.min;


    // 4. raycast through second camera
    var ray = displayRenderCamera.ViewportPointToRay(viewportPoint);

    if (Physics.Raycast(ray, out var hit, 10.0f))
    {
        // Activate the particle system of the detected object
        if(hit.transform.TryGetComponent<CubeManager>(out var cubeManager))
        {
            cubeManager.StartParticleSystem();
        }
    }
}
© www.soinside.com 2019 - 2024. All rights reserved.