我希望能够从iPad Pro Lidar导出网格和纹理,这里有如何导出网格的例子,但我也希望能够导出环境纹理。
这里有如何导出网格的例子,但我希望能够导出环境纹理。
ARKit 3.5 - 如何用激光雷达从新iPad Pro导出OBJ?
ARMeshGeometry存储了网格的顶点,是否需要在扫描环境时 "记录 "纹理,然后手动应用?
这个帖子似乎展示了一种获取纹理坐标的方法,但我没有看到用ARMeshGeometry的方法。将ARFaceGeometry保存到OBJ文件中
任何点在正确的方向,或事情看极大的赞赏!克里斯。
克里斯
你需要计算每个顶点的纹理坐标,将它们应用到网格中,并将纹理作为材质提供给网格。
let geom = meshAnchor.geometry
let vertices = geom.vertices
let size = arFrame.camera.imageResolution
let camera = arFrame.camera
let modelMatrix = meshAnchor.transform
let textureCoordinates = vertices.map { vertex -> vector_float2 in
let vertex4 = vector_float4(vertex.x, vertex.y, vertex.z, 1)
let world_vertex4 = simd_mul(modelMatrix!, vertex4)
let world_vector3 = simd_float3(x: world_vertex4.x, y: world_vertex4.y, z: world_vertex4.z)
let pt = camera.projectPoint(world_vector3,
orientation: .portrait,
viewportSize: CGSize(
width: CGFloat(size.height),
height: CGFloat(size.width)))
let v = 1.0 - Float(pt.x) / Float(size.height)
let u = Float(pt.y) / Float(size.width)
return vector_float2(u, v)
}
// construct your vertices, normals and faces from the source geometry directly and supply the computed texture coords to create new geometry and then apply the texture.
let scnGeometry = SCNGeometry(sources: [verticesSource, textureCoordinates, normalsSource], elements: [facesSource])
let texture = UIImage(pixelBuffer: frame.capturedImage)
let imageMaterial = SCNMaterial()
imageMaterial.isDoubleSided = false
imageMaterial.diffuse.contents = texture
scnGeometry.materials = [imageMaterial]
let pcNode = SCNNode(geometry: scnGeometry)
PCNode如果添加到场景中,将包含应用了纹理的网格。
纹理坐标计算来自 此处