ARKit randomly places models in the real world

I am experimenting with ARKit and I am trying to place some models around the user. So I want it to just place some models around the user when starting the app, so he needs to find them.

When it moves, for example 10 meters, I want to add some random models again. I thought I could do it like this:

 let cameraTransform = self.sceneView.session.currentFrame?.camera.transform
        let cameraCoordinates = MDLTransform(matrix: cameraTransform!)

        let camX = CGFloat(cameraCoordinates.translation.x)
        let camY = CGFloat(cameraCoordinates.translation.y)
        let cameraPosition = CGPoint(x: camX, y: camY)
        let anchors = self.sceneView.hitTest(cameraPosition, types: [.featurePoint, .estimatedHorizontalPlane])

        if let hit = anchors.first {
            let hitTransform = SCNMatrix4(hit.worldTransform)
            let hitPosition = SCNVector3Make(hitTransform.m41, hitTransform.m42, hitTransform.m43)
            self.sceneView.session.add(anchor: ARAnchor(transform: hit.worldTransform))
            return Coordinate(hitPosition.x, hitPosition.y, hitPosition.z)
        }

        return Coordinate(0, 0, 0)
    }

      

The problem is that sometimes it doesn't find any anchors and then I don't know what to do. And when he finds some anchors, he is accidentally placed behind me, not in front of me, but behind me. I don't know why, because never rotate the camera so it can't find any anchors.

Is there a better way to place random models in the real world?

+3


source to share


1 answer


To do this, you need the session (_: didUpdate :) delegation method:

func session(_ session: ARSession, didUpdate frame: ARFrame) {
    guard let cameraTransform = session.currentFrame?.camera.transform else { return }
    let cameraPosition = SCNVector3(
        /* At this moment you could be sure, that camera properly oriented in world coordinates */
        cameraTransform.columns.3.x,
        cameraTransform.columns.3.y,
        cameraTransform.columns.3.z
    )
    /* Now you have cameraPosition with x,y,z coordinates and you can calculate distance between those to points */
    let randomPoint = CGPoint(
        /* Here you can make random point for hitTest. */
        x: CGFloat(arc4random()) / CGFloat(UInt32.max),
        y: CGFloat(arc4random()) / CGFloat(UInt32.max)
    )
    guard let testResult = frame.hitTest(randomPoint, types: .featurePoint).first else { return }
    let objectPoint = SCNVector3(
        /* Converting 4x4 matrix into x,y,z point */
        testResult.worldTransform.columns.3.x,
        testResult.worldTransform.columns.3.y,
        testResult.worldTransform.columns.3.z
    )
    /* do whatever you need with this object point */
}

      

This will allow you to position the object when updating the camera position:

Implement this method if you provide your own screen for AR experience. The provided ARFrame contains the last image captured from the device's camera, which you can display as the scene's background, as well as information about the camera parameters and the transform anchor, which you can use to render virtual content over the camera image.

The really important thing here is that you arbitrarily choose a point for the method hitTest

, and that point will always be in front of the camera.

Remember to use the 0 to 1.0 coordinate system for the CGPoint in the hitTest method :



A point in the normalized image coordinate space. (Point (0,0) represents the top-left corner of the image, and point (1,1) represents the bottom-right corner.)

If you want to place an object every 10 meters, you can save the camera position (in the method session(_:didUpdate:)

) and check that the coordinates x+z

have been changed far enough to place the new object.

Note:

I am assuming you are using a worldwide tracking session:

let configuration = ARWorldTrackingSessionConfiguration()
session.run(configuration, options: [.resetTracking, .removeExistingAnchors])

      

+3


source







All Articles