How to remove ARReferenceObject Anchor when the object is no longer in camera frame? - swift

I am using RealityKit + SwiftUI + ARSessionDelegate to render 3D content on top of an ARReferenceObject. I want to remove the 3D content once the camera pans away from the object and it is no longer in the frame.
Currently I render the 3D content when the object is detected, which is what I want. But I have multiple identical objects that I want to identify separately using the same ARReferenceObject. So in order to do this I need to remove the original anchoring.
This is my wrapper for SWiftUI:
struct ARViewWrapper: UIViewRepresentable {
#ObservedObject var arManager: ARManager
// cretae alias for our wrapper
typealias UIViewType = ARView
// delegate for view representable
func makeCoordinator() -> Coordinator {
return Coordinator(arManager: self.arManager)
}
func makeUIView(context: Context) -> ARView {
// create ARView
let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true)
// assign delegate
arView.session.delegate = context.coordinator
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
print("Updating View")
// create anchor using an image and add it to the ARView
let target = AnchorEntity(.object(group: "AR Resources", name: "bj"))
target.name = "obj_anchor"
// add anchor to AR world
if(uiView.scene.anchors.count == 0){
uiView.scene.anchors.append(target)
}else{
uiView.scene.anchors[0] = target
}
// add plane and title to anchor
addARObjs(anchor: target, arObj: arManager.currARObj)
return()
}
}
This is my Delegate:
class Coordinator: NSObject, ARSessionDelegate {
#ObservedObject var arManager: ARManager
init(arManager: ARManager) {
self.arManager = arManager
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
return
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]){
return
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
return
}
}

SceneKit
You can do it in SceneKit. All you need is to use isNode(_:insideFrustumOf:) instance method that returns a Boolean value indicating whether a node might be visible from a specified point of view or not. This method is also implemented in ARKit (as a part of SceneKit).
func isNode(_ node: SCNNode, insideFrustumOf pointOfView: SCNNode) -> Bool
Sample code:
var allYourNodes = [SCNNode]()
allYourNodes.append(node001)
allYourNodes.append(node002)
guard let pointOfView = arSCNView.pointOfView
else { return }
for yourNode in allYourNodes {
if !arView.isNode(yourNode, insideFrustumOf: pointOfView) {
arSCNView.session.remove(anchor: yourARAnchor)
}
}
However, I haven't found a similar method in RealityKit 2.0. Hope it'll be added by Cupertino engineers in the near future.
RealityKit
Here's what we have in RealityKit 2.0 at the moment:
Apple's documentation says: During an AR session, RealityKit automatically uses the device’s camera to define the perspective from which to render the scene. When rendering a scene outside of an AR session – with the view’s cameraMode property set to
ARView.CameraMode.nonAR
RealityKit uses a PerspectiveCamera instead. You can add a perspective camera anywhere in your scene to control the point of view. If you don't explicitly provide one, RealityKit creates a default camera for you.
So, the only available parameters of a PerspectiveCameraComponent at the moment are:
init(near: Float, far: Float, fieldOfViewInDegrees: Float)

Related

how to fix Cannot convert return expression of type 'UIView' to return type 'ARView' issue in swift?

enter image description here
class MyCustomUIViewController: UIViewController {
let coachingOverly = ARCoachingOverlayView()
override func viewDidLoad() {
let arView = ARView(frame: .zero)
let session = arView.session
// this is a view that display standardized onbording instructions to direct users towards specific goal.
let coachingOverly = ARCoachingOverlayView()
// this is an intger bit mask that determines how the reciever resized itself.
coachingOverly.autoresizingMask = [.flexibleWidth,.flexibleHeight]
coachingOverly.goal = .anyPlane
coachingOverly.session = session
arView.addSubview(coachingOverly)
// Load the "Opject" scene from the "dumu" Reality File
let anchor = try! Dumu.loadScene()
// Add the Opject anchor to the scene
arView.scene.anchors.append(anchor)
self.view.addSubview(arView)
}
override func viewWillAppear(_ animated: Bool) {
coachingOverly.setActive(true, animated: true)
}
}
From your picture in the ARViewContainer I see....
In
func makeUIView(context: Context) -> ARView // ..... -> you specify a return type of ARView
But next you instantiate a new UIView for the return
... = MyCustomViewController() // Seems to be an UIView->
which seems to be an UIView and you return it. But you have to return an ARView which is a subclass of UIView. Maybe you can downcast the MyCustomViewController with
let viewController = MyCustomUIViewController() as ARView
or you may change the class of MyCustomUIViewController() to ARView in its definition. For a more detailed explanation I need at least the definition of MyCustomUIViewController()

Remove the focus point when the camera sees ARImageAnchor and reveal the focus point when the camera does not see ARImageAnchor

I try to obtain focus point entity when camera not see ARImageAnchor, and remove after camera sees ARImageAnchor, and when camera not sees anchor obtain focus point again. I used arView.session.delegate, but delegate method maybe call one time, i don't know. how to make it? Have a good day!
final class CameraViewController: UIViewController {
var focusEntity: FocusEntity! // (FocusEntity: Entity, HasAnchoring)
override func viewDidLoad() {
super.viewDidLoad()
// ...
focusEntity = FocusEntity(on: arView, style: .classic(color: .systemGray4))
arView.session.delegate = self
}
}
extension CameraViewController: ARSessionDelegate {
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors {
if let imageAnchor = anchor as? ARImageAnchor {
focusEntity.destroy()
focusEntity = nil
//... Obtain entity to image anchor
}
}
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
//... ???
}
}
I would create an AnchorEntity, that uses the ARImageAnchor as its anchor.
Then, subscribe to the AnchoredStateChanged event, turning the FocusEntity on and off depending on the anchored state there.
I wrote about subscribing to events like that here:
https://maxxfrazer.medium.com/realitykit-events-97964fa5b5c7
If you want the focus entity to re-appear by the way, you would probably want to set isEnabled on the entity, rather than destroying and re-creating it each time.

RealityKit Custom ARAnchor not syncing across devices

I'm using Apple's custom ARAnchor in a config.isCollaborationEnabled = true environment.
When I call the following on DeviceA:
let boardAnchor = BoardAnchor(transform: last.worldTransform, size: CGSize(width: 10, height: 11))
arView.session.add(anchor: boardAnchor)
I can see the delegate func session(_ session: ARSession, didAdd anchors: [ARAnchor]) get called with the BoardAnchor on DeviceA.
However, DeviceB does not receive such a delegate call.
If however I add a non-subclassed ARAnchor on DeviceA, I can see the delegate called on DeviceB.
let namedAnchor = ARAnchor(name: "test", transform: last.worldTransform)
arView.session.add(anchor: namedAnchor)
So I'm really confused as to why the subclass doesn't work...any ideas?
class BoardAnchor: ARAnchor {
let size: CGSize
init(transform: float4x4, size: CGSize) {
self.size = size
super.init(name: "Board", transform: transform)
}
override class var supportsSecureCoding: Bool {
return true
}
required init?(coder aDecoder: NSCoder) {
self.size = aDecoder.decodeCGSize(forKey: "size")
super.init(coder: aDecoder)
}
// this is guaranteed to be called with something of the same class
required init(anchor: ARAnchor) {
let other = anchor as! BoardAnchor
self.size = other.size
super.init(anchor: other)
}
override func encode(with aCoder: NSCoder) {
super.encode(with: aCoder)
aCoder.encode(size, forKey: "size")
}
}
Delegate
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors {
DLog("didAdd anchor: \(anchor)")
if anchor.name == "test" {
// Non-sublcass ARAnchor ok
}
if let board = anchor as? BoardAnchor {
// Never called
}
}
}
I believe this line from the Building Collaborative AR Experiences session (WWDC 2019) might explain the issue you're facing;
Last, only the user created ARAnchors are shared. That excludes all
the subclass ARAnchors, including ARImageAnchor, ARPlaneAnchor, and
ARObjectAnchor. That also excludes the user subclass ARAnchor which
were used to attach user data within Map Save and Load.
The session seems to go on to indicate that in lieu of using a sublcassed ARAnchor, you can define your own Entity component and conform your Entity to that protocol, hereby allowing you to negate having to use a subclassed ARAnchor and allowing the component, which would synchronize across the session, to perform a similar task. However, if you are not using RealityKit, and are, instead, using SceneKit or SpriteKit, you will likely need to determine a different methodology, such as avoiding subclassing ARAnchor and move your logic in BoardAnchor somewhere else.
I think that other than simply noting the config.isCollaborationEnabled = true, you need to manually handle the data sent out and the connectivity and other peers.
In addition, there is a different delegate method to get the collaborated anchors, if I remember correctly.
Please note the following:
https://developer.apple.com/documentation/arkit/creating_a_collaborative_session
There is a sample project there that will probably answer most of your questions..

How do I use a RealityKit ARView with ARImageTrackingConfiguration?

Can I see an example using a RealityKit ARView with ARImageTrackingConfiguration including the ARSessionDelegate delegate methods?
Here is an example of a RealityKit ARView using ARImageTrackingConfiguration and the ARSessionDelegate delegate methods. I didn't see a complete example of exactly this on Stack Overflow so thought I would ask/answer it myself.
import ARKit
import RealityKit
class ViewController: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// There must be a set of reference images in project's assets
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else { fatalError("Missing expected asset catalog resources.") }
// Set ARView delegate so we can define delegate methods in this controller
arView.session.delegate = self
// Forgo automatic configuration to do it manually instead
arView.automaticallyConfigureSession = false
// Show statistics if desired
arView.debugOptions = [.showStatistics]
// Disable any unneeded rendering options
arView.renderOptions = [.disableCameraGrain, .disableHDR, .disableMotionBlur, .disableDepthOfField, .disableFaceOcclusions, .disablePersonOcclusion, .disableGroundingShadows, .disableAREnvironmentLighting]
// Instantiate configuration object
let configuration = ARImageTrackingConfiguration()
// Both trackingImages and maximumNumberOfTrackedImages are required
// This example assumes there is only one reference image named "target"
configuration.maximumNumberOfTrackedImages = 1
configuration.trackingImages = referenceImages
// Note that this config option is different than in world tracking, where it is
// configuration.detectionImages
// Run an ARView session with the defined configuration object
arView.session.run(configuration)
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
// This example assumes only one reference image of interest
// A for-in loop could work for more targets
// Ensure the first anchor in the list of added anchors can be downcast to an ARImageAnchor
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// If the added anchor is named "target", do something with it
if let imageName = imageAnchor.name, imageName == "target" {
// An example of something to do: Attach a ball marker to the added reference image.
// Create an AnchorEntity, create a virtual object, add object to AnchorEntity
let refImageAnchor = AnchorEntity(anchor: imageAnchor)
let refImageMarker = generateBallMarker(radius: 0.02, color: .systemPink)
refImageMarker.position.y = 0.04
refImageAnchor.addChild(refImageMarker)
// Add new AnchorEntity and its children to ARView's scene's anchor collection
arView.scene.addAnchor(refImageAnchor)
// There is now RealityKit content anchored to the target reference image!
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// Assuming only one reference image. A for-in loop could work for more targets
if let imageName = imageAnchor.name, imageName == "target" {
// If anything needs to be done as the ref image anchor is updated frame-to-frame, do it here
// E.g., to check if the reference image is still being tracked:
// (https://developer.apple.com/documentation/arkit/artrackable/2928210-istracked)
if imageAnchor.isTracked {
print("\(imageName) is tracked and has a valid transform")
} else {
print("The anchor for \(imageName) is not guaranteed to match the movement of its corresponding real-world feature, even if it remains in the visible scene.")
}
}
}
// Convenience method to create colored spheres
func generateBallMarker(radius: Float, color: UIColor) -> ModelEntity {
let ball = ModelEntity(mesh: .generateSphere(radius: radius), materials: [SimpleMaterial(color: color, isMetallic: false)])
return ball
}
}

iPad Swift Playgrounds won’t run ARKit code/crashes

I’ve been experimenting with ARKit on Swift Playgrounds. I’ve written the starter code, but when I run it nothing happens. Instead of evaluating the code, it displays the pop up that shows ay issues in the code.
I know the code I’m using works, because I’ve used the same code on an iPad running an older version of Swift Playgrounds and the code works perfectly. It seems to be a problem with either Swift Playgrounds 3 or Swift 5.
Here’s the interesting part. When I remove the line of code that runs the ARWorldTrackingConfiguration initializer, and the code that makes the view controller the delegate of the session and scene, the code runs just fine. When I put it back, it does the same error again. I don’t know what’s going wrong.
I’m running Swift Playgrounds 3.0 on and iPad 6th Generation. The playground uses ARKit, UIKit, SceneKit, and PlaygroundSupport.
Lastly, here’s some code.
// Code inside modules can be shared between pages and other source files.
import ARKit
import SceneKit
import UIKit
extension ARSCNView {
public func setup(){
antialiasingMode = .multisampling4X
automaticallyUpdatesLighting = false
preferredFramesPerSecond = 60
contentScaleFactor = 1.0
if let camera = pointOfView?.camera {
camera.wantsHDR = true
camera.wantsExposureAdaptation = true
camera.exposureOffset = -1
camera.minimumExposure = -1
camera.maximumExposure = 3
}
}
}
public class vc : UIViewController, ARSessionDelegate, ARSCNViewDelegate {
var arscn : ARSCNView!
var scene : SCNScene!
public override func loadView() {
arscn = ARSCNView(frame: CGRect(x: 0, y: 0, width: 768, height: 1024))
arscn.delegate = self
arscn.setup()
scene = SCNScene()
arscn.scene = scene
var config = ARWorldTrackingConfiguration()
config.planeDetection = .horizontal
arscn.session.delegate = self
self.view = arscn
arscn.session.run(configåå)
}
public func session(_ session: ARSession, didFailWithError error: Error) {
// Present an error message to the user
}
public func sessionWasInterrupted(_ session: ARSession) {
// Inform the user that the session has been interrupted, for example, by presenting an overlay
}
public func sessionInterruptionEnded(_ session: ARSession) {
// Reset tracking and/or remove existing anchors if consistent tracking is required
}
}
Lastly, please note that I’m presenting the live view in the main playground page and putting the class in the shared code.
I’ve figured out a way to make this work. All I had to do was assign the view controller to a variable and then present the variable. I’m not exactly sure why this works, I just know it does.
import ARKit
import SceneKit
import UIKit
import PlaygroundSupport
public class LiveVC: UIViewController, ARSessionDelegate, ARSCNViewDelegate {
let scene = SCNScene()
public var arscn = ARSCNView(frame: CGRect(x: 0,y: 0,width: 640,height: 360))
override public func viewDidLoad() {
super.viewDidLoad()
arscn.delegate = self
arscn.session.delegate = self
arscn.scene = scene
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
arscn.session.run(config)
view.addSubview(arscn)
}
public func session(_ session: ARSession, didFailWithError error: Error) {}
public func sessionWasInterrupted(_ session: ARSession) {}
public func sessionInterruptionEnded(_ session: ARSession) {}
}
var vc = LiveVC()
PlaygroundPage.current.liveView = vc
PlaygroundPage.current.needsIndefiniteExecution = true
Use UpperCamelCasing for classes' names and add two strings of code in the bottom.
This code is suitable for macOS Xcode Playground and iPad Swift Playgrounds:
import ARKit
import PlaygroundSupport
class LiveVC: UIViewController, ARSessionDelegate, ARSCNViewDelegate {
let scene = SCNScene()
var arscn = ARSCNView(frame: CGRect(x: 0,
y: 0,
width: 640,
height: 360))
override func viewDidLoad() {
super.viewDidLoad()
arscn.delegate = self
arscn.session.delegate = self
arscn.scene = scene
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
arscn.session.run(config)
}
func session(_ session: ARSession, didFailWithError error: Error) {}
func sessionWasInterrupted(_ session: ARSession) {}
func sessionInterruptionEnded(_ session: ARSession) {}
}
PlaygroundPage.current.liveView = LiveVC().arscn
PlaygroundPage.current.needsIndefiniteExecution = true
P.S. Tip for Playground on macOS (although it doesn't have much sense when using ARKit module):
To turn on Live View in Xcode Playground 11.0 and higher use the following shortcut:
Command+Option+Return