RealityKit Custom ARAnchor not syncing across devices - swift

I'm using Apple's custom ARAnchor in a config.isCollaborationEnabled = true environment.
When I call the following on DeviceA:
let boardAnchor = BoardAnchor(transform: last.worldTransform, size: CGSize(width: 10, height: 11))
arView.session.add(anchor: boardAnchor)
I can see the delegate func session(_ session: ARSession, didAdd anchors: [ARAnchor]) get called with the BoardAnchor on DeviceA.
However, DeviceB does not receive such a delegate call.
If however I add a non-subclassed ARAnchor on DeviceA, I can see the delegate called on DeviceB.
let namedAnchor = ARAnchor(name: "test", transform: last.worldTransform)
arView.session.add(anchor: namedAnchor)
So I'm really confused as to why the subclass doesn't work...any ideas?
class BoardAnchor: ARAnchor {
let size: CGSize
init(transform: float4x4, size: CGSize) {
self.size = size
super.init(name: "Board", transform: transform)
}
override class var supportsSecureCoding: Bool {
return true
}
required init?(coder aDecoder: NSCoder) {
self.size = aDecoder.decodeCGSize(forKey: "size")
super.init(coder: aDecoder)
}
// this is guaranteed to be called with something of the same class
required init(anchor: ARAnchor) {
let other = anchor as! BoardAnchor
self.size = other.size
super.init(anchor: other)
}
override func encode(with aCoder: NSCoder) {
super.encode(with: aCoder)
aCoder.encode(size, forKey: "size")
}
}
Delegate
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors {
DLog("didAdd anchor: \(anchor)")
if anchor.name == "test" {
// Non-sublcass ARAnchor ok
}
if let board = anchor as? BoardAnchor {
// Never called
}
}
}

I believe this line from the Building Collaborative AR Experiences session (WWDC 2019) might explain the issue you're facing;
Last, only the user created ARAnchors are shared. That excludes all
the subclass ARAnchors, including ARImageAnchor, ARPlaneAnchor, and
ARObjectAnchor. That also excludes the user subclass ARAnchor which
were used to attach user data within Map Save and Load.
The session seems to go on to indicate that in lieu of using a sublcassed ARAnchor, you can define your own Entity component and conform your Entity to that protocol, hereby allowing you to negate having to use a subclassed ARAnchor and allowing the component, which would synchronize across the session, to perform a similar task. However, if you are not using RealityKit, and are, instead, using SceneKit or SpriteKit, you will likely need to determine a different methodology, such as avoiding subclassing ARAnchor and move your logic in BoardAnchor somewhere else.

I think that other than simply noting the config.isCollaborationEnabled = true, you need to manually handle the data sent out and the connectivity and other peers.
In addition, there is a different delegate method to get the collaborated anchors, if I remember correctly.
Please note the following:
https://developer.apple.com/documentation/arkit/creating_a_collaborative_session
There is a sample project there that will probably answer most of your questions..

Related

Remove the focus point when the camera sees ARImageAnchor and reveal the focus point when the camera does not see ARImageAnchor

I try to obtain focus point entity when camera not see ARImageAnchor, and remove after camera sees ARImageAnchor, and when camera not sees anchor obtain focus point again. I used arView.session.delegate, but delegate method maybe call one time, i don't know. how to make it? Have a good day!
final class CameraViewController: UIViewController {
var focusEntity: FocusEntity! // (FocusEntity: Entity, HasAnchoring)
override func viewDidLoad() {
super.viewDidLoad()
// ...
focusEntity = FocusEntity(on: arView, style: .classic(color: .systemGray4))
arView.session.delegate = self
}
}
extension CameraViewController: ARSessionDelegate {
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
for anchor in anchors {
if let imageAnchor = anchor as? ARImageAnchor {
focusEntity.destroy()
focusEntity = nil
//... Obtain entity to image anchor
}
}
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
//... ???
}
}
I would create an AnchorEntity, that uses the ARImageAnchor as its anchor.
Then, subscribe to the AnchoredStateChanged event, turning the FocusEntity on and off depending on the anchored state there.
I wrote about subscribing to events like that here:
https://maxxfrazer.medium.com/realitykit-events-97964fa5b5c7
If you want the focus entity to re-appear by the way, you would probably want to set isEnabled on the entity, rather than destroying and re-creating it each time.

How to remove ARReferenceObject Anchor when the object is no longer in camera frame?

I am using RealityKit + SwiftUI + ARSessionDelegate to render 3D content on top of an ARReferenceObject. I want to remove the 3D content once the camera pans away from the object and it is no longer in the frame.
Currently I render the 3D content when the object is detected, which is what I want. But I have multiple identical objects that I want to identify separately using the same ARReferenceObject. So in order to do this I need to remove the original anchoring.
This is my wrapper for SWiftUI:
struct ARViewWrapper: UIViewRepresentable {
#ObservedObject var arManager: ARManager
// cretae alias for our wrapper
typealias UIViewType = ARView
// delegate for view representable
func makeCoordinator() -> Coordinator {
return Coordinator(arManager: self.arManager)
}
func makeUIView(context: Context) -> ARView {
// create ARView
let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true)
// assign delegate
arView.session.delegate = context.coordinator
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
print("Updating View")
// create anchor using an image and add it to the ARView
let target = AnchorEntity(.object(group: "AR Resources", name: "bj"))
target.name = "obj_anchor"
// add anchor to AR world
if(uiView.scene.anchors.count == 0){
uiView.scene.anchors.append(target)
}else{
uiView.scene.anchors[0] = target
}
// add plane and title to anchor
addARObjs(anchor: target, arObj: arManager.currARObj)
return()
}
}
This is my Delegate:
class Coordinator: NSObject, ARSessionDelegate {
#ObservedObject var arManager: ARManager
init(arManager: ARManager) {
self.arManager = arManager
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
return
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]){
return
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
return
}
}
SceneKit
You can do it in SceneKit. All you need is to use isNode(_:insideFrustumOf:) instance method that returns a Boolean value indicating whether a node might be visible from a specified point of view or not. This method is also implemented in ARKit (as a part of SceneKit).
func isNode(_ node: SCNNode, insideFrustumOf pointOfView: SCNNode) -> Bool
Sample code:
var allYourNodes = [SCNNode]()
allYourNodes.append(node001)
allYourNodes.append(node002)
guard let pointOfView = arSCNView.pointOfView
else { return }
for yourNode in allYourNodes {
if !arView.isNode(yourNode, insideFrustumOf: pointOfView) {
arSCNView.session.remove(anchor: yourARAnchor)
}
}
However, I haven't found a similar method in RealityKit 2.0. Hope it'll be added by Cupertino engineers in the near future.
RealityKit
Here's what we have in RealityKit 2.0 at the moment:
Apple's documentation says: During an AR session, RealityKit automatically uses the device’s camera to define the perspective from which to render the scene. When rendering a scene outside of an AR session – with the view’s cameraMode property set to
ARView.CameraMode.nonAR
RealityKit uses a PerspectiveCamera instead. You can add a perspective camera anywhere in your scene to control the point of view. If you don't explicitly provide one, RealityKit creates a default camera for you.
So, the only available parameters of a PerspectiveCameraComponent at the moment are:
init(near: Float, far: Float, fieldOfViewInDegrees: Float)

How do I use a RealityKit ARView with ARImageTrackingConfiguration?

Can I see an example using a RealityKit ARView with ARImageTrackingConfiguration including the ARSessionDelegate delegate methods?
Here is an example of a RealityKit ARView using ARImageTrackingConfiguration and the ARSessionDelegate delegate methods. I didn't see a complete example of exactly this on Stack Overflow so thought I would ask/answer it myself.
import ARKit
import RealityKit
class ViewController: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// There must be a set of reference images in project's assets
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else { fatalError("Missing expected asset catalog resources.") }
// Set ARView delegate so we can define delegate methods in this controller
arView.session.delegate = self
// Forgo automatic configuration to do it manually instead
arView.automaticallyConfigureSession = false
// Show statistics if desired
arView.debugOptions = [.showStatistics]
// Disable any unneeded rendering options
arView.renderOptions = [.disableCameraGrain, .disableHDR, .disableMotionBlur, .disableDepthOfField, .disableFaceOcclusions, .disablePersonOcclusion, .disableGroundingShadows, .disableAREnvironmentLighting]
// Instantiate configuration object
let configuration = ARImageTrackingConfiguration()
// Both trackingImages and maximumNumberOfTrackedImages are required
// This example assumes there is only one reference image named "target"
configuration.maximumNumberOfTrackedImages = 1
configuration.trackingImages = referenceImages
// Note that this config option is different than in world tracking, where it is
// configuration.detectionImages
// Run an ARView session with the defined configuration object
arView.session.run(configuration)
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
// This example assumes only one reference image of interest
// A for-in loop could work for more targets
// Ensure the first anchor in the list of added anchors can be downcast to an ARImageAnchor
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// If the added anchor is named "target", do something with it
if let imageName = imageAnchor.name, imageName == "target" {
// An example of something to do: Attach a ball marker to the added reference image.
// Create an AnchorEntity, create a virtual object, add object to AnchorEntity
let refImageAnchor = AnchorEntity(anchor: imageAnchor)
let refImageMarker = generateBallMarker(radius: 0.02, color: .systemPink)
refImageMarker.position.y = 0.04
refImageAnchor.addChild(refImageMarker)
// Add new AnchorEntity and its children to ARView's scene's anchor collection
arView.scene.addAnchor(refImageAnchor)
// There is now RealityKit content anchored to the target reference image!
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// Assuming only one reference image. A for-in loop could work for more targets
if let imageName = imageAnchor.name, imageName == "target" {
// If anything needs to be done as the ref image anchor is updated frame-to-frame, do it here
// E.g., to check if the reference image is still being tracked:
// (https://developer.apple.com/documentation/arkit/artrackable/2928210-istracked)
if imageAnchor.isTracked {
print("\(imageName) is tracked and has a valid transform")
} else {
print("The anchor for \(imageName) is not guaranteed to match the movement of its corresponding real-world feature, even if it remains in the visible scene.")
}
}
}
// Convenience method to create colored spheres
func generateBallMarker(radius: Float, color: UIColor) -> ModelEntity {
let ball = ModelEntity(mesh: .generateSphere(radius: radius), materials: [SimpleMaterial(color: color, isMetallic: false)])
return ball
}
}

NSButton turns gray when pushed

I have a NSButton with an image. When pushed the whole cell turns gray. How to prevent this?
There are several posts about this topic. But most of them are like 10 years old. The most recent one was here: NSButton background transparent after getting focus
According to this, I tried with this code:
class overviewImageButton: NSButton {
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
convenience init(appearance: NSAppearance) {
self.init(appearance: appearance)
self.appearance = NSAppearance(named: NSAppearanceNameAqua)
}
override func draw(_ dirtyRect: NSRect) {
self.image = NSImage(named: "buttonImage.png")
super.draw(dirtyRect)
NotificationCenter.default.addObserver(forName: windowChanged, object: nil, queue: nil) {
notification in
self.image = NSImage(named: "buttonImage_highlighted.png")
}
}
}
But it doesn´t work. The buttoncelll still turns gray when pushed. Thanks for any help!
A lot of this was already said by both Willeke and I'L'I, so credit goes to them.
What Willeke said:
Never do anything in draw() except drawing. This method can get called as often as the screen refreshes. So currently you are basically trying to add yourself as the observer of the notificationCenter really often.
Here is what you could do:
Write a setup() method and call that from each initialiser. Any one of them is called once for a button instance.
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.setupButton()
}
override init(frame frameRect: NSRect) {
super.init(frame: frameRect)
self.setupButton()
}
private func setupButton() {
self.image = NSImage(named: "buttonImage.png")
NotificationCenter.default.addObserver(forName: .windowChanged, object: nil, queue: nil) {
notification in
self.image = NSImage(named: "buttonImage_highlighted.png")
}
}
You do not need to add the init(frame:) initialiser here. For storyboards the init(coder:) one is sufficient. I added it anyways, because you might want to initialise the button programmatically. You use the init(frame:) method there usually. If you add convenience methods, make sure to call the setup() method there as well.
What I'L'I said:
To the important stuff:
What I did to suppress the grey background on mouseDown was to simply call isHighlighted = false before calling super.draw(rect).
override func draw(_ dirtyRect: NSRect) {
self.isHighlighted = false
super.draw(dirtyRect)
}
Bonus:
I see you somewhere defined a Notification.Name. You can define all of them in a project global extension to Notification.Name like this:
extension Notification.Name {
static let windowChanged = Notification.Name(rawValue: "WindowChangedNotification")
}
After that you can use them everywhere like system notification names.
NSButton state appearances generally are affected with highlightsBy and showsStateBy. These methods change what happens within the NSButtonCell, which I think you're referring.
↳ https://developer.apple.com/documentation/appkit/nsbuttoncell

Add SKReferenceNode/SKScene to another SKScene in SpriteKit

I would like to add a SKScene to my main GameScene. SKReferenceNode seems to be a good solution.
I have :
- GameScene.sks (main scene)
- Countdown.sks (scene to add to GameScene)
- Countdown.swift (Custom class, how does to init it? SKScene ? SKReferenceNode ? SKNode)
I don't know how to add programmatically my countdown using my class Countdown.
I tried:
let path = Bundle.main.path(forResource: "Countdown", ofType: "sks")
let cd = SKReferenceNode (url: NSURL (fileURLWithPath: path!) as URL) as! Countdown
cd.name = "countdown"
self.addChild(cd)
But I have the following error :
Could not cast value of type 'SKReferenceNode' (0x10d97ad88) to 'LYT.Countdown' (0x10a5709d0
I also tried something more simple like:
let cd=Countdown(scene:self)
self.addChild(cd)
But I don't know how to init the class using the Countdown.sks file.
I know I also have the possibility to create a SKNode class, and init it 100% programmatically, but it really important for me to use the associated .sks file in order to use the Xcode scene editor.
I do that, I don't know if is the best way to do this, but works:
I've 2 file Dragon.swift and sks
I've added a "main" node like DragonNode and other node children of this
Now, the DragonNode is a custom class, set it in sks file:
The DragonNode is a normal SKSpriteNode
class DragonNode: SKSpriteNode, Fly, Fire {
var head: SKSpriteNode!
var body: SKSpriteNode!
var shadow: SKSpriteNode!
var dragonVelocity: CGFloat = 250
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
//Example other node from sks file
body = self.childNodeWithName("Body") as! SKSpriteNode
head = body.childNodeWithName("Head") as! SKSpriteNode
shadow = self.childNodeWithName("Shadow") as! SKSpriteNode
shadow.name = "shadow"
}
//Dragon Func
func fireAction () {}
func flyAction () {}
}
Inside the scene, add a SKReferenceNode:
In the SKScene code:
let dragonReference = self.childNodeWithName("DragonReference") as! SKReferenceNode
let dragonNode = dragonReference.getBasedChildNode() as! DragonNode
print(dragonNode)
//Now you can use the Dragon func
dragonNode.flyAction()
getBasedChildNode() is an extension to find your based node (the first one)
extension SKReferenceNode {
func getBasedChildNode () -> SKNode? {
if let child = self.children.first?.children.first {return child}
else {return nil}
}
}
I do a similar thing to Simone above, but instead of extending the reference node, I added my extension to SKNode.
extension SKNode {
func nodeReferenced() -> SKNode? {
if self .isKind(of: SKReferenceNode.self) {
return children.first!.children.first!
}
return nil
}
}
This way there is no need for the cast if the node is not actually a reference node and make this two step process a one liner. My version would change that above code to:
if let dragonNode = childNodeWithName("DragonReference")?.nodeReferenced() as? DragonNode {
print(dragonNode)
dragonNode.fly()
}
This works for me, but Simone's answer seems more straight forward and maybe flexible than mine, so I'd give them the points. I just like clean code and since we almost never actually need that SKReferenceNode, we can ignore it. Also, when enumerating the nodes, it's easy to ask for a referenced node, get one or nothing, without having to see if the node is actually a referenceNode first, then performing the change.