How do I use a RealityKit ARView with ARImageTrackingConfiguration? - arkit

Can I see an example using a RealityKit ARView with ARImageTrackingConfiguration including the ARSessionDelegate delegate methods?

Here is an example of a RealityKit ARView using ARImageTrackingConfiguration and the ARSessionDelegate delegate methods. I didn't see a complete example of exactly this on Stack Overflow so thought I would ask/answer it myself.
import ARKit
import RealityKit
class ViewController: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
// There must be a set of reference images in project's assets
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else { fatalError("Missing expected asset catalog resources.") }
// Set ARView delegate so we can define delegate methods in this controller
arView.session.delegate = self
// Forgo automatic configuration to do it manually instead
arView.automaticallyConfigureSession = false
// Show statistics if desired
arView.debugOptions = [.showStatistics]
// Disable any unneeded rendering options
arView.renderOptions = [.disableCameraGrain, .disableHDR, .disableMotionBlur, .disableDepthOfField, .disableFaceOcclusions, .disablePersonOcclusion, .disableGroundingShadows, .disableAREnvironmentLighting]
// Instantiate configuration object
let configuration = ARImageTrackingConfiguration()
// Both trackingImages and maximumNumberOfTrackedImages are required
// This example assumes there is only one reference image named "target"
configuration.maximumNumberOfTrackedImages = 1
configuration.trackingImages = referenceImages
// Note that this config option is different than in world tracking, where it is
// configuration.detectionImages
// Run an ARView session with the defined configuration object
arView.session.run(configuration)
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
// This example assumes only one reference image of interest
// A for-in loop could work for more targets
// Ensure the first anchor in the list of added anchors can be downcast to an ARImageAnchor
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// If the added anchor is named "target", do something with it
if let imageName = imageAnchor.name, imageName == "target" {
// An example of something to do: Attach a ball marker to the added reference image.
// Create an AnchorEntity, create a virtual object, add object to AnchorEntity
let refImageAnchor = AnchorEntity(anchor: imageAnchor)
let refImageMarker = generateBallMarker(radius: 0.02, color: .systemPink)
refImageMarker.position.y = 0.04
refImageAnchor.addChild(refImageMarker)
// Add new AnchorEntity and its children to ARView's scene's anchor collection
arView.scene.addAnchor(refImageAnchor)
// There is now RealityKit content anchored to the target reference image!
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// Assuming only one reference image. A for-in loop could work for more targets
if let imageName = imageAnchor.name, imageName == "target" {
// If anything needs to be done as the ref image anchor is updated frame-to-frame, do it here
// E.g., to check if the reference image is still being tracked:
// (https://developer.apple.com/documentation/arkit/artrackable/2928210-istracked)
if imageAnchor.isTracked {
print("\(imageName) is tracked and has a valid transform")
} else {
print("The anchor for \(imageName) is not guaranteed to match the movement of its corresponding real-world feature, even if it remains in the visible scene.")
}
}
}
// Convenience method to create colored spheres
func generateBallMarker(radius: Float, color: UIColor) -> ModelEntity {
let ball = ModelEntity(mesh: .generateSphere(radius: radius), materials: [SimpleMaterial(color: color, isMetallic: false)])
return ball
}
}

Related

how to fix Cannot convert return expression of type 'UIView' to return type 'ARView' issue in swift?

enter image description here
class MyCustomUIViewController: UIViewController {
let coachingOverly = ARCoachingOverlayView()
override func viewDidLoad() {
let arView = ARView(frame: .zero)
let session = arView.session
// this is a view that display standardized onbording instructions to direct users towards specific goal.
let coachingOverly = ARCoachingOverlayView()
// this is an intger bit mask that determines how the reciever resized itself.
coachingOverly.autoresizingMask = [.flexibleWidth,.flexibleHeight]
coachingOverly.goal = .anyPlane
coachingOverly.session = session
arView.addSubview(coachingOverly)
// Load the "Opject" scene from the "dumu" Reality File
let anchor = try! Dumu.loadScene()
// Add the Opject anchor to the scene
arView.scene.anchors.append(anchor)
self.view.addSubview(arView)
}
override func viewWillAppear(_ animated: Bool) {
coachingOverly.setActive(true, animated: true)
}
}
From your picture in the ARViewContainer I see....
In
func makeUIView(context: Context) -> ARView // ..... -> you specify a return type of ARView
But next you instantiate a new UIView for the return
... = MyCustomViewController() // Seems to be an UIView->
which seems to be an UIView and you return it. But you have to return an ARView which is a subclass of UIView. Maybe you can downcast the MyCustomViewController with
let viewController = MyCustomUIViewController() as ARView
or you may change the class of MyCustomUIViewController() to ARView in its definition. For a more detailed explanation I need at least the definition of MyCustomUIViewController()

How can I add multipeer connectivity to an ARKit app that doesn't have 3D assets, but uses UITextView for rendering instead? [ Swift ]

So I am trying to add a multipeer element to this Sticky Note app from Apple's own Sample Code. Link to Sample Code page There are several examples of multipeer ARKit apps but the problem here is, with the app I am working from, the Sticky Note is NOT a 3D element but
For the purposes of this sample app, the sticky note entity has no geometry and thus, no appearance. Its anchor provides a 3D location only, and itʼs the sticky noteʼs screen-space annotation that has an appearance. To display it, you define the sticky noteʼs annotation. Following RealityKitʼs entity-component model, design a component that houses the annotation, which in this case is a view. See ScreenSpaceComponent.
I have been trying to use the example of multipeer apps in ARthat use the ARKit element with 3D elements stored as either assets [the "Collaborative Session" example ] or using ModelEntity geometry [the Creating a Multiuser AR Experience example ] but I haven't been successful in translating this app which uses screen space only.
I am able to get the message on the screen that it's connected to a peer, but that is as far as it goes. It will not render the notes on the second phone. I am burned out from all the attempts of making it work:(
One alternative is to forget about the notes being tethered to the screen space, and recreating this as a regular 3D space and 2D geometry thing using SpriteKit.
The system will not render the apps sticky notes on the other phone. I know there is a way around this, but I have been trying for days and haven't been able to do it.
I have been testing this using 2 phones.
I have
Added the info on the p.list
Added the Multipeer Session file
Added the code on the ViewController file related to multipeer
Added code to the arGestureSetUp() extension file which has the rendering info for the sticky notes.
What works: I can see the notes on both phones, and I get the messages saying that a peer has joined. What I can't do is view the
other user's notes like I would in a regular 3D ARkit app. It will not
render.
This is what I have added to the insertNewSticky function
func insertNewSticky(_ sender: UITapGestureRecognizer)
from one of the other examples:
let anchor = ARAnchor(name: "Anchor for object placement", transform: raycastResult.worldTransform)
arView.session.add(anchor: anchor)
Below is the full code for the Gesture Recognizer Setup
import UIKit
import ARKit
extension ViewController {
// MARK: - Gesture recognizer setup
// - Tag: AddViewTapGesture
func arViewGestureSetup() {
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tappedOnARView))
arView.addGestureRecognizer(tapGesture)
let swipeGesture = UISwipeGestureRecognizer(target: self, action: #selector(swipedDownOnARView))
swipeGesture.direction = .down
arView.addGestureRecognizer(swipeGesture)
}
func stickyNoteGestureSetup(_ note: StickyNoteEntity) {
let panGesture = UIPanGestureRecognizer(target: self, action: #selector(panOnStickyView))
note.view?.addGestureRecognizer(panGesture)
let tapOnStickyView = UITapGestureRecognizer(target: self, action: #selector(tappedOnStickyView(_:)))
note.view?.addGestureRecognizer(tapOnStickyView)
}
// MARK: - Gesture recognizer callbacks
/// Tap gesture input handler.
/// - Tag: TapHandler
#objc
func tappedOnARView(_ sender: UITapGestureRecognizer) {
// Ignore the tap if the user is editing a sticky note.
for note in stickyNotes where note.isEditing { return }
// Create a new sticky note at the tap location.
insertNewSticky(sender)
}
/**
Hit test the feature point cloud and use any hit as the position of a new StickyNote. Otherwise, display a tip.
- Tag: ScreenSpaceViewInsertionTag
*/
func insertNewSticky(_ sender: UITapGestureRecognizer) {
// Get the user's tap screen location.
let touchLocation = sender.location(in: arView)
// Cast a ray to check for its intersection with any planes.
guard let raycastResult = arView.raycast(from: touchLocation, allowing: .estimatedPlane, alignment: .any).first
else {
messageLabel.displayMessage("No surface detected, try getting closer.", duration: 2.0)
return
}
// Create a new sticky note positioned at the hit test result's world position.
let frame = CGRect(origin: touchLocation, size: CGSize(width: 200, height: 200))
let note = StickyNoteEntity(frame: frame, worldTransform: raycastResult.worldTransform)
// Center the sticky note's view on the tap's screen location.
note.setPositionCenter(touchLocation)
// Add the sticky note to the scene's entity hierarchy.
arView.scene.addAnchor(note)
// Add the sticky note's view to the view hierarchy.
guard let stickyView = note.view else { return }
arView.insertSubview(stickyView, belowSubview: trashZone)
// Enable gestures on the sticky note.
stickyNoteGestureSetup(note)
// Save a reference to the sticky note.
stickyNotes.append(note)
// Volunteer to handle text view callbacks.
stickyView.textView.delegate = self
let anchor = ARAnchor(name: "Anchor for object placement", transform: raycastResult.worldTransform)
arView.session.add(anchor: anchor)
}
/// Dismisses the keyboard.
#objc
func swipedDownOnARView(_ sender: UISwipeGestureRecognizer) {
dismissKeyboard()
}
fileprivate func dismissKeyboard() {
for note in stickyNotes {
guard let textView = note.view?.textView else { continue }
if textView.isFirstResponder {
textView.resignFirstResponder()
return
}
}
}
#objc
func tappedOnStickyView(_ sender: UITapGestureRecognizer) {
guard let stickyView = sender.view as? StickyNoteView else { return }
stickyView.textView.becomeFirstResponder()
}
//- Tag: PanOnStickyView
fileprivate func panStickyNote(_ sender: UIPanGestureRecognizer, _ stickyView: StickyNoteView, _ panLocation: CGPoint) {
messageLabel.isHidden = true
let feedbackGenerator = UIImpactFeedbackGenerator()
switch sender.state {
case .began:
// Prepare the taptic engine to reduce latency in delivering feedback.
feedbackGenerator.prepare()
// Drag if the gesture is beginning.
stickyView.stickyNote.isDragging = true
// Save offsets to implement smooth panning.
guard let frame = sender.view?.frame else { return }
stickyView.xOffset = panLocation.x - frame.origin.x
stickyView.yOffset = panLocation.y - frame.origin.y
// Fade in the widget that's used to delete sticky notes.
trashZone.fadeIn(duration: 0.4)
case .ended:
// Stop dragging if the gesture is ending.
stickyView.stickyNote.isDragging = false
// Delete the sticky note if the gesture ended on the trash widget.
if stickyView.isInTrashZone {
deleteStickyNote(stickyView.stickyNote)
// ...
} else {
attemptRepositioning(stickyView)
}
// Fades out the widget that's used to delete sticky notes when there are no sticky notes currently being dragged.
if !stickyNotes.contains(where: { $0.isDragging }) {
trashZone.fadeOut(duration: 0.2)
}
default:
// Update the sticky note's screen position based on the pan location, and initial offset.
stickyView.frame.origin.x = panLocation.x - stickyView.xOffset
stickyView.frame.origin.y = panLocation.y - stickyView.yOffset
// Give feedback whenever the pan location is near the widget used to delete sticky notes.
trashZoneThresholdFeedback(sender, feedbackGenerator)
}
}
/// Sticky note pan-gesture handler.
/// - Tag: PanHandler
#objc
func panOnStickyView(_ sender: UIPanGestureRecognizer) {
guard let stickyView = sender.view as? StickyNoteView else { return }
let panLocation = sender.location(in: arView)
// Ignore the pan if any StickyViews are being edited.
for note in stickyNotes where note.isEditing { return }
panStickyNote(sender, stickyView, panLocation)
}
func deleteStickyNote(_ note: StickyNoteEntity) {
guard let index = stickyNotes.firstIndex(of: note) else { return }
note.removeFromParent()
stickyNotes.remove(at: index)
note.view?.removeFromSuperview()
note.view?.isInTrashZone = false
}
/// - Tag: AttemptRepositioning
fileprivate func attemptRepositioning(_ stickyView: StickyNoteView) {
// Conducts a ray-cast for feature points using the panned position of the StickyNoteView
let point = CGPoint(x: stickyView.frame.midX, y: stickyView.frame.midY)
if let result = arView.raycast(from: point, allowing: .estimatedPlane, alignment: .any).first {
stickyView.stickyNote.transform.matrix = result.worldTransform
} else {
messageLabel.displayMessage("No surface detected, unable to reposition note.", duration: 2.0)
stickyView.stickyNote.shouldAnimate = true
}
}
fileprivate func trashZoneThresholdFeedback(_ sender: UIPanGestureRecognizer, _ feedbackGenerator: UIImpactFeedbackGenerator) {
guard let stickyView = sender.view as? StickyNoteView else { return }
let panLocation = sender.location(in: trashZone)
if trashZone.frame.contains(panLocation), !stickyView.isInTrashZone {
stickyView.isInTrashZone = true
feedbackGenerator.impactOccurred()
} else if !trashZone.frame.contains(panLocation), stickyView.isInTrashZone {
stickyView.isInTrashZone = false
feedbackGenerator.impactOccurred()
}
}
#objc
func tappedReset(_ sender: UIButton) {
reset()
}
}
and this is the full code for the ViewController file
/*
See LICENSE folder for this sample’s licensing information.
Abstract:
Main view controller for the AR experience.
*/
import UIKit
import RealityKit
import Combine
import ARKit
import MultipeerConnectivity
class ViewController: UIViewController, ARSessionDelegate {
// MARK: - Class variable declarations
#IBOutlet var arView: ARView!
#IBOutlet weak var messageLabel: MessageLabel!
var trashZone: GradientView!
var shadeView: UIView!
var resetButton: UIButton!
var keyboardHeight: CGFloat!
var stickyNotes = [StickyNoteEntity]()
var subscription: Cancellable!
//added Sat May 28 5:12pm
var multipeerSession: MultipeerSession?
// end of added Sat May 28 5:12pm
//added Sat May 28 5:12pm
// A dictionary to map MultiPeer IDs to ARSession ID's.
// This is useful for keeping track of which peer created which ARAnchors.
var peerSessionIDs = [MCPeerID: String]()
var sessionIDObservation: NSKeyValueObservation?
var configuration: ARWorldTrackingConfiguration?
// end of added Sat May 28 5:12pm
// MARK: - View Controller Life Cycle
override func viewDidLoad() {
super.viewDidLoad()
subscription = arView.scene.subscribe(to: SceneEvents.Update.self) { [unowned self] in
self.updateScene(on: $0)
}
arViewGestureSetup()
overlayUISetup()
arView.session.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// Add observer to the keyboardWillShowNotification to get the height of the keyboard every time it is shown
let notificationName = UIResponder.keyboardWillShowNotification
let selector = #selector(keyboardIsPoppingUp(notification:))
NotificationCenter.default.addObserver(self, selector: selector, name: notificationName, object: nil)
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
arView.session.delegate = self
// Prevent the screen from being dimmed to avoid interuppting the AR experience.
UIApplication.shared.isIdleTimerDisabled = true
// Turn off ARView's automatically-configured session
// to create and set up your own configuration.
arView.automaticallyConfigureSession = false
configuration = ARWorldTrackingConfiguration()
// Enable a collaborative session.
configuration?.isCollaborationEnabled = true
// Enable realistic reflections.
configuration?.environmentTexturing = .automatic
// Begin the session.
arView.session.run(configuration!)
// Use key-value observation to monitor your ARSession's identifier.
sessionIDObservation = observe(\.arView.session.identifier, options: [.new]) { object, change in
print("SessionID changed to: \(change.newValue!)")
// Tell all other peers about your ARSession's changed ID, so
// that they can keep track of which ARAnchors are yours.
guard let multipeerSession = self.multipeerSession else { return }
self.sendARSessionIDTo(peers: multipeerSession.connectedPeers)
}
// Start looking for other players via MultiPeerConnectivity.
multipeerSession = MultipeerSession(receivedDataHandler: receivedData, peerJoinedHandler:
peerJoined, peerLeftHandler: peerLeft, peerDiscoveredHandler: peerDiscovered)
//arView.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(handleTap(recognizer:))))
messageLabel.displayMessage("Tap the screen to place cubes.\nInvite others to launch this app to join you.", duration: 60.0)
}
//peerDiscovered
func peerDiscovered(_ peer: MCPeerID) -> Bool {
guard let multipeerSession = multipeerSession else { return false }
if multipeerSession.connectedPeers.count > 3 {
// Do not accept more than four users in the experience.
messageLabel.displayMessage("A fifth peer wants to join the experience.\nThis app is limited to four users.", duration: 6.0)
return false
} else {
return true
}
}
// end of added Sat May 28 5:12pm
/// - Tag: PeerJoined
// added Sat May 28 5:12pm
func peerJoined(_ peer: MCPeerID) {
messageLabel.displayMessage("""
A peer has joined the experience.
Hold the phones next to each other.
""", duration: 6.0)
// Provide your session ID to the new user so they can keep track of your anchors.
sendARSessionIDTo(peers: [peer])
}
// end of added Sat May 28 5:12pm
// added Sat May 28 5:12pm
func peerLeft(_ peer: MCPeerID) {
messageLabel.displayMessage("A peer has left the shared experience.")
// Remove all ARAnchors associated with the peer that just left the experience.
if let sessionID = peerSessionIDs[peer] {
removeAllAnchorsOriginatingFromARSessionWithID(sessionID)
peerSessionIDs.removeValue(forKey: peer)
}
}
// end of added Sat May 28 5:12pm
//added Sat May 28 5:12pm
func receivedData(_ data: Data, from peer: MCPeerID) {
if let collaborationData = try? NSKeyedUnarchiver.unarchivedObject(ofClass: ARSession.CollaborationData.self, from: data) {
arView.session.update(with: collaborationData)
return
}
// ...
let sessionIDCommandString = "SessionID:"
if let commandString = String(data: data, encoding: .utf8), commandString.starts(with: sessionIDCommandString) {
let newSessionID = String(commandString[commandString.index(commandString.startIndex,
offsetBy: sessionIDCommandString.count)...])
// If this peer was using a different session ID before, remove all its associated anchors.
// This will remove the old participant anchor and its geometry from the scene.
if let oldSessionID = peerSessionIDs[peer] {
removeAllAnchorsOriginatingFromARSessionWithID(oldSessionID)
}
peerSessionIDs[peer] = newSessionID
}
}
// end of added Sat May 28 5:12pm
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
messageLabel.displayMessage("Established joint experience with a peer.")
// ...
}
func updateScene(on event: SceneEvents.Update) {
let notesToUpdate = stickyNotes.compactMap { !$0.isEditing && !$0.isDragging ? $0 : nil }
for note in notesToUpdate {
// Gets the 2D screen point of the 3D world point.
guard let projectedPoint = arView.project(note.position) else { return }
// Calculates whether the note can be currently visible by the camera.
let cameraForward = arView.cameraTransform.matrix.columns.2.xyz
let cameraToWorldPointDirection = normalize(note.transform.translation - arView.cameraTransform.translation)
let dotProduct = dot(cameraForward, cameraToWorldPointDirection)
let isVisible = dotProduct < 0
// Updates the screen position of the note based on its visibility
note.projection = Projection(projectedPoint: projectedPoint, isVisible: isVisible)
note.updateScreenPosition()
}
}
func reset() {
guard let configuration = arView.session.configuration else { return }
arView.session.run(configuration, options: .removeExistingAnchors)
for note in stickyNotes {
deleteStickyNote(note)
}
}
func session(_ session: ARSession, didFailWithError error: Error) {
guard error is ARError else { return }
let errorWithInfo = error as NSError
let messages = [
errorWithInfo.localizedDescription,
errorWithInfo.localizedFailureReason,
errorWithInfo.localizedRecoverySuggestion
]
let errorMessage = messages.compactMap({ $0 }).joined(separator: "\n")
DispatchQueue.main.async {
// Present an alert informing about the error that has occurred.
let alertController = UIAlertController(title: "The AR session failed.", message: errorMessage, preferredStyle: .alert)
let restartAction = UIAlertAction(title: "Restart Session", style: .default) { _ in
alertController.dismiss(animated: true, completion: nil)
self.reset()
}
alertController.addAction(restartAction)
self.present(alertController, animated: true, completion: nil)
}
}
override var prefersStatusBarHidden: Bool {
return true
}
override var prefersHomeIndicatorAutoHidden: Bool {
return true
}
private func sendARSessionIDTo(peers: [MCPeerID]) {
guard let multipeerSession = multipeerSession else { return }
let idString = arView.session.identifier.uuidString
let command = "SessionID:" + idString
if let commandData = command.data(using: .utf8) {
multipeerSession.sendToPeers(commandData, reliably: true, peers: peers)
}
}
private func removeAllAnchorsOriginatingFromARSessionWithID(_ identifier: String) {
guard let frame = arView.session.currentFrame else { return }
for anchor in frame.anchors {
guard let anchorSessionID = anchor.sessionIdentifier else { continue }
if anchorSessionID.uuidString == identifier {
arView.session.remove(anchor: anchor)
}
}
}
}
Update: I spoke to a Staff Engineer on Apple's RealityKit team who explained to me that what I was trying to accomplish is not feasible because the note had an embedded subclass that is not 'codable' as per Swift's Codable Protocol
I will have to rebuild the note differently than the example i had been working with to ensure it fits within the Codable protocol which will then ensure the data can travel across the network via Multipeer Connectivity Framework.

How to change a folder of Reference Images?

I'm trying to code with my iPad and the Swift Playgrounds 4.0. I tried to do Image Tracking with SwiftUI example 1 or example 2. It's possible to create a folder in the iPad App, but you can't put an Image in here... So the following code doesn't work:
func makeUIView(context: Context) -> ARView {
guard let referenceImages = ARReferenceImage.referenceImages(
inGroupNamed: "AR Resources",
bundle: nil)
else {
fatalError("Missing expected asset catalog resources.")
}
}
Here an Image from the Swift Playgrounds App
Is it possible to use reference images from the root / main like you do with the .usdz models?
if let usdzModel = try? Entity.load(named: "drummer") {
anchor.addChild(usdzModel)
}
Here is the complete Code:
import ARKit
import SwiftUI
import RealityKit
struct RealityKitView: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let view = ARView()
// Start AR session
let session = view.session
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal]
session.run(config)
// Add coaching overlay
let coachingOverlay = ARCoachingOverlayView()
coachingOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight]
coachingOverlay.session = session
coachingOverlay.goal = .horizontalPlane
view.addSubview(coachingOverlay)
// Set debug options
#if DEBUG
view.debugOptions = [.showFeaturePoints, .showAnchorOrigins, .showAnchorGeometry]
#endif
//AnchorEntity Bild
let anchor = AnchorEntity(.image(group: "AR Resources", name: "Test"))
// Create an image anchor by specifying the group name and image name of the AR resource
let box = ModelEntity(mesh: .generateBox(size: simd_make_float3(0.1, 0.03, 0.05)))
anchor.addChild(box)
view.scene.anchors.append(anchor)
//End AnchorEntity
return view
}
func updateUIView(_ view: ARView, context: Context) {
}
}
It's a code snippet from my Medium story.
This is done differently from Xcode, but no more complicated than in Xcode. Use + button in the upper right corner of your Swift Playgrounds app to place your reference images there.
Then use this code.
import ARKit
let sceneView = ARSCNView(frame: .zero)
var trackingImages = Set<ARReferenceImage>()
fileprivate func feedSession() {
let imageFromWeb = #imageLiteral(resourceName: "Photo.png")
let image = ARReferenceImage(imageFromWeb.cgImage!, orientation: .up,
physicalWidth: 0.5)
self.trackingImages.insert(image)
let config = ARImageTrackingConfiguration()
config.trackingImages = trackingImages
self.sceneView.session.run(config)
}

How to remove ARReferenceObject Anchor when the object is no longer in camera frame?

I am using RealityKit + SwiftUI + ARSessionDelegate to render 3D content on top of an ARReferenceObject. I want to remove the 3D content once the camera pans away from the object and it is no longer in the frame.
Currently I render the 3D content when the object is detected, which is what I want. But I have multiple identical objects that I want to identify separately using the same ARReferenceObject. So in order to do this I need to remove the original anchoring.
This is my wrapper for SWiftUI:
struct ARViewWrapper: UIViewRepresentable {
#ObservedObject var arManager: ARManager
// cretae alias for our wrapper
typealias UIViewType = ARView
// delegate for view representable
func makeCoordinator() -> Coordinator {
return Coordinator(arManager: self.arManager)
}
func makeUIView(context: Context) -> ARView {
// create ARView
let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true)
// assign delegate
arView.session.delegate = context.coordinator
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
print("Updating View")
// create anchor using an image and add it to the ARView
let target = AnchorEntity(.object(group: "AR Resources", name: "bj"))
target.name = "obj_anchor"
// add anchor to AR world
if(uiView.scene.anchors.count == 0){
uiView.scene.anchors.append(target)
}else{
uiView.scene.anchors[0] = target
}
// add plane and title to anchor
addARObjs(anchor: target, arObj: arManager.currARObj)
return()
}
}
This is my Delegate:
class Coordinator: NSObject, ARSessionDelegate {
#ObservedObject var arManager: ARManager
init(arManager: ARManager) {
self.arManager = arManager
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
return
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]){
return
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
return
}
}
SceneKit
You can do it in SceneKit. All you need is to use isNode(_:insideFrustumOf:) instance method that returns a Boolean value indicating whether a node might be visible from a specified point of view or not. This method is also implemented in ARKit (as a part of SceneKit).
func isNode(_ node: SCNNode, insideFrustumOf pointOfView: SCNNode) -> Bool
Sample code:
var allYourNodes = [SCNNode]()
allYourNodes.append(node001)
allYourNodes.append(node002)
guard let pointOfView = arSCNView.pointOfView
else { return }
for yourNode in allYourNodes {
if !arView.isNode(yourNode, insideFrustumOf: pointOfView) {
arSCNView.session.remove(anchor: yourARAnchor)
}
}
However, I haven't found a similar method in RealityKit 2.0. Hope it'll be added by Cupertino engineers in the near future.
RealityKit
Here's what we have in RealityKit 2.0 at the moment:
Apple's documentation says: During an AR session, RealityKit automatically uses the device’s camera to define the perspective from which to render the scene. When rendering a scene outside of an AR session – with the view’s cameraMode property set to
ARView.CameraMode.nonAR
RealityKit uses a PerspectiveCamera instead. You can add a perspective camera anywhere in your scene to control the point of view. If you don't explicitly provide one, RealityKit creates a default camera for you.
So, the only available parameters of a PerspectiveCameraComponent at the moment are:
init(near: Float, far: Float, fieldOfViewInDegrees: Float)

Load a spritekit scene from another bundle?

I am making a SpriteKit framework using swift 4.2 and want to include some .sks files for scenes and actions. I have tried to load the scene from the bundle using the code below:
class func newGameScene() -> GameScene {
guard let gameScenePath = Bundle(for: self).path(forResource: "GameScene", ofType: "sks") else { assert(false) }
guard let gameSceneData = FileManager.default.contents(atPath: gameScenePath) else { assert(false) }
let gameSceneCoder = NSKeyedUnarchiver(forReadingWith: gameSceneData)
guard let scene = GameScene(coder: gameSceneCoder) else { assert(false) }
// Set the scale mode to scale to fit the window
scene.scaleMode = .aspectFill
return scene
}
I load the scene and present it. (This code is mostly from Apple's template for SpriteKit as Im testing this issue.)
guard let view = view else {
return nil
}
let scene = GameScene.newGameScene()
view.presentScene(scene)
view.ignoresSiblingOrder = true
view.showsFPS = true
view.showsNodeCount = true
return nil
The GameScene.sks and the code is unchanged from Apples template in this case. This code and the .sks assets are in the dynamic framework and imported into another project.
When having the framework load the scene into a view I pass it, it shows the fps and node count but not the "Hello, World!" text.
In the code below, also copied from the template, a break point shows that these are not called when mousing down.
#if os(OSX)
// Mouse-based event handling
extension GameScene {
override func mouseDown(with event: NSEvent) {
if let label = self.label {
label.run(SKAction.init(named: "Pulse")!, withKey: "fadeInOut")
}
self.makeSpinny(at: event.location(in: self), color: SKColor.green)
}
override func mouseDragged(with event: NSEvent) {
self.makeSpinny(at: event.location(in: self), color: SKColor.blue)
}
override func mouseUp(with event: NSEvent) {
self.makeSpinny(at: event.location(in: self), color: SKColor.red)
}
}
#endif
I know it must have to do with how SpritKit loads the scene but cannot find a solution. I have to use an NSKeyedUnarchiver becuase SpritKit's built in file initializer:
GameScene(fileNamed: "GameScene")
Only loads from the Main Bundle.
Now in the above I assumed that the file can be loaded by using a coder but Tomato made the point that sks most likely was not saved using a coder. In that case, It may be impossible to load an sks file from another bundle in sprite-kit using the provided api from apple. The answer may not include coders.
I have compiled the above discussion/solution into a single extension function on SKScene. Hope this helps someone!
import SpriteKit
extension SKScene {
static func fromBundle(fileName: String, bundle: Bundle?) -> SKScene? {
guard let bundle = bundle else { return nil }
guard let path = bundle.path(forResource: fileName, ofType: "sks") else { return nil }
if let data = FileManager.default.contents(atPath: path) {
return NSKeyedUnarchiver.unarchiveObject(with: data) as? SKScene
}
return nil
}
}
Just as I thought let gameSceneCoder = NSKeyedUnarchiver(forReadingWith: gameSceneData) was not creating a proper coder for you.
Just do
guard let scene = NSKeyedUnarchiver.unarchiveObject(with: gameSceneData) as? SKScene
else{
assert(false)
}
This will unarchive the file properly for you.
Note, if you want to use GameScene, make sure GameScene is set in the custom class of the SKS file