How to capture node from "renderer(_:nodeFor:)" instance method? - swift

I'm confused about this delegate method that is called when planeDetection is active. The method is being successfully called when plane is detected, but where does the returned SCNNode go? How do I access it?
func renderer(_ renderer: SCNSceneRenderer,
nodeFor anchor: ARAnchor) -> SCNNode?
For what it's worth, I'm using a custom ARCL library that allows me to place nodes by GPS coordinates. For some reason, with that framework this method does not seem to be firing upon detecting a plane. My delegates are set properly because the renderer(_:nodeFor:) method does get called - otherwise I would just use this method.
func renderer(_ renderer: SCNSceneRenderer,
didAdd node: SCNNode,
for anchor: ARAnchor)

Telling about new SCNNode, that renderer(_:nodeFor:) instance method generates for us, we pass this node to ARKit that tethers it with a corresponding anchor to thoroughly track its position.
The node, returned by renderer(_:nodeFor:) method, added as a child to SCNScene root node. Quite often renderer(_:nodeFor:) method is used when tracking ARFaceAnchors.
var specialNode: SCNNode?
func renderer(_ renderer: SCNSceneRenderer,
nodeFor anchor: ARAnchor) -> SCNNode? {
guard let sceneView = renderer as? ARSCNView, anchor is ARFaceAnchor
else { return nil }
let faceGeometry = ARSCNFaceGeometry(device: sceneView.device!)!
self.specialNode = SCNNode(geometry: faceGeometry)
return self.specialNode
}
...
func renderer(_ renderer: SCNSceneRenderer,
didUpdate node: SCNNode,
for anchor: ARAnchor) {
if let faceAnchor = anchor as? ARFaceAnchor,
let faceGeo = node.geometry as? ARSCNFaceGeometry {
faceGeo.update(from: faceAnchor.geometry)
}
}
Nevertheless, you can use renderer(_:nodeFor:) method with any desired anchor type.

Related

RealityKit - How to get rendered frame?

So session delegate allows me to get a frame that contains camera image + some details. But how can I get something similar from RealityKit. Like a frame of rendered objects + shadows without background?
I'd like to do my own post-frame rendering in metal. So have realityKit render nice meshes and then do adjustments to the resulting frame myself & render to my own surface.
Regards
Dariusz
TLDR
You can add a session delegate to RealityKit's ARView: arView.session.delegate = sessionDelegate
SwiftUI
The Following is a how you could implement a session delegate with SwiftUI:
class SessionDelegate<ARContainer: UIViewRepresentable>: NSObject, ARSessionDelegate {
var arVC: ARContainer
var arGameView : ARView?
init(_ control: ARContainer) {
self.arVC = control
}
func setARView(_ arView: ARView){
arGameView = arView // get access the arView
}
func session(_ session: ARSession, didUpdate anchors : [ARAnchor]) {}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
print("frame", frame)
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {}
func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {}
}
Make your delegate available to your UIViewRepresentable through the makeCoordinator function:
struct CameraARContainer: UIViewRepresentable {
func makeCoordinator() -> SessionDelegate<Self>{
SessionDelegate<Self>(self)
}
func makeUIView(context: Context) -> ARView {
let arView = ARGameView(frame: .zero)
arView.session.delegate = context.coordinator
context.coordinator.setARView(arView)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}

How do I make a SKSpritenode move to the touch location in swift SpriteKit

Hi I am making a game where a node should move to the touch location and I have asked before and it is still not working. Here is my code, hopefully you can help me. I have a file called Player.swift with this code inside it:
import SpriteKit
class Player: SKSpriteNode {
let playerTexture = SKTexture(imageNamed: "head")
init() {
super.init(texture: playerTexture, color: .clear, size: playerTexture.size())
}
// Satisfy the NSCoder required init.
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}}
and then I have the gamescene.swift file with this inside of it:
import SpriteKit
class GameScene: SKScene {
let player = Player()
override func didMove(to view: SKView) {
addChild(player)
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
player.position = touch.location(in: self)
}
}
}
Are you sure your touch event is registered? Just put a print("moved") in the touches moves function to be sure.

How to set up multi-face recognition with ARKit3?

I am trying to develop the ARKit3 facial recognition application. I want to make an application that supports multi-face recognition. I have made the following settings, but it does not work. Is it wrong with me?
override func viewDidLoad() {
super.viewDidLoad()
/// SetupDelegate
faceSCNView.delegate = self
faceSCNView.session.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
/// FaceTrackingConfiguration
let configuration = ARFaceTrackingConfiguration()
/// MaxNumberOfTrackedFaces = 2
configuration.maximumNumberOfTrackedFaces = 2
/// Run
faceSCNView.session.run(configuration)
}
Delegate
extension GameViewController : ARSCNViewDelegate {
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
print(anchor.sessionIdentifier , anchor.identifier , anchor.name)
if anchor is ARFaceAnchor {
print("renderer didAdd", anchor.identifier , anchor.name ?? "noname")
}
}
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard
let faceAnchor = anchor as? ARFaceAnchor
else { return}
print("renderer didUpdate", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
extension GameViewController : ARSessionDelegate {
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
print(anchors.count)
for anchor in anchors where anchor is ARFaceAnchor {
let faceAnchor = anchor as! ARFaceAnchor
print("Session didAdd", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors where anchor is ARFaceAnchor {
let faceAnchor = anchor as! ARFaceAnchor
print("Session didUpdate", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
}
No matter how many people perform facial recognition together, there is only one recognized Anchor, the identifier is: CA831DB2-E078-45C3-9A1C-44F8459AA04F.
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04505286
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04578292
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04813192
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04813192
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04832877
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0484867
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0484867
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04869337
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04869337
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0489419
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05000613
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05000613
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05070856
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05031016
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05070856
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05118915
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05093153
Sorry, it is my problem, because my mobile phone is iPhoneX, it is A11 chip, multi-face recognition is not supported.
The reason for this problem is that I used the iOS13 Beta 1 system at the time, but the maximum number of face recognitions supported during the run was 3, but only one person was actually supported. When I upgraded to iOS13 Beta 2, the update was shown to be the correct one.

Dispatchqueue background thread update not working on iOS 12

I have the following piece of code that works perfectly on 11.4.1 but fails on 12
let background = DispatchQueue(label:"task")
var debugMeshNode = SCNNode()
let myKit = MyKit()
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
self.background.async {
let node = self.myKit.extractNode(anchor:anchor)
self.debugMeshNode.addChildNode(node) // no node added on UI in iOS12
}
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
self.background.async {
self.myKit.process(frame: frame)
}
}
Could anyone point my mistake here?
UPDATE
The code seems to work if I add a print statement in the block like so,
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
self.background.async {
let node = self.myKit.extractNode(anchor:anchor)
self.debugMeshNode.addChildNode(node) // no node added on UI in iOS12
print("sample")
}
}
Originally from here, I used this
func guaranteeMainThreadSynchronousExecution(_ block: () -> ()) {
if Thread.isMainThread {
block()
} else {
DispatchQueue.main.sync {
block()
}
}
}
and updated my code like so,
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
self.guaranteeMainThreadSynchronousExecution {
self.background.async {
let node = self.myKit.extractNode(anchor:anchor)
self.debugMeshNode.addChildNode(node) // no node added on UI in iOS12
}
}
}
Then it works flawlessly. Hope this helps someone.

swift/scenekit problems getting touch events from SCNScene and overlaySKScene

Good afternoon, I'm trying to figure out how to get touch notifications from an SCNNode & a SKSpriteNode from an SCNScene overlayed with a SKScene.
import UIKit
import SceneKit
class GameViewController: UIViewController {
var scnView:SCNView!
var scnScene:SCNScene!
var sprite: spritekitHUD!
var cameraNode: SCNNode!
var shape: SCNNode!
override func viewDidLoad() {
super.viewDidLoad()
setupScene()
}
func setupScene() {
scnView = self.view as! SCNView
scnView.delegate = self
scnView.allowsCameraControl = true
scnScene = SCNScene(named: "art.scnassets/scene.scn")
scnView.scene = scnScene
sprite=spritekitHUD(size: self.view.bounds.size, game: self)
scnView.overlaySKScene=sprite
cameraNode = scnScene.rootNode.childNode(withName: "camera",
recursively: true)!
shape=scnScene.rootNode.childNode(withName: "shape", recursively: true)
shape.name="ThreeDShape"
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?)
{
let touch = touches.first!
let location = touch.location(in: scnView)
let hitResults = scnView.hitTest(location, options: nil)
if let result = hitResults.first {
handleTouchFor(node: result.node)
}
}
func handleTouchFor(node: SCNNode) {
if node.name == "ThreeDShape" {
print("SCNNode Touched")
}
}
}
This is my Spritekit overlay scene
import Foundation
import SpriteKit
class spritekitHUD: SKScene{
var game:GameViewController!
var shapeNode: SKSpriteNode!
init(size: CGSize, game: GameViewController){
super.init(size: size)
self.backgroundColor = UIColor.white
let spriteSize = size.width/12
self.shapeNode= SKSpriteNode(imageNamed: "shapeNode")
self.shapeNode.size = CGSize(width: spriteSize, height: spriteSize)
self.shapeNode.position = CGPoint(x: spriteSize + 8, y: spriteSize + 8)
self.shapeNode.name="test"
self.game=game
self.addChild(self.pauseNode)
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
guard let touch=touches.first else{
return
}
let location=touch.location(in: self)
if self.atPoint(location).name=="test" {
print("Spritekit node pressed")
}
}
}
so with this I can successfully get notifications that my spritenode has been touched on my overlaySKScene but I cant figure out how to get a notification that my SCNode has been touched. If you cant have 2 touchesbegan functions does anyone have any ideas how I can handle the 3d events with 2d events at the same time?
Thanks for your help!!
If you want to use an SKScene overlay of an SCNView for user controls, (eg you want to implement a button in the SKScene overlay that "consumes" the touch), but also have touches that don't hit the controls to pass through and register in the underlying SCNView, you have to do this: set isUserInteractionEnabled to false on the SKScene overlay itself, but then to true on any individual elements within that overlay that you'd like to act as buttons.
let overlay = SKScene(fileNamed: "overlay")
overlay?.isUserInteractionEnabled = false
let pauseButton = overlay?.childNode(withName: "pauseButton") as? Button
// Button is a subclass of SKSpriteNode that overrides touchesEnded
pauseButton?.isUserInteractionEnabled = true
sceneView.overlaySKScene = overlay
If the user touches a button, the button's touch events (touchesBegan, touchesEnded etc) will fire and consume the touch (underlying gesture recognizers will still fire though). If they touch outside of a button however, the touch will pass through to the underlying SCNView.
This is "lifted" straight out of Xcode's Game template......
Add a gesture recognizer in your viewDidLoad:
// add a tap gesture recognizer
let tapGesture = UITapGestureRecognizer(target: self, action:
#selector(handleTap(_:)))
scnView.addGestureRecognizer(tapGesture)
func handleTap(_ gestureRecognize: UIGestureRecognizer) {
// retrieve the SCNView
let scnView = self.view as! SCNView
// check what nodes are tapped
let p = gestureRecognize.location(in: scnView)
let hitResults = scnView.hitTest(p, options: [:])
// check that we clicked on at least one object
if hitResults.count > 0 {
// retrieved the first clicked object
let result: AnyObject = hitResults[0]
// result.node is the node that the user tapped on
// perform any actions you want on it
}
}
You can implement this method in spritekitHUD:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?)
{
game.touchesBegan(touches, with: event)
}