How to set up multi-face recognition with ARKit3? - arkit

I am trying to develop the ARKit3 facial recognition application. I want to make an application that supports multi-face recognition. I have made the following settings, but it does not work. Is it wrong with me?
override func viewDidLoad() {
super.viewDidLoad()
/// SetupDelegate
faceSCNView.delegate = self
faceSCNView.session.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
/// FaceTrackingConfiguration
let configuration = ARFaceTrackingConfiguration()
/// MaxNumberOfTrackedFaces = 2
configuration.maximumNumberOfTrackedFaces = 2
/// Run
faceSCNView.session.run(configuration)
}
Delegate
extension GameViewController : ARSCNViewDelegate {
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
print(anchor.sessionIdentifier , anchor.identifier , anchor.name)
if anchor is ARFaceAnchor {
print("renderer didAdd", anchor.identifier , anchor.name ?? "noname")
}
}
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard
let faceAnchor = anchor as? ARFaceAnchor
else { return}
print("renderer didUpdate", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
extension GameViewController : ARSessionDelegate {
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
print(anchors.count)
for anchor in anchors where anchor is ARFaceAnchor {
let faceAnchor = anchor as! ARFaceAnchor
print("Session didAdd", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors where anchor is ARFaceAnchor {
let faceAnchor = anchor as! ARFaceAnchor
print("Session didUpdate", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
}
No matter how many people perform facial recognition together, there is only one recognized Anchor, the identifier is: CA831DB2-E078-45C3-9A1C-44F8459AA04F.
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04505286
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04578292
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04813192
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04813192
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04832877
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0484867
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0484867
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04869337
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04869337
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0489419
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05000613
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05000613
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05070856
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05031016
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05070856
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05118915
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05093153

Sorry, it is my problem, because my mobile phone is iPhoneX, it is A11 chip, multi-face recognition is not supported.
The reason for this problem is that I used the iOS13 Beta 1 system at the time, but the maximum number of face recognitions supported during the run was 3, but only one person was actually supported. When I upgraded to iOS13 Beta 2, the update was shown to be the correct one.

Related

How can I replace a ModelEntity in RealityKit?

I want to switch the ModelEntity from the fvBoatAnchor.fvBoatObject to the fvBridgeAnchor.fvBridgeObject with the click of a button, but I'm not sure how I do it, while also keeping the gestures and collisions as well as the image tracker after changing the entity.
Here is my code
import UIKit
import RealityKit
import ARKit
class fvVessel: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
#IBOutlet weak var imageView: UIImageView!
let fvBoatAnchor = try! FV.loadFvBoatScene()
let fvBridgeAnchor = try! FV.loadFvBridgeScene()
var imageAnchorToEntity: [ARImageAnchor: AnchorEntity] = [:]
override func viewDidLoad() {
super.viewDidLoad()
let fvBoat = fvBoatAnchor.fvBoatObject as? Entity & HasCollision
arView.installGestures(for: fvBoat!)
fvBoatAnchor.generateCollisionShapes(recursive: true)
arView.scene.addAnchor(fvBoatAnchor)
arView.session.delegate = self
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
anchors.compactMap { $0 as? ARImageAnchor }.forEach {
let anchorEntity = AnchorEntity()
let modelEntity = fvBoatAnchor.fvBoatObject!
anchorEntity.addChild(modelEntity)
arView.scene.addAnchor(anchorEntity)
anchorEntity.transform.matrix = $0.transform
imageAnchorToEntity[$0] = anchorEntity
imageView.isHidden = true
}
}
// func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
// anchors.compactMap { $0 as? ARImageAnchor }.forEach {
// let anchorEntity = imageAnchorToEntity[$0]
// anchorEntity?.transform.matrix = $0.transform
// }
// }
func installGestures(on object:ModelEntity){
object.generateCollisionShapes(recursive: true)
arView.installGestures(.all, for: object)
}
func leaveScene() {
arView?.session.pause()
arView?.session.delegate = nil
arView?.scene.anchors.removeAll()
arView?.removeFromSuperview()
arView?.window?.resignKey()
arView = nil
}
#IBAction func leaveScene(_ sender: UIButton) {
leaveScene()
}
}

Why am I unable to add anchor to arView scene after using removeAll()?

I'm trying to add and remove an anchor to my scene, but after removing it I'm unable to add it again. This might be because of the anchors added to the scene in the session function, but I'm not sure.
Do I need to run the session function again to add the anchorEntity to the scene again(not managed to do it due to some errors), or is there something else I'm missing...
Here is my code:
import UIKit
import RealityKit
import ARKit
class fvBoat: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
let fvBoatAnchor = try! Vessel.loadFvBoatScene()
var imageAnchorToEntity: [ARImageAnchor: AnchorEntity] = [:]
override func viewDidLoad() {
super.viewDidLoad()
let fvBoat = fvBoatAnchor.fvBoatObject as? Entity & HasCollision
arView.installGestures(for: fvBoat!)
fvBoatAnchor.generateCollisionShapes(recursive: true)
// arView.scene.addAnchor(fvBoatAnchor)
arView.session.delegate = self
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
anchors.compactMap { $0 as? ARImageAnchor }.forEach {
let anchorEntity = AnchorEntity()
let modelEntity = fvBoatAnchor.fvBoatObject!
anchorEntity.addChild(modelEntity)
arView.scene.addAnchor(anchorEntity)
anchorEntity.transform.matrix = $0.transform
imageAnchorToEntity[$0] = anchorEntity
}
}
func installGestures(on object:ModelEntity){
object.generateCollisionShapes(recursive: true)
arView.installGestures(.all, for: object)
}
func leaveScene() {
arView?.session.pause()
arView?.session.delegate = nil
arView?.scene.anchors.removeAll()
arView?.removeFromSuperview()
arView?.window?.resignKey()
arView = nil
}
#IBAction func leaveScene(_ sender: Any) {
leaveScene()
}
#IBAction func addAnchor(_ sender: Any) {
arView.scene.addAnchor(fvBoatAnchor)
}
#IBAction func clearScene(_ sender: Any) {
arView.scene.anchors.removeAll()
}
}

RealityKit - How to get rendered frame?

So session delegate allows me to get a frame that contains camera image + some details. But how can I get something similar from RealityKit. Like a frame of rendered objects + shadows without background?
I'd like to do my own post-frame rendering in metal. So have realityKit render nice meshes and then do adjustments to the resulting frame myself & render to my own surface.
Regards
Dariusz
TLDR
You can add a session delegate to RealityKit's ARView: arView.session.delegate = sessionDelegate
SwiftUI
The Following is a how you could implement a session delegate with SwiftUI:
class SessionDelegate<ARContainer: UIViewRepresentable>: NSObject, ARSessionDelegate {
var arVC: ARContainer
var arGameView : ARView?
init(_ control: ARContainer) {
self.arVC = control
}
func setARView(_ arView: ARView){
arGameView = arView // get access the arView
}
func session(_ session: ARSession, didUpdate anchors : [ARAnchor]) {}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
print("frame", frame)
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {}
func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {}
}
Make your delegate available to your UIViewRepresentable through the makeCoordinator function:
struct CameraARContainer: UIViewRepresentable {
func makeCoordinator() -> SessionDelegate<Self>{
SessionDelegate<Self>(self)
}
func makeUIView(context: Context) -> ARView {
let arView = ARGameView(frame: .zero)
arView.session.delegate = context.coordinator
context.coordinator.setARView(arView)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}

How to capture node from "renderer(_:nodeFor:)" instance method?

I'm confused about this delegate method that is called when planeDetection is active. The method is being successfully called when plane is detected, but where does the returned SCNNode go? How do I access it?
func renderer(_ renderer: SCNSceneRenderer,
nodeFor anchor: ARAnchor) -> SCNNode?
For what it's worth, I'm using a custom ARCL library that allows me to place nodes by GPS coordinates. For some reason, with that framework this method does not seem to be firing upon detecting a plane. My delegates are set properly because the renderer(_:nodeFor:) method does get called - otherwise I would just use this method.
func renderer(_ renderer: SCNSceneRenderer,
didAdd node: SCNNode,
for anchor: ARAnchor)
Telling about new SCNNode, that renderer(_:nodeFor:) instance method generates for us, we pass this node to ARKit that tethers it with a corresponding anchor to thoroughly track its position.
The node, returned by renderer(_:nodeFor:) method, added as a child to SCNScene root node. Quite often renderer(_:nodeFor:) method is used when tracking ARFaceAnchors.
var specialNode: SCNNode?
func renderer(_ renderer: SCNSceneRenderer,
nodeFor anchor: ARAnchor) -> SCNNode? {
guard let sceneView = renderer as? ARSCNView, anchor is ARFaceAnchor
else { return nil }
let faceGeometry = ARSCNFaceGeometry(device: sceneView.device!)!
self.specialNode = SCNNode(geometry: faceGeometry)
return self.specialNode
}
...
func renderer(_ renderer: SCNSceneRenderer,
didUpdate node: SCNNode,
for anchor: ARAnchor) {
if let faceAnchor = anchor as? ARFaceAnchor,
let faceGeo = node.geometry as? ARSCNFaceGeometry {
faceGeo.update(from: faceAnchor.geometry)
}
}
Nevertheless, you can use renderer(_:nodeFor:) method with any desired anchor type.

Dispatchqueue background thread update not working on iOS 12

I have the following piece of code that works perfectly on 11.4.1 but fails on 12
let background = DispatchQueue(label:"task")
var debugMeshNode = SCNNode()
let myKit = MyKit()
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
self.background.async {
let node = self.myKit.extractNode(anchor:anchor)
self.debugMeshNode.addChildNode(node) // no node added on UI in iOS12
}
}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
self.background.async {
self.myKit.process(frame: frame)
}
}
Could anyone point my mistake here?
UPDATE
The code seems to work if I add a print statement in the block like so,
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
self.background.async {
let node = self.myKit.extractNode(anchor:anchor)
self.debugMeshNode.addChildNode(node) // no node added on UI in iOS12
print("sample")
}
}
Originally from here, I used this
func guaranteeMainThreadSynchronousExecution(_ block: () -> ()) {
if Thread.isMainThread {
block()
} else {
DispatchQueue.main.sync {
block()
}
}
}
and updated my code like so,
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
self.guaranteeMainThreadSynchronousExecution {
self.background.async {
let node = self.myKit.extractNode(anchor:anchor)
self.debugMeshNode.addChildNode(node) // no node added on UI in iOS12
}
}
}
Then it works flawlessly. Hope this helps someone.