RealityKit - How to get rendered frame? - swift

So session delegate allows me to get a frame that contains camera image + some details. But how can I get something similar from RealityKit. Like a frame of rendered objects + shadows without background?
I'd like to do my own post-frame rendering in metal. So have realityKit render nice meshes and then do adjustments to the resulting frame myself & render to my own surface.
Regards
Dariusz

TLDR
You can add a session delegate to RealityKit's ARView: arView.session.delegate = sessionDelegate
SwiftUI
The Following is a how you could implement a session delegate with SwiftUI:
class SessionDelegate<ARContainer: UIViewRepresentable>: NSObject, ARSessionDelegate {
var arVC: ARContainer
var arGameView : ARView?
init(_ control: ARContainer) {
self.arVC = control
}
func setARView(_ arView: ARView){
arGameView = arView // get access the arView
}
func session(_ session: ARSession, didUpdate anchors : [ARAnchor]) {}
func session(_ session: ARSession, didUpdate frame: ARFrame) {
print("frame", frame)
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {}
func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {}
}
Make your delegate available to your UIViewRepresentable through the makeCoordinator function:
struct CameraARContainer: UIViewRepresentable {
func makeCoordinator() -> SessionDelegate<Self>{
SessionDelegate<Self>(self)
}
func makeUIView(context: Context) -> ARView {
let arView = ARGameView(frame: .zero)
arView.session.delegate = context.coordinator
context.coordinator.setARView(arView)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}

Related

How can I replace a ModelEntity in RealityKit?

I want to switch the ModelEntity from the fvBoatAnchor.fvBoatObject to the fvBridgeAnchor.fvBridgeObject with the click of a button, but I'm not sure how I do it, while also keeping the gestures and collisions as well as the image tracker after changing the entity.
Here is my code
import UIKit
import RealityKit
import ARKit
class fvVessel: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
#IBOutlet weak var imageView: UIImageView!
let fvBoatAnchor = try! FV.loadFvBoatScene()
let fvBridgeAnchor = try! FV.loadFvBridgeScene()
var imageAnchorToEntity: [ARImageAnchor: AnchorEntity] = [:]
override func viewDidLoad() {
super.viewDidLoad()
let fvBoat = fvBoatAnchor.fvBoatObject as? Entity & HasCollision
arView.installGestures(for: fvBoat!)
fvBoatAnchor.generateCollisionShapes(recursive: true)
arView.scene.addAnchor(fvBoatAnchor)
arView.session.delegate = self
}
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
anchors.compactMap { $0 as? ARImageAnchor }.forEach {
let anchorEntity = AnchorEntity()
let modelEntity = fvBoatAnchor.fvBoatObject!
anchorEntity.addChild(modelEntity)
arView.scene.addAnchor(anchorEntity)
anchorEntity.transform.matrix = $0.transform
imageAnchorToEntity[$0] = anchorEntity
imageView.isHidden = true
}
}
// func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
// anchors.compactMap { $0 as? ARImageAnchor }.forEach {
// let anchorEntity = imageAnchorToEntity[$0]
// anchorEntity?.transform.matrix = $0.transform
// }
// }
func installGestures(on object:ModelEntity){
object.generateCollisionShapes(recursive: true)
arView.installGestures(.all, for: object)
}
func leaveScene() {
arView?.session.pause()
arView?.session.delegate = nil
arView?.scene.anchors.removeAll()
arView?.removeFromSuperview()
arView?.window?.resignKey()
arView = nil
}
#IBAction func leaveScene(_ sender: UIButton) {
leaveScene()
}
}

Receiving Xcode notification on Reality Composer animation end

I have the following Reality Composer project that loads properly. As you can see, when the animation completes, it should notify with the keyword "attackComplete".
How do I get this notification?
import RealityKit
import ARKit
class ViewController: UIViewController, ARSessionDelegate {
#IBOutlet var arView: ARView!
override func viewDidLoad() {
super.viewDidLoad()
let boxAnchor = try! Experience.loadOrcAttack()
arView.session.delegate = self
arView.scene.anchors.append(boxAnchor)
print("done")
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
print(anchors)
}
}
With Reality Composer's notifications you can implement two scenarios:
Action listener
This is your case and it's easy to implement using
public var onAction: ((RealityKit.Entity?) -> Swift.Void)?.
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let scene = try! Experience.loadScene()
override func viewDidLoad() {
super.viewDidLoad()
arView.scene.anchors.append(scene)
scene.actions.attackCompleted.onAction = notificationID // listener
}
fileprivate func notificationID(_ entity: Entity?) {
print(scene.actions.attackCompleted.identifier)
}
}
Here is one more example of how .onAction completion handler can be used.
Trigger for action
When you need to notify Reality Composer's scene to play an action use the following scenario:
import UIKit
import RealityKit
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
let scene = try! Experience.loadScene()
override func viewDidLoad() {
super.viewDidLoad()
arView.scene.anchors.append(scene)
}
#IBAction func press(_ sender: UIButton) {
scene.notifications.spinner.post() // trigger for action
}
}
or use a subscript for [NAME.NotificationTrigger]:
#IBAction func press(_ sender: NSButton) {
scene.notifications.allNotifications[0].post()
}
Here's one more example of how .post() instance method can be used.
P. S.
If you need more info, read this post.

How to share data from iPhone to Apple Watch if watch app is in background

I need some help sending data from Watch to iPhone and the other way.
If I send data from my watch to the iPhone and the iPhone app is not in multitasking the data is update once the iPhone app gets opened.
But if I send data from my iPhone to the watch it only gets update on the watch if the watch app is visible.
I can't get it to work if the app is in background. any suggestions?
thank you very much!
InterfaceController
import WatchKit
import Foundation
import WatchConnectivity
class InterfaceController: WKInterfaceController {
#IBOutlet weak var dataFromPhoneLabel: WKInterfaceLabel!
let session = WCSession.default
override func awake(withContext context: Any?) {
super.awake(withContext: context)
// Configure interface objects here.
session.delegate = self
session.activate()
}
override func willActivate() {
// This method is called when watch view controller is about to be visible to user
super.willActivate()
}
override func didDeactivate() {
// This method is called when watch view controller is no longer visible
super.didDeactivate()
}
#IBAction func sendDataToPhoneButtonTapped() {
let dataToPhone: [String: Any] = ["watch": "FromWatch" as Any]
session.sendMessage(dataToPhone, replyHandler: nil, errorHandler: nil)
}
}
extension InterfaceController: WCSessionDelegate {
func session(_ session: WCSession, activationDidCompleteWith activationState: WCSessionActivationState, error: Error?) {
//
}
func session(_ session: WCSession, didReceiveMessage message: [String : Any]) {
if let valueFromPhone = message["phone"] as? String {
self.dataFromPhoneLabel.setText(valueFromPhone)
}
}
}
ViewController
import UIKit
import WatchConnectivity
class ViewController: UIViewController {
#IBOutlet weak var phoneToWatchTextField: UITextField!
var session: WCSession?
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
if WCSession.isSupported() {
session = WCSession.default
session.delegate = self
session.activate()
}
}
#IBAction func sendToWatchButtonTapped(_ sender: Any) {
if let validSession = self.session, validSession.isReachable {
let dataToWatch: [String: Any] = ["phone": phoneToWatchTextField.text as Any]
validSession.sendMessage(dataToWatch, replyHandler: nil, errorHandler: nil)
}
}
}
extension ViewController: WCSessionDelegate {
func sessionDidBecomeInactive(_ session: WCSession) {
//
}
func sessionDidDeactivate(_ session: WCSession) {
//
}
func session(_ session: WCSession, activationDidCompleteWith activationState: WCSessionActivationState, error: Error?) {
//
}
func session(_ session: WCSession, didReceiveMessage message: [String : Any]) {
DispatchQueue.main.async {
if let valueFromWatch = message["watch"] as? String {
phoneToWatchTextField.text = valueFromWatch
}
}
}

SwiftUI UIViewRepresentable and Custom Delegate

When creating a UIViewControllerRepresentable for SwiftUI, how do you create a Coordinator so that it can access the delegate of a third party library?
In this case, I am trying to access BBMetal, a photo-filtering library.
This is a truncated version of the code we are trying to 'bridge' to SwiftUI:
class CameraPhotoFilterVC: UIViewController {
private var camera: BBMetalCamera!
private var metalView: BBMetalView!
private var faceView: UIView!
override func viewDidLoad() {
super.viewDidLoad()
...
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
camera.start()
}...
}
extension CameraPhotoFilterVC: BBMetalCameraPhotoDelegate {
func camera(_ camera: BBMetalCamera, didOutput texture: MTLTexture) {
// do something with the photo
}
func camera(_ camera: BBMetalCamera, didFail error: Error) {
// In main thread
print("Fail taking photo. Error: \(error)")
}
}
Using UIViewRepresentable everything sets up properly and the CameraPhotoFilterVC works, starts up the camera, etc, but the extension does not respond. We tried to set this up as a Coordinator:
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
func makeUIViewController(context: UIViewControllerRepresentableContext<CameraPreviewView>) -> CameraViewController {
let cameraViewController = CameraViewController()
// Throws an error because what we really want is a BBMetalCameraPhotoDelegate
//cameraViewController.delegate = context.coordinator
return cameraViewController
}
class Coordinator: NSObject, BBMetalCameraPhotoDelegate {
var parent: CameraPreviewView
init(_ parent: CameraPreviewView) {
self.parent = parent
}
func camera(_ camera: BBMetalCamera, didOutput texture: MTLTexture) {
print("do something with the photo")
}
func camera(_ camera: BBMetalCamera, didFail error: Error) {
print("Fail taking photo. Error: \(error)")
}
}
We also tried simply leaving an extension of the ViewController:
final class CameraViewController : UIViewController {
...
}
extension CameraViewController: BBMetalCameraPhotoDelegate {
func camera(_ camera: BBMetalCamera, didOutput texture: MTLTexture) {
...
}
However the delegate methods from BBMetalCameraPhotoDelegate do not 'fire.
I suppose the question is: in UIViewControllerRepresentable or UIViewRepresentable, how do you add an "external" delegate in the makeUIViewController method?
Usually, if this was say a UIPickerView, the following line would work:
picker.delegate = context.coordinator
But in this case the delegate is 'once removed'
You need to set the BBMetalCamera's delegate at some point before you use it.
You might do it immediately after creating it. You didn't show how you create it, so I don't know if that would be a good place to set it.
You could probably just do it in viewDidLoad:
override func viewDidLoad() {
super.viewDidLoad()
camera.photoDelegate = self
}

How to set up multi-face recognition with ARKit3?

I am trying to develop the ARKit3 facial recognition application. I want to make an application that supports multi-face recognition. I have made the following settings, but it does not work. Is it wrong with me?
override func viewDidLoad() {
super.viewDidLoad()
/// SetupDelegate
faceSCNView.delegate = self
faceSCNView.session.delegate = self
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
/// FaceTrackingConfiguration
let configuration = ARFaceTrackingConfiguration()
/// MaxNumberOfTrackedFaces = 2
configuration.maximumNumberOfTrackedFaces = 2
/// Run
faceSCNView.session.run(configuration)
}
Delegate
extension GameViewController : ARSCNViewDelegate {
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
print(anchor.sessionIdentifier , anchor.identifier , anchor.name)
if anchor is ARFaceAnchor {
print("renderer didAdd", anchor.identifier , anchor.name ?? "noname")
}
}
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
guard
let faceAnchor = anchor as? ARFaceAnchor
else { return}
print("renderer didUpdate", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
extension GameViewController : ARSessionDelegate {
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
print(anchors.count)
for anchor in anchors where anchor is ARFaceAnchor {
let faceAnchor = anchor as! ARFaceAnchor
print("Session didAdd", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
for anchor in anchors where anchor is ARFaceAnchor {
let faceAnchor = anchor as! ARFaceAnchor
print("Session didUpdate", faceAnchor.identifier , faceAnchor.blendShapes[.mouthClose] ?? 0)
}
}
}
No matter how many people perform facial recognition together, there is only one recognized Anchor, the identifier is: CA831DB2-E078-45C3-9A1C-44F8459AA04F.
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04505286
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04578292
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04813192
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04813192
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04832877
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0484867
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0484867
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04869337
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.04869337
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.0489419
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05000613
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05000613
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05070856
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05031016
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05070856
renderer didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05118915
Session didUpdate CA831DB2-E078-45C3-9A1C-44F8459AA04F 0.05093153
Sorry, it is my problem, because my mobile phone is iPhoneX, it is A11 chip, multi-face recognition is not supported.
The reason for this problem is that I used the iOS13 Beta 1 system at the time, but the maximum number of face recognitions supported during the run was 3, but only one person was actually supported. When I upgraded to iOS13 Beta 2, the update was shown to be the correct one.