Vision and ARKit frameworks in Xcode project - swift

I want to create an ARKit app using Xcode. I want it to recognize a generic rectangle without pressing a button and that subsequently the rectangle does a certain function.
How to do it?

You do not need ARKit to recognise rectangles, only Vision.
In case to recognise generic rectangles, use VNDetectRectanglesRequest.

As you rightly wrote, you need to use Vision or CoreML frameworks in your project along with ARKit. Also you have to create a pre-trained machine learning model (.mlmodel file) to classify input data to recognize your generic rectangle.
For creating a learning model use one of the following resources: TensorFlow, Turi, Caffe, or Keras.
Using .mlmodel with classification tags inside it, Vision requests return results as VNRecognizedObjectObservation objects, which identify objects found in the captured scene. So, if the image's corresponding tag is available via recognition process in ARSKView then an ARAnchor will be created (and SK/SCN object can be placed onto this ARAnchor).
Here's a snippet code on a topic "how it works":
import UIKit
import ARKit
import Vision
import SpriteKit
.................................................................
// file – ARBridge.swift
class ARBridge {
static let shared = ARBridge()
var anchorsToIdentifiers = [ARAnchor : String]()
}
.................................................................
// file – Scene.swift
DispatchQueue.global(qos: .background).async {
do {
let model = try VNCoreMLModel(for: Inceptionv3().model)
let request = VNCoreMLRequest(model: model, completionHandler: { (request, error) in
DispatchQueue.main.async {
guard let results = request.results as? [VNClassificationObservation], let result = results.first else {
print ("No results.")
return
}
var translation = matrix_identity_float4x4
translation.columns.3.z = -0.75
let transform = simd_mul(currentFrame.camera.transform, translation)
let anchor = ARAnchor(transform: transform)
ARBridge.shared.anchorsToIdentifiers[anchor] = result.identifier
sceneView.session.add(anchor: anchor)
}
}
let handler = VNImageRequestHandler(cvPixelBuffer: currentFrame.capturedImage, options: [:])
try handler.perform([request])
} catch {
print(error)
}
}
.................................................................
// file – ViewController.swift
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let identifier = ARBridge.shared.anchorsToIdentifiers[anchor] else {
return nil
}
let labelNode = SKLabelNode(text: identifier)
labelNode.horizontalAlignmentMode = .center
labelNode.verticalAlignmentMode = .center
labelNode.fontName = UIFont.boldSystemFont(ofSize: 24).fontName
return labelNode
}
And you can download two Apple's projects (sample code) written by Vision engineers:
Recognizing Objects in Live Capture
Classifying Images with Vision and Core ML
Hope this helps.

Related

ARKit, RealityKit: add AnchorEntities in the world after loading saved ARWorldMap

I am developing a simple AR app, where the user can:
select an object to add to the world from a list of objects
manipulate the object, changing its position, its orientation and its scale
remove the object
save all the objects added and modified as an ARWorldMap
load a previously saved ARWorldMap
I'm having some problems in getting the situation as it was before, when the user wants to load the previously saved ARWorldMap. In particular, when I re-add the objects, their positions and orientations are completely different that before.
Here are some details about the app. I initially add the objects as AnchorEntities. Doing some research I found out that the ARWorldMap doesn't save AnchorEntities but only ARAnchors, so I created a singleton object (AnchorEntitiesContainer in the following code) where I have a list of all the AnchorEntities added by the user that gets restored when the user wants to load a saved ARWorldMap.
Here is the initial insertion of objects in the world:
func addObject(objectName: String) {
let path = Bundle.main.path(forResource: objectName, ofType: "usdz")!
let url = URL(fileURLWithPath: path)
if let modelEntity = try? ModelEntity.loadModel(contentsOf: url) {
modelEntity.name = objectName
modelEntity.scale = [3.0, 3.0, 3.0]
let anchor = AnchorEntity(plane: .vertical, minimumBounds: [0.2, 0.2])
anchor.name = objectName + "_anchor"
anchor.addChild(modelEntity)
arView.scene.addAnchor(anchor)
modelEntity.generateCollisionShapes(recursive: true)
}
}
Here is the saving of the ARWorldMap and of the list of entities added:
func saveWorldMap() {
print("Save world map clicked")
self.arView.session.getCurrentWorldMap { worldMap, _ in
guard let map = worldMap else {
self.showAlert(title: "Can't get world map!", message: "Can't get current world map. Retry later.")
return
}
for anchor in self.arView.scene.anchors {
AnchorEntitiesContainer.sharedAnchorEntitiesContainer().addAnchorEntity(anchorEntity: anchorEntity)
}
do {
let data = try NSKeyedArchiver.archivedData(withRootObject: map, requiringSecureCoding: true)
try data.write(to: URL(fileURLWithPath: self.worldMapFilePath), options: [.atomic])
} catch {
fatalError("Can't save map: \(error.localizedDescription)")
}
}
showAlert(title: "Save world map", message: "AR World Map successfully saved!")
}
Here is the loading of the ARWorldMap and the re-insertion of the AnchorEntities:
func loadWorldMap() {
print("Load world map clicked")
let mapData = try? Data(contentsOf: URL(fileURLWithPath: self.worldMapFilePath))
let worldMap = try! NSKeyedUnarchiver.unarchivedObject(ofClass: ARWorldMap.self, from: mapData)
let configuration = self.defaultConfiguration
configuration.initialWorldMap = worldMap
self.arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
for anchorEntity in AnchorEntitiesContainer.sharedAnchorEntitiesContainer().getAnchorEntities()){
self.arView.scene.addAnchor(anchorEntity)
}
}
The problem is that the position and orientation of the AnchorEntities are completely different than before, so they are not where they should be.
Here are the things I tried:
I tried to save ModelEntities instead of AnchorEntities and repeat the initial insertion when loading the saved ARWorldMap but it didn't give the expected results.
I thought that maybe the problem was that the world origins are different when the ARWorldMap gets loaded, so I tried to restore the previous world origin but I don't know how to get that information and how to work with it.
I noticed that the ARWorldMap has a "center" parameter so I tried to modify the AnchorEntities transform matrix with that information but I never got what I wanted.
So my question is how do I load AnchorEntities into the world in exactly their previous positions and orientations when loading an ARWorldMap?

Improving saved depth data image from TrueDepth sensor

I'm trying to save depth data from an iPad Pro's FaceId TrueDepth sensor. I have taken this demo code and have added the following code with a simple button:
#IBAction func exportData(_ sender: Any) {
let ciimage = CIImage(cvPixelBuffer: realDepthData.depthDataMap)
let depthUIImage = UIImage(ciImage: ciimage)
let data = depthUIImage.pngData()
print("data: \(realDepthData.depthDataMap)")
do {
let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0];
let path = directory.appendingPathComponent("FaceIdData.png");
try data!.write(to: path)
let activityViewController = UIActivityViewController(activityItems: [path], applicationActivities: nil)
activityViewController.popoverPresentationController?.sourceView = exportMeshButton
present(activityViewController, animated: true, completion: nil)
} catch {
print("Unable to save image")
}
}
realDepthData is a class property I added and that I update in dataOutputSynchronizer:
func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer,
didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) {
...
let depthData = syncedDepthData.depthData
let depthPixelBuffer = depthData.depthDataMap
self.realDepthData = depthData
...
}
I'm able to save the image (grey scale) but I'm losing some depth information, notably in the background where all objects are fully white. You can see this in the image bellow, the wall and the second person behind are not correcly appearing (all white). If I'm not mistaken, from what I've seen in the app, I should have more information!
Thanks!
Only 32-bit depth makes sense – you can see image's depth setting its gamma. .exr and .hdr file formats support 32-bit. .png and .jpg are generally 8-bit. You should also consider channels order when converting.

How to programmatically export 3D mesh as USDZ using ModelIO?

Is it possible to programmatically export 3D mesh as .usdz file format using ModelIO and MetalKit frameworks?
Here's a code:
import ARKit
import RealityKit
import MetalKit
import ModelIO
let asset = MDLAsset(bufferAllocator: allocator)
asset.add(mesh)
let filePath = FileManager.default.urls(for: .documentDirectory,
in: .userDomainMask).first!
let usdz: URL = filePath.appendingPathComponent("model.usdz")
do {
try asset.export(to: usdz)
let controller = UIActivityViewController(activityItems: [usdz],
applicationActivities: nil)
controller.popoverPresentationController?.sourceView = sender
self.present(controller, animated: true, completion: nil)
} catch let error {
fatalError(error.localizedDescription)
}
When I press a Save button I get an error.
The Andy Jazz answer is correct, but needs modification in order to work in a SwiftUI Sandboxed app:
First, the SCNScene needs to be rendered in order to export correctly. You can't create a bunch of nodes, stuff them into the scene's root node and call write() and get a correctly rendered usdz. It must first be put on screen in a SwiftUI SceneView, which causes all the assets to load, etc. I suppose you could instantiate a SCNRenderer and call prepare() on the root node, but that has some extra complications.
Second, the Sandbox prevents a direct export to a URL provided by .fileExporter(). This is because Scene.write() works in two steps: it first creates a .usdc export, and zips the resulting files into a single .usdz. The intermediate files don't have the write privileges the URL provided by .fileExporter() does (assuming you've set the Sandbox "User Selected File" privilege to "Read/Write"), so Scene.write() fails, even if the target URL is writeable, if the target directory is outside the Sandbox.
My solution was to write a custom FileWrapper, which I return if the WriteConfiguration UTType is .usdz:
public class USDZExportFileWrapper: FileWrapper {
var exportScene: SCNScene
public init(scene: SCNScene) {
exportScene = scene
super.init(regularFileWithContents: Data())
}
required init?(coder inCoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override public func write(to url: URL,
options: FileWrapper.WritingOptions = [],
originalContentsURL: URL?) throws {
let tempFilePath = NSTemporaryDirectory() + UUID().uuidString + ".usdz"
let tempURL = URL(fileURLWithPath: tempFilePath)
exportScene.write(to: tempURL, delegate: nil)
try FileManager.default.moveItem(at: tempURL, to: url)
}
}
Usage in a ReferenceFileDocument:
public func fileWrapper(snapshot: Data, configuration: WriteConfiguration) throws -> FileWrapper {
if configuration.contentType == .usdz {
return USDZExportFileWrapper(scene: scene)
}
return .init(regularFileWithContents: snapshot)
}
08th January 2023
At the moment iOS developers still can export only .usd, .usda and .usdc files; you can check this using canExportFileExtension(_:) type method:
let usd = MDLAsset.canExportFileExtension("usd")
let usda = MDLAsset.canExportFileExtension("usda")
let usdc = MDLAsset.canExportFileExtension("usdc")
let usdz = MDLAsset.canExportFileExtension("usdz")
print(usd, usda, usdc, usdz)
It prints:
true true true false
However, you can easily export SceneKit's scenes as .usdz files using instance method called: write(to:options:delegate:progressHandler:).
let path = FileManager.default.urls(for: .documentDirectory,
in: .userDomainMask)[0]
.appendingPathComponent("file.usdz")
sceneKitScene.write(to: path,
options: nil,
delegate: nil,
progressHandler: nil)

RealityKit for VR

Trying to use my RealityKit project as the foundation for an on screen app (VR) instead of projecting onto the real-world (AR) out the back camera.
Anyone know how to load a RealityKit project asynchronously with the .nonAR camera option, so it project in an app instead of leveraging the rear facing camera?
Do I create position information in the Swift code or the Reality Composer project?
Here's how you can asynchronously load .usdz VR-model with a help of RealityKit's .loadModelAsync() instance method and Combine's AnyCancellable type.
import UIKit
import RealityKit
import Combine
class VRViewController: UIViewController {
#IBOutlet var arView: ARView!
var anyCancellable: AnyCancellable? = nil
let anchorEntity = AnchorEntity(world: [0, 0,-2])
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
arView.backgroundColor = .black
arView.cameraMode = .nonAR
anyCancellable = ModelEntity.loadModelAsync(named: "horse").sink(
receiveCompletion: { _ in
self.anyCancellable?.cancel()
},
receiveValue: { [self] (object: Entity) in
if let model = object as? ModelEntity {
self.anchorEntity.addChild(model)
self.arView.scene.anchors.append(self.anchorEntity)
} else {
print("Can't load a model due to some issues")
}
}
)
}
}
However, if you wanna move inside 3D environment, instead of using .nonAR camera mode use:
arView.environment.background = .color(.black)

Load a spritekit scene from another bundle?

I am making a SpriteKit framework using swift 4.2 and want to include some .sks files for scenes and actions. I have tried to load the scene from the bundle using the code below:
class func newGameScene() -> GameScene {
guard let gameScenePath = Bundle(for: self).path(forResource: "GameScene", ofType: "sks") else { assert(false) }
guard let gameSceneData = FileManager.default.contents(atPath: gameScenePath) else { assert(false) }
let gameSceneCoder = NSKeyedUnarchiver(forReadingWith: gameSceneData)
guard let scene = GameScene(coder: gameSceneCoder) else { assert(false) }
// Set the scale mode to scale to fit the window
scene.scaleMode = .aspectFill
return scene
}
I load the scene and present it. (This code is mostly from Apple's template for SpriteKit as Im testing this issue.)
guard let view = view else {
return nil
}
let scene = GameScene.newGameScene()
view.presentScene(scene)
view.ignoresSiblingOrder = true
view.showsFPS = true
view.showsNodeCount = true
return nil
The GameScene.sks and the code is unchanged from Apples template in this case. This code and the .sks assets are in the dynamic framework and imported into another project.
When having the framework load the scene into a view I pass it, it shows the fps and node count but not the "Hello, World!" text.
In the code below, also copied from the template, a break point shows that these are not called when mousing down.
#if os(OSX)
// Mouse-based event handling
extension GameScene {
override func mouseDown(with event: NSEvent) {
if let label = self.label {
label.run(SKAction.init(named: "Pulse")!, withKey: "fadeInOut")
}
self.makeSpinny(at: event.location(in: self), color: SKColor.green)
}
override func mouseDragged(with event: NSEvent) {
self.makeSpinny(at: event.location(in: self), color: SKColor.blue)
}
override func mouseUp(with event: NSEvent) {
self.makeSpinny(at: event.location(in: self), color: SKColor.red)
}
}
#endif
I know it must have to do with how SpritKit loads the scene but cannot find a solution. I have to use an NSKeyedUnarchiver becuase SpritKit's built in file initializer:
GameScene(fileNamed: "GameScene")
Only loads from the Main Bundle.
Now in the above I assumed that the file can be loaded by using a coder but Tomato made the point that sks most likely was not saved using a coder. In that case, It may be impossible to load an sks file from another bundle in sprite-kit using the provided api from apple. The answer may not include coders.
I have compiled the above discussion/solution into a single extension function on SKScene. Hope this helps someone!
import SpriteKit
extension SKScene {
static func fromBundle(fileName: String, bundle: Bundle?) -> SKScene? {
guard let bundle = bundle else { return nil }
guard let path = bundle.path(forResource: fileName, ofType: "sks") else { return nil }
if let data = FileManager.default.contents(atPath: path) {
return NSKeyedUnarchiver.unarchiveObject(with: data) as? SKScene
}
return nil
}
}
Just as I thought let gameSceneCoder = NSKeyedUnarchiver(forReadingWith: gameSceneData) was not creating a proper coder for you.
Just do
guard let scene = NSKeyedUnarchiver.unarchiveObject(with: gameSceneData) as? SKScene
else{
assert(false)
}
This will unarchive the file properly for you.
Note, if you want to use GameScene, make sure GameScene is set in the custom class of the SKS file