I only create a new property of RealityKit.
Xcode won't be able to preview the SwiftUI canvas.
But it can build successfully.
I create the App by Xcode.
Choose Augmented Reality App and set User Interface to SwiftUI.
The project can work normally.
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
return VStack {
Text("123")
ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
// Add the box anchor to the scene
arView.scene.anchors.append(boxAnchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
#if DEBUG
struct ContentView_Previews : PreviewProvider {
static var previews: some View {
ContentView()
}
}
#endif
I only create a new property of RealityKit in makeUIView function.
let test: AnchoringComponent.Target.Alignment = .horizontal
The project cannot preview canvas and appear the error
'Alignment' is not a member type of 'AnchoringComponent.Target'
The project still can compile successful.
I am so confused what I met.
Has anyone meet the same issue?
You have to fix several issues:
You can't use anchoring component in iOS Simulator or in SwiftUI Canvas Preview, because it can only be used for pinning a virtual content to the real world surfaces. So no simulator for AR apps.
RealityKit Anchors are useless in iOS Simulator mode and SwiftUI Canvas Preview mode.
// Use it only for compiled AR app, not simulated...
let _: AnchoringComponent.Target.Alignment = .horizontal
Not only anchors are useless in iOS Simulator Mode and SwiftUI Preview Mode but also other session-oriented properties (including ARView.session) like ones you can see on a picture:
Change .backgroundColor in ARView to any other desirable one. Default color sometimes doesn't allow you to see a RealityKit scene. Looks like a bug.
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let boxAnchor = try! Experience.loadBox()
boxAnchor.steelBox?.scale = SIMD3(5, 5, 5)
boxAnchor.steelBox?.orientation = simd_quatf(angle: Float.pi/4, axis: SIMD3(1,1,0))
arView.backgroundColor = .orange
arView.scene.anchors.append(boxAnchor)
return arView
}
And here's what you can see in SwiftUI Preview Area now:
And, of course, you have to give a Camera Permission before using AR app. And it doesn't matter what you're using: Storyboard or SwiftUI.
You need to add Camera Usage Description property and arkit string in info.plist file:
XML version looks like this:
/* info.plist
<key>NSCameraUsageDescription</key>
<string>Camera access for AR experience</string>
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>armv7</string>
<string>arkit</string>
</array>
*/
After fixing these issues app compiles and works as expected (without any errors):
Related
I made a simple cube model with animation in Blender.
Exported it as a .fbx file with the option bake animation turned on.
With Reality Converter I converted the .fbx file to .usdz which I imported into my Xcode project. But I am not getting the animation back in my project (see below for the code).
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
struct ARViewContainer: UIViewRepresentable {
let arView = ARView(frame: .zero)
func makeUIView(context: Context) -> ARView {
let anchorEntity = AnchorEntity()
//get local usdz file which is in xcode project
do {
let cubeModel = try ModelEntity.load(named: "cube")
print(cubeModel)
print(cubeModel.availableAnimations) // here i get an empty array
anchorEntity.addChild(cubeModel)
arView.scene.addAnchor(anchorEntity)
}
catch {
print(error)
}
return arView // the cube is visible on my ipad
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
For what i understand is has to be possible to import with animations. Am i missing something ?
FBX model with character animation
Make sure you exported a model from Blender with enabled animation. To play animation in RealityKit use AnimationPlaybackController object. To test a character animation in RealityKit use this fbx model.
To convert .fbx model to .usdz use Reality Converter.
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
ARViewContainer()
.ignoresSafeArea()
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let model = try! Entity.loadModel(named: "walking.usdz")
let anchor = AnchorEntity(world: [0,-1,-2])
model.setParent(anchor)
arView.scene.anchors.append(anchor)
let animation: AnimationResource = model.availableAnimations[0]
let controller = model.playAnimation(animation.repeat())
controller.blendFactor = 0.5
controller.speed = 1.7
return arView
}
func updateUIView(_ view: ARView, context: Context) { }
}
I found the answer in blender, i did the export in USD format with animations on.
Then convert the file in realityconverter to usdz format, now the model and the animations are in my xcode project.
I'm new to SwiftUI, and I've learned some basics of my project's RealityKit and ARKit. I just want to display an arrow that always faces towards the north, or at least get heading information displayed as text when I open the camera (AR Experience).
Waiting for someone to solve this fundamental problem.
Thanks in advance!
Use the following code to create AR experience that depends on device's geo position.
Reality Composer
Code
import SwiftUI
import RealityKit
import ARKit
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.cameraMode = .ar
arView.automaticallyConfigureSession = false
let config = ARWorldTrackingConfiguration()
config.worldAlignment = .gravityAndHeading // case = 1
arView.session.run(config)
let arrowScene = try! Experience.loadNorth()
arView.scene.anchors.append(arrowScene)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
struct ContentView : View {
var body: some View {
ZStack {
ARViewContainer().ignoresSafeArea()
}
}
}
Settings
On device, go to Settings–Privacy–Location Services–On. After that, in Xcode, append Privacy LocationUsageDescription and LocationWhenInUseUsageDescription and then CameraUsageDescription keys in info tab.
In RealityKit I have an image anchor. When the image anchor is detected I would like to display an object and play animation it has. I created an animation in Reality Composer. It's a simple "Ease Out" animation which comes built-in Reality Composer.
Currently, my code looks like that:
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = CustomARView(frame: .zero)
// generate image anchor
let anchor = AnchorEntity(.image(group: "AR Resources", name: "imageAnchor"))
// load 3D model from Experience.rcproject file
let box = try! Experience.loadBox()
// add 3D model to anchor
anchor.addChild(box)
// add anchor to scene
arView.scene.addAnchor(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
The solution is easy. Choose Reality Composer's image anchor (supply it with a corresponding .jpg or .png image). Then assign a Custom Behavior for your model. As a trigger use a Scene Start. Then apply any desired Action.
Your code will be frighteningly simple:
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let scene = try! Experience.loadCylinder()
arView.scene.addAnchor(scene)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
Action will be played automatically (immediately after the image anchor appears).
I've created an app using the RealityKit template file. Inside RealityComposer there are multiple scenes, all the scenes use image recognition that activates some animations.
Inside Xcode I have to load all the scenes as anchors and append those anchors to arView.scene.anchors array. The issue is an obvious one, as I present the physical 2D image one after the other I get multiple anchors piled on top of each other which is not desirable. I'm aware of arView.scene.anchors.removeAll() prior to loading the new anchor but my issue is this:
How do I check when a certain image has appeared to therefore remove the existing anchor and load the correct one? I've tried to look for something like there is in ARKit as didUpdate but I can't see anything similar in RealityKit.
Many thanks
Foreword
RealityKit's AnchorEntity(.image) coming from RC, matches ARKit's ARImageTrackingConfig. When iOS device recognises a reference image, it creates Image Anchor (that conforms to ARTrackable protocol) that tethers a corresponding 3D model. And, as you understand, you must show just one reference image at a time (in your particular case AR app can't operate normally when you give it two or more images simultaneously).
Code snippet showing how if condition logic might look like:
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
return ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let id02Scene = try! Experience.loadID2()
print(id02Scene) // prints scene hierarchy
let anchor = id02Scene.children[0]
print(anchor.components[AnchoringComponent] as Any)
if anchor.components[AnchoringComponent] == AnchoringComponent(
.image(group: "Experience.reality",
name: "assets/MainID_4b51de84.jpeg")) {
arView.scene.anchors.removeAll()
print("LOAD SCENE")
arView.scene.anchors.append(id02Scene)
}
return arView
}
func updateUIView(_ uiView: ARView, context: Context) { }
}
ID2 scene hierarchy printed in console:
P.S.
You should implement SwiftUI Coordinator class (read about it here), and inside Coordinator use ARSessionDelegate's session(_:didUpdate:) instance method to update anchors properties at 60 fps.
Also you may use the following logic – if anchor of scene 1 is active or anchor of scene 3 is active, just delete all anchors from collection and load scene 2.
var arView = ARView(frame: .zero)
let id01Scene = try! Experience.loadID1()
let id02Scene = try! Experience.loadID2()
let id03Scene = try! Experience.loadID3()
func makeUIView(context: Context) -> ARView {
arView.session.delegate = context.coordinator
arView.scene.anchors.append(id01Scene)
arView.scene.anchors.append(id02Scene)
arView.scene.anchors.append(id03Scene)
return arView
}
...
func session(_ session: ARSession, didUpdate frame: ARFrame) {
if arView.scene.anchors[0].isActive || arView.scene.anchors[2].isActive {
arView.scene.anchors.removeAll()
arView.scene.anchors.append(id02Scene)
print("Load Scene Two")
}
}
I've got some trouble having a preview in Xcode 11.4. My code is working when my phone is plugged, so it's not a code problem, but when unplugged, build always failed. I'd like to be able to work on my project, on the other files not using AR, without this error. When I resume the preview on those other files, I'm blocked because of this error.
I've already put some strings in the info.plist file (privacy camera usage and required device capabilities) but still not working. Have an idea ?
import SwiftUI
import RealityKit
struct ContentView : View {
var body: some View {
return ARViewContainer().edgesIgnoringSafeArea(.all)
}
}
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.enablePlacement()
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {}
}
extension ARView {
func enablePlacement() {
let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(handleTap(recognizer:)))
self.addGestureRecognizer(tapGestureRecognizer)
}
#objc func handleTap(recognizer: UITapGestureRecognizer) {
let location = recognizer.location(in: self)
let results = self.raycast(from: location, allowing: .estimatedPlane, alignment: .vertical)
if let firstResult = results.first {
let mesh = MeshResource.generateBox(width: 0.5, height: 0.02, depth: 0.2)
var material = SimpleMaterial()
material.baseColor = try! MaterialColorParameter.texture(TextureResource.load(named: "glacier"))
let modelEntity = ModelEntity(mesh: mesh,materials: [material])
let anchorEntity = AnchorEntity(world: firstResult.worldTransform)
anchorEntity.addChild(modelEntity)
self.backgroundColor = .orange
self.scene.addAnchor(anchorEntity)
}else{
print("No Surface detected - move around device")
}
}
}
#if DEBUG
struct ContentView_Previews : PreviewProvider {
static var previews: some View {
ContentView()
}
}
#endif
error value of type 'ARView' has no member 'raycast'.
Cannot infer contextual base in reference to member 'estimatedPlane'.
Cannot infer contextual base in reference to member 'vertical'.
Screen Capture
A lot of RealityKit symbols are not available in the Simulator. I think your only solution is to remove them from Simulator builds by using
#if !targetEnvironment(simulator)
/ ** /
#endif
This error occurs because you have a device selected in the simulator and the simulator cannot be used with AR apps. Change the dropdown next to the play and stop buttons at the top of Xcode to Any iOS Device (arm64)
You will need to connect a physical device to Xcode or push your app to the AppStoreConnect and use Testflight to test your code instead of the simulator.