Is there a good way to implement RealityKit into a Swift playground? - swift

I want to be able to add a file created in Reality Composer into a Swift Playground for testing purposes. I also think it would be interesting to use as a supplement to a playbook. How would one go about adding an rcproject into a playground like the Experience.rcproject Apple provides with the scene “Box”?

You can read a .reality file format in Playground using the following code:
import Cocoa
import RealityKit
import PlaygroundSupport
let arView = ARView(frame: CGRect(x: 0, y: 0, width: 800, height: 200))
arView.environment.background = .color(.black)
let fileURL = Bundle.main.url(forResource: "main", withExtension: "reality")
let bowlingScene = try! Entity.load(contentsOf: fileURL!)
let anchor = AnchorEntity()
anchor.addChild(bowlingScene)
anchor.scale = [4,4,4]
anchor.position.y = -0.5
arView.scene.anchors.append(anchor)
PlaygroundPage.current.liveView = arView
So, supply a .playground file with a main.reality scene, making it nested.
About Swift Playgrounds for iPad read here.

Related

RealityKit – ARView with '.ar' mode + ARView with '. nonAR' mode

I want to create an AR experience with a full screen video with AR elements, and in addition a small view at the bottom of the screen to show some relevant 3D objects.
In the past, I used to implement such features with an ARSCNView for the AR part, and a SCNView (SceneKit view) for the little preview part.
Is there a way I can do that with RealityKit for those 2 views?
I understand that RealityKit needs an ARView to render, which means I would need 2 ARViews on my screen, with different cameraMode settings:
let arView = ARView(frame: self.view.frame,
cameraMode: .ar,
automaticallyConfigureSession: true)
view.addSubview(arView)
let newAnchor = AnchorEntity(world: [0, 0, -1])
let newBox = ModelEntity(mesh: .generateBox(size: 0.5))
newAnchor.addChild(newBox)
arView.scene.anchors.append(newAnchor)
let arView2 = ARView(frame: CGRect(x:50,y:300,width: 200,height:200),
cameraMode: .nonAR,
automaticallyConfigureSession: true)
view.addSubview(arView2)
let newAnchor2 = AnchorEntity(world: [0, 0,-1])
let newBox2 = ModelEntity(mesh: .generateBox(size: 0.5))
newAnchor2.addChild(newBox2)
arView2.scene.anchors.append(newAnchor2)
arView shows the video feed with the newBox as expected.
My problem is that arView2 shows the camera feed although I expected a black view with only the newBox2. In addition this camera feed is distorted (it seems to be the contents of arView resized to fit arView2)
What am I missing?
Unlike ARKit, in RealityKit 2.0 you cannot simultaneously run two ARSessions (I mean a face config as primary setting, and a world config as a secondary setting), just a single one. Therefore, the running session of the first ARView must be used by both views.
arView2.session = arView.session
arView2.environment.background = .color(.systemIndigo)

Play USDZ animation in RealityKit

I spent 2 days trying to understand how to play properly an animation in my RealityKit project.
I followed many tips from others stackoverflow topics but without success. I know that with RealityKit v2 we can only play the first animation from the usdz file, ok. I'm trying to play the first animation of the "toy_robot_vintage.usdz" delivered by Apple directly in the Reality Composer.
Here is my complete code :
func loadModel(named: String, result: ARRaycastResult) {
var usdzToLoad: String = ""
switch named {
case "ROBOT":
usdzToLoad = "toy_robot_vintage.usdz"
default:
break;
}
DispatchQueue.main.async {
let modelToLoad = try! ModelEntity.loadModel(named: usdzToLoad)
switch named {
case "ROBOT":
modelToLoad.name = "ROBOT"
default:
break;
}
let anchor = AnchorEntity(plane: .horizontal, classification: .any, minimumBounds: [0.1, 0.1])
anchor.position.y = 0.01
anchor.addChild(modelToLoad)
// Create a "Physics" model of the toy in order to add physics mode
guard let modelEntity = anchor.children.first as? (Entity & HasPhysics)
else { return }
self.arView.installGestures([.rotation], for: modelEntity)
modelEntity.generateCollisionShapes(recursive: true)
modelEntity.physicsBody = PhysicsBodyComponent(shapes: [.generateBox(size: .one)],
mass: 1.0,
material: .default,
mode: .kinematic)
self.currentEntity = modelEntity
self.anchorsEntities.append(anchor)
self.arView.scene.addAnchor(anchor)
// self.currentEntity!.availableAnimations.forEach { self.currentEntity!.playAnimation($0.repeat()) }
let robotAnimationResource = self.currentEntity?.availableAnimations.first
self.currentEntity!.playAnimation(robotAnimationResource!.repeat(duration: .infinity),
transitionDuration: 1.25,
startsPaused: false)
}
robotAnimationResource is always nil returning of course a fatal error when I try to play the animation.
Any idea ? Thanks in advance for your help and support.
Change ModelEntity.loadModel to ModelEntity.load and it should now have the animations.
It's very weird and I don't know why, but that has worked for me in the past.
Also HasPhysics inherits Entity, so to save yourself looking for the anchor's children etc, you should be able to replace that guard let modelEntity... line with this:
guard let modelEntity = modelToLoad as? HasPhysics else { return }
EDIT:
I just ran this in playground and the animation runs fine:
import PlaygroundSupport
import UIKit
import RealityKit
let arview = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: true)
arview.environment.lighting.intensityExponent = 3
let newAnchor = AnchorEntity(world: .zero)
let newEnt = try! Entity.load(named: "toy_robot_vintage")
newAnchor.addChild(newEnt)
arview.scene.addAnchor(newAnchor)
for anim in newEnt.availableAnimations {
newEnt.playAnimation(anim.repeat(duration: .infinity), transitionDuration: 1.25, startsPaused: false)
}
PlaygroundSupport.PlaygroundPage.current.liveView = arview
The issue is that a model imported this way does not conform to HasPhysics (useful if you mentioned that's where it was now failing for you).
Apply the ModelComponent to another entity class or ModelEntity instead.

Reality Composer - Custom Collision Between Entities of Different Scenes

I'm pretty new to RealityKit and ARKit. I have two scenes in Reality Composer, one with a book image anchor and one with a horizontal plane anchor. The first scene with an image anchor has a cube attached to the top of it and the second scene built on a horizontal plane has two rings. All objects have a fixed collision. I'd like to run an animation when the rings and the cube touch. I couldn't find a way to do this in Reality Composer, so I made two attempts within the code to no avail. (I'm printing "collision started" just to test the collision code without the animation) Unfortunately, it didn't work. Would appreciate help on this.
Attempt #1:
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let componentBreakdownAnchor = try! CC.loadComponentBreakdown()
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
// Add the componentBreakdown anchor to the scene
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
let _ = ringsAnchor.scene?.subscribe(
to: CollisionEvents.Began.self,
on: bookAnchor
) { event in
print("collision started")
}
return arView
}
Attempt #2
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
let componentBreakdownAnchor = try! CC.loadComponentBreakdown()
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
// Add the componentBreakdown anchor to the scene
arView.scene.anchors.append(componentBreakdownAnchor)
let bookAnchor = try! CC.loadBook()
arView.scene.anchors.append(bookAnchor)
let ringsAnchor = try! CC.loadRings()
arView.scene.anchors.append(ringsAnchor)
arView.scene.subscribe(
to: CollisionEvents.Began.self,
on: bookAnchor
) { event in
print("collision started")
}
return arView
}
RealityKit scene
If you want to use models' collisions made in RealityKit's scene from scratch, at first you need to implement a HasCollision protocol.
Let's see what a developer documentation says about it:
HasCollision protocol is an interface used for ray casting and collision detection.
Here's how your implementation should look like if you generate models in RealityKit:
import Cocoa
import RealityKit
class CustomCollision: Entity, HasModel, HasCollision {
let color: NSColor = .gray
let collider: ShapeResource = .generateSphere(radius: 0.5)
let sphere: MeshResource = .generateSphere(radius: 0.5)
required init() {
super.init()
let material = SimpleMaterial(color: color,
isMetallic: true)
self.components[ModelComponent] = ModelComponent(mesh: sphere,
materials: [material])
self.components[CollisionComponent] = CollisionComponent(shapes: [collider],
mode: .trigger,
filter: .default)
}
}
Reality Composer scene
And here's how your code should look like if you use models from Reality Composer:
import UIKit
import RealityKit
import Combine
class ViewController: UIViewController {
#IBOutlet var arView: ARView!
var subscriptions: [Cancellable] = []
override func viewDidLoad() {
super.viewDidLoad()
let groundSphere = try! Experience.loadStaticSphere()
let upperSphere = try! Experience.loadDynamicSphere()
let gsEntity = groundSphere.children[0].children[0].children[0]
let usEntity = upperSphere.children[0].children[0].children[0]
// CollisionComponent exists in case you turn on
// "Participates" property in Reality Composer app
print(gsEntity)
let gsComp: CollisionComponent = gsEntity.components[CollisionComponent]!.self
let usComp: CollisionComponent = usEntity.components[CollisionComponent]!.self
gsComp.shapes = [.generateBox(size: [0.05, 0.07, 0.05])]
usComp.shapes = [.generateBox(size: [0.05, 0.05, 0.05])]
gsEntity.components.set(gsComp)
usEntity.components.set(usComp)
let subscription = self.arView.scene.subscribe(to: CollisionEvents.Began.self,
on: gsEntity) { event in
print("Balls' collision occured!")
}
self.subscriptions.append(subscription)
arView.scene.anchors.append(upperSphere)
arView.scene.anchors.append(groundSphere)
}
}

SceneKit Particle Systems in a Swift Playground

Usually I instantiate SCNParticleSystems by using the file initializer like this:
var stars = SCNParticleSystem(named: "Stars.sncp", inDirectory: nil)
However this project requires a Swift Playground and when I try to use that init function with systems stored in the playground's Resources folder it returns nil (even if I change the specified directory to "Resources" or "/Resources" etc. etc. ).
Are Playground resource paths handled differently to normal apps or am I making a really stupid filenaming mistake?
In Xcode 11 and later there's no preconfigured .scnp Particle System file. Instead you can use Particle System object coming directly from Xcode library.
Or, as always, you can create a particle system in Xcode Playground programmatically.
About Swift Playgrounds for iPad read here.
Here's a code:
import PlaygroundSupport
import SceneKit
let rectangle = CGRect(x: 0, y: 0, width: 1000, height: 200)
var sceneView = SCNView(frame: rectangle)
var scene = SCNScene()
sceneView.scene = scene
sceneView.backgroundColor = .black
let cameraNode = SCNNode()
cameraNode.camera = SCNCamera()
cameraNode.position.z = 70
sceneView.scene!.rootNode.addChildNode(cameraNode)
let particleSystem = SCNParticleSystem()
particleSystem.birthRate = 500
particleSystem.particleLifeSpan = 0.5
particleSystem.particleColor = .systemIndigo
particleSystem.speedFactor = 7
particleSystem.emittingDirection = SCNVector3(1,1,1)
particleSystem.emitterShape = .some(SCNSphere(radius: 15))
let particlesNode = SCNNode()
particlesNode.scale = SCNVector3(2,2,2)
particlesNode.addParticleSystem(particleSystem)
sceneView.scene!.rootNode.addChildNode(particlesNode)
PlaygroundPage.current.liveView = sceneView
I think you're doing a mistake in filename extension. It is .scnp and not .sncp.
Either try without any extension -
var stars = SCNParticleSystem(named: "Stars", inDirectory: nil)
or try with correct extension -
var stars = SCNParticleSystem(named: "Stars.scnp", inDirectory: nil)

Convert swift 2 video code into swift 3 video code

The code below displays a video. It works perfect in swift 2 but in swift 3 xcode states that the MPMoivePlayerController code has been phased out. I just would like my video to be displayed in swift 3 like it was in swift 2.
import UIKit
import AVFoundation
import AVKit
import MediaPlayer
class video: UIViewController {
var moviePlayer : MPMoviePlayerController!
override func viewDidLoad() {
super.viewDidLoad()
let path = Bundle.main.path(forResource: "jxdo", ofType:"mp4")
let url = URL(fileURLWithPath: path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = self.moviePlayer {
player.view.frame = CGRect(x: 67, y: 75, width: self.view.frame.size.width/2, height: self.view.frame.size.height / 5)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.aspectFit
player.isFullscreen = false
player.controlStyle = MPMovieControlStyle.default
player.movieSourceType = MPMovieSourceType.file
player.repeatMode = MPMovieRepeatMode.none
self.view.addSubview(player.view)
The MPMoviePlayerController has been deprecated and AVPlayerViewController should be used instead. See more on Apple's documentation page https://developer.apple.com/reference/mediaplayer/mpmovieplayercontroller
Also this thread might help in starting converting to the newer api: How to load MPMoviePlayerController contentUrl asynchronous when loading view?