Best way to render 3D animations in Swift from Maya - swift

I've got two apps, one in Unity and one in iOS (created using Swift in Xcode). For the Unity game, our animator exports the 3D models from Maya into an fbx file which we are then able to render in the game.
Our challenge now is being able to render those same animations in the iOS game. I've read documentation that iOS supports usdz files. However, I'm having lots of trouble exporting the fbx files into usdz. These are the things I've tried:
I had our animator export the Maya file into Blender and then had them export to usd, which I then ran converted to usdz through Reality Converter. However, nothing shows up in the SCNScene (just a blank screen).
I downloaded all the appropriate SDKs from Apple and Autodesk to try and convert the the fbx file directly to usdz through Reality Converter, but when I run it in Swift, the textures seem to get messed up (we have a second layer of texture for eyes). Some fbx files turned out better than others. The below was the best one I could get.
Lastly, I created a .scn file and dragged the different dae/usdz files into the scene which I'm able to see in the scene editor, but when I load the scene using SCNScene(named: "filename"), I get an empty scene (with a sky and ground though).
Worth mentioning, I've tried converting the animations to dae and gtlf files as well, but with no luck.
I've tried two permutations of code to load the animations into the scene:
directly into the scene:
guard let url = Bundle.main.url(forResource: "ferox", withExtension: "usdz") else { fatalError() }
let scene = try! SCNScene(url: url, options: [.checkConsistency: true])
and also using an intermediary MDLAsset:
guard let url = Bundle.main.url(forResource: "ferox", withExtension: "usdz") else { fatalError() }
let asset = MDLAsset(url: url)
let scene = SCNScene(mdlAsset: asset)
There has to be a way to do this since fbx files are so common. Can someone tell me what I'm doing wrong (or even where, since there are so many different steps)?

Related

RealityKit – Playing multiple animations in USDZ file

Has anyone found a workflow to create multiple animations for a skeletal mesh packaged in a USDZ file and playback the animations using RealityKit?
I have a skeletal mesh with two animations (idle & run). I want to package them into a single USDZ file (or even multiple USDZ files if I have to) to be used in RealityKit.
I have been able to create an FBX for export of my skeletal mesh and the animations, and ship them up to sketchfab for a valid USDZ export that RealityKit can understand. I do not know how to package the second animation into a single USDZ file and then use SWIFT to playback the specific animations based off of specific events.
There seem to be a lot of posts from about a year ago on the topic with no real answers and little activity since. Any pointers would be appreciated.
Although in SceneKit you can play multiple animations using .dae model, in RealityKit 2.0 you still have no possibility to play multiple animations found in any .usdz model. Look at this post and this post.
There is only one animation is accessible using the following code now:
let robot = try ModelEntity.load(named: "drummer")
let anchor = AnchorEntity()
anchor.children.append(robot)
arView.scene.anchors.append(anchor)
robot.playAnimation(robot.availableAnimations[0].repeat(duration: .infinity),
transitionDuration: 0.5,
startsPaused: false)
When you choose second or third element in collection (if it really exists), your app crashes:
modelWithMultipleAnimations.availableAnimations[1]
modelWithMultipleAnimations.availableAnimations[2]

RealityKit arview snapshot not capturing 3d object

I currently have an application using RealityKit to add AR content to the view. I have a button that allows the user to take a photo. Based on the documentation, ARView.snapshot() seems to do this. However, the image captured is not including the AR content(3D object).
ARViewContainer().arview.snapshot(saveToHDR: false) { image in
UIImageWriteToSavedPhotosAlbum(image!, nil, nil, nil)
}
There is no error with this function but the captured snapshot is only a "camera photo" but no containing the AR Content(3D object) that is in the ARView.
It looks like you're creating a new ARViewContainer every time you create a snapshot, and that ARView will be fresh without any 3D content in it.
You'll want to instead reference the current ARViewContainer to grab the ARView within that.

Moving a Reality Composer entity after loading from Xcode?

I have used Reality Composer to build an AR scene, which currently has one object (as I understand, this is an entity). Using Xcode, I am loading this Reality Composer scene, which functions as expected. However, I would like my user to have the ability to scale or move the object, while still retaining all of my animations and Reality Composer setup.
I am using this code to load my object;
override func viewDidLoad() {
super.viewDidLoad()
// Load the "Box" scene from the "Experience" Reality File
let boxAnchor = try! Experience.loadBox()
boxAnchor.generateCollisionShapes(recursive: true)
arView.scene.anchors.append(boxAnchor)
}
I have attempted to implement traditional UIPinchGestureRecognizer and UITapGestureRecognizer to no avail. I do see such options such as EntityScaleGestureRecognizer, though I've yet to figure out how to implement this accordingly. I do see, from some reading, that my "entity" needs to conform to hasCollision, but it seems that I might be missing something, as I'd imagine Reality Composer must offer some sort of interaction functionality, given its simplicity to build AR experiences.
Thanks!
let boxAnchor = try! Experience.loadBox()
boxAnchor.generateCollisionShapes(recursive: true)
let box = boxAnchor.group as? Entity & HasCollision
arView.installGestures(for: box!)
set Physics for the box in Reality Composer
see https://forums.developer.apple.com/thread/119773

Where's the particle system file in SceneKit in Xcode 11.3.1

Recently I updated my Xcode to 11.3.1. But while working with SceneKit, I found that I can't create a particle system file.
Before
After
How can I create a particle system in a file now?
SceneKit Library
In Xcode 14 / 13 / 12 / 11 you have no preconfigured .scnp particle system files anymore. Instead, you can use a Particle System object coming from a Xcode Library (with the same settings in Attributes Inspector as they were in Xcode 10).
If you manually placed a Particle System from library into SceneKit's Scene graph you can then retrieve it and setup programmatically. Let's see how it looks like:
let particlesNode = sceneView.scene?.rootNode.childNode(withName: "particles",
recursively: true)
particlesNode?.particleSystems?.first?.isAffectedByGravity = true
particlesNode?.particleSystems?.first?.acceleration.z = 5.0
Creating particles programmatically
Or you can easily create a Particle System from scratch using just code:
let particleSystem = SCNParticleSystem()
particleSystem.birthRate = 1000
particleSystem.particleSize = 1.45
particleSystem.particleLifeSpan = 2
particleSystem.particleColor = .yellow
let particlesNode = SCNNode()
particlesNode.addParticleSystem(particleSystem)
sceneView.scene!.rootNode.addChildNode(particlesNode)
Creating .scnz file containing Particle System
Select a .scn file in Project Navigator (left pane) and choose File – Export...
In drop-down menu choose Compressed Scenekit Scene Document .scnz
Or you can create .scnp file by renaming .scn – the same way #ycao proposed.
Particle System moved to Scene Kit Scene File as a library object:
While creating a new file, select SceneKit SceneFile. Edit the suffix to .scnp, and everything is OK.
You can also right click inside the scene graph viewport Create > ParticleSystem then adjust the settings as per usual in the properties inspector. Then do the usual stuff as mentioned above in code to retrieve the system, move about, change settings etc.

Sphere not rendering in Unity for Google Cardboard

I was following this blog post on how to implement 360 degree video in Unity. At the end, I used ffmpeg to split the video into individual frames as recommended. I also set the first frame as the texture for each material on each sphere. The end result looks like this
bad sphere
The big problem though is that once I build and run it on my phone or just play the scene itself, the sphere simply fails to render. Could this be caused by the texture being the first frame? Or am I making some other sort of error? Many thanks.
Movies in Unity are usually rendered as textures on objects. On mobile the issue becomes that the device only wants to display video in a video player, so the Unity class MovieTexture is not supported.
I am having success circumventing this, and successfully rendering 360-video on the inside of a sphere using a Unity plug-in from the Unity Asset Store called, Easy Movie Texture.
For working on a Mac, here's what I did:
Download the Easy Movie Texture plug-in from the Unity Asset Store
Open the Demo Sphere demo scene from Assets/EasyMovieTexture/Scene
Create a new (empty) Prefab to your project, and drag the Sphere GameObject from the Demo Sphere scene onto the Prefab.
Reopen your Cardboard scene and drag the new videosphere prefab into your hierarchy.
Open your source 360-video in Quicktime
File -> Export -> 720p
Change file extension from '.mov' to '.mp4'
Drag your new mp4 file into your projects Assets/Streaming Assets directory. Note: don't import through the menu system, as this will force Unity to convert to OGG.
On the "Media Player Ctrl" script component of your videosphere GameObject, locate the "Str_File_Name" field and provide the FULL filename of your newly exported video file. Make sure to include the extension as part of the string, "mymovie.mp4".
Pretty sure that's everything. Hope it helps other folks stuck on this problem.
Final note, the video will only render on the device. In the editor you will only see a white texture on the sphere. You have to publish to the device in order to see your awesome 360-video.