Is it possible to create a game scene that saves an animation file as an asset in Unity 3D? - unity3d

I want to know that if it's possible to have a game scene in your game that allows any player to animate a character?
Let's ignore the cognitive load a player will face while animating a character and assume that, somehow, we manage to abstractly present it in a game scene. Can we make it happen that after player's making of certain animation, unity stores that in a .anim file and in the next game scene the player can play a character having that specific animation?
I think calling it real time in-game animation creation would describe it.

Related

How can I make the player model invisible to first person camera and not to the other player's camera?

I am making an online game and I want to make the player not see his own body but I don't know how to do it without making it invisible to other players too.

How to make a "DontDestoryOnLoad" Object remember objects from original scene

I have a game where the music game objects starts on the main menu scene and it has two components slider that I placed with drag and drop
and the player can interact with those sliders on the settings canvas in that scene. when the player changes scene the music keeps playing because of DontDestroyOnLoad and that's great! but if the player returns to the main scene the dontdestroy game object that plays music forgets
the sliders and the sliders don't do anything because they are not connected to the music game object with the dontdestroy. any way I can get around this?
This is because when the slider that your music manager is referencing is destroyed.
Solution 1:
To quickly fix the issue, simply set those sliders as DontDestroyOnLoad as well. When you don't want the slider to show, just create two game objects variables and drop those sliders in. Then use gameobject.setactive(true/false) to turn it on or off
Soulution2:
Create a function that triggers every scene change, and inside that function, help the sound manager to locate the slider by using find gameobject with tag, and inside the editor give those sliders gameobject a tag so it can be easily located. In this case, even the slider get destroyed the sound manager can still reference the new sliders

How to load animations made in Blender to ARKit?

I made a 3D object in Blender and made some custom animation to it. However, I manage to load the object to the scene, but not the animation.
This is what it appears to me at the Entities tab
I was searching about "How to load custom animations in ARKit", but couldn't find anything besides this link:
https://blog.pusher.com/animating-3d-model-ar-arkit-mixamo/
In there it tells you how to download and play some animations with Mixamo, but I want to load the animations that I made in Blender.
Just for the record, I'm new in this field of ARKit programming and I'm learning by myself.
Any suggestions? Thanks in advance!
When we're talking about animation it's better to talk in context of SceneKit and Core Animation frameworks.
Here's what Apple says about it:
SceneKit also uses CAAnimation objects for animations created using external 3D authoring tools (3dsMax, Maya, Blender) and saved in scene files. For example, an artist might create a game character with animations for walking, jumping, and other actions. You incorporate these animations into your game by loading animation objects from the scene file using the SCNSceneSource class and attaching them to the SCNNode object that represents the game character.
But remember: you need to export your animations as .dae file format. And your animations must be baked (you need to transform all animation and deformations into keyframes (for every frame) on the timeline regardless of how the animation is done).

How can I use two Audio Listeners?

The problem is that I have a character controller the player with a camera and the camera have a Audio Listener.
But I also have another camera the Main Camera that also have a Audio Listener.
The Main Camera is using Cinemachine Brain and using virtual cameras.
If I disable the Audio Listener on the Main Camera the character in my cut scene will walk to a door/s but when the door/s will open there will be no sound of the door open.
And if I disable the player controller camera Audio Listener then when I will move my player around to door/s there will be no sound when the player enter a door.
And I need both to work. While the character in the cut scene is walking and enter a door and the door is open the player can walk around.
Screenshot of the player controller camera and the audio listener:
And this is the Main Camera Audio Listener screenshot:
So now when running the game the character medea_m_arrebola is walking by animation through a door and there is a sound of the door open and close.
This is part of a cut scene that work in the background I mean the cut scene camera is not enabled yet but you can hear the audio.
Later I will switch between the cameras to show parts of the cut scene.
But now also the FPSController ( Player ) is working and the player can move around but when he will walk through a door the door will open but there will be no audio sound of the door.
And if I will enable both Audio Listeners I will get this warning message in the console in the editor say that more then 2 audio listeners are enabled....etc.
This sounds like a design issue to me. Unity can only handle one AudioListener at a time. You basically have to construct your cutscene-system to work with what Unity offers, or find some kind of workaround to fit your specific case.
You could try to en-/disable your AudioListeners on the fly or maybe use AudioSources around you player dedicated to directional audio input while in a cutscene. (Like a surround sound setup with empty objects) That way you could simulate two AudioListeners. The best case would be if you reworked your system to use one AudioListener for both inputs.
Maybe try a workaround first but if it does not 100% work as intended do the rework. It's worth it in the long run.

SmoothFollow camera in VR

In a test scene with a locomotion character, when I attach a SmoothFollow script to the character, it works as it should, but when I use Oculus Rift to view the scene in VR, it no longer follows the character as it walks...
I am aware that the camera transform is over-ridden with the head-tracked pose, and that if I want to move the camera, I must attach it as a child to another game object and then move the root game object, but doing so still would not let me follow the character in VR.
Am I missing something, or is it not possible to have this in Oculus Rift where you can just make the character walk and you automatically follow it?
It sounds like you are close. Did you attach the SmoothFollow script to the new GameObject (parent of the camera) rather than the camera itself?
Also, you may want to comment out the part of SmoothFollow where it sets the rotation of the camera - this can be very disorienting in VR.