Unity Playables graph locking animated properties with values at the last frame - unity3d

I've been trying to use a mixture of Unity's Animators and Playables in my game, and for the most part it works well, but there's two issues that I've been having for a long time, and I've at best worked around them. Today I bashed my head against them again, and after finding no solution online I decided to get my lazy ass to finally ask for help.
The basic setup is that my characters have:
An Animator with its controller, state machine, etc. that is used mostly for movement, jumping, climbing, etc. In case this is relevant, each character has an override controller of a generic one.
A very simple playable graph with just an output (wrapping the animator) and an input (wrapping the specific clip I want to play at the time). This is used for actions and attacks.
The problems I have are:
1- I can't seem to figure out an elegant, clean way to know when the clip fed to the graph (second part above) is finished. Currently I circumvent this by simply calculating how long the clip is and dividing by the current animation speed factor; I also have to account for when the animation is paused (e.g. hitstop). This gets the job done but is quite unelegant, and I'm sure there must be a better way.
2- Most importantly, when I'm done with the graph and standalone animation, the values of all of the properties the clip touches become locked at their last value. They stay locked even during any animation played by the regular animator; even if any of these later animations change its value, it snaps back to that locked "last frame" value when they end.
I've tried several things to solve this:
2.1- Set the default / desired value of the properties in the idle / default animation (to "mark" them as animatable properties in the normal animator's animation). This only fixes the issue for whatever animation is touched; any other animation played after that instantly reverts to the value locked by the last frame of the animation played by the graph.
2.2- Destroy the playable wrapping the animation (I do this anyway for cleanup since I need to recreate it each time a new animation plays).
2.3- Destroy the graph and recreate it each time (surprisingly, even this keeps the values locked).
2.4- Disabling the animator and enabling it again.
I'm frankly starting to lose my mind with the second problem, so any help would be exceedingly appreciated. Thanks in advance for any help!

Although this question is pretty old, I'm adding answer (along with my related follow up question) just in case there's more people that end up here from a search engine.
Animations (both "legacy" and non-legacy) can fire off events at some frame - just pick point (frame on dopesheet, place in graph on curves) and click "add event"...
There's some difference on how to specify which object/script & function to call between legacy and non-legacy - but in both cases it's basically a callback so you can know for sure when some animation started/finished (or any point in between).
Instead of trying to change values of those properties that are "locked by animations" from void Update() you seem to need to do those from within void LateUpdate().
From my testing - using/doing "legacy" animations (that also means "animation" component instead of "animator controller") allows you to use Update() - at least once the animation is finished.
And also worth keeping in mind that "animator controller" (component) doesn't accept importing "legacy" animations for any of it's states.
And animation (component) doesn't seem to play (at least not auto-play) non-legacy animations.
As of my question, well it's basically same as OPs question - is it possible to somehow "unlock" these properties (obviously without any states/animations playing) while using "newer" animator controller?
Although - based on things I've read while trying to find what's going on. Those "legacy" animations are not really "legacy" - and seem to be there to stay for reasons like being better for performance.

Related

How to record animation in runtime

I have some animations for upper body and lower body. I use avatar mask and set weight for them so them can override each other. I have 1 button for each animation. You can see it in this video.
https://youtu.be/fYdoFFJCuxk
All I want is how can I record the animation that I play in runtime and export it to a file(anim, fbx, ...).
Thanks in advance!
Your question is not trivial, meaning that maybe you need to try something out to narrow down the specific problem you might encounter in your attempt.
I am not 100% about how would I do it but I think there are 2 options.
1.- Runtime animation info serialize + save and load:
You would need to have a script that keeps track of the animation's state machine to save that info along the timeline. This script should save the info to a file (.xml, . json or text) so that it can be loaded and played. This info should hold basically the time and the state of the animation changes along the timeline since the animation play.
2.- Transform record:
You can have component for all the gameobjects you want to track that serialize and store the transform of your gameObject along a timeline. This is basically a replay system. If you play the animation, all the transforms info is saved to file (positions and rotations) and then when loaded, this positions and rotations are applied to the respective gameObjects.
You can give a look to the Easy replay system asset in the asset store. I bought it, it really simple and works really good. I tried it also with animations and it works.
Does not have the serialization and save and load part, but you can work that out very probably.
From scratch I think the best option is 1. With some asset, that might be more expensive if includes the serialization and save/load part, maybe you can get it done faster with option 2.

Unity Animator Checking Too Slow

I'm currently just trying to learn to use the animator within Unity, I'm very in-exp at animation and don't understand it even in the editor as I focus on programming/scripting.
I have an animation and the states for the animations as-well as the conditions all working perfectly however the animation check for the next state is way to slow. I've tried changing the speed of the actual state but it speeds the animation up and makes it look like my character is walking insanely fast.
I've tried messing around with the frames, making them over a longer time period and making the speed of the state faster however it seems to counter act each other, when I make it longer frames the pace of the animation is slow and then when I make the speed of the state go quicker it just makes the frames tick faster making the animation faster.
What I believe is happening is that the check for the next state of animation is happening once the full animation has been played. However what I need is the check to be happening constantly (as if frame by frame of the unity game not the animation).
Any advice would be great, I've tried using youtube to solve this before coming here however most people are creating a platformer game where as I'm trying to aim for a top down 2d, all directional character movement instead of the linear x axis character movement., and outside libraries.
I deeply apologise for my inability to find a suitable source. I have literally just came across an article online that came across a simple solution.
here:https://answers.unity.com/questions/221601/slow-animation-response.html
basically if you can't be bothered to click the link and you are having the same problem,
find exit-time by clicking the transition and then in the inspector and untick it.
Sorry.

Animation not working on an instantiated prefab (weapon)

So i'm working on this project and i'm using it to finally learn how to animate in 3D (taking a small break from coding hehe)
So here i am faced with a problem and i have no idea what i did wrong. First let me explain how everything works.
So the Animator is attached to the player, and the player obviously has a structure of legs arms...etc
everything inside the player is being animated by this animator.
So i have a weapon (a wooden sword) that has it's pivot attached to it's bottom (in case it helps to know)
i'm animating it from that pivot point, which happens to be the parent to which the sword model is a child to.
When i hit the V key the weapon gets instantiated in the player's hand (which is an empty gameobject) and when i press the F key the player attack and activates a trigger in the animation that starts the attack animation.
But the animation is not working properly. more precisely the key frames of the weapon are not being player(as you can see in the video all the rotation axes give the coordinates of 0 0 0 throughout all the animation.
But, ...and this is where the strange things start!... when i manually go through each second to play the animation and see what's happening, you can see that those coordinates start to change and it shows the animation exactly as it's supposed to be. then when i switch back to idle state so that i can start moving around the player normally. when i hit the F key to attack the correct animation is player and no problem happens... Magic? i don't think so... :p
what do you think? what could be causing this problem.
Who's up to solve my riddle :cool:
Seriously guys what's going on here, i need help.o_O
Thank you all ;)
The Video : HERE
In general: You are using an ALPHA version 2019.3.0a4 ... in short don't.
As any alpha version it is like to have some bugs .. especially since you are not even using the latest instance of the alpha which afaik would be 2019.3.0a12!
2019.3.0b1 is actually even already in BETA state a lot of former bugs should be fixed there - but it is still a beta release meaning it is not ready for production.
So in general don't even use the beta. Rather stick to the latest stable release version which would now be 2019.2.3f1
There is not directly listed one relating to the animator not finding a certain object at first and then not animating it .. but as said alpha version are likely to have bugs. Also since it is an instantiated prefab the original instance will be gone .. then by name it gets re-assigned by the animator so the main issue might be that you instantiate it in the first place instead of just having it already from the beginning.
You should consider only using the parent/pivot object of that sword and not animate the sword itself at all. Simply spawn it as a child of the animated pivot and you should be fine,

Using unity animations globaly

I am trying to code an end for a level in a simple game. A lot of things need to happen at slightly different times. The character needs to do a celebration. Text needs to pop up on screen. The camera needs to move to show off the win, and finally there needs to be a scene transition.
This all seems like a great thing to solve with an animation. All these things could come in and act on specific key-frames, at the end raising an event and ending the scene.
The problem is it looks like animations have to be attached to specific objects. My camera, player, and the static global GameController are completely unrelated. In fact the global controller can't be related to anything. Because of that my animations don't see all the objects and can't control them. I am instead stuck writing synchronized animations, and code with a lot of yield return new WaitForSeconds(...);. I find this very difficult to manage, and seems like a lot of waste. Is there any way I can use animations, or some other frame based tool to globally animate my game?
Look into Unity's Timeline system. I believe this is exactly the sort of thing it was made for.

Do AKAudioPlayer nodes apply a 10 ms fade out once they are stopped before reaching the end of the file/buffer?

First off, I just want to say thanks to the team at AudioKit for shedding some light on some difficult problems through their code. I have a few questions.
1: It does not appear the the AKAudioPlayer class applies on-the-spot fades if a player is stopped before reaching the end of the file/buffer. Is there another place in the AudioKit library where this is handled?
2: Does anybody know if the AVAudioMixer node’s volume can be adjusted in real time? E.G. can I make adjustments every 1/441 ms to follow the curve of my fade envelope? There is also the AVAudioUnitEQ with its globalGain property.
3: Is it possible to write to an AVAudioPCMBuffer’s floatChannelData after it has been scheduled, and while it is being played?
I’m writing a sampler app with AVFoundation. When it came time to tackle the problem of applying fades to loaded audio files within AVAudioPlayerNodes my first plan was to adjust the volume of the mixer node attached to my player node(s) in real time. This did not seem to have any sort of effect. It is entirely possible that my timing was off when doing this.
When I finally looked at the AKAudioPlayer class, I realized that one could adjust the actual buffer associated with an audio file. After a day or two of debugging, I was able to adapt the code from the AKAudioPlayer class into my PadModel class, with a few minor differences, and it works great.
However, I’m still getting those nasty little clicks whenever I stop one of my Pads from playing before the end of the file because the fades I apply are only in place at the start and the end of the file/buffer.
As far as my first question is concerned, in looking through the AKAudioPlayer class, it appears that the only fades applied to the buffer occur at the beginning and end of the buffer. The stop() method does not appear to apply any sort of on-the-spot fade to the buffer.
In my mind, the only way to have a fade out happen once a stop event happens is to apply it after said stop event, correct?
I have tried doing this, playing a 10 ms long faded-out buffer consisting of the buffer 10 ms after the stop position immediately after I call stop on my player node. It does not have the desired affect. I did not have much confidence in this scheme from the onset, but it seemed worth a try.
To be clear, once my stop() method is called, before actually stopping the the player node, I allocate the 10 ms fade buffer, read into the buffer at the position it is currently at, for the number of frames my fade buffer consists of. I then apply the envelope to the recently allocated fade out buffer, just as it is done in fadeBuffer() method in the AKAudioPlayer class. At this point I finally call stop() on the playing node, then schedule and play the fade out buffer.
Obviously there is going to be a discontinuity between stopping the buffer and playing the fade out buffer, e.g. by the time I apply the fade to the fade out buffer, the stop frame position I assigned to a local variable will no longer be valid, etc. And indeed, once I let off a pad, the sound that is played can only be described as discontinuous.
The only other solution to the problem I can think of strikes me as a daunting task, which would be to continually apply the fade envelope in realtime to the samples immediately ahead of the current play position as the buffer is being played. I currently do not believe I have the coding chops to pull this off.
Anyway, I looked through all the questions on S.O. concerned with AudioKit and this particular subject did not seem to come up. So anybodies thoughts on the matter would be greatly appreciated. Thanks in advance!
If anybody wants to look at my code, the PadModel class starts on line 223 of this file:
https://github.com/mike-normal13/pad/blob/master/Pad.swift
AudioKit is lacking in a fade-to-stop method. I would suggest requesting the feature as it is a worth while endeavor. If you are using AVAudioUnitSampler, I believe you can set ADSR values to achieve the fading effect, but not in a very straightforward way. You have to create a preset using AULab, figure out how to get the release to work, then import it into your project.