Override AnimatorController at runtime - unity3d

I have an AnimatorController and want to override it at runtime with another clip. The project should run as WebGL project so it is not possbile to create a new AnimatorController because it is in the UnityEditor namespace which is not exported when building the project as WebGL project (if you have an solution for that you can post it too).
So i tried to use the AnimatorOverrideController but I don't know why the animation isn't playing...
Basic_Run_02 is loaded by default before pressing play and it should be overridden with the clip in the SadWalk.fbx.
Anyone an idea why the animation is not playing?
public class AnimationHelper : MonoBehaviour
{
public AnimationClip state;
public AnimatorOverrideController overrideController;
void Start(){
state = Resources.Load("SadWalk", typeof(AnimationClip)) as AnimationClip;
PlayMotion (state.name);
}
void PlayMotion(string name){
RuntimeAnimatorController myController = gameObject.GetComponent<Animator>().runtimeAnimatorController;
overrideController = new AnimatorOverrideController ();
overrideController.name = "Test";
overrideController.runtimeAnimatorController = gameObject.GetComponent<Animator>().runtimeAnimatorController;
overrideController ["Basic_Run_02"] = state;
//Debug.Log (overrideController.clips.ToString);
gameObject.GetComponent<Animator>().runtimeAnimatorController = overrideController;
gameObject.GetComponent<Animator> ().StartPlayback ();
}
}

The Animator and RuntineAnimatorController classes are also available in the UnityEngine namespace, so you should do a couple of changes like below:
gameObject.GetComponent<UnityEngine.Animator>().runtimeAnimatorController = overrideController;
And the overrideController should be of type UnityEngine.RuntimeAnimatorController, example:
public AnimatorOverrideController overrideController;
Now if you export to WebGL you will have your code working.

Related

Scene not properly loaded when switching back and forth

Problem
The Unity project I build is targeting iOS, Android and Windows X64. I have two scenes, A and B, whereas A is the main menu scene of my game and scene B is kind of a level selection scene in which the user can choose a level to play. From scene A I can navigate to scene B and back again. When running the game in the Unity Editor, everything behaves as expected. The problem arises, when I run the game on the target platforms (real devices). Then, when navigating like A --> B --> A, I end up in scene A being rendered as a black screen, except the FPSIndicator game object which is still rendered and doing its job. The FPSIndicator game object is a small piece of code which draws itself to the scene in the OnGUI callback. Nothing else is displayed.
Setup of Scene A
I have a Unity UI button there ("Drag and Drop"), which, when clicked, loads scene B using this code:
using System.Collections;
using UnityEngine;
using UnityEngine.SceneManagement;
using UnityEngine.EventSystems;
public class GameTypeButtonController : MonoBehaviour, IPointerClickHandler
{
public ButtonSounds ButtonSounds;
public string SceneNameToLoad;
public GameType GameType;
public void OnPointerClick(PointerEventData eventData)
{
StartCoroutine(Do());
}
private IEnumerator Do()
{
var animator = gameObject.GetComponent<Animator>();
if (animator != null)
{
animator.SetTrigger("Clicked");
}
var audioSource = gameObject.GetComponent<AudioSource>();
if (audioSource != null)
{
var clip = GetRandomAudioClip(ButtonSounds);
audioSource.clip = clip;
audioSource.Play();
yield return new WaitWhile(() => audioSource.isPlaying);
}
Logger.LogInfo("[GameTypeButtonController.Do] Setting game type " + GameType);
GameManager.Instance.CurrentGameType = GameType;
SceneManager.LoadScene(SceneNameToLoad);
}
private AudioClip GetRandomAudioClip(ButtonSounds buttonSounds)
{
var numberOfAudioClips = buttonSounds.AudioClips.Length;
var randomIndex = Random.Range(0, numberOfAudioClips);
return buttonSounds.AudioClips[randomIndex];
}
}
The scene looks like this:
Setup of Scene B
In scene B, I have a button in the lower left which brings me back to scene A when clicked. This is not a Unity UI button, but a regular sprite with a CircleCollider2D attached. The script on that button looks like this:
using System.Collections;
using UnityEngine;
using UnityEngine.SceneManagement;
public class HomeButtonController : MonoBehaviour
{
public ButtonSounds ButtonSounds;
public string SceneNameToLoad;
void OnMouseDown()
{
StartCoroutine(Do());
}
private IEnumerator Do()
{
var animator = gameObject.GetComponent<Animator>();
if (animator != null)
{
animator.SetTrigger("Clicked");
}
var audioSource = gameObject.GetComponent<AudioSource>();
if (audioSource != null)
{
var clip = GetRandomAudioClip(ButtonSounds);
audioSource.clip = clip;
audioSource.Play();
yield return new WaitWhile(() => audioSource.isPlaying);
}
SceneManager.LoadScene(SceneNameToLoad);
}
private AudioClip GetRandomAudioClip(ButtonSounds buttonSounds)
{
var numberOfAudioClips = buttonSounds.AudioClips.Length;
var randomIndex = UnityEngine.Random.Range(0, numberOfAudioClips);
return buttonSounds.AudioClips[randomIndex];
}
}
The scene looks like this:
General Notes
Two objects use DontDestroyOnLoad: GameManager and MusicPlayer.
What I have checked so far
Scenes are properly referenced in Build Settings
As I use Unity Cloud Build, I have disabled the Library Caching feature to avoid issues with old build artifacts (so every time I build, I do a proper, clean build)
I can locally build all three platforms (Unity reports it as "Build successful"). So no build errors.
I am using LoadSceneMode.Single (default)
I am using the same Unity version locally and in Unity Cloud Build: 2018.3.0f2
Update 2019-02-19:
When I navigate from a third scene C back to scene A using the same mechanism (a sprite button calling a coroutine), I also end up on the very same black screen. So the issue probably exists within scene A?
Update 2 from 2019-02-19:
Here's my GameManager code:
using System;
using System.IO;
using System.Runtime.Serialization.Formatters.Binary;
using UnityEngine;
public class GameManager : MonoBehaviour
{
public EventHandler<LevelStartedEventArgs> LevelStarted;
public EventHandler<LevelFinishedEventArgs> LevelFinished;
// General
public GameType CurrentGameType;
public GameScene CurrentScene;
public int CurrentLevel;
public static GameManager Instance;
public GameLanguage Language;
public bool IsMusicEnabled;
private string gameStateFile;
void Start()
{
if (Instance == null)
{
gameStateFile = Application.persistentDataPath + "/gamestate.dat";
Load(gameStateFile);
DontDestroyOnLoad(gameObject);
Instance = this;
}
else if (Instance != this)
{
Destroy(gameObject);
}
}
public void Save()
{
Logger.LogInfo("[GameManager.Save] Saving game state to " + gameStateFile);
var bf = new BinaryFormatter();
var file = File.Create(gameStateFile);
var gameState = new GameState();
gameState.Language = Language;
gameState.IsMusicEnabled = IsMusicEnabled;
bf.Serialize(file, gameState);
file.Close();
Logger.LogInfo("[GameManager.Save] Successfully saved game state");
}
public void Load(string gameStateFile)
{
Logger.LogInfo("[GameManager.Load] Loading game state from " + gameStateFile);
if (File.Exists(gameStateFile))
{
var bf = new BinaryFormatter();
var file = File.Open(gameStateFile, FileMode.Open);
var gameState = (GameState)bf.Deserialize(file);
file.Close();
Language = gameState.Language;
IsMusicEnabled = gameState.IsMusicEnabled;
}
Logger.LogInfo("[GameManager.Load] Successfully loaded game state");
}
[Serializable]
class GameState {
public GameLanguage Language;
public bool IsMusicEnabled;
}
}
Thanks for any hints!
So, I finally managed to solve the issue on my own. Here's a list of steps I went through in order to eliminate my issue:
I installed the "Game development with Unity" workload in Visual Studio 2017 in order to be able to use the Visual Studio Debugger. That gave me the possibility to properly debug my scripts (setting breakpoints, inspect values, ...) and go through them, step by step. See this link for some details on how to accomplish this. Unfortunately, this did not solve my problem yet, but gave a good basis for troubleshooting my issue.
As the issue described above, also arised on Windows x64 builds, I've decided to troubleshoot my issue on a windows build. So I reconfigured my Windows build in the Build Player Settings, by setting Script Debugging to true, Development build to true, Copy PDB filesto true. Then I've ran another windows build.
If you now run your game, it will show a development console in the lower left screen, giving a lot of useful information (exceptions thrown, etc...). Also you can open the log file directly from within the game, which provides a LOT of useful informations to tackle down the real issue.
So, I had some PlatformNotSupportedExceptions in the logs, which I was able to fix and some other errors. Now it is working.
Although, I've answered my own question, I hope this will be useful to some of you.

In VR development ,how can I make a handle shotting?

I want to make an example of shot,
then I wrote this in the handle button event,
using UnityEngine;
using System.Collections;
public class fire : MonoBehaviour {
public GameObject bullet;
SteamVR_TrackedObject trackedObj;
void start() {
trackedObj = GetComponent<SteamVR_TrakedObject>();
}
void Update() {
var device = SteamVR_Controller.Input((int)trackedObj.index);
if (device.GetTouchDown(SteamVR_Controller.ButtonMask.Trigger)) {
GameObejct obj = Instantiate(bullet,transform.position);
Vector3d fwd = transform.TransformDirection(Vector3.forward);
obj.GetComponent.<Rigidbody>().AddForce(fwd*2800);
}
}
}
but when debugging and I press handle button ,it didn't produce a bullet,and had erred at the line
var device = SteamVR_Controller.Input((int)trackedObj.index);,
the error is:
Object reference not set to an instance of an object.
First You should need to confirm that your fire script is attached to your controller object and your controller object also attached SteamVR_TrackedObject script (which provide by steam plugin).
Then, lastly ensure this line is executing
void start() {
trackedObj = GetComponent<SteamVR_TrakedObject>();
}

How to destroy unity objects created in an static class?

Please consider the following code. It creates a material and shader from a certain path. It's a utility to use in an editor extensions.
public static class GpuImageProcessing
{
private static readonly string matPath = Application.dataPath + "/Uplus/Zcommon/Material/ImageProcessing/";
private static Shader Gaussian2D5Shader;
private static Material Gaussian2D5Mat;
static GpuImageProcessing()
{
Gaussian2D5Shader = (Shader) AssetDatabase.LoadAssetAtPath(matPath
+ "Gaussian2D5.shader", typeof(Shader));
Gaussian2D5Mat = new Material(Gaussian2D5Shader);
}
}
Now the problem is how can I destroy this material before editor recompiles each time some script is changed? I mean after some code is changed the editor needs to recompile scripts and create a new execution context and a new version of this GpuImageProcessing will be created. I want to destroy the materials created in previous runtime.
PS: This is included in a DLL file so I can't make it a ScriptableObject and listen to the event callbacks and also because it's a utility class I really like it being static.
Thanks to #Programmer Figured out a workaround to ensure the destruction of materials. In order for this to work, the class should become a singleton and inherit from ScriptableObject. Then we can implement OnDisable() and it will be called when editor is recompiling. Here is the working example:
public class GpuImageProcessing:ScriptableObject
{
private static readonly string matPath = "Assets/Uplus/Zcommon/Material/ImageProcessing/";
private Material Gaussian2D5Mat;
private Material Gaussian1DVariableMat;
private static GpuImageProcessing _instance;
private bool destroyedAlready;
public static GpuImageProcessing Instance
{
get
{
if (_instance != null) return _instance;
_instance= CreateInstance<GpuImageProcessing>();
_instance.Init();
return _instance;
}
}
private void Init()
{
var Gaussian2D5Shader = (Shader)AssetDatabase.LoadAssetAtPath(matPath + "Gaussian2D5.shader", typeof(Shader));
Gaussian2D5Mat = new Material(Gaussian2D5Shader);
var gaus1dVarShader = (Shader)AssetDatabase.LoadAssetAtPath(matPath + "Gaussian1DVariable.shader", typeof(Shader));
Gaussian1DVariableMat = new Material(gaus1dVarShader);
}
private void OnDisable()
{
Debug.Log("On disable GpuImgProc");
if(destroyedAlready) return;
DestroyImmediate(Gaussian2D5Mat);
DestroyImmediate(Gaussian1DVariableMat);
destroyedAlready = true;
DestroyImmediate(this);
}
}
Important: Please notice the destroyedAlready field. It ensures materials are destroyed only once since the OnDisable is called two times (one when unity editor invokes it and one when we DestroyImmediate(this)). Please set destroyedAlready before calling DestroyImmediate(this).
While not the exact solution for original question and not a nice solution, it does work and prevents the memory leak while preserving the static nature of the class.

How can I force an orientation per scene in Unity3D

Unity3D has a Screen class with an orientation property that allows you to force orientation in code, which lets you have different scenes with different orientations (useful in mini-games).
Setting this works fine for Android but crashes on iOS. What is the fix?
The problem is the file UnityViewControllerBaseiOS.mm that gets generated during the build for iOS has an assert in it which inadvertently prevents this property from being used. It is possible to create a post-build class that runs after the iOS build files have been generated that can alter the generated code before you compile it in XCode.
Just create a C# script named iOSScreenOrientationFix.cs and paste in the following code - adapted from this Unity3D forum post. Note that this file must be placed in a folder named Editor, or in one of its subfolders.
using UnityEngine;
using UnityEditor;
using UnityEditor.Callbacks;
using System.IO;
namespace Holovis
{
public class iOSScreenOrientationFix : MonoBehaviour
{
#if UNITY_CLOUD_BUILD
// This method is added in the Advanced Features Settings on UCB
// PostBuildProcessor.OnPostprocessBuildiOS
public static void OnPostprocessBuildiOS (string exportPath)
{
Debug.Log("OnPostprocessBuildiOS");
ProcessPostBuild(BuildTarget.iPhone,exportPath);
}
#endif
[PostProcessBuild]
public static void OnPostprocessBuild(BuildTarget buildTarget, string path)
{
#if !UNITY_CLOUD_BUILD
ProcessPostBuild(buildTarget, path);
#endif
}
private static void ProcessPostBuild(BuildTarget buildTarget, string path)
{
if (buildTarget == BuildTarget.iOS)
{
#if !UNITY_CLOUD_BUILD
Debug.Log("Patching iOS to allow setting orientation");
#endif
string filePath = Path.Combine(path, "Classes");
filePath = Path.Combine(filePath, "UI");
filePath = Path.Combine(filePath, "UnityViewControllerBaseiOS.mm");
Debug.Log("File Path for View Controller Class: " + filePath);
string classFile = File.ReadAllText(filePath);
string newClassFile = classFile.Replace("NSAssert(UnityShouldAutorotate()", "//NSAssert(UnityShouldAutorotate()");
File.WriteAllText(filePath, newClassFile);
}
}
}
}
You can set it in a scene by attaching the following MonoBehaviour to a game object
using UnityEngine;
namespace Holovis
{
public class SetDeviceOrientation : MonoBehaviour
{
public ScreenOrientation orientation = ScreenOrientation.AutoRotation;
void Awake()
{
Screen.orientation = orientation;
}
}
}
NOTE: Setting Screen.orientation has no effect when running on desktop, in the Unity editor, or when testing using Unity Remote.

Unity3D Programmatically Assign EventTrigger Handlers

In the new Unity3D UI (Unity > 4.6), I'm trying to create a simple script I can attach to a UI component (Image, Text, etc) that will allow me to wedge in a custom tooltip handler. So what I need is to capture a PointerEnter and PointerExit on my component. So far I'm doing the following with no success. I'm seeing the EVentTrigger component show up but can't get my delegates to fire to save my life.
Any ideas?
public class TooltipTrigger : MonoBehaviour {
public string value;
void Start() {
EventTrigger et = this.gameObject.GetComponent<EventTrigger>();
if (et == null)
et = this.gameObject.AddComponent<EventTrigger>();
EventTrigger.Entry entry;
UnityAction<BaseEventData> call;
entry = new EventTrigger.Entry();
entry.eventID = EventTriggerType.PointerEnter;
call = new UnityAction<BaseEventData>(pointerEnter);
entry.callback = new EventTrigger.TriggerEvent();
entry.callback.AddListener(call);
et.delegates.Add(entry);
entry = new EventTrigger.Entry();
entry.eventID = EventTriggerType.PointerExit;
call = new UnityAction<BaseEventData>(pointerExit);
entry.callback = new EventTrigger.TriggerEvent();
entry.callback.AddListener(call);
et.delegates.Add(entry);
}
private void pointerEnter(BaseEventData eventData) {
print("pointer enter");
}
private void pointerExit(BaseEventData eventData) {
print("pointer exit");
}
}
Also... the other method I can find when poking around the forums and documentations is to add event handlers via interface implementations such as:
public class TooltipTrigger : MonoBehaviour, IPointerEnterHandler, IPointerExitHandler {
public string value;
public void OnPointerEnter(PointerEventData data) {
Debug.Log("Enter!");
}
public void OnPointerExit(PointerEventData data) {
Debug.Log("Exit!");
}
}
Neither of these methods seems to be working for me.
Second method (implementation of IPointerEnterHandler and IPointerExitHandler interfaces) is what you're looking for. But to trigger OnPointerEnter and OnPointerExit methods your scene must contain GameObject named "EventSystem" with EventSystem-component (this GameObject created automatically when you add any UI-element to the scene, and if its not here - create it by yourself) and components for different input methods (such as StandaloneInputModule and TouchInputModule).
Also Canvas (your button's root object with Canvas component) must have GraphicRaycaster component to be able to detect UI-elements by raycasting into them.
I just tested code from your post and its works just fine.