I'm trying to import an existing .anim file to an existing unity model.
I know how do so it in Unity, and build into a HTML file, but I'm wondering is there a method to do it just using unityscript or javascript?
I mean using javascript to load an .anim file into the model in unity web player, if there is any solution, thanks!
Resource.Load allows you to do just that.
var mdl : GameObject = Resources.Load("Animations/"+ animationFolder+"/" + aName);
if (!mdl) {
Debug.LogError("Missing animation asset: Animations/" + animationFolder+"/"+aName + " could not be found.");
} else {
var aClip = mdl.animation.clip;
charAnimation.AddClip(aClip, aName);
Debug.Log(charAnimation[aName].name + " loaded from resource file " + animationFolder + "/" + aName + ". Length check: " + charAnimation[aName].length);
}
Related
I am trying to save cupemap object in unity and load it to perform a SAVE/LOAD function in my application so I tried this code to save my cubemap and it works fine :
foreach (var gameObj in FindObjectsOfType(typeof(GameObject)) as GameObject[])
{
if (gameObj.name.StartsWith("Chambre"))
{
PlayerPrefs.SetString("Chambre " + (l + 1), gameObj.name);
var chambre = new Material(Shader.Find("Skybox/Cubemap"));
chambre = gameObj.GetComponent<MeshRenderer>().material;
var Texture1 = chambre.GetTexture("_Tex");
if (!AssetDatabase.Contains(Texture1))
{
AssetDatabase.CreateAsset(Texture1, "Assets/Resources/Saved/chambre" + (l + 1) + ".mat");
AssetDatabase.SaveAssets();
AssetDatabase.Refresh();
}
l++;
}
}
But the problem here that I use the import UnityEditor which is not accepted when you try to build the game. So is there another solution to save my cubemap and reload it without using "AssetDatabase.CreateAsset" because it called by UnityEditor?
I don't completely understand what you want to archive.
In a build there are no "Assets" anymore so the methods make no sence in a build.
To use them in the Editor anyway and be able to build your code you just have to use pre-processors #if UNITY_EDITOR <your code> #endif arround code blocks that shall only exist in the Editor:
#if UNITY_EDITOR
using UnityEditor;
#endif
...
#if UNITY_EDITOR
foreach (var gameObj in FindObjectsOfType(typeof(GameObject)) as GameObject[])
{
if (gameObj.name.StartsWith("Chambre"))
{
PlayerPrefs.SetString("Chambre " + (l + 1), gameObj.name);
var chambre = new Material(Shader.Find("Skybox/Cubemap"));
chambre = gameObj.GetComponent<MeshRenderer>().material;
var Texture1 = chambre.GetTexture("_Tex");
if (!AssetDatabase.Contains(Texture1))
{
AssetDatabase.CreateAsset(Texture1, "Assets/Resources/Saved/chambre" + (l + 1) + ".mat");
AssetDatabase.SaveAssets();
AssetDatabase.Refresh();
}
l++;
}
}
#endif
So those codeblocks will not be included/executed in your build.
For a Demo application I need an reliable AR SDK that does allow the creation of Image Targets during Runtime.
The SDK has to run on a mobile device and the Targets should not be created by some cloud server or during development. In this scenario Users would photograph their own markers (e.g. magazine covers) and they get cropped and warped to be used as markers (3D Objects have to be assigned to these markers at random). Neither vuforia nor ARToolkit allow this scenario. Some other SDKs that might
support it: Kudan, EasyAR or MAXST.
If this is not possible at all, is there a AR SDK that allows to use the exact same Image Target (or Marker of any kind) multiple times for rendering the same 3D Object? Again vuforia and ARToolkit do not support this.
Kudan seams to be not supporting this feature in unity
i think it's supported in the native SDKs.
Unlike the native SDKs, the Unity Plugin can't just get a raw image file from the assets and load it into the tracker. This is a feature we will be adding to the plugin in the future.
source :- https://kudan.readme.io/docs/markers-1
EasyAR on the other hand support creating imageTarget out of .png or .jpg one image at a time or by .json to add multiple images in one batch
and it's provided in the example projects in EasyAR_SDK_2.2.0_Basic_Samples_Unity here
Note:- to run the example project you need to
1 - sign up on their site https://www.easyar.com/
2 - creat SDK license key from here.
3 - follow this Guide to place the key and run on unity.
4 - Your goal is achieved in the "HelloARTarget" project
and here is the example project script loading an AR experience from .pjg images
using UnityEngine;
using System.Linq;
using EasyAR;
namespace Sample
{
public class HelloARTarget : MonoBehaviour
{
private const string title = "Please enter KEY first!";
private const string boxtitle = "===PLEASE ENTER YOUR KEY HERE===";
private const string keyMessage = ""
+ "Steps to create the key for this sample:\n"
+ " 1. login www.easyar.com\n"
+ " 2. create app with\n"
+ " Name: HelloARTarget (Unity)\n"
+ " Bundle ID: cn.easyar.samples.unity.helloartarget\n"
+ " 3. find the created item in the list and show key\n"
+ " 4. replace all text in TextArea with your key";
private void Awake()
{
if (FindObjectOfType<EasyARBehaviour>().Key.Contains(boxtitle))
{
#if UNITY_EDITOR
UnityEditor.EditorUtility.DisplayDialog(title, keyMessage, "OK");
#endif
Debug.LogError(title + " " + keyMessage);
}
}
void CreateTarget(string targetName, out SampleImageTargetBehaviour targetBehaviour)
{
GameObject Target = new GameObject(targetName);
Target.transform.localPosition = Vector3.zero;
targetBehaviour = Target.AddComponent<SampleImageTargetBehaviour>();
}
void Start()
{
SampleImageTargetBehaviour targetBehaviour;
ImageTrackerBehaviour tracker = FindObjectOfType<ImageTrackerBehaviour>();
// dynamically load from image (*.jpg, *.png)
CreateTarget("argame01", out targetBehaviour);
targetBehaviour.Bind(tracker);
targetBehaviour.SetupWithImage("sightplus/argame01.jpg", StorageType.Assets, "argame01", new Vector2());
GameObject duck02_1 = Instantiate(Resources.Load("duck02")) as GameObject;
duck02_1.transform.parent = targetBehaviour.gameObject.transform;
// dynamically load from json file
CreateTarget("argame00", out targetBehaviour);
targetBehaviour.Bind(tracker);
targetBehaviour.SetupWithJsonFile("targets.json", StorageType.Assets, "argame");
GameObject duck02_2 = Instantiate(Resources.Load("duck02")) as GameObject;
duck02_2.transform.parent = targetBehaviour.gameObject.transform;
// dynamically load from json string
string jsonString = #"
{
""images"" :
[
{
""image"" : ""sightplus/argame02.jpg"",
""name"" : ""argame02""
}
]
}
";
CreateTarget("argame02", out targetBehaviour);
targetBehaviour.Bind(tracker);
targetBehaviour.SetupWithJsonString(jsonString, StorageType.Assets, "argame02");
GameObject duck02_3 = Instantiate(Resources.Load("duck02")) as GameObject;
duck02_3.transform.parent = targetBehaviour.gameObject.transform;
// dynamically load all targets from json file
var targetList = ImageTargetBaseBehaviour.LoadListFromJsonFile("targets2.json", StorageType.Assets);
foreach (var target in targetList.Where(t => t.IsValid).OfType<ImageTarget>())
{
CreateTarget("argame03", out targetBehaviour);
targetBehaviour.Bind(tracker);
targetBehaviour.SetupWithTarget(target);
GameObject duck03 = Instantiate(Resources.Load("duck03")) as GameObject;
duck03.transform.parent = targetBehaviour.gameObject.transform;
}
targetBehaviour = null;
}
}
}
edit
although it's easy to make an ImageTarget from .png but i wonder how to check that the image contains the sufficient features for being detected in EasyAR ?
Google AR Core supports this feature but they have limited number of supported devices
https://developers.google.com/ar/develop/java/augmented-images/guide
edit 2
Looks like Vuforia is supporting the creation of the image target at runtime. also drag and drop the image as texture in the editor without having to generate a dataset from the portal. although you still need the api key from the portal.
You can definitely do that with Vuforia and UserDefinedTargetBuildingBehaviour
https://library.vuforia.com/articles/Training/User-Defined-Targets-Guide
Kudan and EasyAR seem to offer this option. I will try to integrate them with Google Cardboard.
I have seen an OpenSpace3D video doing that. I believe they integrated ARToolKit5 into OpenSpace3D and made it work somehow. OpenSpace3D seems to be OpenSource so you might be able to look into their solution.
This is the link to the video:
https://www.youtube.com/watch?v=vSF1ZH1CwQI
Look at around minute 8:50 to 9:50.
I am trying to create an asset bundle with scenes. This is what I did in unity4
[MenuItem("Bundle/Create ios Scene SceneLoader")]
static void iosBuild(){
string[] levels = new string []{"Assets/Scenes/01 SceneLoader.unity", "Assets/Scenes/02 Level1.unity","Assets/Scenes/02 Level2.unity" ,"Assets/Scenes/02 Level3.unity"};
BuildPipeline.BuildStreamedSceneAssetBundle( levels, "Assets/Bundles/bundle-ios.unity3d", BuildTarget.iOS);
}
After that I load my bundle via this code:
using(WWW www = WWW.LoadFromCacheOrDownload (url, 0)){
while(!www.isDone){
status.text = "loading \n" + (www.progress * 100).ToString() + "%";
yield return null;
}
yield return www;
//check if server response is an error
if (www.error != null){
throw new Exception("WWW download had an error: " + url + " " + www.error);
}
//Load the asset bundle
AssetBundle bundle = www.assetBundle;
//obsolete bundle.LoadAll();
bundle.LoadAllAssets();
Application.LoadLevel ("01 SceneLoader");
}
This code worked in unity4, but now, when I load my Scene, all the script references are missing. Objects are in the scene but no scripts. Also, unity tells me that BuildStreamedSceneAssetBundle is obsolete. So my question is, why aren't my script references in the core scene ? So that when I load an asset bundle, all the scripts aren't missing. Also my NGUI Atlas that I use in the loaded scene is missing.
Would be glad if someone has an idea!
EDIT: the first string in "levels" will have all script references. How is that possible ?
I think this problem is unity bug.
My project has same problem.
I found solution, but it's very inconvenient.
If you must use scene asset bundle,
make one asset bundle per one scene.
string[] level1 = new string []{"Assets/Scenes/01 SceneLoader.unity"};
string[] level2 = new string []{"Assets/Scenes/02 Level1.unity"};
.....
BuildPipeline.BuildStreamedSceneAssetBundle( level1, "Assets/Bundles/bundle-ios1.unity3d", BuildTarget.iOS);
BuildPipeline.BuildStreamedSceneAssetBundle( level2, "Assets/Bundles/bundle-ios2.unity3d", BuildTarget.iOS);
.....
I used "BuildPipeline.BuildAssetBundles" func at Unity5.
But I think that "BuildPipeline.BuildStreamedSceneAssetBundle" and "BuildPipeline.BuildAssetBundles" is similar.
I have a level in which I have placed many instances of a prefab Gameobject (targets). I need to use the coordinates of those targets in a script. Is there a way to obtain the xyz vector coordinates of all those objects and export them to a text file? Right now I need to manually copy-past each individual target from the Unity inspector to MonoDevelop, which is a PITA...
To get the coordinates of an object use item.transform.Position where item is a reference to the object you want to get coordinates for. The result is a Vector3 from which you can do .x, .y or .z to get individual coordinates/
Writing to a text file is well-documented.
Alternatively, you may want to look into Serialization
EDIT: To do this for all objects in the scene:
string filePath = "D:/mycoords.txt";
string coordinates = "";
var objects = GameObject.FindObjectsOfType(Transform) as Transform[];
foreach(Transform trans in objects)
{
coordinates+= "x: " + trans.position.x.ToString() + ", y: " + trans.position.y.ToString() + ", z: " + trans.position.z.ToString() + System.Environment.NewLine;
}
//Write the coords to a file
System.IO.File.WriteAllText(filePath,coordinates);
NOTE: FindObjectsOfType will not return any assets (meshes, textures, prefabs, ...) or inactive objects.
EDIT3: If you want to only get your Targets, add a script to you Target prefab called "SaveMeToFile" and use this code:
string filePath = "D:/mycoords.txt";
string coordinates = "";
var objects = GameObject.FindObjectsOfType(SaveMeToFile) as SaveMeToFile[];
foreach(SaveMeToFile smtf in objects)
{
Transform trans = smtf.transform;
coordinates+= "x: " + trans.position.x.ToString() + ", y: " + trans.position.y.ToString() + ", z: " + trans.position.z.ToString() + System.Environment.NewLine;
}
//Write the coords to a file
System.IO.File.WriteAllText(filePath,coordinates);
EDIT 4: Or if you have any component specific to your target you can use that instead of SaveMeToFile in the code above, saving you from having to create a new, worthless Monobehaviour
I am creating a small fun application to play some mp3 files into a PhoneGap app
I have managed to play the audio in the background, many thanks to help on StackOverflow
But now the problem is when the app runs in background, or the app is locked, the next songs does not play. I have written the code to play the next song
=======
$(".greyselected").each(function(i) {
if($(this).is(":visible"))
{
activetrackclass = $(this).attr("class").split(" ")[0];
return;
}
});
$("." + activetrackclass).each(function(i){
if($(this).hasClass("greyselected"))
{
fetchmp3($("." + activetrackclass + ":eq(" + parseInt(i+1) + ")").find(".partistname").text(), $("." + activetrackclass + ":eq(" + parseInt(i+1) + ")").find(".ptrackname").text(), $("." + activetrackclass + ":eq(" + parseInt(i+1) + ")").find(".ptrackurl").text(), $("." + activetrackclass + ":eq(" + parseInt(i+1) + ")").find(".spanduration").text(), "tracklist");
$("." + activetrackclass).removeClass("greyselected");
$("." + activetrackclass + ":eq(" + parseInt(i+1) + ")").addClass("greyselected");
$("#divplayer").slideDown("slow");
return false;
}
});
=======
greyselected is the song currently playing
activetrackclass is a variable that holds the class of the track item or track div
fetchmp3 is the function to call the next mp3 file in the list
All mp3 are hosted on a server
When the app is active it plays just fine, but as soon as it goes to background it just plays the song which was actively playing and then stops until the app is made visible again
Would appreciate your valuable help as always
A background task only runs for a period of time, then the OS kills it. If your application is not active then it wont run forever and the ios will stop it after a while.
http://www.macworld.com/article/1164616/how_ios_multitasking_really_works.html
You can declare the support for background playback by specifying audio mode in UIBackgroundModes in Info.plist file. Check page 60 of iphone programming guide for more detail
You have lot of options to configure the behavior of the audio if you application can use the native code. Check the Categories Express Audio Roles in below docs.
http://developer.apple.com/library/ios/#documentation/Audio/Conceptual/AudioSessionProgrammingGuide/Introduction/Introduction.html#//apple_ref/doc/uid/TP40007875
An old thread from Apple Developer Forum
https://devforums.apple.com/message/264397#264397