Passing IOS native objects back to unity3d - iphone

I implemented IOS plugin with couple of simple methods to interract my unity application with native static library.
The problem, I faced, is passing back native UI elements(objects) to unity.
F.e native SDK has method that creates badge (UIView), on the other hand I have button in unity (it could be GUI element or some 3d object, whatever)
I access this method from unity through my plugin, smth like:
[DllImport("__Internal")]
private static extern void XcodePlugin_GetBadgeView();
and following:
void XcodePlugin_GetBadgeView()
{
// Generate native UIView
UIView *badge = [Badge badge];
???? Return UIView badge instance to unity (hm)?!
}
So I need something like:
[someViewOrObject addSubView:badge];
but inside unity.
I know there is ability to send message back to unity:
UnitySendMessage( "UnityCSharpClassName" , "UnityMethod", "WhateverValueToSendToUnity");
but WhateverValueToSendToUnity should be UTF8 string.
In conclusion:
Only one idea I have to pass coordinates from unity to native plugin, and then add badge to these coordinates(not sure this is best solution):
[DllImport("__Internal")]
private static extern void XcodePlugin_AddBadgeViewToCoordinates(x,y);

If it's just for the badge coordinates, your idea seems fine.
If there are other things you'd like to get from iOS into Unity, I don't think there are many other options. The only other way that I can think of is to save the data in a file (eg, in NSTemporaryDirectory) and then pass the filename back as a UTF8String using UnitySendMessage. I've used this technique before (to pass images and JSON files) to Unity. Files in that directory get cleaned up automatically by iOS.

Related

How to use vuforia in unity with sound

I have a problem using vuforia with unity, I have 10 videos in database and when I start the App the sound from the videos start playing immediately even if there is no cards to read from.
Check you AudioSource components are on the target object, so it gets disabled.
If this happens and you still have the problem, add custom code so that it gets muted on OnTrackingLost and unmute in OnTrackingFound.
The class you are looking for id DefaulTrackableEventHandler, you can inherit from that one and add that code instead to our target using overrides or add code there like GetComponent.

Accessing Web Audio Context in React VR

I'm seeking inputs from people who have worked with Web Audio API in the react-vr. React-VR already has very cool components to place sounds in your scene, however, I need to go down one-step and access the audio buffer which is easily achieved by AudioContext provided by Web Audio.
In my client.js init() I can find the audio context in the vr instance
function init(bundle, parent, options) {
const vr = new VRInstance(bundle, 'WelcomeToVR', parent, {
...options,
});
audioCtx = vr.rootView.context.AudioModule.audioContext._context; //HERE
vr.render = function() { };
vr.start();
return vr;
}
I am struggling to figure how to expose the audio context. It's scope ends once I exit the init() function. Is there another way to access the audio context in index.vr.js?
I'm having the same issue... I can set the audio context:
audioOsc._setAudioContext(vr.rootView.context.AudioModule.audioContext._context;)
And inside of client.js it console.logs just fine. BUT...
inside my AudioOsc module this:
AudioOsc.getAudioContext(this.getAc, this.getAc);
where this.getAc is a callback (as per the reactvr docs) logs out an empty Object.
HOWEVER...
inside my AudioOsc module I can create an oscillator and connect it to a destination and it hums along just fine. So it seems to me there is actually no way to pass the context from a Native Module into ReactVR and back again... They eat the object somewhere along the way.
If things change I'd love to know! Otherwise, I think we may have to create a crap ton of audio modules on our own.

Can I run ARCore Preview 1 App on Preview 2 release?

I've built an app which runs on ARCOre preview 1 package on Unity. I know Google has made major changes in preview 2.
My question is what changes will I have to make in order to run my ARCore preview 1 app run on preview 2?
Take a look at the code in the Preview 2 sample app(s) and update your code accordingly. For example, here is the new code for properly instantiating an object into the AR scene:
if (Session.Raycast(touch.position.x, touch.position.y, raycastFilter, out hit))
{
var andyObject = Instantiate(AndyAndroidPrefab, hit.Pose.position,
hit.Pose.rotation);
// Create an anchor to allow ARCore to track the hitpoint
// as understanding of the physical world evolves.
var anchor = hit.Trackable.CreateAnchor(hit.Pose);
// Andy should look at the camera but still be flush with the plane.
andyObject.transform.LookAt(FirstPersonCamera.transform);
andyObject.transform.rotation = Quaternion.Euler(0.0f,
andyObject.transform.rotation.eulerAngles.y,
andyObject.transform.rotation.z);
// Make Andy model a child of the anchor.
andyObject.transform.parent = anchor.transform;
}
Common
Preview 1 use Tango Core service that can changed Ar-Core service in Preview 2.
Automatic Screen Rotation is Handled.
Some Classes are altered like some reason of following.
For Users:
Introduce AR Stickers
For Developers:
A new C API for use with the Android NDK that complements our existing Java, Unity, and Unreal SDKs;
Functionality that lets AR apps pause and resume AR sessions, for example to let a user return to an AR app after taking a phone call;
Improved accuracy and runtime efficiency across our anchor, plane finding, and point cloud APIs.
I have updated my app from Preview 1 to Preview 2. And it's not a lot. It had minor API changes like the ones for hit flags, Pose.position etc. It would probably be stupid to post the change log here. I suggest that you can file the below steps:
Replace the old sdk with the new one in the Unity Project
Then, check for the error in your default editor, vs or vs code or mono
Just check for the relevant API's in the deveoper docs of AR.
It's not such a cumbersome job, it too me some 5-10 min to upgrade that's it.
Cheers!

Cannot Change to a specific scene in Unity after building it to mobile device

Currently I'm using Application.load() to change scenes and it works perfectly in Unity. When I built it to a mobile platform and tested it, only in that one scene I can't change to a specific scene.
For example, gameplayScene, levelSelectionScene, mainMenu. I'm able to change scene from mainMenu to levelSelectionScene and then to gameplayScene. For unknown reason, I'm unable to go back to levelSelectionScene from gameplayScene while I can change scene to mainMenu from gameplayScene.
Below is the sample code from button that goes to levelSelectionScene from gameplayScene
private void OnClick ()
{
Debug.Log ("clicked");
if (PlayerPrefs.GetInt("SanguineDifficultyAchieved") == 1)
{
Debug.Log("Entering Difficulty");
m_Owner.SetActive ();
}
else
{
Debug.Log("Exiting");
State.Current = State.NotInGame;
Application.LoadLevel(scene.name);
}
m_Owner.close ();
I don't understand why it works on Unity debugger but then it doesn't work on mobile platforms.
Update 1
I tried to use numbers instead of strings it worked well. But I still don't understand the reason why.
Finally got an answer. It seems that the scenes collided with each other because I use scenes from Asset Bundle and the added scenes from build settings. That is why the when i use Application.Load(int number) works because i only access the scene from build settings.

UIWebView intercept "setting movie path:" from javascript audio player

In iOS 4, I've got a page loaded in a UIWebView with a javascript audio player. I did not create this player, it is owned by a third party, so I can't tinker with it. When I click the play button, I see an NSLog printout like the following:
setting movie path: http://data.myaudio.com/thefile.mp3
My question is, what is getting it's movie path set and how do I intercept it? The audio will continue to play until I create another UIWebView, or use the built in audio controls accessible by an iPhone home button double tap, or close the app. I can't intercept the path with shouldStartLoadWithRequest:, the javascript function audio.play() appears to call some built in player directly. I'd like to control where and how the audio is being played, but short of parsing the HTML for any <audio> tags, I can't figure out how to grab that path and point it somewhere other than the default.
UIWebView is essentially a wrapper for WebKit. Apple does not want you to touch anything more about it than is provided by the existing delegate methods.
That being said, you can modify the DOM of any loaded document by injecting JavaScript. This way you could also modify the audio.play to not do anything and instead get the URL to play with your own player that you can control.