I try to use TrueDepth data from iPhoneX(Xs, Xr, new iPadPro) for Unity App.
Current ARKit plugin for Unity doesn't seem to access to TrueDepth. So I think I need to create native plugin in Objective-c.
How to create a plugin which enable Unity to access to TrueDepth?
Just obtaining RGB and D from iOS is enough.
Basically, I would like to make similar app in the link with Unity.
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/streaming_depth_data_from_the_truedepth_camera
Related
I need help in switching mobile device camera in sdk Vuforia Engine Unity SDK v10.13 because vuforia has chnages the entire SDk. I did it past but i could find any documentation. Please help
I couldn't find any documention regarding camera switching in Vuforia Engine Unity SDK v10.13
Unfortunately switching btw back and front camera has been depracated but this can be achieved if its necessary via https://library.vuforia.com/platform-support/driver-framework.
If you need to sw btw the several back cameras that phones have, I don't believe its necessary as Vuforia usually selects the main camera and selecting any other camera like Telephoto or Ultrawide will results in tracking issues as the SDK was not designed to work with such images out of the box.
The Q that I would have, why do you need to sw? what do you want to achieve?
I have a Xiaomi Redmi Note 8 Pro and it doesn't seem to support ARCore. I found some ways to dodge this but it appears to be quite complicated nor very safe.
My question is:
What other tools would you recommend if I want to create an app in Unity that also needs to use GPS modules, maybe altimeter and of course camera (AR stuff)?
I heard about Vuforia that might do the trick, also read something about AR Foundation from Unity. But to me, it looked like depending on chosen deployment it use AR Core or AR Kit(even Vuforia).
Any clearance about this maybe?
I suggest you don't try messing with your device. It doesn't support ARCore for good reason. And maybe you just try to use Android Studio Emulator, but Alas for some unknown reason, i try it but APK which is generated from unity can't be installed on the Emulator. Some stuff with the architecture
If you want to use Unity anyway. I suggest you use Vuforia. It works on most modern devices and doesn't even need device to test, just hit unity play mode and you can test around from your PC (need webcam).
Vuforia Engine provides a simulator mode in the Game view that you can activate by pressing the Play button. You can use this feature to evaluate and rapidly prototype your scene(s) without having to deploy to a device. (Source: https://library.vuforia.com/articles/Training/getting-started-with-vuforia-in-unity.html)
For Unity with ARFoundation, you can't use your PC like Unity with Vuforia, you need ARCore/ARKit supported devices.
Last if you want AR with GPS Modules (although this is not with unity) checkout AR.js https://github.com/AR-js-org/AR.js
I have an Android application which I created using Unity and C#, it also uses Vuforia. I have an AR Camera, which shows a black screen when I have just installed the application and gave camera permission to it, but when I close the app and start it again, everything works fine. I did a lot of searches, so what I found and tried- to Switch automatic graphics to OpenGls2, to set minimization to none, etc. Upgrading Vuforia version will not work for me. Also ARCamera works for phones supporting ARCore and does not work for phones not having ARCore. So I guess, the issue is within the architecture of the phone. Any ideas on how to make it work?
It must be related to camera API. I don't know how Vuforia is trying to initialize the camera. Seems like it tries to initialize it before successful permission result on old Android versions. You can delay Vuforia initialization to be after the permission check. There is an option to set Vuforia delayed initialization, but it would be better to make a preload scene where you do check all permissions and maybe other setup and then load the scene where Vuforia is used.
Update: Please try to update xr plugins for Unity (they are packages now), update Unity itself to the latest version. Do not use unity beta. Se if there are some recommendations from Vuforia of which unity version is the latest supported. Youc also try to create a new unity project from scratch and then import the latest vuforia, then try camera on android, then if all good, import all the resources for your app.
I have to make an app that uses virtual reality, so should I drop the idea of using Flutter?
yes as much as I have seen flutter does support AR,I have been following a flutter developer on twitter he posts some cool AR stuff built with flutter here's a plugin ARCore he has built for flutter.
here are some of sample AR videos from the developer himself
https://twitter.com/i/status/1123893412279791616
https://twitter.com/i/status/1129117305303175168
Can I build 3D (OpenGL) apps with Flutter?
Today we don’t support for 3D via OpenGL ES or similar.We have long-term plans to expose an optimized 3D API, but right now we’re focused on 2D.
https://flutter.dev/docs/resources/faq#can-i-build-3d-opengl-apps-with-flutter
there aren't any OpenGL bindings supported by flutter. Flutter is only a 2d only application.
https://flutter.io/faq/#can-i-build-3d-opengl-apps-with-flutter
https://github.com/flutter/flutter/issues/14591
https://github.com/flutter/flutter/issues/7053
https://github.com/flutter/flutter/issues/179
I am not sure how VR would work at all on flutter.
You can use google's ar core with flutter. Check out the arcore_flutter_plugin to work with ar in a flutter.
As of now, there aren't any packages that specifically target VR. But you can use ARKIT arkit_flutter_plugin.
NOTE: ARCORE only works with android. And ARKIT only works with iOS.
I recently created a Flutter plugin for AR that supports both Android and iOS by wrapping around ARCore and ARKit: https://pub.dev/packages/ar_flutter_plugin
The plugin is a work in progress, but it already supports collaborative AR and sharing content through Google's Cloud Anchor Service and a lot of other useful features
I'm creating my firs VR app i unity for google cardboard. I noticed VR support setting in the inspector. I searched a little bit and it isn't clear for me if I should enable it for google cardboard. Here is the link for unity manual about it. Thanks for help!
For a while you should not enable VR support since Unity has no native support for Cardboard yet, although it was announced on Vision Summit 2016 for future versions.
By now, you should only use the Unity package with the appropriated prefabs for Google Cardboard available at https://developers.google.com/vr/unity/download.
You dont need to check Virtual Reality Supported to make an app for Google Cardboard. Make sure that you download the Google VR SDK for Unity and bring it in your assets folder in your project. Then just search for GVR Main in your assets folder and bring that into your project scene and you are good to go. Also you may not need the default camera in the scene as well. Just create your scene and your build is automatically supported for Google Cardboard. VR Supported is largely for computer dependent devices like the Oculus Rift and not apps that you run on your mobile phones.