I have developed an AR mobile application with object detection using TensorFlow. The app is running perfectly on iPhone 12 Mini and other iPhones. But when I test it on iPhone 12 Pro and iPad 12 Pro the app is not showing 3D model when the phone camera is far from the detected object. When ever the app detects the trained object it is suppose to show the 3d model and place near to that object but in iPhone 12 pro versions it is only showing 3D object when camera is near to detected object.
I think may be LiDAR is creating problem? If yes, then how to stop the LiDAR using C# code as I have developed the project in Unity using ARFoundation and TensorFlow. I am using ARFoundation 1.0.
ARFoundation 1.0 was released in 2018, so it doesn't support Meshing (generation of triangle meshes that correspond to the physical space). So, there may be possible time-lag-problems, because device equipped with a LiDAR scanner must understand that there's no support for Scene Reconstruction in a current config, and it must toggled for a common Plane Detection approach instead.
A solution is simple – use the latest version of ARFoundation 4.1.5.
Related
I need help with one project based on the AR + GPS.
I have to place the model at GPS location at runtime and the player has to find that model.
I completed almost part of the project using assets AR + GPS Location to place the model at lat-long. I also use one asset named TriLib 2 - Model Loading Package to load the model from the URL.
I am facing one issue with some android devices.
The model does not appear at some distance(i.e 20 meters away) and if I am inside the range of the 20 meters then it appears on the screen.
I set the AR camera far clipping to 5000.
On iOS devices, it works perfectly fine. On some android devices, it also works fine.
What will be the issue?
https://www.youtube.com/watch?v=IDhH3SrzVFg
Please see the video(time: 3.08) for reference.
On Real Me XT, iPhone X it is working.
On Samsung Galaxy S10 and Xiaomi Redmi Note 9 Pro, The model is disappearing at some distance.
I have mixed around with Unity and Vuforia. Im all set and got it to work with an image that spawns 3D objects. My project has been done with my FaceTime Camera on my Macbook pro 2018. Now i want to try and move over to my Digital Camera: Sony AX6300. But when i connect the camera, Unity wont recognize it. I can still only choose FaceTime Cam built in on my Mac. Can anyone in here help out maybe?
Image
You have to edit the xml file named webcamprofiles.xml inside vuforiaresource, and add or edit your device camera name...
it works for me
I'm trying to learn the basics of ARKit and I created a project in XCode 9 beta 4 by using the template for an AR app with SpriteKit. This project is supposed to already have the implementation of a sample app that displays an emoji in an SKLabelNode when you touch the screen.
I'm trying to run this sample in an iPad Mini 4 running iOS 11. I had to remove the arkit key from the Info.plist, and then the app runs. But the emojis are not shown. I also tried example of ARKit with SpriteKit, and the images are not displayed.
However, I could successfully run Apple's sample app which renders 3D models and I can see them. And also this other example that uses SceneKit as well.
How is it possible that I'm able to display 3D SceneKit objects but not 2D SpriteKit objects in my device? What could I be missing?
iPad mini 4 is not supported. ARKit requires an A9 or better processor
The devices that use A9 or A10 chips are:
iPhone 6s and 6s Plus
iPhone 7 and 7 Plus
iPhone SE
iPad Pro (9.7, 10.5 or 12.9)
iPad (2017)
source:http://www.redmondpie.com/ios-11-arkit-compatibility-check-if-your-device-is-compatible-with-apples-new-ar-platform/
I am having an issue with the AR Camera in Tango crashing after a few minutes. I am using Unity 5.5.0 f3 release. I have updated to the latest Tango Core on the Phab 2 Pro as well as updated to the most recent SDK as of today(Gankino).
When the AR Camera crashes it just freezes in the background but all my AR still continues to run just fine and app functions as it is suppose to with no problem.
There is no updated AR Camera for Tango in this release. They use the deprecated one in their examples.
Thank you for the bug report, we would like to understand the cause of the crashing - a logcat would be useful.
We merged the AR camera prefab functionality into a single Tango Camera prefab that has support for AR as well. We marked the old AR camera prefab as deprecated but it uses some scripts that were modified for the new Tango Camera prefab. It is possible we introduced this bug, but a repro would be useful.
so I want to create a vr game using unity3d and cardboard sdk for PC(windows), which I'll stream to my phone screen using kinoConsol. I created a simple scene when I build it for android,it works fine , I mean it shows the dual sbs camera(screen), but a windows build shows only one normal camera(screen).. is there a way I can use the cardboard sdk to show the sbs camera(screen) in a windows build ?? if not is there any thing else available to achieve this ?
Side by side is easy, just place two cameras where the eyes should be and change their viewport rect to half width. Now you have a side by side stereo renderer without any external library. Cardboard also adds some distortion to the lenses, but it is not that important to use it in your case.
Your second, and much bigger problem is the gyroscope - you have to somehow communicate the position of your headset to your unity app on your pc. This is not trivial and probably will require finding or building an persistent service on your android device that will send the orientation data to your desktop app.