Add real world texture to .obj file in RealityKit - LiDar scan - swift

I have managed to develop an application that will use both an external hardware scanner and use the in build lidar scanner of an apple device to produce a .obj file. The external scanner comes with an SDK which has worked well.
Where I am struggling is with the in build scanner on the most recent iPad, I have managed to get a mesh generated (Which looks terrible by the way any hints on making the mesh higher quality is welcome) but I cannot seem to generate a real world texture and map it to the .obj file. Does anyone know how this can be achieved, nobody online currently seems to be able to achieve this ? Its so frustrating.....

Related

How do I render over a tracked image on Hololens?

I'd like the Hololens to take in through the camera and project an image over tracked images and I can't seem to find a concrete way as to how online. I'd like to avoid using Vuforia etc for this.
I'm currently using AR Foundation Tracked Image manager (https://docs.unity3d.com/Packages/com.unity.xr.arfoundation#2.1/manual/tracked-image-manager.html) in order to achieve the same functionality on mobile, however it doesn't seem to work very well on hololens.
Any help would be very appreciated, thanks!
AR Foundation is a Unity tool and 2D Image tracking feature of AR Foundation is not supported on HoloLens platforms for now, you can refer to this link to learn more about feature support per platform:Platform Support
Currently, Microsoft does not provide an official library to support image tracking for Hololens. But that sort of thing is possible with OpenCV, you can implement it yourself, or refer to some third-party libraries.
Besides, if you are using HoloLens2 and making the QR code as a tracking image is an acceptable option in your project, recommand using Microsoft.MixedReality.QR to detect an QR code in the environment and get the coordinate system for the QR code, more information please see: QR code tracking

Fixing object when camera open Unity AR

Im trying to create a AR Game in Unity for educational project.
I want to create something like pokemon go: when the camera open the object will be fixed somewhere on the real world and you will have to search for it with the camera.
My problem is that ARCore and vuforia groundDetection (I dont want to use targets) are only limited for few types of phone and i tried to use kudan sdk but it didnt work.
Any one can give me a tool or a tutorial on how to do this? I just need ideas or someone to tell me where to start?
Thanks in advance.
The reason why plane detection is limited to only some phones at this time is partially because older/less powerful phones cannot handle the required computing power.
If you want to make an app that has the largest reach, Vuforia is probably the way to go. Personally, I am not a fan of Vuforia, and I would suggest you use ARCore (and/or ARKit for iOS).
Since this is an educational tool and not a game, are you sure Unity is the way to go? I am sure you may be able to do it in Unity, but choosing the right platform for a project is important - just keep that in mind. You could make a native app instead.
If you want to work with ARCore and Unity (which is a great choice in general), here is the first in a series of tutorials that can get you started as a total beginner.
Let me know if you have other questions :)
You can use GPS data from phone to display object when the user arrived specific place you can show the object. You can search GPS based Augmented Reality on google. You can check this video : https://www.youtube.com/watch?v=X6djed8e4n0

Vuforia + Unity dynamic 3d models and triggers

I want to build a cross-platform mobile app that can identify QR-codes and will render a 3d model on it using AR.
I found that Unity in combination with Vuforia will do the trick on the AR part, but is it possible here to download and use 3D models dynamically?
Thanks
I guess what you're looking for is called AssetBundle be aware that downloading a large model (+texture) at run time can be heavy and will highly depend on the internet connection of the device.
Hope this helps.

Is there a work around for scanning and exporting a real world object with Hololens with all of it's texture details?

I'm about to develop something which is based on scanning real world objects. But however I could not find a way to scan the single object with texture. Is there a workaround. I've heard about Kinect will it work with Hololens?

Load 3d object dynamic in Cocos 3d

I am developer of iOS but new bee in Cocos, i am building an Augmented Reality App in which i want to load 3d object on run time and show them when specific marker detected. But my following Questions are about Cocoas and related to my own task, forgive me if i asked something silly .
Can i load 3d objects on run time in Cocos 3d?
Is this possible that i can get those 3d objects from my server via calling web-service, because these objects can b in .fbx format so is cocoas understand this type or should i use something else.
Is cocoas support objective-c because i only have knowledge in objective-c.
Will integration will be easy with my first part of app which i being developed in iOS, because i was thinking to do this task in Unity but Integration is so hectic so i decided to move any other and i got suggestion to go with Cocos 3d.
Please help me on this. Thanks in Advance.
Yes
Download from server: yes. You could use NSData (dataFromURL) to download a file. But last time I checked (4 months ago or so) cocos3d only supported the POD file format.
Yes, Cocos3d is written in Objective-C (haven't you downloaded the code yet?).
Easy is relative. Might be super-easy for a pro and next to impossible for a beginner. Hard to say without knowing you, your project, your requirements, your goals.