Measure distance in flutter with Camera - flutter

In flutter, I need to measure distance in cm between 2 points when I use camera. I found only an example with Arcore but it's for Android and it is in Kotlin.
App Will be for Android devices so I need Arcore and not ArKit.
I tried working with anchors and nodes in Arcore but I don't understand the way to measure distance.

Related

Can ARKit and ARCore use Beacon as an Anchor?

We’re trying to anchor a model using an absolute coordinate system.
I’m using a UWB beacon system to know where devices are.
I’m now trying to tell ARKit (or ARCore) where the origin Anchor is.
I’m trying to use a beacon location as the origin.
Is there an anchor type that will accept a real world beacon as its source?
iBeacon vs UWB
iBeacon is a low-precision Received Signal Strength Indicator (RSSI). With iBeacon, the device position can be estimated with accuracy of 2.5 to 3.5 meters (it's monstrously inaccurate positioning).
A more interesting approach is the "collaboration" of NearbyInteraction and ARKit frameworks based on U1 (Ultra-WideBand) chips. This trio will give you an incomparably preciser positioning.

Any package for Light Estimation for android in flutter? How can I find Light estimation in AR Core?

I want to check the light (brightness ) of real world .Either it is dark(night) or light (morning).
I want to to find the Light estimation in flutter. We can find it in ARKIT but that's just for iOS .I want it for android too. So how can I find Light estimation for android in flutter.
I check in ARCORE but found nothing about Light estimation for android.

implementing Android AR with UVC / External camera (Unity)

I'm working on an AR project (Unity) and I want to use an external camera instead of my Android's original one. I saw that Vuforia has such a feature - but claims that ny using that, Ground Plane detection wouldn't work at all and ModelTargets performances taking a hit.
I also saw EasyAR has CustomCamera and Camera2 lib in ARCore.
Question is: What's the best way to approach this? has anyone experienced using an external camera? and with what AR solution? (ARFoundation / Vuforia / EasyAR...).
2nd Question: What should I look for when buying said UVC? Any examples for one?
Also I'd like to hear about experiences with AR solutions regardless of the external camera thing.
Thanks in advance!
Unfortunately, this is unlikely to work with an external camera.
A key part of AR is having a precise calibration of the camera's optics. Without that, it's not possible to accurately analyze the world to draw new objects in it or other AR effects.
A UVC webcam doesn't come with any such calibration information. So it would have to be calibrated somehow, and the calibration information given to Unity's AR engine. I don't know if that's possible with Unity in some way.
Note that not that all internal camera devices on Android are calibrated enough for AR either, but the ARCore team is certifying devices that have sufficient calibration in place.

Unity overlay objects on top of device camera (AR)

I am working with the Vive and a mobile tablet. The tablet has a tracker attached and then there is another tracker in the room.
On the tablet I output the devices camera on the screen and adjust the position and rotation according the devices position. What I want to do now is render the other tracker's position AR-like on top of the camera output.
I tried googling this, but so far I could only find how to make AR with Vuforia, which I don't need.
I really just need some keywords to start searching, because I don't really know how begin.
there is allot of way for making AR .
i propose you using API - like
Vuforia AR package
ARtoolkit
wikitude
etc ( by searching "Augmented reality API )
in other way ( i try it allot )
you should use corner detection and using feature extraction method ( every things under "ImageProcessing" science )
it if you don't want work with image Marker or target - you should get data from sensors in mobile application is easy but in other you should add sensors ( gyroscope and acceleration meter )
i hope i got what you want,
for image processing use openCV in C++ or java / emguCV in C#
if your problem was shader in unity
you could add a background layer in a unlit shader;
and put the Plane ( textured with camera render ) front of view of you unity camera object

Google-VR SDK: Unity gvraudiolistener script and world scale

I'm working on a frogger google cardboard vr game where I have a river and the river plays a "waves" audio clip. Once I scale my game world up 100x, the clip is faint. I can hear the sound play but in a very small radius. Even If I turn the Global Gain (db) in my gvraudiolistener script all the way up, I can only hear the waves play in a very small radius. I increased Volume Rollof: Min Distance to 1000 and Max Distance to 50000, and max out Gain (db) in my gvraudiosource script but the small radius is still an issue. How do I go about accounting for my world scale change with my gvraudiolistener script?
In the tutorial I'm following, the sound issue was maxing out a World Scale property. world scale property pic
However, it would appear the latest sdk does not have this world scale property anymore Anyone know how I may get my sound working correctly to account for world upscaling?
example
vs
my listener
Update** World Scale property was removed between v0.6 and v1.0 of Google Cardboard SDK update. How do I go about achieving the same effect?
I have an official answer from the gvr sdk team
Hi, we removed the world scale parameter to be able to integrate Unity's internal distance attenuation calculation into our Unity plugin. To fix this, you'll need to rescale your environment to match 1 unit = 1 meter.