How to draw lines from browser on remote mobile AR app? - unity3d

I am looking for a solution to share the screen from a mobile AR app (ARKit or Unity AR Foundation).
The screen needs to be shared to a browser on the desktop and it should be possible to draw lines on the screen from the browser using the mouse in the AR environment that can be seen on the mobile app which is sharing the screen.
After some investigation there does not seem to be a viable solution to truly share the same AR instance with browser/mobile as you can do with 2 mobile devices.
There should however be some sort of work around possible as it can be done with Vuforia Chalk AR.
Here is a GIF showing how it works:
AR Drawing demo
Sharing a video seems to be possible
Specifically trying to figure out how the line is drawn from the browser and then displayed on the mobile AR app
How can you achieve the same functionality with open source alternatives or Unity and custom code (No Vuforia is possible)?
Looking for a tutorial or some directions to how this can be implemented.

Related

Flutter Camera plugin zoomed in

I am developing a document scanner app in flutter, however of all the flutter apps I have seen, they are using the camera plugin which has a main issue that I found out here:
https://github.com/flutter/flutter/issues/45906#issuecomment-1124244943
I want to have a full resolution preview(for taking photos) rather than the video preview of the camera(which is also a little zoomed in). Any ideas?

Unity : project works in editor but black screen when build

I'm on a project using Unity 2019 LTS and some unity SDK / package:
Mapbox SDK
DreamWorld SDK (the SDK of my AR headset)
some other default AR packages (Foundation, Subsystem)
I would like to reused the Mapbox World-scale AR example in order to implement the possibility to move the scene according my AR headset position.
To do so, I removed the default main camera of the example (in AR Root) and added instead a the camera for my headset, as explained in the headset's docs (DW Developer Kit SDK).
Here are some pictures of what I've done:
No here's my problem: when I run the project in the editor with the player mode, all works perfectly fine and I see the camera rotation following the position of my AR headset.
Therefore, if I try to build the project, I cannot the see the "view" of the camera. I know that the project run because I still can see the overlay menu provides by the Mapbox World-scale example but not my camera.
Editor :
Build :
I searched online to find some solution to my issue but I only found some answer about building to Android and iPhone while I trying to build on my laptop.
The fact I see a black screen (and the overlay) seems to me that Unity cannot find a camera to show me the scene.
I just started using Unity, so it is possible that I missed an obvious thing but I don't know what.
If someone as any idea of what my problem is...
In case the suggestion from the comments with the In-Game logs does not work, you can check the external log file.
According to https://docs.unity3d.com/Manual/LogFiles.html
it is found under "C:\Users\YOUR_USERNAME\AppData\LocalLow\CompanyName\ProductName\Player.log"
CompanyName and ProductName are two names you can project somewhere in the unity project settings but there are default values.

how to use the cardboard sdk for pc vr game?

so I want to create a vr game using unity3d and cardboard sdk for PC(windows), which I'll stream to my phone screen using kinoConsol. I created a simple scene when I build it for android,it works fine , I mean it shows the dual sbs camera(screen), but a windows build shows only one normal camera(screen).. is there a way I can use the cardboard sdk to show the sbs camera(screen) in a windows build ?? if not is there any thing else available to achieve this ?
Side by side is easy, just place two cameras where the eyes should be and change their viewport rect to half width. Now you have a side by side stereo renderer without any external library. Cardboard also adds some distortion to the lenses, but it is not that important to use it in your case.
Your second, and much bigger problem is the gyroscope - you have to somehow communicate the position of your headset to your unity app on your pc. This is not trivial and probably will require finding or building an persistent service on your android device that will send the orientation data to your desktop app.

Image rendering on HTML5 canvas on iPhone 5 running iOS 7

canvasOn a web page in a mobile web application I am developing, I have a fileinput control that is used to take a picture with a mobile device camera. The image from the camera is then drawn onto an HTML5 canvas object on the same page.
The issue I am having is that if the web application is run on an iPhone 5 running iOS 7 (in the Safari web browser), the image appears extremely distorted. Specifically, the image appears to be vertically squished when drawn on the canvas. If the same web application is run on an Android device, no distortion is seen.
In previous versions of iOS and on iOS devices prior to iPhone 5, some vertical squishing was seen (although not as bad as this), and a jquery plugin named megapixel-image.js could be used to correct the vertical squishing. This tool unfortunately is not compatible with iOS 7.
Is this related to image subsampling in Safari or something else? What can be done to correct this? I obviously cannot have my users see this distorted image. Any mobile web app developer who wants to use the camera and HTML5 canvas is going to run into this, so a solution is mandatory.
megapixel-image.js does handle this correctly. I found I was passing some parameters to the plugin incorrectly, causing it not to work. My thanks to Ray Nicholus for his assistance with this issue.

access iphone camera stream with a window on top (phonegap)

I have been looking around but have not found anything pertaining to phonegap. Is there a way to access the iphone camera and capture the stream while keeping the webview on top? Some sample code would be sweet.