Why is not Unity recognizing my digital camera? - unity3d

I have mixed around with Unity and Vuforia. Im all set and got it to work with an image that spawns 3D objects. My project has been done with my FaceTime Camera on my Macbook pro 2018. Now i want to try and move over to my Digital Camera: Sony AX6300. But when i connect the camera, Unity wont recognize it. I can still only choose FaceTime Cam built in on my Mac. Can anyone in here help out maybe?
Image

You have to edit the xml file named webcamprofiles.xml inside vuforiaresource, and add or edit your device camera name...
it works for me

Related

Vuforia Ground Plane Detection doesn't work when using webcam (IP webcam using Iriun Webcam in android phone)

Thought of using Vuforia as it allows testing using a webcam. So, i downloaded Iriun Webcam (IP Webcam for android), and succesfully got the video stream inside the Unity Editor when i press "Play".
I created a minimal AR example, where my app would detect a plane and a tap would place an object.
My issue is that when I Build the apk and test it on my phone it works perfectly as it should, but when i do it using the play button inside the editor, the video stream is captured, but it won't do any AR stuff (like plane detection as it was supposed to do)
Please help me with any possible reason there could be, as I couldn't find any such issues faced by anyone else
I created a minimal AR example, where my app would detect a plane and a tap would place an object.
My issue is that when I Build the apk and test it on my phone it works perfectly as it should, but when i do it using the play button inside the editor, the video stream is captured, but it won't do any AR stuff (like plane detection as it was supposed to do)
As mentioned here https://library.vuforia.com/unity-extension/vuforia-play-mode-unity Ground Plane is not supported when using a webcam in Unity Play Mode. Ground plane relies on a robust device tracker and this is not available when using a webcam. However, to help development, it is possible to record a sequence on a device and then using this in play mode https://library.vuforia.com/platform-support/recording-and-playback. The other option is to emulate the Ground Plane behavior using an Image Target as discussed here https://library.vuforia.com/ground-plane/introduction-ground-plane-unity.

Unity : project works in editor but black screen when build

I'm on a project using Unity 2019 LTS and some unity SDK / package:
Mapbox SDK
DreamWorld SDK (the SDK of my AR headset)
some other default AR packages (Foundation, Subsystem)
I would like to reused the Mapbox World-scale AR example in order to implement the possibility to move the scene according my AR headset position.
To do so, I removed the default main camera of the example (in AR Root) and added instead a the camera for my headset, as explained in the headset's docs (DW Developer Kit SDK).
Here are some pictures of what I've done:
No here's my problem: when I run the project in the editor with the player mode, all works perfectly fine and I see the camera rotation following the position of my AR headset.
Therefore, if I try to build the project, I cannot the see the "view" of the camera. I know that the project run because I still can see the overlay menu provides by the Mapbox World-scale example but not my camera.
Editor :
Build :
I searched online to find some solution to my issue but I only found some answer about building to Android and iPhone while I trying to build on my laptop.
The fact I see a black screen (and the overlay) seems to me that Unity cannot find a camera to show me the scene.
I just started using Unity, so it is possible that I missed an obvious thing but I don't know what.
If someone as any idea of what my problem is...
In case the suggestion from the comments with the In-Game logs does not work, you can check the external log file.
According to https://docs.unity3d.com/Manual/LogFiles.html
it is found under "C:\Users\YOUR_USERNAME\AppData\LocalLow\CompanyName\ProductName\Player.log"
CompanyName and ProductName are two names you can project somewhere in the unity project settings but there are default values.

Why is the sample Unity ARcore scene not detecting planes?

I installed the arcore-preview.apk on my Samsung 8(SM-G950U). Then I installed the sample ARCore scene from Unity onto the S8. The app starts up fine , but does not draw any point clouds or blue guidelines displaying the planes. I also cannot place the little Andy bot onto a plane. All I get is the UI saying, "Searching For Surfaces". Any ideas? I'll be continuously tinkering until I find a solution.
The problem seemed to have been the version of Unity(2017.2.0b10) I was using. So I just went back to the Unity Beta Archives and downloaded 2017.2.0b9 and then downloaded Android Target Support specifically for 2017.2.0b9. All is now working correctly!
You need to get the beta version 0b9, it won't work on the current stable one

Recording vr motion video

I want to record a vr motion video of my house using aframe so that I can show the demo view of the the house.What are the js files need to be included?How can I do it?
I assume you just want to record 360º video — if so, a 360º camera is the place to start. See: http://thewirecutter.com/reviews/best-360-degree-camera/
You may want to test things out on your device (Gear VR, Cardboard, Vive, whatever) before investing in a camera. Video playback in A-Frame may have some issues on mobile devices particularly.
Example: https://aframe.io/examples/showcase/videosphere/
hi kumar yes you use without using 360 camera by using normal mobile phone but to place that video you have to use latest version of unity that support not only 360 video format but also plane so where you can put that video as an medium.
or if you are using unity 4.6 to unity 5.3 then use have to use some packages like easy video texture in this package you have different options for placing video
https://github.com/maazirfan/Easy-Movie-Texture-for-Unity
and use where ever you want vr/gear vr / htc vive .

how to use the cardboard sdk for pc vr game?

so I want to create a vr game using unity3d and cardboard sdk for PC(windows), which I'll stream to my phone screen using kinoConsol. I created a simple scene when I build it for android,it works fine , I mean it shows the dual sbs camera(screen), but a windows build shows only one normal camera(screen).. is there a way I can use the cardboard sdk to show the sbs camera(screen) in a windows build ?? if not is there any thing else available to achieve this ?
Side by side is easy, just place two cameras where the eyes should be and change their viewport rect to half width. Now you have a side by side stereo renderer without any external library. Cardboard also adds some distortion to the lenses, but it is not that important to use it in your case.
Your second, and much bigger problem is the gyroscope - you have to somehow communicate the position of your headset to your unity app on your pc. This is not trivial and probably will require finding or building an persistent service on your android device that will send the orientation data to your desktop app.