How to Determine Metal Device VR Support - swift

I am looking to create a macOS 10.13 application that tests for virtual reality support. What would be the best way to test a Mac for VR support, considering the CPU, GPU, and connectivity requirements?
Also, given a MTLDevice, is there a way to check for VR support using the Metal API?
I have tried to check the default system Metal device for macOS GPUFamily1_v3 support, but that does not completely answer the question of whether a device supports VR on macOS. The code below is what I use to test support for the Metal feature set.
let defaultDevice = MTLCreateSystemDefaultDevice()
print(defaultDevice?.supportsFeatureSet(.macOS_GPUFamily1_v3))

There is no such thing as "Metal VR Support". There is no special capability or GPU-level feature required to render for VR. Furthermore, there is no such thing as a "spec good enough for VR" as it relies completely on the resolution and frame rate of the particular headset used, as well as your application.
You can query the IOService layer to get GPU model and specs, but you will have to extrapolate the capabilities for yourself based on your personal requirements.

Related

Not using Metal, but get error "Metal GPU Frame Capture Enabled"

I am creating a simple SpriteKit game that uses SpriteKit, UIKit, Foundation, and nothing else in terms of frameworks.
However, every time I tun the app I got the following two errors:
Metal GPU Frame Capture Enabled
Metal API Validation Enabled
How do I un-enable the Metal frameworks and make these errors go away?
Do not worry, they are not errors just messages.
If your target links to the Metal framework (or any other framework
that uses the Metal API "SpriteKit"), Xcode automatically enables GPU Frame
Capture and Metal API Validation. Enable these options in your app’s
scheme to use the host of tools Xcode provides with Metal Frame
Capture, and to test for correct API usage with Metal API Validation.
You only disable these options when you want to test your app’s
maximum performance level.
Enabling/Disabling Frame Capture

Did ARCore supported on Android One?

Did Android One support ARCore? I checked the ARCore offical website and Android One didn't mention there.
[https://developers.google.com/ar/discover/supported-devices#android_play]
At the moment the list you reference is all the supported devices - this is because Google reportedly has to check the performance and do some calibration work per device.
I think, given the performance requirements, it will be mostly mid to high end devices initially which will likely exclude Android One type phones.
In discussions on the ARCore repository issues list they have explained in the past that adding devices requires (https://github.com/google-ar/arcore-android-sdk/issues/217):
The devices to have a fast enough CPU to run the map-building algorithms.
Testing to ensure the camera and IMU provide consistent timestamps across exposure, cpu load, etc. This can require OS updates in some cases.
Calibration of the camera and IMU, per device model. For now this can only happen in custom-designed robotic calibration stations using special software.

Is it possible to develop a virtual world that is accessible via both a PC and a virtual reality headset?

For example, if you have a virtual reality headset, you can interact with this virtual world in VR (i.e. WebVR); however, if you don't have a VR headset and/or WebVR compatibility can still access and explore this virtual world (i.e. like Runescape) and interact with characters, whether they are VR or web in the same virtual world?
The A-Frame framework handles that for you automatically, or you can roll it yourself, if you're using another framework. Either way, the different control schemes require a fair amount of thought.
You could also take a look at React360 (https://facebook.github.io/react-360/), it is a WebVR based framework from Facebook and can handle most 3D media out of the box. It performs quite well and has the advantage of progressive enhancement, i.e. if you view it from a desktop/tablet you get a 2D experience, if you are on a 3D capable device you get a full VR experience.
It's also cross platform so will run on android/iOS/Windows/MacOS/Oculus/Vive. Samples are included with it which should be enough to give you an idea of its capabilities.
Depending on the complexity of the game you are trying to develop and the graphical control required A-Frame is another option to look at.

Control Camera desktop application using Gyroscope of Android smartphone

For a project at my university I need to create a Unity3D application on my laptop, in which the camera is stationairy and can be controlled to rotate in any direction using the gyroscope of my Android smartphone (Nexus 5), wirelessly or through usbcable.
I've looked at the possibility of OSC or the Unity Remote 5 App, but up till now haven't found a way that works in order to obtain this result.
Any help or advice would be hugely appreciated - I don't have much experience yet with all this.
Thanks!
If i was going to do this then I would use UNET (Unitys built in multiplayer networking API) and have the rotation sync over LAN.
On the camera I would have a Network Transform and a script to control its rotation based on accelerometer input.
The version on the phone would be the authority and sync it's rotation over the network to the client on the laptop.
Pros: Wireless, fast (over wifi), very little code required to make it work, lots of documentation available online.
Cons: Relies totally on your network situation, you will have to do a lot of trial and error to get a smooth experience (not jerky) i think.
As for getting the tilt input on the device, Unity have a great tutorial here: https://unity3d.com/learn/tutorials/topics/mobile-touch/accelerometer-input
It's pretty straight forward.
Sound like a fun project, good luck!
It's possible to do this via cable, WiFi, Bluetooth 2.0 or Bluetooth 4.0 (BLE). I have implemented what you need in WiFi and Bluetooth 2.0 for my current work.
It's pretty easy to get rotation values and stream them from Android but I don't think you will need to do anything because you can just use this!
https://play.google.com/store/apps/details?id=de.lorenz_fenster.sensorstreamgps&hl=en
So the question is how do you receive the data this is sending on Unity's side. The answer is the UdpClient class.
If you need more reliability because every second student in your uni library is torrenting Mr. Robot and your getting huge lag then I can tell you how to implement the same thing in Bluetooth, which is not super trivial as .NET 2.0 (which unity uses) doesn't support bluetooth libraries, but there are solutions...

Google Cardboard controlling PC

When I saw the Google Cardboard for Unity, I assumed this meant that you would be able to make a Unity PC game and use your phone as a screen/controller. All I can see is it wanting me to make an android app which is all well and good, but it doesn't allow for input from the keyboard.
Is there a way to stream the Unity PC project to the device and retrieve input (i.e. Headtracking, NFC magnet)?
The problem with such a solution is latency. In VR latency is a big deal. The overall latency from input to photons reaching your eyes should be 20ms or lower. Regular games have 30-60 ms latency by themselves. Add to that the gyro latency, the phone display latency... If you want to add another 25ms or more ping to your VR experience, that's gonna be painful and may even make you sick. If you want to read more on why latency is such a big deal in VR, Michael Abrash wrote an excellend blogpost about it: post on latency
If you want to necessairly use a keyboard for navigation, consider using a bluetooth keyboard that can be used with android devices. Also keep in mind that with the current technology, especially without a dedicated headset, really dynamic vr experiences probably won't work very well and can make some people uncomfortable or sick. For a good read on designign virtual reality experiences, please refer to this guide from the Oculus Rift: http://static.oculus.com/sdk-downloads/documents/Oculus_Best_Practices_Guide.pdf
There's nothing in the Cardboard SDK for talking with a PC-hosted Unity game. You could adapt the code from the Unity Remote 4 project:
https://www.assetstore.unity3d.com/en/#!/content/18106
We are developing the app what you want except it uses GearVR instead of Cardboard. Please check the link below.
http://challengepost.com/software/airvr
Streaming from your PC to your phone's Cardboard is possible using third-party apps, such as Trinus VR (the client app on your phone) and Vireio (the streaming app on your computer). The two apps will then communicate via your home network (Wi-Fi or other) to stream the images.