MTLCreateSystemDefaultDevice() returning nil on iPad with iOS 10.3 - swift

I am trying to do some fluid simulation with Google's Liquidfun and Metal, using an iPad running iOS 10.3. However, the initial call to MTLCreateSystemDefaultDevice() is returning nil.
I have the following console log, so I know that Metal is supported on the iPad, but I am not sure how to debug this issue.
2018-07-02 20:28:44.547645-0500 chem-lab-practical[529:464344] [DYMTLInitPlatform].
platform initialization successful 2018-07-02 20:28:44.781763-0500
chem-lab-practical[529:464294] Metal GPU Frame Capture Enabled 2018-07-02
20:28:44.783609-0500 chem-lab-practical[529:464294] Metal API Validation Enabled
(lldb)
I have not found similar questions on stack overflow and am a beginner to using Metal, so I am not sure how to begin to debug this issue.

So... I guess I wasn't looking hard enough because I've found an answer here (iOS code to identify metal support in runtime?).
According to the post,
Note that just testing for the presence of a Metal framework class doesn't help — those classes are there on any device running iOS 8 (all the way back to iPhone 4s & iPad 2), regardless of whether that device has a Metal-capable GPU.
For reference, this is a list of Metal compatible iOS devices (https://developer.apple.com/library/archive/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/HardwareGPUInformation/HardwareGPUInformation.html)

Related

Not using Metal, but get error "Metal GPU Frame Capture Enabled"

I am creating a simple SpriteKit game that uses SpriteKit, UIKit, Foundation, and nothing else in terms of frameworks.
However, every time I tun the app I got the following two errors:
Metal GPU Frame Capture Enabled
Metal API Validation Enabled
How do I un-enable the Metal frameworks and make these errors go away?
Do not worry, they are not errors just messages.
If your target links to the Metal framework (or any other framework
that uses the Metal API "SpriteKit"), Xcode automatically enables GPU Frame
Capture and Metal API Validation. Enable these options in your app’s
scheme to use the host of tools Xcode provides with Metal Frame
Capture, and to test for correct API usage with Metal API Validation.
You only disable these options when you want to test your app’s
maximum performance level.
Enabling/Disabling Frame Capture

How to Determine Metal Device VR Support

I am looking to create a macOS 10.13 application that tests for virtual reality support. What would be the best way to test a Mac for VR support, considering the CPU, GPU, and connectivity requirements?
Also, given a MTLDevice, is there a way to check for VR support using the Metal API?
I have tried to check the default system Metal device for macOS GPUFamily1_v3 support, but that does not completely answer the question of whether a device supports VR on macOS. The code below is what I use to test support for the Metal feature set.
let defaultDevice = MTLCreateSystemDefaultDevice()
print(defaultDevice?.supportsFeatureSet(.macOS_GPUFamily1_v3))
There is no such thing as "Metal VR Support". There is no special capability or GPU-level feature required to render for VR. Furthermore, there is no such thing as a "spec good enough for VR" as it relies completely on the resolution and frame rate of the particular headset used, as well as your application.
You can query the IOService layer to get GPU model and specs, but you will have to extrapolate the capabilities for yourself based on your personal requirements.

ARKit plane detection is problematic on iPhone 6s

I have been working on ARKit tutorials, and using my iPhone 6s as a build device; the problem is plane detection in my experience does not seem to be as effective as the tutorials I watch (which use iPhone 7).
Have not been able to find anything online about whether plane detection is identical or not for the iPhone 6s vs newer models, does anyone have any similar experience?
Or any online resource clearly stating that plane detection will be identical no matter the supported device used (iPhone 6s and above, iPad 2017, iPad Pro).
To be clear, the question is: is iPhone 6s worse in plane detection vs an iPhone 7 and above?
Thanks.
My biz partner and I do a lot of work in the ARKit space. I have a 6S and he has a 7. We're constantly discussing and evaluating performance/accuracy issues with ARKit and have never encountered a scenario where ARKit plane detection (or any other facet of ARKit for that matter) differed between the two devices.
He was lucky enough to get his hands on an iPhone X for a short time, did some AR testing, and reported that accuracy/detection did noticeably improve. That's likely due to the massive improvements in the A11 Bionic, so I'd expect to see those gains on the 8 as well.
That's a totally empirical take, so ymmv...
I also experiencing sometimes tracking issues with Iphone 6s. Especially when I really really close to an object with the phone, or if I suddenly move my phone camera.
I think on newer devices ARKit works better.

iOS 9.2.1: iBeacon region monitoring bug

I deployed the same iBeacon region monitoring code in 7 devices. One of them, iPhone 6 running iOS 9.2.1, is regularly failing to detect beacon states (inside, outside, etc..).
I tried with and without SIM card and when the SIM card is inserted it gives better results. However, when compared to an iPhone 6 running iOS 9.2.3 and one running iOS 9.2.1 it gives worst results (the latter two iPhone's always detect the region).
Have you experienced the same in this version of iOS (iOS 9.2.1)?
Is this an known official Apple bug?
My theory behind this:
iBeacon region monitoring uses iPhone location, iPhone location can be
improved using Wi-Fi and phone signal and possibly also accelerometer
and gyroscope data.
iOS could be using these in combination withe the Bluetooth data to
"tune" the beacon ranging thread (in other words if a significant
location happens then the background monitoring frequency will be
increased). Hence if this tuning is depending on the SIM card
information there could be a version of iOS where they put a stronger
dependency on the SIM card presence leading to what I am observing.

Accessing Bluetooth via Bonjour/GameKit from the iPhone SDK simulator (circa 3.1.3)

I'm in the process of getting hardware for an iPhone prototype, and I'm wondering if it's possible to access bluetooth (ExternalAccessory, Bonjour or GameKit) exposed by development box (eg. a macbook) using the SDK's simulator as of the latest version (3.1.3 at the time of writing)
Before I get any answers on the subject, I accept that I will need an iPhone for the actual development - this is simply a prototype.
I'm not sure I understand your question entirely.
I can state that in iOS (as of May 2011) the only access you have to bonjour is through the Gamekit functionality. That functionality is fairly robust, but (for instance) you will not be able to create a BT based bonjour service that other non-iOS BT enabled devices can find.
On the simulator, even if the underlying device HAS bluetooth capabilities, you will NOT be able to perform "real" bluetooth operations. Instead the simulator will simulate some capabilities, and ignore others. You really do need multiple devices to load your code onto and run, in order to test out BT code.
Bluetooth connectivity is inaccessible using the Simulator. If WiFi is good enough for you, that works with Bonjour and GameKit on the Simulator as well. Otherwise, just like for the accelerometer and several other components of the device, you will need an actual device.