Trying to open gamescene in xcode 9.2 on vmware MAC High Sierra but xcode unexpectedlly shut down - sprite-kit

I am trying to open Gamescene.sks file in Vmware HighSierra Mac but XCode unexpectedly closes. I have the solution for XCode 8. How to do for Xcode 9.2?

As AlessandroOrnano pointed out, the problem could be due to the video drivers. Since SpriteKit uses GPU, and VMWare cannot use GPU acceleration for macOS, there will be glitches and stuttering wherever the system uses graphics that runs on GPU such as VisualEffect views, rendering 2D or 3D objects in a scene etc. VMWare uses 3D software rendering for all graphics-related activities in macOS, and the fact that SpriteKit needs Hardware acceleration can make it not work at all or crash.

Related

iPhone 12 Pro is not showing 3D model instantly

I have developed an AR mobile application with object detection using TensorFlow. The app is running perfectly on iPhone 12 Mini and other iPhones. But when I test it on iPhone 12 Pro and iPad 12 Pro the app is not showing 3D model when the phone camera is far from the detected object. When ever the app detects the trained object it is suppose to show the 3d model and place near to that object but in iPhone 12 pro versions it is only showing 3D object when camera is near to detected object.
I think may be LiDAR is creating problem? If yes, then how to stop the LiDAR using C# code as I have developed the project in Unity using ARFoundation and TensorFlow. I am using ARFoundation 1.0.
ARFoundation 1.0 was released in 2018, so it doesn't support Meshing (generation of triangle meshes that correspond to the physical space). So, there may be possible time-lag-problems, because device equipped with a LiDAR scanner must understand that there's no support for Scene Reconstruction in a current config, and it must toggled for a common Plane Detection approach instead.
A solution is simple – use the latest version of ARFoundation 4.1.5.

Why is not Unity recognizing my digital camera?

I have mixed around with Unity and Vuforia. Im all set and got it to work with an image that spawns 3D objects. My project has been done with my FaceTime Camera on my Macbook pro 2018. Now i want to try and move over to my Digital Camera: Sony AX6300. But when i connect the camera, Unity wont recognize it. I can still only choose FaceTime Cam built in on my Mac. Can anyone in here help out maybe?
Image
You have to edit the xml file named webcamprofiles.xml inside vuforiaresource, and add or edit your device camera name...
it works for me

Vuforia and Unity : Unable to place mid-air objects or use ground plane

I'm trying to place an object in mid-air and detect ground planes. When I follow the steps in the documentation, it doesn't work and I have the following error when debugging with adb:
PositionalDeviceTracker has not been Initialized correctly, unable to create anchors
I tried on an iPad running iOS 11.2, Pixel XL running Android 8.1.0 and a OnePlus 3T on Android 8.0.0 (which is not supporting ground planes but should work with mid-air anchors)
I tried each on Unity 2017.3.0p4, 2017.3.0f3, 2017.3.1f1 and even 2018.1.0b7
None of the above combination had any of these two features working.
I also use the image target feature and this one works perfectly.
I once managed to detect ground planes a month ago and I haven't changed my Unity version since then. However I updated both my Android and iOS devices at least once since then.
Please could you let me know if I'm doing anything wrong or if there's a known issue about that?
Thanks

Unity 3d crashing on mac when I press "play" in the editor

I downloaded Unity 3d on my macbook air and created a new project with a single 3d object (a plane).
When I press the "play" button (Triangle icon), unity hangs and crashes.
It happens no matter if I create a new project again and add a different 3d object (a sphere).
The only way my project doesnt crash when I press play is if it's a completely blank project with no 3d objects.
Interestingly I can build a mac build of a project and it runs okay (I can see the 3d objects, the plane or the sphere).
Can anyone offer any advice?
Edit: OS / X version 10.9
My version of OS/X was a little outdated.
Upgrading from 10.9 to 10.12.X fixed the problem (after a 4 GB download and 60 minute install)
Ok, i think i solved it. My project settings had metal editor support disabled. Enabling this option solved the crashing problem. I reproduced this issue with a freshly created project and just disabled the Metal editor support. I tried on a macbook pro 2018 (with dedicated graphics card) and could not reproduce the issue. This could be an Intel graphics driver related problem.

gles 2.0 perfomance on iphone simulator, iphone, macbook pro

I did an wave animation to explore features of sgx chip which is tile-based rendering (TBR) architecture by comparing the performance on iphone and laptop.
An advantage of TBR architecture is it allows the GPU to perform hidden surface removal before fragments are processed, so I draw many overlaped layers of animated waves, and only the wave in the top layer is visible.
I did this program on both iphone 3gs (using gles 2.0) and my laptop, a macbook pro(using opengl 2.0). I recorded the fps numbers of different layers, and I assume trends of fps changes on iphone and laptop are different. I guess the performance's decreasing of iphone should be slower than on laptop, when the number of layers is increased. But they have very similar trends.
I have 2 questions.
1. why it doesn't show the advantage of TBR architecture, while there are alot of overlapped triangles
2. why the performance of iphone simulator is much much much much slower than just running on laptop(without simulator)? As documentations say the simulator does not enforce the memory limitations of MBX and SGX and take the advantage of laptop's CPU, i guess its performance should keep up with the laptop.
anyone can help?
thanks alot
The OpenGL ES implementation in the iPhone Simulator is a software rasterizer and does not use the GPU in your MacBook.
What kind of framerate trends are you seeing, and in what way is only the wave on the top layer visible? Your primitives typically need to have framebuffer blending disabled and not issue discards in the fragment shader in order for hidden surface removal to skip fragment processing for what’s underneath.