Multiple cameras make bad performance on mobile - unity3d

When i add new base camera even though it doesn't display anything my fps on mobile decrease approximately on 10 points. Is new camera on scene so expensive by performance for mobile? I use URP. How can I increase my fps if need six base cameras at time for different render textures?

Related

Is a big amount of objects a bfl thing for FPS and performance? (unity 2d)

I'm working on 2d game for android in unity and I'm curious Does a large number of objects in the scene affect the performance of the game?(if they are not in the camera frame. for example: different levels)
Yes the number of objects will affect your frame rate.
Assuming that you don't have any scripts running on them their impact on your system will be smaller while they are out of the cameras frame.
You can measure the your current performance with the profiler and use some tips to reduce the load.

Reducing eye strain when displaying video in VR?

I'm working on a VR project where at some points we will display video in front of the user. I'm looking for recommendations/answers into how to maintain focus on the video without causing eye strain on the users. I want to know how others have tackled this issue of showing video for VR headsets within their projects/games.
What I have tried so far is increasing/decreasing the size and distance of the video from the players eyes/head. I have found that since you can't do overlay or world space-camera with VR canvas, you have to result to general world space placement. This results in decreasing too small and close causing a splitting effect and also blurring the video. Increasing the size as resulted in no significantly noticeable changes.
Headset: HTC Vive (Regular and Pro versions)

Low Framerate in SceneKit

I am writing a game in SceneKit, and it is divided into levels. Each level uses different basic shapes (the ones that come in Xcode/SceneKit as default, found in the object library) as "obstacles" that the player must avoid. So level one has blocks as obstacles and runs at 60 FPS on iPhone 6 and a below, yet when the player plays level two, which uses Pyramids as obstacles, the FPS drops to 10. There are less than 500 nodes in each level. It runs at 60 FPS on iPhone 8. What is occurring and how can it be fixed?
Here is a youtube link that displays how they are being used in the game. It is run on an iPhone 8 Plus and has no FPS issues; however, if run on an iPhone 5s or lower the frame-rate drops very low for the first level, but not the second.
This link provides it with statistics running. WaitDrawable takes up a big portion.
Here are images of the wireframes, with and without materials.
The issue was that Physics was being called on every frame, then performing some logic. The player was hitting the floor, used to center the block, every frame. I changed the contact bit mask and all the issues went away.

Is there Optimizations in the render pipeline for Virtual Reality

Im trying to get my head around the render pipeline on a Head Mounted Display.
Given that a we have a target refresh rate of 90hz per screen.
Is there efficiency built within the pipeline that benefit from a reduced compute load on the reduced delta from one frame to another in VR?
Im wondering does the fact that less pixels have changed in the image from Frame A to B # 90fps compared to Frame A to B # 45fps given the same movement on screen.
I.e is the workload per frame from moving 1 frame to another anyway reduced by these new frames.
http://imgur.com/6rRiWGM
AFAIK all frames on VR HMDs are always rendered from scratch as in other 3d applications. If there was a good method to magically interpolate the rendering why would it only be used on VR?
There is however another trick called Timewarp. With proper async timewarp implementation if you don't provide a new frame in time, the last one is rotated by the delta of your headset rotation.
So when you look around, the head movement is still looking as if your
app would have high fps.
If you were not moving and there is nothing "stuck" to the camera like a GUI, this is a very good ilusion.
Currently timewarp is working well on GearVR and Vive, possibly on production ready Oculus (but not on DK2 on 0.8 drivers, still haven't got my new headset).

FPS drops down on iPhone 4 and 3GS

I have a problem with fps while using particles. In game I have coins which uses particles.
I have tested my application on iPhone 3gs, 4, 4s, 5 and on iPad 3. FPS goes down to 30-35 on 3GS and iPhone 4. But when I stop using particles FPS goes to 50-60.
I used also CCParticleBatchNode but didn't help :(
The code I used with batchNode:
CCParticleBatchNode *batchNodeParticle = [CCParticleBatchNode batchNodeWithFile:#"image.png"];
CCParticleSystemQuad *particles = [CCParticleSystemQuad particleWithFile:#"particles.plist"];
[batchNodeParticle addChild:particles];
[self addChild:batchNodeParticle];
Any suggestions?
Thanks and sorry for bad english.
Particle effects are very easy to use performance killers. Here are a few suggestions:
Reduce the number of particles. Usually one is tempted to start out with far too many particles. Anything above 100 should make you feel uneasy, anything above 250 should cause a mild panic attack.
Multiple particle effects running at the same time multiply the number of particles. 10 particle effects with 100 particles are just as bad performance-wise as a single particle effect with 1000 particles.
Don't use overly large textures. Most particles look fine with a 64x64 texture or even less.
There's no real need to provide a -hd version of the particle effect. Particles get scaled up on Retina devices and look the same automatically. The only benefit from using -hd particles is using higher resolution textures, which in 99% won't make any visual difference on Retina devices. This is because most particle effects are somewhat blurry in nature to begin with.
Particle batching only improves performance if you add multiple particle effects (using the same texture) to the same particle batch node.