How to capture the background frames while a unity game is running?
I already knew
cam.Render();
Texture2D image = new Texture2D(cam.targetTexture.width, cam.targetTexture.height); image.ReadPixels(new Rect(0, 0, cam.targetTexture.width, cam.targetTexture.height), 0, 0);
and then
converting the texture into an image using EncodeToPng/JPG
But what I have seen using this extra cam.render and encoding to png and capturing the frames differently slow down the operation drastically, takes huge time to do this.
How can I directly get the textures or frames online while a game is running, maybe from the OpenGL calls which GPU are using?
Does anyone have any idea how to achieve this?
Have you tried using Unity's Video Capture API to record gameplay? There are tutorials and instructionals you can find, but the Unity Documentation is a good place to start.
https://docs.unity3d.com/Manual/windowsholographic-videocapture.html
It's flexibly implemented, so you can work around whatever requirements you want in terms of fps, resolution, etc based on your application and how it's set up.
Related
I am using the Video player API that unity provides for playing a video on the surface texture. While I am changing the video clip after each update the FPS on the editor is really slow. The switching and loading the new video clip takes a lot of time (500-600 ms)
videoPlayer.clip = videoClips [vindex]; //this command used for changing the videoclip
I just put the timer before and after and found it consumes a huge amount of time.
Can anyone please tell me how to reduce the time and increase the FPS. Any alternative way or suggestion will highly be appreciated. (Platform: Unity Editor on windows)
If the videos are really small, you can consider to use multiple videoPlayer to play each video in the same time. Set the RenderMode to RenderTexture, and switch RenderTexture instead of videoClip.
surface.GetComponent<MeshRenderer>().material = videoPlays[vindex].targetTexture;
I have a program to allow user to record the video. I have to provide options like Black&White, Crystal etc. like how "Viddy" iPhone application does for setting more Effects to the recorded video. How can i achieve it programmatically?
Please guide me.
Thank you!
Here's one way:
Start capturing video frames with AVCaptureSession+AVCaptureVideoDataOutput
Convert the frames to OpenGL textures for display
Write GLSL shaders for each desired effect and apply to textures from step 2
Read back textures + effect and write to movie file
Optimise until performance is adequate
goto 5
5 is probably the most important step. You'll need to tune and tweak the algorithm, video quality, texture size, frame rates, shaders, etc.
Enjoy!
I'm currently using box2d with cocos2d on iPhone. I have quite a complex scene set up, and I want the end user to be able to record it as video as part of the app. I have implemented a recorder using the AVAssetWriter etc. and have managed to get it recording frames grabbed from OpenGL pixel data.
However, this video recording seems to a) slow down the app a bit, but more importantly b) only record a few frames per second at best.
This led me to the idea of rendering a Box2D scene, manually firing ticks and grabbing an image every tick. However, dt could be an issue here.
Just wondering if anyone has already done this, or if anyone has any better ideas?
A good solution I guess would be to use a screen recorder solution like ScreenFlow or similar...
I think your box2d is a good idea... however, you would want to used a fixed-time step. if you use dt the steps in the physics simulation will be to big, and box2d will be unstable and jittery.
http://gafferongames.com/game-physics/fix-your-timestep/
The frame rate will take a hit, but you'll get every frame. I don't think you'll be able to record every frame and still maintain a steady frame rate - that seems to be asking a lot of the hardware.
Bad news everyone!
I'm trying to port old Windows game to iPad and I have a problem. Much of all graphics written without hardware acceleration. On Windows it uses LockRect() call from IDirect3DSurface8 class to get colorbuffer from backbuffer and write some graphics data. After this it uses UnlockRect to send our pixel color data to videomemory. But OpenGL ES hasn't such functionality. I'm trying to emulate this. I have an array which I draw every game tact using glTexImage2D() then glDrawTexfOES(0, 0, 0, 1024, 768)
But creating a texture from array every tact is too slow. How can I do this much faster? Thanks in advance.
Try a rendering a textured quad and use glTexSubImage2D() for texture re-uploads.
I'm working on a relatively simple 2D side-scrolling iPhone game. The controls are tilt-based. I use OpenGL ES 1.1 for the graphics. The game state is updated at a rate of 30 Hz... And the drawing is updated at a rate of 30 fps (via NSTimer). The smoothness of the drawing is ok... But not quite as smooth as a game like iFighter. What can I do to improve the smoothness of the game?
Here are the potential issues I've briefly considered:
I'm varying the opacity of up to 15 "small" (20x20 pixels) textures at a time... Apparently varying the opacity in this manner can degrade drawing performance
I'm rendering at only 30 fps (via NSTimer)... Perhaps 2D games like iFighter are rendered at a higher frame rate?
Perhaps the game state could be updated at a faster rate? Note the acceleration vales are updated at 100 Hz... So I could potentially update part of the game state at 100 hz
All of my textures are PNG24... Perhaps PNG8 would help (due to smaller size etc)
It's really hard to debug graphics problems with this. Try using the openGL ES instruments to find where are the bottlenecks. It's quite handy. Also, look at the WWDC videos on openGL. They're really good.
One thing I noticed is that you said "I'm rendering at only 30 fps". Does it mean you're manually setting up a timer or something? This is not a good way, instead you should use CADisplayLink to get notified when the screen wants to update. It could improve your smoothness.
To complete Mo's answer on the 30fps...
The fact that you request the update at 30fps does not mean you're going to get it. I'm not an iphone programmer, but I can tell you that if your frame rendering takes 100ms, you're guaranteed to never update faster than 10fps. And if you're actually rendering at 10fps, then smoothness is gone.
So, measure the time you take to render to get an idea of what actual frame rate you get. As to how to optimize the rendering specifically for iphone, I'll leave that to people more expert than me on the subject.