My Iphone Application running HTTP Live Stream through url and when i am playing it on My Iphone it is showing Memory Leakage.
Here is the Leakage Showing.
Leaked Object = GeneralBlock-64 (64 bytes size)
Responsible Library = UIKit
Responsible Frame = GetContextStack
ONLY WHEN I AM RUNNING IT ON IPHONE IF SIMULATOR NO LEAKAGE THERE.
PLEASE HELP....
check whether you are running your code in main thread or child thread,if you are working with UI Elements you must run your code on UIThread only.so check ur code once
Related
I am working on an IOS app (ios 12) that uses both the ARKit and Metal/Metal Kit Frameworks. I am capturing images and zipping them and saving them to the phone's document directory. The images are being captured every 10 centimeters the phone moves. However, after about 10 minutes (after about 300 MB of data is saved) the app crashes and spits the error:
"Execution of the command buffer was aborted due to an error during execution. Discarded (victim of GPU error/recovery) (IOAF code 5)"
Has anyone seen this or understand what may be the issue?
Change your background property of mainscene from procedural sky to custom image and turn off environmentsceen
I created application to download files. For downloading i use ASIHTTPRequest. When I start download big file, and lock my device, after some time my download stops, wi-fi disables and i see Edge icon instead of Wi-fi icon. When I unlock my device, Wi-fi icon appears in 1-2 seconds. My application is not in background! How to solve my problem?
Two things come to mind:
Firstly enable persisten wifi connection for you app: My iPhone app needs a persistent network connection...how to specify UIRequiredDeviceCapabilities?
Secondly make the app request background time when it goes into the background so the actual download can continue:
Continuing a long running process in the background under iOS4
I'm not sure if 10 minutes after locking the device if the app would count as running in the background or not.
I'd at least try enabling background downloading in ASIHTTPRequest:
[request setShouldContinueWhenAppEntersBackground:YES];
It might help and you've nothing to lose :)
You can also prevent the IPhone to lock the screen. It'll use more battery but will solve your problem:
UIApplication *myApp = [UIApplication sharedApplication];
myApp.idleTimerDisabled = YES;
I have a bad performance problem with OpenAL in my iPhone game. My game runs smoothly with 60fps but when I initialize OpenAL the game begins to jerk. This is my initialisation code:
ALCdevice* device = alcOpenDevice(NULL);
ALCcontext *context;
if(device) {
context = alcCreateContext(device, NULL);
alcMakeContextCurrent(context);
}
I don't create any OpenAL Sources nor load/play any sounds. The jittering (the game jerks all the time) is caused by the initializing of OpenAL.
The XCode instruments are saying that the game runs with stable 60fps yet it's obviously jittering (when I don't run the code above the game runs smoothly).
This also doesn't happen on an old iPod Touch 2G with iOS 3.1.3. On all my other devices with iOS 4 the jittering happens, which is also crazy.
I also tried to put the OpenAL stuff in a separate thread but it doesn't help.
Has anybody noticed a similar behaviour?
Try calling alcGetError() after context set up to ensure it succeeded.
Is this failing on the simulator or on an actual phone?
Are you doing anything with sound after the code you posted? In the code you posted if alcOpenDevice fails then you won't have a context, yet you don't return either.
How can I gather Instruments memory/zombies data after removing app in simulator and restarting in simulator?
What I see in the simulator after I remove the app process, and then retart it by clicking on it's icon in the simulator, is that Instruments has stopped receiving the data from the application?
Background
I'm trying to test saving and loading data via NSUserDefaults. After saving it to test the loading part I need to simulate removing the app from memory, so what I have been doing is manually removing the app process on the simulator (double click on menu button etc).
What I'm finding is that when I do run up the app this way there is some problem at start up - getting "Thread 1 - Program received signal SIGKILL" against the " int retVal = UIApplicationMain(argc, argv, nil, nil);" line of code
Therefore I thought I would try to run in Instruments to track down the issue, hence my question as Instruments seems to "stop recording" after I kill the app process on the simulator and then restart.
Not possible (i.e. no answers yet so I'm guessing the correct answer may be that it's not possible)
I'm trying to build a video recorder without jailbreaking my iPhone (i've a Developer license).
I began using PhotoLibrary private framework, but i can only reach 2ftp (too slow).
Cycoder app have a fps of 15, i think it uses a different approach.
I tried to create a bitmap from the previewView of the CameraController, but it always returns e black bitmap.
I wonder if there's a way to directly access the video buffer, maybe with IOKit framework.
Thanks
Marco
Here is the code:
image = [window _createCGImageRefRepresentationInFrame:rectToCapture];
Marco
That is the big problem. So far i've solved using some temp fixed size buffers and detach a thread for every buffer when is full. The thread will save the buffer content in the Flash memory. Launching some heavy threads, heavy beacause each thread access the flash, will slow the device down and the refresh of the camera view.
Buffers cannot be big, because you will get memory warning, and cannot be small because you will freeze the device, because of too many threads and accesses to the flash memory at a time.
The solution resides in balancing buffer size and number of threads.
I haven't already tried to use sqlite3 db to store images binary data, but i don't if will be a better solution.
PS: to speed up class methods call, avoid the common solution [object method] because of how method call works, but try to get and save the method address as below.
From Apple ObjC doc:
"The example below shows how the procedure that implements the setFilled: method might be
called:
void (*setter)(id, SEL, BOOL);
int i;
setter = (void (*)(id, SEL, BOOL))[target methodForSelector:#selector(setFilled:)];
for ( i = 0; i < 1000, i++ )
setter(targetList[i], #selector(setFilled:), YES); "
Marco
If you're intending to ever release your app on the App Store, using a private framework will ensure that it will be rejected. Video, using the SDK, simply isn't supported.
To capture the video you can see when the Camera is active requires fairly sophisticate techniques, not exposed by any framework/lib out of the box.
I used a non documented UIWindow method to get the current displayed frame as CGImageRef.
Now it works successfully!!
If you would, and if i'm allowded, i can post the code that do the trick.
Marco