_dyld_start causing leaks in iphone apps - iphone

Using the Allocations Instrument on my Iphone Device, I notice in my heapshots that all my heap growth is caused by the _dyld_start caller (of dyld library).
Here is an example:
Snapshot: UIImageView
Heap Growth: 4.83 Kb
Still Alive: 103
When I look in the details, all I see is several instances of the following:
Object Add: xxxx
Creation Time: ....
Live: check
Responsible Library: dyld
Responsible Caller: _dyld_start
What does this mean?
How can I change my code to release this memory?

if you load your UIImage with imageNamed: then you cant release this.
cause imageNamed: caches the image till the application closes.
you may try to load ur Image with imageWithContentsOfFile: or imageWithData:
Hope that helps

Related

iPhone UIWebView photo upload memory leak

Whenever I upload a image using webView in iOS I got some memory leak
malloc 16byte JavascriptCore only one time
PhotoLibrary 144byte CAMBulleredSnapShotView multiple time
CALayer 48byte UIKit multiple time
UIStatusBarHideAnimationParameters 48byte multiple time
malloc 144&16 byte Quartzcore multiple time
how I release it any help thanks in advance

Memory leakage when Running on iPhone

My Iphone Application running HTTP Live Stream through url and when i am playing it on My Iphone it is showing Memory Leakage.
Here is the Leakage Showing.
Leaked Object = GeneralBlock-64 (64 bytes size)
Responsible Library = UIKit
Responsible Frame = GetContextStack
ONLY WHEN I AM RUNNING IT ON IPHONE IF SIMULATOR NO LEAKAGE THERE.
PLEASE HELP....
check whether you are running your code in main thread or child thread,if you are working with UI Elements you must run your code on UIThread only.so check ur code once

confusing memory allocation error on iPhone

Hello I'm working on an iPhone application which provides information with images and texts. In every text there is one image, which can be clicked and zoomed, shown with a UIImageView
NSString* imgName = [imgPath substringToIndex:[imgPath rangeOfString:#".jpg"].location];
UIImage* img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:imgName ofType:#"jpg"]];
[imgView setImage:img];
as I go through the images by opening them one by one the app crashes(debug on device). with some error in console:
: Decompression error
my_app_name(1226,0x3e088868) malloc: * mmap(size=32768) failed (error code=12)
* error: can't allocate region
and then:
CoreAnimation: failed to allocate 2228352 bytes.
I don't have leak in code and if I do not open the images I don't get the error. so does anyone have a clue where this problem could be?
oh I think I finally fixed it. and yes my images are relative large, about 700*600 in size.
the problem seems to be in [imgView setImage:img]; the img is although released but somehow still in memory, I don't know. One line code [imgView setImage:nil]; before releasing imgView in dealloc fixes the problem.
Thank you for the helps.
Seems you are using too much memory?
How many images do you open? Start with instruments attached and watch the memory footprint.
Keep in mind that images take much more memory when loaded than compressed on disk.
Try wrapping-up your allocations/releases in a local auto release pool.

iPhone Dev: big png sequences cause crash?

I'm building an app which includes a number of image sequences (5 sequences with about 80 images each). It runs nicely in the iPhone simulator, but causes my iPhone to reboot when I test it. By the way, each png image is about 8k in size.
Has anyone successfully built a similar app?
Am I using too many resources for the iPhone to handle?
Anyone?
UPDATE:
Thanks to all for you answers! I've modified my code to use [UIImage imageWithContentsOfFile:] instead of [UIImage imageNamed:]
However I'm still unable to prevent the app from crashing my iPhone.
(please note that my pngs are not that big about 400x400px / 8k)
Does anyone have any suggestions?
Here's my code:
// code snippet:
myFrames = [[NSMutableArray alloc] initWithCapacity:maxFrames];
NSMutableString *curFrame;
num = 0;
// loop (maxframes = 80)
for(int f = 1; f < maxFrames+1; f++)
{
curFrame = [NSMutableString stringWithString:tName];
if(f < 10) [curFrame appendString:[NSString stringWithFormat:#"00%i",f]];
else if(f>9 && f<100) [curFrame appendString:[NSString stringWithFormat:#"0%i",f]];
else [curFrame appendString:[NSString stringWithFormat:#"%i",f]];
UIImage *img = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:curFrame ofType:#"png"]];
if(img) [myFrames addObject:img];
[img release];
}
// animate the images!
self.animationImages = myFrames;
self.animationDuration = (maxFrames * .05); // Seconds
[self startAnimating];
The best way to find out is to run the application under Instruments using Leaks or Object Alloc. If you see an upward trend that keeps rising, you might have a leak.
If you're using [UIImage imageNamed:], you should be aware that it pre-caches an optimized version which takes up more memory when compared with [UIImage imageWithContentsOfFile:]. Additionally, until iPhone 3.0, the cache created by [UIImage imageNamed:] doesn't get released when there's a memory warning.
The current-gen iPhone only has 128MB of ram, some of which is used by the OS itself. A 320x480 image fully uncompressed with an alpha channel can take 614k. If you have 400 unique images that are full screen, that's well over 128MB of ram, assuming it is loaded up and cached uncompressed.
The number one reason why an app would not crash on the simulator but on the phone would be memory
On the iphone simulator AFAIK the memory is not limited to 128Mb while on the iphone once it reaches 128Mb it restarts. So check your memory usage on the simulator. You have to change the way you are loading the images and or check for leaks. Also check if your getting low memory warnings by implementing the methods (I forgot what they are called :()
I've seen apps run in the simulator and not on the phone because of improper PNG formatting (even a single improperly formatted image can cause this crash). Check to make sure that the format of your images matches those of PNG files provided by apple in their example apps.
That being said 400 full screen images would easily cause it to run out of memory as in memory they will occupy far more than the 8kb. Not sure how big those images are, but if they're all in memory they will need to be very, very small on the iPhone.
The first answer to your question states that while your PNGs may take up only 8K on disk, that is the compressed on-disk form. When it is loaded into memory, it is decompressed and is much larger than 8K. At 32-bits per pixel, a 400x400 image will be 640K.
Even without the alpha channel, you're looking at 480K. 480K x 80 frames, that is 38.4MB, which is definitely creeping into using more memory than the iphone has available to give your app at once. Here is an article about some of the troubles with obtaining a substantial about of memory from the iPhone OS.

iPhone video buffer

I'm trying to build a video recorder without jailbreaking my iPhone (i've a Developer license).
I began using PhotoLibrary private framework, but i can only reach 2ftp (too slow).
Cycoder app have a fps of 15, i think it uses a different approach.
I tried to create a bitmap from the previewView of the CameraController, but it always returns e black bitmap.
I wonder if there's a way to directly access the video buffer, maybe with IOKit framework.
Thanks
Marco
Here is the code:
image = [window _createCGImageRefRepresentationInFrame:rectToCapture];
Marco
That is the big problem. So far i've solved using some temp fixed size buffers and detach a thread for every buffer when is full. The thread will save the buffer content in the Flash memory. Launching some heavy threads, heavy beacause each thread access the flash, will slow the device down and the refresh of the camera view.
Buffers cannot be big, because you will get memory warning, and cannot be small because you will freeze the device, because of too many threads and accesses to the flash memory at a time.
The solution resides in balancing buffer size and number of threads.
I haven't already tried to use sqlite3 db to store images binary data, but i don't if will be a better solution.
PS: to speed up class methods call, avoid the common solution [object method] because of how method call works, but try to get and save the method address as below.
From Apple ObjC doc:
"The example below shows how the procedure that implements the setFilled: method might be
called:
void (*setter)(id, SEL, BOOL);
int i;
setter = (void (*)(id, SEL, BOOL))[target methodForSelector:#selector(setFilled:)];
for ( i = 0; i < 1000, i++ )
setter(targetList[i], #selector(setFilled:), YES); "
Marco
If you're intending to ever release your app on the App Store, using a private framework will ensure that it will be rejected. Video, using the SDK, simply isn't supported.
To capture the video you can see when the Camera is active requires fairly sophisticate techniques, not exposed by any framework/lib out of the box.
I used a non documented UIWindow method to get the current displayed frame as CGImageRef.
Now it works successfully!!
If you would, and if i'm allowded, i can post the code that do the trick.
Marco