iPhone UIWebView photo upload memory leak - iphone

Whenever I upload a image using webView in iOS I got some memory leak
malloc 16byte JavascriptCore only one time
PhotoLibrary 144byte CAMBulleredSnapShotView multiple time
CALayer 48byte UIKit multiple time
UIStatusBarHideAnimationParameters 48byte multiple time
malloc 144&16 byte Quartzcore multiple time
how I release it any help thanks in advance

Related

Memory leakage when Running on iPhone

My Iphone Application running HTTP Live Stream through url and when i am playing it on My Iphone it is showing Memory Leakage.
Here is the Leakage Showing.
Leaked Object = GeneralBlock-64 (64 bytes size)
Responsible Library = UIKit
Responsible Frame = GetContextStack
ONLY WHEN I AM RUNNING IT ON IPHONE IF SIMULATOR NO LEAKAGE THERE.
PLEASE HELP....
check whether you are running your code in main thread or child thread,if you are working with UI Elements you must run your code on UIThread only.so check ur code once

Saving file to disk while running AVCaptureVideoPreviewLayer and CMMotionManager

Good day, hope someone can help me up with this situation:
I working on an iPhone app that takes series of images, assisted by gyroscope.
So both AVCamCaptureManager and CMMotionManager sessions are running at the same time.
after taking a still image, i am:
- processing the image in a background thread (which works fine without affecting anything)
- then saving processed image data to disk
[imageData writeToFile:imagePath atomically:YES];
The issue: both AVCamCaptureManager and CMMotionManager sessions freeze for less then 1/2 second, right after initiating writeToFile function.
Does anyone have any experience with such scenario?
Thanks for your time! :)
It appears that saving to disk does not affect sessions.
I am also setting UIImageView.image to a large image in the end of my routine, and this is what was freezing everything for 1/2 second.

iOS Video: More than 4 simultaneous AVAssetReaders possible?

I would like to render multiple H264 mp4 videos on multiple views at the same time. Target is to read about 8 short videos, each at a size of 100x100 pixels and let them display their content on multiple positions on the screen, simultaneously.
Imagine 24 squares on the screen, each showing one video out of pool of 8 videos.
MoviePlayer doesn't work, for it's only showing one fullscreen video. An AVPlayer with multiple AVPlayerLayers is limited, because only the most-recently-created Layer will show it's content on screen (according to the documentation and my testing).
So, i wrote a short video class and created an instance for every .mp4 file in my package, using AVAssetReader to read it's content. On update, every videoframe is retreived converted to an UIImage and displayed, according to the video's framerate. Furthermore, these images are cached for a fast access on looping.
- (id) initWithAsset:(AVURLAsset*)asset withTrack:(AVAssetTrack*)track
{
self = [super init];
if (self)
{
NSDictionary* settings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], (NSString*)kCVPixelBufferPixelFormatTypeKey, nil];
mOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:track outputSettings:settings];
mReader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
[mReader addOutput:mOutput];
BOOL status = [mReader startReading];
}
return self;
}
- (void) update:(double)elapsed
{
CMSampleBufferRef buffer = [mOutput copyNextSampleBuffer];
if (buffer)
{
UIImage* image = [self imageFromSampleBuffer:sampleBuffer];
}
[...]
}
Actually this works pretty well, but only for 4 videos. The fifth one never shows up. First I thought of memory issues, but I tested it on the following devices:
iPhone 3GS
iPhone 4
iPad
iPad 2
I had the same behaviour on each device: 4 videos playing in a smooth loop, no differences.
If it would have been a memory issue, I would have expect at least either the iPad 2 to show 5 or 6 videos (due to it's better hardware) or the 3GS to show only 1 or a crash somewhere.
The simulator shows all videos, though.
Debugging on the device shows, that
BOOL status = [mReader startReading];
returns false for video 5,6,7 and 8.
So, is there some kind of hardware setting (or restriction) that doesn't allow more than 4 simultaneous AVAssetReaders? Because, I can't really explain this behaviour. I don't think that all devices have the exact same amount of video memory.
Yes, iOS has an upper limit on the number of videos that can be decoded at one time. While your approach is good, I don't know of any way to work around this upper limit as far as having that many h.264 decoders active at once. If you are interested, please have a look at my solution to this problem, this is an xcode project called Fireworks. Basically, this demo shows decoding a bunch of alpha channel videos to disk, then each one is played by mapping a portion of the video files into memory. This approach makes it possible to decode more than 4 movies at the same time without using up all the system memory and without running into the hard limit of the number of h.264 decoder objects.
Have you tried creating separate AVPlayerItems based on the same AVAsset for each AVPlayerLayer?
Here's my latest iteration of a perfectly smooth-scrolling collection view with real-time video previews (up to 16 at a time):
https://youtu.be/7QlaO7WxjGg
It even uses a cover flow custom layout and "reflection" view that mirrors the video preview perfectly. The source code is here:
http://www.mediafire.com/download/ivecygnlhqxwynr/VideoWallCollectionView.zip

_dyld_start causing leaks in iphone apps

Using the Allocations Instrument on my Iphone Device, I notice in my heapshots that all my heap growth is caused by the _dyld_start caller (of dyld library).
Here is an example:
Snapshot: UIImageView
Heap Growth: 4.83 Kb
Still Alive: 103
When I look in the details, all I see is several instances of the following:
Object Add: xxxx
Creation Time: ....
Live: check
Responsible Library: dyld
Responsible Caller: _dyld_start
What does this mean?
How can I change my code to release this memory?
if you load your UIImage with imageNamed: then you cant release this.
cause imageNamed: caches the image till the application closes.
you may try to load ur Image with imageWithContentsOfFile: or imageWithData:
Hope that helps

iPhone video buffer

I'm trying to build a video recorder without jailbreaking my iPhone (i've a Developer license).
I began using PhotoLibrary private framework, but i can only reach 2ftp (too slow).
Cycoder app have a fps of 15, i think it uses a different approach.
I tried to create a bitmap from the previewView of the CameraController, but it always returns e black bitmap.
I wonder if there's a way to directly access the video buffer, maybe with IOKit framework.
Thanks
Marco
Here is the code:
image = [window _createCGImageRefRepresentationInFrame:rectToCapture];
Marco
That is the big problem. So far i've solved using some temp fixed size buffers and detach a thread for every buffer when is full. The thread will save the buffer content in the Flash memory. Launching some heavy threads, heavy beacause each thread access the flash, will slow the device down and the refresh of the camera view.
Buffers cannot be big, because you will get memory warning, and cannot be small because you will freeze the device, because of too many threads and accesses to the flash memory at a time.
The solution resides in balancing buffer size and number of threads.
I haven't already tried to use sqlite3 db to store images binary data, but i don't if will be a better solution.
PS: to speed up class methods call, avoid the common solution [object method] because of how method call works, but try to get and save the method address as below.
From Apple ObjC doc:
"The example below shows how the procedure that implements the setFilled: method might be
called:
void (*setter)(id, SEL, BOOL);
int i;
setter = (void (*)(id, SEL, BOOL))[target methodForSelector:#selector(setFilled:)];
for ( i = 0; i < 1000, i++ )
setter(targetList[i], #selector(setFilled:), YES); "
Marco
If you're intending to ever release your app on the App Store, using a private framework will ensure that it will be rejected. Video, using the SDK, simply isn't supported.
To capture the video you can see when the Camera is active requires fairly sophisticate techniques, not exposed by any framework/lib out of the box.
I used a non documented UIWindow method to get the current displayed frame as CGImageRef.
Now it works successfully!!
If you would, and if i'm allowded, i can post the code that do the trick.
Marco