detect no disk space iPhone SDK - iphone

Suppose I need to write many images to iPhone file system. And I need to find I have enough space to write image to disk. Is it possible using iPhone SDK?

Yes, it is possible. See the following tutorial (found using the powerful "google" search engine) ;)
http://iphoneincubator.com/blog/device-information/how-to-obtain-total-and-available-disk-space-on-your-iphone-or-ipod-touch
Edit: added response re: writing UIImage to disk, with unknown available disk space:
Try-Catch a bad way of doing things. Exceptions in cocoa are only for truly exceptionally circumstances (i.e. crash-worthy). If you write your image using
NSData *imageData = [myImage UIImageJPEGRepresentation];
(or UIImagePNGRepresentation)
and then
NSError *error;
BOOL success = [imageData writeToFile:(NSString *)path options:(NSDataWritingOptions)mask error:&error];
Your BOOL failed will tell you if it worked, and error will contain an NSError object with a description of the error (don't test for error != nil though, that'll crash occasionally).

Related

ASIHTTPRequest - Uploading files from the camera roll

I currently need to upload large files from an iDevice to a server API. For this I'm attempting to use the ASIHTTPRequest library as it automatically supports uploading large files in queued chunks (before this I simply created an NSData instance with the bytes for the entire file at once and attached this to the POST message, but this causes the application to crash on larger files due to an excessive amount of RAM usage).
The problem is that when you need to upload files and add a file to the HTTP Post message, this is the syntax you need to use:
[theUploadRequest setFile:#"" forKey:#"videoupload"];
setFile requires a file path in a string format. The problem I'm currently having is that it does not seem like you are allowed to simply take the file path from a file which is not in your applications sandbox? Since I need to upload a file which is not in my application, but outside of it in the standard cameraroll.
I tried to make this quick test to see if I could create an NSData object and fill it with data from a file in the cameraroll, providing a path to it like this:
NSData *testData = [NSData dataWithContentsOfURL:theContent.defaultRepresentation.url];
NSLog(#"THE SIZE OF THE TEST DATA: %i", testData.length);
Note that "theContent" is an instance of an ALAsset, and is a file retrieved from the cameraroll. The result of this is simply a length of 0, which I suppose means you can't simply do that.
Is there any way around this? Or would I have to somehow import the video file into the application's sandbox?
So if I understand correctly, you want to upload stuff straight from the camera roll? Based on the docs I'd say the important piece of code you need is:
NSString *filePath = [[info objectForKey:
UIImagePickerControllerMediaURL] path];
The info dictionary is passed to your media picker's delegate:
- (void) imagePickerController: (UIImagePickerController *) picker
didFinishPickingMediaWithInfo: (NSDictionary *) info;
You should then be able to use that filePath in your call to setFile.
Take a look here probably you'll find something useful.
Hope this helps.
EDIT:
For something simpler you can use the bit of code posted in this answer
Implying that you need to copy the file somewhere before uploading it, especially if it is a big one.

iPhone, how does one read an image from the photo library as NSData?

On the iPhone I saved an image to the photo library with modifications (appending data after the image code) using the assest library to save the NSData directly to the photo library. Now I want to read the image back from the photo library as an NSData to read it. If I read it as a UIImage, it changes the data inside the image.
How can I read the photo from the photo library as NSData? I tried looking into the reference URL in 4.1 but no luck.
Edit: I edited to explain why I'm saving it as an NSData. The URL method in the answers works if you just want pure NSData, but does not help in the context that I am asking.
Edit 2: The answer with the getBytes was the answer that did it for me. The code I used was:
int8_t *bytes = malloc([representation size]);
NSUInteger length = [representation getBytes:bytes fromOffset:0 length:[representation size] error:&error];
This was able to get me everything inside the file which gave me the image code PLUS what I added in NSData form.
Edit 3:
Is there a way to do this now in iOS 10 with the PHPhotoLibrary which replaces AssetLibrary? I need the image in NSData being the original RAW image with no modifications.
You should look into the getBytes method of ALAssetRepresentation.
This gives you the original RAW image data of your image and also makes it possible to use a buffer to process the image instead of reading the image into memory all at once.
And of course you can always generate NSData from the getBytes method.
Cheers,
Hendrik
If you already have your ALAsset, you should be able to get an NSData of the default representation using the following:
NSData *data = [NSData dataWithContentsOfURL:[[asset defaultRepresentation] url]];

Persisting an array of images

What's the best way to save and retrieve an array of images across app restarts?
I'm implementing a caching feature for offline viewing of downloaded images and just want to make sure I'm using the right persisting methods.
Thanks!
The quickest & probably best solution would be to persist your images to disk, no question about it.
You could do something like this to save them as JPEG.
NSData *data = UIImageJPEGRepresentation(image, 1.0f);
[data writeToFile:imagePath atomically:YES];

UIImagePNGRepresentation slow or am I doing something wrong?

I'm working on an iPhone App that uses the camera to take pictures, then I'm saving them to the Applications Documents directory. I'm using the following code to convert the UIImage to NSData,
NSData *imageData = [NSData dataWithData:UIImagePNGRepresentation(image)];
Then I write the NSData using
[imageData writeToFile:path atomically:NO]
It all works. The problem is that UIImagePNGRepresentation() is really slow. It takes 8-9 secs on my 3G to convert the image to NSData. This seems wrong to me. Does anyone have any experience with this? Is this just slow function or am I doing something terribly wrong?
Thanks
Are you sure you want to save pictures captured with the camera as PNG?
JPEG is a more appropriate format for photographs. Additionally, its likely much faster!

How does the default Camera iPhone app manages to save a photo so fast?

So far I've managed to create an app for iPhone that takes multiple images with about a 3 second interval between each. I`m processing each image in a separate thread asynchronously and everything is great till it gets to the moment for saving the image on the iPhone disk. Then it takes about 12 seconds to save the image to the disk using JPEG representation.
How does Apple do it, how do they manage to save a single image so fast to the disk is there a trick they are using? I saw that the animations distract the user for a while, but still the time needed is below 12 seconds!
Thanks in advance.
Actually apple uses its kernal driver AppleJPEGDriver, It is a hardware jpeg encoding api and is much faster than software encoding (JPEGRepresnetaion) and some of the people using it in their jailbreak apps(cycorder video recording application).
Apple should give the same functionality to its users but they are apple :)
I haven't tried this but I wouldn't be so sure that Apple isn't using the same methods. A big part of the Apple design philosophy relies on hiding operational interruptions from the user. The Apple code may take as much time as yours but simply be adroit at hiding the entire save time from the perception of the user.
If someone can't tell you how Apple actually does save faster I would suggest looking at ways to disguise the save time.
If you google around a bit... there is a whole bunch of people with the same problem.
I didn't find an answer. The general conclusion seems to be that apple either uses some internal api and bypass public api overhead or some hardware encoder.
Guess you are out of luck for fast image saving
I was having this problem in my app, on saving it would hang so I used Grand central dispatch.
Below is the setImage method out of my image cache class, if UIImage has a image it saves it otherwise it deletes it. You can adapt this to suit your needs hopefully, will only work on iOS 4+. The code is ARC enabled.
-(void)setImage:(UIImage *)image{
if (image == nil){
NSLog(#"Deleting Image");
// Since we have no image let's remove the cached image if it exists
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
[[NSFileManager defaultManager] removeItemAtPath:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] error:nil];
});
}
else {
NSLog(#"Saving Image");
// We've got an image, let's save it to flash memory.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
NSString *cachePath =
[NSSearchPathForDirectoriesInDomains(NSCachesDirectory,
NSUserDomainMask, YES) objectAtIndex:0];
NSData *dataObj = UIImagePNGRepresentation(image);
[dataObj writeToFile:[cachePath
stringByAppendingPathComponent:#"capturedimage.jpg"] atomically:NO];
});
}
imageCache = image;
}