iOS PDF with libHaru issue - iphone

I use libHary to create PDF.
I use this answer iOS SDK - Programmatically generate a PDF file to create pdf.
In this library method HPDF_LoadPngImageFromFile exist, but I need to load image from data (or UIImage). How can I do that?
Now the best solution for me is to right data to png file and then load it, but I thing better solution should exist.

LibHaru is platform independent so it does not know about about UIImage or NSData.
You have 2 options:
1. Save the UIImage/NSData to a file and then load the image from the file using HPDF_LoadPngImageFromFile method or
2. Save the UIImage to a NSData object, get a pointer to the NSData buffer ([nsdataobject bytes] method) and then use HPDF_LoadPngImageFromMem to load the image from memory.

Related

ASIHTTPRequest - Uploading files from the camera roll

I currently need to upload large files from an iDevice to a server API. For this I'm attempting to use the ASIHTTPRequest library as it automatically supports uploading large files in queued chunks (before this I simply created an NSData instance with the bytes for the entire file at once and attached this to the POST message, but this causes the application to crash on larger files due to an excessive amount of RAM usage).
The problem is that when you need to upload files and add a file to the HTTP Post message, this is the syntax you need to use:
[theUploadRequest setFile:#"" forKey:#"videoupload"];
setFile requires a file path in a string format. The problem I'm currently having is that it does not seem like you are allowed to simply take the file path from a file which is not in your applications sandbox? Since I need to upload a file which is not in my application, but outside of it in the standard cameraroll.
I tried to make this quick test to see if I could create an NSData object and fill it with data from a file in the cameraroll, providing a path to it like this:
NSData *testData = [NSData dataWithContentsOfURL:theContent.defaultRepresentation.url];
NSLog(#"THE SIZE OF THE TEST DATA: %i", testData.length);
Note that "theContent" is an instance of an ALAsset, and is a file retrieved from the cameraroll. The result of this is simply a length of 0, which I suppose means you can't simply do that.
Is there any way around this? Or would I have to somehow import the video file into the application's sandbox?
So if I understand correctly, you want to upload stuff straight from the camera roll? Based on the docs I'd say the important piece of code you need is:
NSString *filePath = [[info objectForKey:
UIImagePickerControllerMediaURL] path];
The info dictionary is passed to your media picker's delegate:
- (void) imagePickerController: (UIImagePickerController *) picker
didFinishPickingMediaWithInfo: (NSDictionary *) info;
You should then be able to use that filePath in your call to setFile.
Take a look here probably you'll find something useful.
Hope this helps.
EDIT:
For something simpler you can use the bit of code posted in this answer
Implying that you need to copy the file somewhere before uploading it, especially if it is a big one.

iPhone, how does one read an image from the photo library as NSData?

On the iPhone I saved an image to the photo library with modifications (appending data after the image code) using the assest library to save the NSData directly to the photo library. Now I want to read the image back from the photo library as an NSData to read it. If I read it as a UIImage, it changes the data inside the image.
How can I read the photo from the photo library as NSData? I tried looking into the reference URL in 4.1 but no luck.
Edit: I edited to explain why I'm saving it as an NSData. The URL method in the answers works if you just want pure NSData, but does not help in the context that I am asking.
Edit 2: The answer with the getBytes was the answer that did it for me. The code I used was:
int8_t *bytes = malloc([representation size]);
NSUInteger length = [representation getBytes:bytes fromOffset:0 length:[representation size] error:&error];
This was able to get me everything inside the file which gave me the image code PLUS what I added in NSData form.
Edit 3:
Is there a way to do this now in iOS 10 with the PHPhotoLibrary which replaces AssetLibrary? I need the image in NSData being the original RAW image with no modifications.
You should look into the getBytes method of ALAssetRepresentation.
This gives you the original RAW image data of your image and also makes it possible to use a buffer to process the image instead of reading the image into memory all at once.
And of course you can always generate NSData from the getBytes method.
Cheers,
Hendrik
If you already have your ALAsset, you should be able to get an NSData of the default representation using the following:
NSData *data = [NSData dataWithContentsOfURL:[[asset defaultRepresentation] url]];

How to save images in camera roll in iphone programmatically?

I am developing one application. In that I want to save my pictures in camera roll. So how to save my pictures in camera roll through code?
The most basic way (and the only way if you're targeting iOS before 4.0) is to use UIImageWriteToSavedPhotosAlbum. This lets you specify a selector to be called on a target object when the save is complete.
In 4.0, you can use ALAssetsLibrary's writeImageToSavedPhotosAlbum:orientation:completionBlock: to write the image; in this case, you provide a block to be called when the save is complete.
In 4.1, you can also use ALAssetsLibrary's writeImageDataToSavedPhotosAlbum:metadata:completionBlock: or writeImageToSavedPhotosAlbum:metadata:completionBlock: to write an image along with metadata (e.g. geotagging information). The former is also the only way to write an image from an NSData object without first loading it as a UIImage or CGImageRef.
See this blog post..UIImageWriteToSavedPhotosAlbum is your friend..

Is there any standard fileopen dialog in iphone programming?

I want to create application, that will process images(mostly - photographs from mobile camera).
So i need at first to provide the way to open image file from phone memory.
how can i do that?
Sorry for such a stupid question, but i am newbie in iphone programming(yet:)).
P.S. I use xcode, cocoa
You can use UIImagePickerController to get a standard interface for selecting a photo from the user's library or camera roll. Depending on the configuration, it can also be used to capture a new image with the camera. Note however that you get a UIImage back, not a file, if you want to upload it somewhere, you will first have to create a file or NSData object from the image, using either UIImagePNGRepresentation() or UIImageJPEGRepresentation().

ALAssetRepresentation as NSData for GIFs

I'm try to allow users to pull images out of their Photos collections using ALAssetsLibrary. Users can then upload these images. My goal is to allow users to upload any GIFs they may have in their library w/o loosing any animation they may have.
For PNG and JPEG files I can grab the ALAssetRepresentation, use - (CGImageRef)fullResolutionImage to get a CGImageRef, and then save it to NSData using UIImageJPEGRepresentation or UIImagePNGRepresentation.
However, because no similar function exists for GIF files, all I can do is covert the GIF to either JPEG or PNG, but then I lose the animation.
Is there either
a way to grab the NSData straight from an ALAssetRepresentation object or
a way to go from ALAssetRepresentation -> CGImageRef -> NSData without loosing any gif animation frames?
Thanks in advance!
Yes, there is a quite simple way:
Use the getBytes:fromOffset:length:error: method of ALAssetRepresentation. This gives youthe raw file data of the ALAsset, in your case the GIF file.