I am making an iOS app that saves an image of given URL.
I used the following code.
- (IBAction)save:(id)sender {
UIImage *image = [[UIImage alloc] initWithData:[NSData dataWithContentsOfURL:[NSURL URLWithString:photoUrl]]];
NSLog(#"%f,%f",image.size.width,image.size.height);
// Let's save the file into Document folder.
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSLog(#"saving jpg");
NSString *jpegFilePath = [NSString stringWithFormat:#"%#/test.jpg",docDir];
NSData *data2 = [NSData dataWithData:UIImageJPEGRepresentation(image, 1.0f)];//1.0f = 100% quality
[data2 writeToFile:jpegFilePath atomically:YES];
NSLog(#"saving image done");
}
The image is saved successfully after calling this method.
Since I use iPhone simulator, I can see the image is saved my local Library folder.
However, when I open Photos (iPhone navie app) from the simulator, it doesn't detect the image I just saved.
When I save an image from Safari and open Photos app, it detects the image correctly.
But why can't it detect the image I saved thru my ios app? Any solutions?
How come/why do you think or assume any image written to a random location should be known about by the Photos app? That's just nonsense. Use the UIImageWriteToSavedPhotosAlbum() function instead.
Related
I want to save images that user selects from the camera roll, when app exits and restart i want to show those images to user. I have searched over internet and not found much useful resource. I know I can write the image in database as blob but I dont want to do it as increases the database size. Is it possible to save the image path and then later use that path to access that image.
Save the Image into the DocumentDirectory
NSString *imageName = "myImage.png";
NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString * documentsDirectoryPath = [paths objectAtIndex:0];
NSString *dataPath = [documentsDirectoryPath stringByAppendingPathComponent:imageName];
NSData* settingsData = UIImagePNGRepresentation(imageData);
[settingsData writeToFile:dataPath atomically:YES];
EDIT:
If you want to see your saved images on the iPhone/iPad (or share them) in iTunes->Apps->Documents
jut add in Info.plist "Application supports iTunes file sharing" : "YES"
You have to use UIImagePickerController
to get the media from the Camera roll
You will get the image in the delegate method
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
Here you can get the image and save is as NSData .. in a file and then retrieve back it next time
Edit :
to Save
Data = UIImageJPEGRepresentation(theImage,1.0);
NSString *imagePath = [ContentFolder stringByAppendingPathComponent:[NSString stringWithFormat:#" %d.plist",Counter]];
NSLog(#" iamge path %#",imagePath);
NSMutableDictionary *objectDictionary = [[NSMutableDictionary alloc]initWithObjectsAndKeys:data,#"data", nil];
[objectDictionary writeToFile:imagePath atomically:YES];
you should be able to change these according to your needs..
I am new to this and have searched a lot this week. I am trying to set a background image from a file I have downloaded from a url. I need to save the image for later. Specially if I am not connected to the internet I can still show this image.
I have validated by printing out the contents of the directory that the file exists. This is one of many code that I have tried to make the image appear in the background.
nNSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *documentPath = [docDir stringByAppendingFormat:#"/mainback.png"];
// If you go to the folder below, you will find those pictures
NSLog(#"%#",documentPath);
if ([[NSFileManager defaultManager]fileExistsAtPath:documentPath]) {
NSData *data = [NSData dataWithContentsOfFile:documentPath];
UIImage *image = [[UIImage alloc] initWithData:data];
NSLog(#"WHAT - %f,%f",image.size.width,image.size.height);
UIColor *background = [[UIColor alloc] initWithPatternImage:[UIImage imageNamed:documentPath]];
self.view.backgroundColor = background;
[background release];
}
The code finds the image in the directory but I cannot set the image to the background. Please help.
Thanks
Use a UIImageView (Apple documentation linked) instead of doing what you are trying to do with UIColor.
It'll be a lot more straightforward and less problematic.
Can someone point me in the right direction for saving a webpage as a pdf? I already have a library for an iOS pdf reader, but I need to somehow implement a feature for saving whatever webpage the user is viewing as pdf. I am assuming I would have to use an in-app browser. Any thoughts?
This is not something that I have done, but it seems you are not the first to want to do something like this.
http://www.iphonedevsdk.com/forum/iphone-sdk-development/21451-save-contents-uiwebview-pdf-file.html
The last post in this link provides code to save a UIWebView as an image, and you should be able to convert that image into a PDF. I don't take credit for the code as I did not write it, just connecting you two :)
Here is the code for simplicity:
CGSize sixzevid=CGSizeMake(1024,1100);
UIGraphicsBeginImageContext(sixzevid);
[webview.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImagePNGRepresentation(viewImage);
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *pathFloder = [[NSString alloc] initWithString:[NSString stringWithFormat:#"%#",#"new.png"]];
NSString *defaultDBPath = [documentsDirectory stringByAppendingPathComponent:pathFloder];
[imageData writeToFile:defaultDBPath atomically:YES];
Chances are pretty good that you will have to tweak this but, hopefully this helps you move in the right direction.
I am attempting to save and load a UIImage to and from the iPhone documents directory after the image is picked from the iPhone Photo Library. It is able to do so, but for some reason, when I load the image, it rotates it 90 degrees counterclockwise. Here is my saveImage and loadImage methods:
Save Image:
- (void)saveImage: (UIImage*)image{
if (image != nil)
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: #"lePhoto.png"] ];
//NSData* data = UIImagePNGRepresentation(image);
//[data writeToFile:path atomically:YES];
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
}
}
Load Image:
- (NSData*)loadImage{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString* path = [documentsDirectory stringByAppendingPathComponent:
[NSString stringWithString: #"lePhoto.png"] ];
NSData *data = [[NSData alloc] initWithContentsOfFile:path];
//UIImage *image = [[UIImage alloc] initWithData:data]; //my second try
//UIImage* image = [UIImage imageWithContentsOfFile:path]; //first try
//return image;
return data;
[data release];
}
And I am now loading the image like this, to see if it would, for some crazy reason, solve my problem (it didn't, obviously):
UIImage* theImage = [[UIImage alloc] initWithData:[self loadImage]];
I have done some testing where I have two UIImageViews side-by-side, one on the left that doesn't save and load the image before display (it just straight-up shows the image once a photo is picked from the ImagePicker), and one on the right that displays AFTER the save load occurs. The one on the right is the only one rotating 90 degrees CCW; the non Save/Load picture is displaying correctly. This only occurs on the actual iPhone. The simulator shows both images correctly. I have spent many, many hours attempting to figure it out, but to no avail. Any ideas as to what could be causing this?
EDIT: It should also be known that the image always says it's oriented up, even when it obviously isn't! It only rotates after ALL of the code has been run. I've tried doing a force redraw in between my code to see if that changes things, but it doesn't (though it is possible I'm even doing that wrong. Who knows, anymore).
Also, it will not autorotate photos that have been screenshot from the iPhone. Only photos that have been taken by the camera. [I haven't tried downloaded photos.] It's the weirdest thing...
And if you're wondering exactly how I'm pulling my picture, here's the method that will at least shed light on what I'm doing (it's not every method needed to do this function, obviously, but it gives insight on how I've written my code):
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
theimageView.image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[self saveImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"]];
[picker dismissModalViewControllerAnimated:YES];
}
Any advice is greatly appreciated. And I do know this much: if there's a way to do something crazy to get some stupid, never-before-heard-of problem, chances are, I'll find it, apparently :P
Ok. I figured it out. Apparently when you save things with PNG representation, and not JPEG representation, the image orientation information is not saved with it. Because of this, every image loaded will default to showing orientation up. So, the easiest way is to do this:
NSData* data = UIImageJPEGRepresentation(image,0.0);
[data writeToFile:path atomically:YES];
instead of this:
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
And, of course, changing all the picture names from .png to .jpg. The 0.0 part of the code above is to control Alpha levels, which is required of jpegs. Hope this helps people in the future!
I assume you're using the Assets Library Framework to get the image. If so, you'll want to get the orientation of the image in your results block, and manipulate the UIImage to adjust. I have something like the following:
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset* asset)
{
ALAssetRepresentation* rep = [asset defaultRepresentation];
_orientation = rep.orientation;
...
}
Aside: What a great type name, huh? Holy moly, does Objective-C need namespaces!
Thanks so much...
This code has worked and saved several hours of work for me.
NSData* data = UIImageJPEGRepresentation(image,0.0);
[data writeToFile:path atomically:YES];
in the place of
[UIImagePNGRepresentation(image) writeToFile:path atomically:YES];
I'm trying to load pictures in iPhone Photos app, then save the selected pictures in to my app's folder, using ALAssetsLibrary, and have two issues:
1. the image file saved to disk is much bigger then original files in Photos app, for example, a picture is 2.8MB, but after saved to my app's folder, it's 6.4MB, following is the code:
CGImageRef cgImage = [localPhoto thumbnail];
NSString *path = #"/documents/test/1.jpeg";//the path is just an example
BOOL succ;
UIImage *image1 = [UIImage imageWithCGImage:cgImage];
NSData *data1 = UIImageJPEGRepresentation(image1, 1.0);
succ = [data1 writeToFile:path atomically:YES];
the above code(saving 6.4MB image to file) costs about 1.6seconds, is it normal? is there anyway to make it faster?
Try with PNG representation of the image.
NSData *data1 = UIImagePNGRepresentation(image1);
or else reduce the image quality or JPG.
NSData *data1 = UIImageJPEGRepresentation(image1, 0.5);