image getting saved in simulator gallery but not in iphone gallery - iphone

Trying to save the image in iphone gallery with following code below, but working fine in simulator, not in iphone.
-(void)saveTkt
{
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
CGRect rect = [keyWindow bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[keyWindow.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(img);
[data writeToFile:#"f.png" atomically:YES];
UIImageWriteToSavedPhotosAlbum(img, self, #selector(thisImage:hasBeenSavedInPhotoAlbumWithError:usingContextInfo:), nil);
}
edited code
-(void)saveTkt
{
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
CGRect rect = [keyWindow bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[keyWindow.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData * data = UIImagePNGRepresentation(img);
[data writeToFile:#"f.png" atomically:YES];
ALAuthorizationStatus status = [ALAssetsLibrary authorizationStatus];
if (status != ALAuthorizationStatusAuthorized) {
//show alert for asking the user to give permission
}
//UIImageWriteToSavedPhotosAlbum(img, self, #selector(thisImage:hasBeenSavedInPhotoAlbumWithError:usingContextInfo:), nil);
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[img CGImage] orientation:(ALAssetOrientation)[img imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
// TODO: error handling
UIAlertView *alert = [[UIAlertView alloc]initWithTitle:#"error" message:[error localizedDescription] delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
} else {
// TODO: success handling
UIAlertView *alert = [[UIAlertView alloc]initWithTitle:#"alert" message:#"Saved suceesfully in photos album." delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
[alert show];
}
}];
ALAssetsLibraryGroupsEnumerationResultsBlock assetGroupEnumerator =
^(ALAssetsGroup *assetGroup, BOOL *stop) {
if (assetGroup != nil) {
// do somthing
}
};
ALAssetsLibraryAccessFailureBlock assetFailureBlock = ^(NSError *error) {
NSLog(#"Error enumerating photos: %#",[error description]);
};
NSUInteger groupTypes = ALAssetsGroupAll;
[library enumerateGroupsWithTypes:groupTypes usingBlock:assetGroupEnumerator failureBlock:assetFailureBlock];
}

Add Framework assetsLibrary
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
// TODO: error handling
} else {
// TODO: success handling
}
}];
[library release];

Apple's documentation say that the completion handler must conform to this:
- (void) image: (UIImage *) image didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo;
And there's an error parameter passed along, which you should print out to see what the actual true error is.
For example:
- (void) image: (UIImage *) image didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo
{
if(error != NULL)
{
NSLog( #"error while saving image is %#", [error localizedDescription]);
}
}

I think you should try with NSData first if you need It, or you can simply do that without NSData
NSData *dataImage = UIImageJPEGRepresentation(file,1);
UIImageView *img;
img.image = [[UIImage alloc] initWithData:dataImage];
UIImageWriteToSavedPhotosAlbum( [[UIImage alloc] initWithData:dataImage], nil, nil, nil);
Hope ,it will work

Related

How to save a video using code in objective c?

I have captured a video using objective c,however I am not able to save it in the iphone photos library.I dont want to use the AlAssets library,I want to use the image picker only.I have seen a lot of methods on stack overflow and other sites but they either use storage location path(which is not mentioned what it is) or they dont work.
This is my piece of code.
-(IBAction)Onclick:(id)sender
{
UIImagePickerController *imagePicker = [[UIImagePickerController
alloc] init];
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.mediaTypes = [NSArray arrayWithObject:(NSString *)
kUTTypeMovie];
imagePicker.delegate = self;
//UISaveVideoAtPath
imagePicker.allowsImageEditing = NO;
[self.view addSubview:imagePicker.view];
[imagePicker viewWillAppear:YES];
CGRect overlayFrame = CGRectMake(0, 380, 320, 44);
//UILabel *lbl=[[UILabel alloc]init];
UIView *baseView = [[[UIView alloc] init] autorelease];
baseView.backgroundColor =[UIColor greenColor];
//lbl.text=#"dfgfdgfd";
baseView.frame=overlayFrame;
//view.delegate = self;
//view.picker = picker;
//[view customize];
imagePicker.cameraOverlayView = baseView;
[self presentModalViewController:imagePicker animated:YES];
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:#"public.image"]){
UIImage *picture = [info objectForKey:UIImagePickerControllerOriginalImage];
UIImageWriteToSavedPhotosAlbum(picture, nil, nil, nil);
}
else if ([mediaType isEqualToString:#"public.movie"]){
NSURL *url = [[[info objectForKey:UIImagePickerControllerMediaURL] copy] autorelease];
// ALAssetsLibrary* library = [[[ALAssetsLibrary alloc] init] autorelease];
// [library writeVideoAtPathToSavedPhotosAlbum:url
// completionBlock:^(NSURL *assetURL, NSError *error){/*notify of completion*/}];
UISaveVideoAtPathToSavedPhotosAlbum(url, nil, nil, nil);
}
[self dismissModalViewControllerAnimated:YES];
}
Try this:
- (void) imagePickerController: (UIImagePickerController *) picker
didFinishPickingMediaWithInfo: (NSDictionary *) info
{
NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
[self dismissViewControllerAnimated:YES completion:nil];
// Handle a movie capture
if (CFStringCompare (( CFStringRef) mediaType, kUTTypeMovie, 0)
== kCFCompareEqualTo)
{
NSString *moviePath = [[info objectForKey:
UIImagePickerControllerMediaURL] path];
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath))
{
UISaveVideoAtPathToSavedPhotosAlbum (moviePath,self, #selector(video:didFinishSavingWithError:contextInfo:), nil);
}
}
}
}
- (void)video:(NSString*)videoPath didFinishSavingWithError:(NSError*)error contextInfo:(void*)contextInfo
{
if (error)
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:#"Photo/Video Saving Failed" delegate:nil cancelButtonTitle:#"Ok" otherButtonTitles: nil, nil];
[alert show];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Photo/Video Saved" message:#"Saved To Photo Album" delegate:self cancelButtonTitle:#"Ok" otherButtonTitles: nil];
[alert show];
}
}
Hope it helps you.
Try this:
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeVideoAtPathToSavedPhotosAlbum:videoURL completionBlock:^(NSURL *assetURL, NSError *error){
/*notify of completion*/
NSLog(#"AssetURL: %#",assetURL);
NSLog(#"Error: %#",error);
if (!error) {
//video saved
}else{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:error.domain delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil, nil];
[alert show];
[alert release];
}
}];

Raw image data from camera

I've been searching this forum up and down but I couldn't find what I really need. I want to get raw image data from the camera. Up till now I tried to get the data out of the imageDataSampleBuffer from that method captureStillImageAsynchronouslyFromConnection:completionHandler: and to write it to an NSData object, but that didn't work.
Maybe I'm on the wrong track or maybe I'm just doing it wrong.
What I don't want is for the image to be compressed in any way.
The easy way is to use jpegStillImageNSDataRepresentation: from AVCaptureStillImageOutput, but like I said I don't want it to be compressed.
Thanks!
This is how i do it:
1: I first open the camera using:
- (void)openCamera
{
imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.mediaTypes = [NSArray arrayWithObjects:
(NSString *) kUTTypeImage,
(NSString *) kUTTypeMovie, nil];
imagePicker.allowsEditing = NO;
[self presentModalViewController:imagePicker animated:YES];
}
else {
lblError.text = NSLocalizedStringFromTable(#"noCameraFound", #"Errors", #"");
}
}
When the picture is taken this method gets called:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Save and get the path of the image
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([mediaType isEqualToString:(NSString *)kUTTypeImage])
{
// Save the image
image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if(!error) {
//Save path and location to database
NSString *pathLocation = [[NSString alloc] initWithFormat:#"%#", assetURL];
} else {
NSLog(#"CameraViewController: Error on saving image : %# {imagePickerController}", error);
}
}];
}
[imagePicker dismissModalViewControllerAnimated:YES];
}
Then with that path i get the picture from the library in FULL resolution (using the "1"):
-(void)preparePicture: (NSString *) filePathPicture{
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset *myasset)
{
if(myasset != nil){
ALAssetRepresentation *assetRep = [myasset defaultRepresentation];
CGImageRef imageRef = [assetRep fullResolutionImage];
if (imageRef) {
NSData *imageData = UIImageJPEGRepresentation([UIImage imageWithCGImage:imageRef], 1);
}
}else {
//error
}
};
ALAssetsLibraryAccessFailureBlock failureBlock = ^(NSError *error)
{
NSString *errorString = [NSString stringWithFormat:#"can't get image, %#",[error localizedDescription]];
NSLog(#"%#", errorString);
};
if(filePathPicture && [filePathPicture length])
{
NSURL *assetUrl = [NSURL URLWithString:filePathPicture];
ALAssetsLibrary *assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:assetUrl
resultBlock:resultBlock
failureBlock:failureBlock];
}
}
Hope this helps you a bit further :-).

MFMailComposeViewController: Attaching Images from Photo Gallery

I am having a bit of trouble attaching images from the Photo Gallery to an email.
Basically, one of the features of my application allow the user to take photos. When they snap the shot, I record the URL Reference to the image in Core Data. I understand that you have to go through the ALAssetRepresentation to get to the image. I have this up and running within my application for when the user wants to review an image that they have taken.
I am now attempting to allow the user to attach all of the photos taken for an event to an email. While doing this, I iterate through the Core Data entity that stores the URL References, call a method that returns a UIImage from the ALAssetsLibrary and then attaches it using the NSData/UIImageJPEGRepresentation and MFMailComposeViewController/addAttachmentData methods.
Problem is: When the email is presented to the user, there are small blue squares representing the images and the image is not attached.
Here is the code:
- (void)sendReportReport
{
if ([MFMailComposeViewController canSendMail])
{
MFMailComposeViewController *mailer = [[MFMailComposeViewController alloc] init];
mailer.mailComposeDelegate = self;
[mailer setSubject:#"Log: Report"];
NSArray *toRecipients = [NSArray arrayWithObjects:#"someone#someco.com", nil];
[mailer setToRecipients:toRecipients];
NSError *error;
NSFetchRequest *fetchPhotos = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription
entityForName:#"Photo" inManagedObjectContext:__managedObjectContext];
[fetchPhotos setEntity:entity];
NSArray *fetchedPhotos = [__managedObjectContext executeFetchRequest:fetchPhotos error:&error];
int counter;
for (NSManagedObject *managedObject in fetchedPhotos ) {
Photo *photo = (Photo *)managedObject;
// UIImage *myImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#.png", counter++]];
NSData *imageData = UIImageJPEGRepresentation([self getImage:photo.referenceURL], 0.5);
// NSData *imageData = UIImagePNGRepresentation([self getImage:photo.referenceURL]);
// [mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"%i", counter]];
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
counter++;
}
NSString *emailBody = [self getEmailBody];
[mailer setMessageBody:emailBody isHTML:NO];
[self presentModalViewController:mailer animated:YES];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Failure"
message:#"Your device doesn't support the composer sheet"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}
and the method that returns the UIImage:
#pragma mark - Get Photo from Asset Library
+ (ALAssetsLibrary *)defaultAssetsLibrary {
static dispatch_once_t pred = 0;
static ALAssetsLibrary *library = nil;
dispatch_once(&pred, ^{
library = [[ALAssetsLibrary alloc] init];
});
return library;
}
- (UIImage *)getImage:(NSString *)URLReference
{
__block UIImage *xPhoto = nil;
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
UIImage *xImage;
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
xImage = [UIImage imageWithCGImage:iref];
}
xPhoto = xImage;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:URLReference];
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
return xPhoto;
}
NOTE: This above code does run, it simply does not attach the image. Also, note, I am able to successfully attach images from the gallery within my application, as long at I have set them into a UIImageView.Image already (basically, I am taking the pointer to the image from the UIImageView and passing it to the addAttachmentData method.) It is just when I attempt to iterate through Core Data and attach without first setting the image into a UIImageView that I have the trouble.
Any tips would be greatly appreciated!
Thanks!
Jason
Oh now I see it.. sry. You are using an asynchronous block to get the image from the asset library. But right after starting that operation, you are returning xImage. But the asynchronous operation will finish later. So you are returning nil.
You need to change your architectur to smth like this:
In your .h file you need two new members:
NSMutableArray* mArrayForImages;
NSInteger mUnfinishedRequests;
In your .m file do smth like that:
- (void)sendReportReport
{
// save image count
mUnfinishedRequests = [fetchedPhotos count];
// get fetchedPhotos
[...]
// first step: load images
for (NSManagedObject *managedObject in fetchedPhotos )
{
[self loadImage:photo.referenceURL];
}
}
Change your getImage method to loadImage:
- (void)loadImage:(NSString *)URLReference
{
NSURL *asseturl = [NSURL URLWithString:URLReference];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
[mArrayForImages addObject: [UIImage imageWithCGImage:iref]];
} else {
// handle error
}
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Create a new callback method:
- (void) imageRequestFinished
{
mUnfinishedRequests--;
if(mUnfinishedRequests <= 0)
{
[self sendMail];
}
}
And an extra method for finally sending the mail, after getting the images:
- (void) sendMail
{
// crate mailcomposer etc
[...]
// attach images
for (UIImage *photo in mArrayForImages )
{
NSData *imageData = UIImageJPEGRepresentation(photo, 0.5);
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
}
// send mail
[...]
}

writeImageToSavedPhotosAlbum:metadata:completionBlock:

I use the method writeImageToSavedPhotosAlbum:metadata:completionBlock: save taken picture to photo album,code is:
-(void)savePhotoToAlbum{
CGImageRef imageRef=[imageView image].CGImage;
NSDictionary *currentDic=[self getLocation];
NSDictionary *metadata=[NSDictionary dictionaryWithDictionary:currentDic];
ALAssetsLibrary *library=[[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:imageRef metadata:metadata completionBlock:^(NSURL *assetURL,NSError *error){
if(error == nil)
{
UIAlertView *alertView=[[UIAlertView alloc] initWithTitle:nil message:#"Save success!" delegate:nil cancelButtonTitle:#"Okay" otherButtonTitles:nil];
[alertView show];
[alertView release];
}
else
{
UIAlertView *alertView=[[UIAlertView alloc] initWithTitle:nil message:#"Save failure!" delegate:nil cancelButtonTitle:#"Okay" otherButtonTitles:nil];
[alertView show];
[alertView release];
}
}];
[library release];
}
.The method getLocation that is get user's current location!That can save success!Then I want to pick taken picture from photo album use UIImagePickerController! Code is :
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{ if([picker sourceType]==UIImagePickerControllerSourceTypeSavedPhotosAlbum)//picker image delegate
{
NSString *mediaType=[info objectForKey:UIImagePickerControllerMediaType];
if([mediaType isEqualToString:#"public.image"])
{
NSDictionary *metadata=[info objectForKey:UIImagePickerControllerMediaMetadata];
NSLog(#"%#",metadata);
}
}
}
Then log the metadata is null.That's why? And how do I get the metadata info which I saved?Thanks!
The metadata of the image will be available only if the sourceType is UIImagePickerControllerSourceTypeCamera.
See Ref. Look at the last paragraph in that page.
You can take log of metadata with AssetsLibrary framework:
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
...
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if ([mediaType isEqualToString:(NSString*)kUTTypeImage]) {
NSURL *url = [info objectForKey:UIImagePickerControllerReferenceURL];
if (url) {
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset) {
CLLocation *location = [myasset valueForProperty:ALAssetPropertyLocation];
NSLog(#"\n\n\n____________________________\n");
NSLog(#"ORIENTATION: %#\n",[myasset valueForProperty:ALAssetPropertyOrientation]);
NSLog(#"LOCATION: %#\n",[myasset valueForProperty:ALAssetPropertyLocation]);
NSLog(#"DATE: %#\n",[myasset valueForProperty:ALAssetPropertyDate]);
NSLog(#"Duration: %#\n",[myasset valueForProperty:ALAssetPropertyDuration]);
NSLog(#"TYPE: %#\n",[myasset valueForProperty:ALAssetPropertyType]);
NSLog(#"\n____________________________\n\n\n");
//take coordinates only
CLLocationCoordinate2D coordinate = [location coordinate];
strCoord = [NSString stringWithFormat:#"long: %f; lat: %f;", coordinate.latitude, coordinate.longitude];
NSLog(#"%#", strCoord);
// location contains lat/long, timestamp, etc
// extracting the image is more tricky and 5.x beta ALAssetRepresentation has bugs!
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror) {
NSLog(#"cant get image - %#", [myerror localizedDescription]);
};
ALAssetsLibrary *assetsLib = [[ALAssetsLibrary alloc] init];
[assetsLib assetForURL:url resultBlock:resultblock failureBlock:failureblock];
}
}
...
}

iPhone capture image crashes app when saving to Photos Album

In my app i make a screen capture:
UIImage *viewImage = [UIImage imageWithCGImage:UIGetScreenImage()];
CGRect cropRect = CGRectMake(0, 0, 320, 440);
CGImageRef imageRef = CGImageCreateWithImageInRect([viewImage CGImage], cropRect);
viewImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
captureImage = [[UIImage alloc] init];
captureImage = viewImage;
Then i want to save it to photos album:
UIImageWriteToSavedPhotosAlbum(anImage, self, #selector(savedPhotoImage:didFinishSavingWithError:contextInfo:), nil);
- (void) savedPhotoImage:(UIImage *)image
didFinishSavingWithError:(NSError *)error
contextInfo:(void *)contextInfo
{
NSString *themessage = #"This image has been saved to your Photos album. Tap to continue.";
NSString *errorTitle = #"Image saved.";
if (error) {
themessage = [error localizedDescription];
errorTitle = #"error";
}
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:errorTitle
message:themessage
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
}
The app crashes when saving to photos album.
The captured image is ok, i can successfully upload and display it.
I also tried loading an image from memory and saving it to Photos Album and it also worked.
I guess that i'm doing something wrong when i'm processing my image..
Any ideas?
Thanks!
As this question is too old I would like to show more modern way to do what you wanted.
This code can be used to capture the screen:
- (UIImage *) getScreenShot
{
UIWindow *keyWindow = [[UIApplication sharedApplication] keyWindow];
CGRect rect = [keyWindow bounds];
UIGraphicsBeginImageContext(rect.size);
CGContextRef context = UIGraphicsGetCurrentContext();
[keyWindow.layer renderInContext:context];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
And this code to save the image:
[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^
{
PHAssetChangeRequest *changeRequest = [PHAssetChangeRequest creationRequestForAssetFromImage:screenShot];
}
completionHandler:^(BOOL success, NSError *error)
{
if (success)
{
NSLog(#"successfully saved");
}
else
{
NSLog(#"error saving to photos: %#", error);
}
}];}
Can you check the permission ? if its disabled, there is no alert will display next time.