I've been searching this forum up and down but I couldn't find what I really need. I want to get raw image data from the camera. Up till now I tried to get the data out of the imageDataSampleBuffer from that method captureStillImageAsynchronouslyFromConnection:completionHandler: and to write it to an NSData object, but that didn't work.
Maybe I'm on the wrong track or maybe I'm just doing it wrong.
What I don't want is for the image to be compressed in any way.
The easy way is to use jpegStillImageNSDataRepresentation: from AVCaptureStillImageOutput, but like I said I don't want it to be compressed.
Thanks!
This is how i do it:
1: I first open the camera using:
- (void)openCamera
{
imagePicker = [[UIImagePickerController alloc] init];
imagePicker.delegate = self;
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePicker.mediaTypes = [NSArray arrayWithObjects:
(NSString *) kUTTypeImage,
(NSString *) kUTTypeMovie, nil];
imagePicker.allowsEditing = NO;
[self presentModalViewController:imagePicker animated:YES];
}
else {
lblError.text = NSLocalizedStringFromTable(#"noCameraFound", #"Errors", #"");
}
}
When the picture is taken this method gets called:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
// Save and get the path of the image
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([mediaType isEqualToString:(NSString *)kUTTypeImage])
{
// Save the image
image = [info objectForKey:#"UIImagePickerControllerOriginalImage"];
[library writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error){
if(!error) {
//Save path and location to database
NSString *pathLocation = [[NSString alloc] initWithFormat:#"%#", assetURL];
} else {
NSLog(#"CameraViewController: Error on saving image : %# {imagePickerController}", error);
}
}];
}
[imagePicker dismissModalViewControllerAnimated:YES];
}
Then with that path i get the picture from the library in FULL resolution (using the "1"):
-(void)preparePicture: (NSString *) filePathPicture{
ALAssetsLibraryAssetForURLResultBlock resultBlock = ^(ALAsset *myasset)
{
if(myasset != nil){
ALAssetRepresentation *assetRep = [myasset defaultRepresentation];
CGImageRef imageRef = [assetRep fullResolutionImage];
if (imageRef) {
NSData *imageData = UIImageJPEGRepresentation([UIImage imageWithCGImage:imageRef], 1);
}
}else {
//error
}
};
ALAssetsLibraryAccessFailureBlock failureBlock = ^(NSError *error)
{
NSString *errorString = [NSString stringWithFormat:#"can't get image, %#",[error localizedDescription]];
NSLog(#"%#", errorString);
};
if(filePathPicture && [filePathPicture length])
{
NSURL *assetUrl = [NSURL URLWithString:filePathPicture];
ALAssetsLibrary *assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:assetUrl
resultBlock:resultBlock
failureBlock:failureBlock];
}
}
Hope this helps you a bit further :-).
Related
I have url for image picked using ELCImagePickerController. I stored the url for future reference.
I get that URL using:
[dict valueForKey:UIImagePickerControllerReferenceURL];
Now the problem arises when after some time user deleted that particular image from photo library
and I am going to access that image using URL.
My app does not get crashed.
I have tried using NSUrl method
[imagePath checkResourceIsReachableAndReturnError:&err]
as well i tried something like:
-(BOOL)findImage:(NSURL*)path
{
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
__block BOOL flag=YES;
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
flag=YES;
dispatch_group_leave(group);
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"cant get image - %#",[myerror localizedDescription]);
flag=NO;
};
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:path resultBlock:resultblock failureBlock:failureblock];
dispatch_group_wait(group, DISPATCH_TIME_FOREVER);
dispatch_release(group);
[assetslibrary release];
return flag;
}
Sample url :
assets-library://asset/asset.JPG?id=E862927E-E646-448A-9EB6-A7D48668B3DC&ext=JPG
But no success.
How to know that image present at particular URL.
If any one can help me out on this will be appreciable.
Thanks in advance.
For this case you need to check ALAssetRepresentation *rep = [myasset defaultRepresentation] to nil.
if(rep != nil){
//write your code..
}
Solved the problem with change in the findImage method
-(BOOL)findImage:(NSURL*)path
{
dispatch_group_t group = dispatch_group_create();
dispatch_group_enter(group);
__block BOOL flag=YES;
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:path resultBlock:^(ALAsset *asset) {
if (asset==nil)
{
flag=NO;
}
else
{
flag=YES;
}
dispatch_group_leave(group);
} failureBlock:^(NSError *error){
NSLog(#"operation was not successfull!");
dispatch_group_leave(group);
}];
dispatch_group_wait(group, DISPATCH_TIME_FOREVER);
dispatch_release(group);
[assetslibrary release];
return flag;
}
I am having a bit of trouble attaching images from the Photo Gallery to an email.
Basically, one of the features of my application allow the user to take photos. When they snap the shot, I record the URL Reference to the image in Core Data. I understand that you have to go through the ALAssetRepresentation to get to the image. I have this up and running within my application for when the user wants to review an image that they have taken.
I am now attempting to allow the user to attach all of the photos taken for an event to an email. While doing this, I iterate through the Core Data entity that stores the URL References, call a method that returns a UIImage from the ALAssetsLibrary and then attaches it using the NSData/UIImageJPEGRepresentation and MFMailComposeViewController/addAttachmentData methods.
Problem is: When the email is presented to the user, there are small blue squares representing the images and the image is not attached.
Here is the code:
- (void)sendReportReport
{
if ([MFMailComposeViewController canSendMail])
{
MFMailComposeViewController *mailer = [[MFMailComposeViewController alloc] init];
mailer.mailComposeDelegate = self;
[mailer setSubject:#"Log: Report"];
NSArray *toRecipients = [NSArray arrayWithObjects:#"someone#someco.com", nil];
[mailer setToRecipients:toRecipients];
NSError *error;
NSFetchRequest *fetchPhotos = [[NSFetchRequest alloc] init];
NSEntityDescription *entity = [NSEntityDescription
entityForName:#"Photo" inManagedObjectContext:__managedObjectContext];
[fetchPhotos setEntity:entity];
NSArray *fetchedPhotos = [__managedObjectContext executeFetchRequest:fetchPhotos error:&error];
int counter;
for (NSManagedObject *managedObject in fetchedPhotos ) {
Photo *photo = (Photo *)managedObject;
// UIImage *myImage = [UIImage imageNamed:[NSString stringWithFormat:#"%#.png", counter++]];
NSData *imageData = UIImageJPEGRepresentation([self getImage:photo.referenceURL], 0.5);
// NSData *imageData = UIImagePNGRepresentation([self getImage:photo.referenceURL]);
// [mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"%i", counter]];
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
counter++;
}
NSString *emailBody = [self getEmailBody];
[mailer setMessageBody:emailBody isHTML:NO];
[self presentModalViewController:mailer animated:YES];
}
else
{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Failure"
message:#"Your device doesn't support the composer sheet"
delegate:nil
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
}
}
and the method that returns the UIImage:
#pragma mark - Get Photo from Asset Library
+ (ALAssetsLibrary *)defaultAssetsLibrary {
static dispatch_once_t pred = 0;
static ALAssetsLibrary *library = nil;
dispatch_once(&pred, ^{
library = [[ALAssetsLibrary alloc] init];
});
return library;
}
- (UIImage *)getImage:(NSString *)URLReference
{
__block UIImage *xPhoto = nil;
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
UIImage *xImage;
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
xImage = [UIImage imageWithCGImage:iref];
}
xPhoto = xImage;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
};
NSURL *asseturl = [NSURL URLWithString:URLReference];
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
return xPhoto;
}
NOTE: This above code does run, it simply does not attach the image. Also, note, I am able to successfully attach images from the gallery within my application, as long at I have set them into a UIImageView.Image already (basically, I am taking the pointer to the image from the UIImageView and passing it to the addAttachmentData method.) It is just when I attempt to iterate through Core Data and attach without first setting the image into a UIImageView that I have the trouble.
Any tips would be greatly appreciated!
Thanks!
Jason
Oh now I see it.. sry. You are using an asynchronous block to get the image from the asset library. But right after starting that operation, you are returning xImage. But the asynchronous operation will finish later. So you are returning nil.
You need to change your architectur to smth like this:
In your .h file you need two new members:
NSMutableArray* mArrayForImages;
NSInteger mUnfinishedRequests;
In your .m file do smth like that:
- (void)sendReportReport
{
// save image count
mUnfinishedRequests = [fetchedPhotos count];
// get fetchedPhotos
[...]
// first step: load images
for (NSManagedObject *managedObject in fetchedPhotos )
{
[self loadImage:photo.referenceURL];
}
}
Change your getImage method to loadImage:
- (void)loadImage:(NSString *)URLReference
{
NSURL *asseturl = [NSURL URLWithString:URLReference];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
// get the image
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullScreenImage];
if (iref) {
[mArrayForImages addObject: [UIImage imageWithCGImage:iref]];
} else {
// handle error
}
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error fetching photo: %#",[myerror localizedDescription]);
[self performSelectorOnMainThread: #selector(imageRequestFinished)];
};
// create library and set callbacks
ALAssetsLibrary *al = [DetailsViewController defaultAssetsLibrary];
[al assetForURL:asseturl
resultBlock:resultblock
failureBlock:failureblock];
}
Create a new callback method:
- (void) imageRequestFinished
{
mUnfinishedRequests--;
if(mUnfinishedRequests <= 0)
{
[self sendMail];
}
}
And an extra method for finally sending the mail, after getting the images:
- (void) sendMail
{
// crate mailcomposer etc
[...]
// attach images
for (UIImage *photo in mArrayForImages )
{
NSData *imageData = UIImageJPEGRepresentation(photo, 0.5);
[mailer addAttachmentData:imageData mimeType:#"image/jpeg" fileName:[NSString stringWithFormat:#"a.jpg"]];
}
// send mail
[...]
}
First of all sorry for the crappy formatting. I dont know how to format the code here.I am having a problem with retrieving the image from the photo library. I have pasted my code Please help me. I am having this problem from many days.
here, I am storing the image path in the database using the referenceURL. It is in the
-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
NSURL *imageUrl = [info valueForKey:UIImagePickerControllerReferenceURL];
NSURL *imageUrl = [info valueForKey:UIImagePickerControllerReferenceURL];
ALAssetsLibrary *assetLibrary = [[ALAssetsLibrary alloc]init];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
NSLog(#"asset");
CGImageRef iref = [myasset thumbnail];
if(iref)
{
NSLog(#"ifref");
UIImage *thethumbnail = [UIImage imageWithCGImage:iref];
NSLog(#"the thumbnail %#",thethumbnail);
[[self photo]setImage:thethumbnail];
lpdel.imageurl = UIImageJPEGRepresentation (thethumbnail, 1);
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"failed");
NSLog(#"cant get image -- %#",[myerror localizedDescription]);
};
if (imageUrl) {
NSLog(#"if url");
[assetLibrary assetForURL:imageUrl
resultBlock:resultblock
failureBlock:failureblock];
//lpdel.imageurl =imageUrl;
NSLog(#"the image string %#",lpdel.imageurl);
NSLog(#"lpdel.image%#",lpdel.imageurl);
[picker dismissModalViewControllerAnimated:YES];
[imageUrl release];
}
[picker dismissModalViewControllerAnimated:YES];
[picker release]
}
I am storing the imageURL in the database. Then in another View Controller I am trying to retrive the image using the same url using the below code
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
NSLog(#"asset");
CGImageRef iref = [myasset thumbnail];
if(iref)
{
NSLog(#"ifref");
thethumbnail = [UIImage imageWithCGImage:iref];
NSLog(#"the thumbnail to upload %#",thethumbnail);
[self uploadImage:UIImageJPEGRepresentation (thethumbnail, 1)];
//[[self photo]setImage:thethumbnail];
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"failed");
NSLog(#"cant get image -- %#",[myerror localizedDescription]);
};
if (image)
{
NSLog(#"if url");
ALAssetsLibrary *assetLibrary = [[[ALAssetsLibrary alloc]init]autorelease];
NSLog(#"url");
[assetLibrary assetForURL:image
resultBlock:resultblock
failureBlock:failureblock];
}
NSLog(#"the thumbnail %#",thethumbnail);
The application crashes at this point without any error message. Can anyone please help me on this
I found the solution. It was the problem with the last nslog. But I need to know why. the variable thethumbnail is declared with class scope. Has it got to do anything with blocks
I'm trying to create thumbnails (288x288) of selected photos from iPad photo library. I have an array of ALAsset objects presented in a UITableView and as I select a row, a larger preview (288x288) of that image is displayed. In order to prevent main thread blocking, I'm trying to create the thumbnail on a background thread and also cache a copy of the thumbnail to the file system.
In a view controller when a tableview row is selected, I call loadPreviewImage in background:
- (void)tableView:(UITableView *)tableView
didSelectRowAtIndexPath:(NSIndexPath *)indexPath
{
// get the upload object from an array that contains a ALAsset object
upload = [uploads objectAtIndex:[indexPath row]];
[self performSelectorInBackground:#selector(loadPreviewImage:)
withObject:upload];
}
I pass a custom upload object that contains asseturl property:
- (void)loadPreviewImage:(MyUploadClass*)upload
{
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
UIImage *preview = [upload previewImage];
[self performSelectorOnMainThread:#selector(setPreviewImage:)
withObject:preview
waitUntilDone:YES];
[pool release];
}
This is called on main thread to display the thumbnail after it's loaded:
- (void)setPreviewImage:(UIImage*)image
{
self.imageViewPreview.image = image;
[self layoutSubviews];
}
This is a method of MyUploadClass:
- (UIImage *)previewImage
{
__block UIImage *previewImage = [[UIImage imageWithContentsOfFile:
[self uploadPreviewFilePath]] retain];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
previewImage = [UIImage imageWithCGImage: [rep fullScreenImage]];
previewImage = [[previewImage resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:CGSizeMake(288, 288)
interpolationQuality:kCGInterpolationHigh] retain];
NSData *previewData = UIImageJPEGRepresentation(previewImage, 1.0);
[previewData writeToFile:[self uploadPreviewFilePath] atomically:YES];
}
failureBlock:^(NSError *error){ }];
[library release];
}
return [previewImage autorelease];
}
The problem is that I always get nil previewImage the first time and only after the thumbnail is cached I get an image object. What am I doing wrong? Is there a better approach to this problem?
I didn't clearly understand how the resultBlock of ALAssetsLibrary operates, my mistake was to think that the execution is linear. It turns out that in my case the resultBlock executes on the main thread while the rest of the code in previewImage executes on a background thread. I was getting nil because previewImage returned before resultBlock had a chance to end its execution. I solved the problem by replacing previewImage with the following method:
- (void) loadPreviewImage:(CGSize)size withTarget:(id)target andCallback:(SEL)callback
{
NSString *path = [self uploadPreviewFilePath];
UIImage *previewImage = [UIImage imageWithContentsOfFile:path];
if (previewImage == nil && asseturl)
{
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library assetForURL:self.asseturl resultBlock:^(ALAsset *asset)
{
if (asset) {
ALAssetRepresentation *rep = [asset defaultRepresentation];
UIImage *img = [UIImage imageWithCGImage: [rep fullScreenImage]];
img = [img resizedImageWithContentMode:UIViewContentModeScaleAspectFit
bounds:size interpolationQuality:kCGInterpolationHigh];
NSData *previewData = UIImageJPEGRepresentation(img, 1.0);
[previewData writeToFile:path atomically:YES];
[target performSelectorOnMainThread:callback
withObject:img
waitUntilDone:YES];
}
}
failureBlock:^(NSError *error){ }];
[library release];
}
else {
[target performSelectorOnMainThread:callback withObject:img waitUntilDone:YES];
}
}
Please can anyone provide any sample code to save video after recording.
I m able to record video using UIImagePickerController.
if (canShootVideo) {
UIImagePickerController *videoRecorder = [[UIImagePickerController alloc] init];
videoRecorder.sourceType = UIImagePickerControllerSourceTypeCamera;
videoRecorder.delegate = self;
NSArray *mediaTypes = [UIImagePickerController availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
NSArray *videoMediaTypesOnly = [mediaTypes filteredArrayUsingPredicate:[NSPredicate predicateWithFormat:#"(SELF contains %#)", #"movie"]];
BOOL movieOutputPossible = (videoMediaTypesOnly != nil);
if (movieOutputPossible) {
videoRecorder.mediaTypes = videoMediaTypesOnly;
[self presentModalViewController:videoRecorder animated:YES];
}
[videoRecorder release];
}
but how to save it. Can anyone please tell.
Thanks in advance
- (void)saveVideo:(NSURL *)videoUrl {
NSData *videoData = [NSData dataWithContentsOfURL:videoUrl];
[videoData writeToFile:#"YOUR_PATH_HERE" atomically:YES];
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *type = [mediaDict objectForKey:UIImagePickerControllerMediaType];
if ([type isEqualToString:(NSString *)kUTTypeVideo] ||
[type isEqualToString:(NSString *)kUTTypeMovie]) { // movie != video
NSURL *videoURL [mediaDict objectForKey:UIImagePickerControllerMediaURL];
[self saveVideo:videoUrl];
}
}
This is for the case you want to save the video into some file path that you can control. You can save into "Saved Photos" as well