- (void)thumbnail:(NSNumber *)index{
__block NSNumber *number = [NSNumber numberWithInt:[index intValue]];
ALAssetsLibrary *library = [ALAssetsLibrary sharedALAssetsLibrary];
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
CGImageRef iref = [myasset thumbnail];
if (iref) {
[delegate thumbnailDidLoad:[UIImage imageWithCGImage:iref] withIndex:number];
}
NSLog(#"RESSSSSSSSSSSSSSSSSSSSSSSSSSSSSULT");
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Error, can't get image - %#",[myerror localizedDescription]);
};
NSString *mediaurl = #"assets-library://asset/asset.JPG?id=5AF4118C-947D-4097-910E-47E19553039C&ext=JPG";
NSURL *asseturl = [NSURL URLWithString:mediaurl];
[library assetForURL:asseturl resultBlock:resultblock failureBlock:failureblock];
NSLog(#"asseturl %#",asseturl);
}
Here is my code and i have issue with my blocks - they works under simulator 5.0 but they don't work under device at all, it doesn't stop on break points and NSLogs don't work. With simulator all work correctly. Notice: CLAuthorizationStatus == kCLAuthorizationStatusAuthorized
Make sure that this whole function - (void)thumbnail:(NSNumber *)index... is either executed from the main thread or you are sure that the user has authorized your app to use location services. If you call it in the background and you don't yet have authorization, then the user will never be prompted for approval and neither the result nor failure blocks will be called.
as of iOS5 assetForURL works async. Make sure you call
[delegate thumbnailDidLoad:[UIImage imageWithCGImage:iref] withIndex:number];
on the main thread. This is easiest accomplished by using dispatch_async on the main queue.
Cheers,
Hendrik
Related
In my app (which worked under iOS 4) I collect pictures selected via UIImagePickerController. Unfortunately, I have a strange problem after upgrading to iOS 5.
In a nutshell, I store ALAssetRepresentation in NSMutableArray. When I add photos from Library, everything is ok. However, when I capture and save a picture, all ALAssetRepresentations (including a new one) become 0-sized. ALAssetRepresentation.size and ALAssetRepresentation.getBytes:fromOffset:length:error: return 0 and getBytes:error is nil.
I init ALAssetsLibrary in AppDelegate, so the
“The lifetimes of objects you get back from a library instance are tied to the lifetime of the library instance.” condition is OK.
Is there a way to prevent ALAssetRepresentation from zeroing? Or how can I read image by bytes after this?
My code:
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{
if ([picker sourceType] == UIImagePickerControllerSourceTypePhotoLibrary){
[self addPhoto:[info valueForKey:UIImagePickerControllerReferenceURL]];
}
else if ([picker sourceType] == UIImagePickerControllerSourceTypeCamera){
[self savePhoto:[info valueForKey:UIImagePickerControllerOriginalImage]];
}
[self dismissModalViewControllerAnimated:YES];
}
-(ALAssetsLibrary*) getLibrary{
if (!library){
testAppDelegate *appDelegate = (testAppDelegate *)[[UIApplication sharedApplication] delegate];
library = appDelegate.library;
}
NSLog(#"getLibrary: %#", library);
return library;
}
-(void) addPhoto:(NSURL*) url{
ALAssetsLibraryAssetForURLResultBlock successBlock = ^(ALAsset *asset_){
ALAssetRepresentation *assetRepresentation = [[asset_ defaultRepresentation] retain];
[photos addObject: assetRepresentation];
};
ALAssetsLibraryAccessFailureBlock failureBlock = ^(NSError *error){
NSLog(#"Error: Cannot get image. %#", [error localizedDescription]);
};
[[self getLibrary] assetForURL:url resultBlock:successBlock failureBlock:failureBlock];
}
- (void)savePhoto:(UIImage *)image {
[[self getLibrary] writeImageToSavedPhotosAlbum:[image CGImage] orientation:[image imageOrientation] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error: Cannot save image. %#", [error localizedDescription]);
} else {
NSLog(#"photo saved");
[self addPhoto:assetURL];
}
}];
}
I solved it!
ALAssetsLibraryChangedNotification
Sent when the contents of the assets library have changed from under the app that is using the data.
When you receive this notification, you should discard any cached information and query the assets library again. You should consider invalid any ALAsset, ALAssetsGroup, or ALAssetRepresentation objects you are referencing after finishing processing the notification.
Have you tried retaining the ALAssets instead of the ALAssetRepresentation? This should work, I have used this approach before.
In my app, i have Used UiimagepickerController for taking Video.in bettween my programm received any web service Which belongs to my app,
i have to stop Video Capture and save video.
i have Used StopVideoCapture to do above thing ,but it doesn't call delegate - `
(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
How to force call above delegate ??.or How to handle interruption Handling inUIImagePickerController`.. any idea?
The idea with delegate methods is not that you call those methods - "They call you".
So I would not consider calling the delegate method yourself a good practise. However, if you present the UIImagePickerViewController with a modal dialogue (which I guess is common for such a picker) then you can close it like this outside of your delegate method:
[[yourPicker parentViewController] dismissModalViewControllerAnimated:YES];
Source
Update: You can use the ALAssetsLibrary for accessing the stored data in your iPhone media library. I recently had to do a similar project where I had to list all images on the iPhone. The Github project ELCImagePickerController.git was very useful since it shows how the items in your library can be accessed. So you'll do something like this:
#import <AssetsLibrary/AssetsLibrary.h>
// ....
-(void)fetchPhotoAlbums{
if(!self.assetsGroups){
self.assetsGroups = [NSMutableDictionary dictionary];
}
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSMutableArray *returnArray = [[NSMutableArray alloc] init];
#autoreleasepool {
void (^assetGroupEnumerator)(ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop){
if (group == nil){
// Completed
[self.delegate pictureService:self fetchedAlbums:returnArray];
return;
}
Album *currentAlbum = [self albumForAssetsGroup:group];
// Store the Group for later retrieving the pictures for the album
[self.assetsGroups setObject:group forKey:currentAlbum.identifier];
[returnArray addObject:currentAlbum];
[self.delegate pictureService:self fetchedAlbums:returnArray];
};
void (^assetGroupEnumberatorFailure)(NSError *) = ^(NSError *error) {
NSLog(#"A problem occured %#", [error description]);
};
[library enumerateGroupsWithTypes:ALAssetsGroupAll
usingBlock:assetGroupEnumerator
failureBlock:assetGroupEnumberatorFailure];
}
}
-(void)fetchPhotosForAlbum:(Album *)album{
ALAssetsGroup *currentGroup = [self.assetsGroups objectForKey:album.identifier];
NSMutableArray *photos = [NSMutableArray array];
[currentGroup enumerateAssetsUsingBlock:^(ALAsset *asset, NSUInteger index, BOOL *stop){
if(asset == nil){
[self.delegate pictureService:self fetchedPictures:photos forAlbum:album];
return;
}
[photos addObject:[self pictureForAsset:asset]];
}];
}
Additionally I use two mapper methods to convert the AL-classes into my own model classes.
- (Album *)albumForAssetsGroup:(ALAssetsGroup *)assetsGroup{
Album *album = [[Album alloc] init];
album.title = [assetsGroup valueForProperty:ALAssetsGroupPropertyName];
album.identifier = [assetsGroup valueForProperty: ALAssetsGroupPropertyPersistentID];
album.assetsCount = assetsGroup.numberOfAssets;
album.thumbnail = [UIImage imageWithCGImage:assetsGroup.posterImage];
return album;
}
- (Picture *)pictureForAsset:(ALAsset *)asset{
Picture *picture = [[Picture alloc]init];
picture.identifier = [((NSArray *)[asset valueForProperty: ALAssetPropertyRepresentations]) objectAtIndex:0];
picture.thumbnail = [UIImage imageWithCGImage:asset.thumbnail];
return picture;
}
See the AssetsLibrary Documentation
I currently sending ipa to friends for testing. Funny thing is, one of my tester able view her photos stored on her phone which was running IOS 5 using iPhone 4.
Another 2 testers: one has iPhone 4 (IOS 4.3.3) , and iPhone 3GS (IOS 5.0.1) both of them can't see photos stored on their phone.
These are the code I have used:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
void (^assetEnumerator)(ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *result, NSUInteger index, BOOL *stop) {
if(result != NULL) {
//NSLog(#"See Asset: %#", #"ggg");
[assets addObject:result];
}
};
NSLog(#"location = %i length = %i ", range->location, range->length );
void (^assetGroupEnumerator)(ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
NSRange *datarange = malloc(sizeof(NSRange));
range->total = [group numberOfAssets];
datarange->location = [group numberOfAssets] - range->location - range->length;
datarange->length = range->length;
NSLog(#" total = %i", range->total);
int location = [group numberOfAssets] - range->location - range->length;
if (location < 0)
{
datarange->location = 0;
datarange->length = [group numberOfAssets] - range->location;
}
NSIndexSet *indexset = [ [NSIndexSet alloc] initWithIndexesInRange:*datarange];
[group enumerateAssetsAtIndexes:indexset options:NULL
usingBlock:assetEnumerator];
[indexset release];
free(datarange);
[self loadAssetToScrollView:assets];
}
};
[assets release];
assets = [[NSMutableArray alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos
usingBlock:assetGroupEnumerator
failureBlock: ^(NSError *error) {
NSLog(#"Failure");
}];
[library release];
I saw somebody say about asynchronous thing in some other threads but don't know is it the case. He say put dispatch_async in the enumerate group block.
Does anyone know what is wrong.
Additionally, one of tester with iOS 4.3.3 can show his photos after enabling location services under General->Setting. Why we have to enable it? Can we enabled it on code since it will be quite disturbing to the user who using our application.
Also on iOS 5.x you must retain your ALAssetsLibrary instance so long as you need to work with the collected assets. When you release your ALAssetsLibrary instance like in your code just after calling [library enumerateGroupsWithTypes:…] all the collected assets will be invalid.
See also the ALAssetsLibrary doc - overview:
"… The lifetimes of objects you get back from a library instance are tied to the lifetime of the library instance. …"
Yes, it is incredibly frustrating, but that is how it is, and you cannot enable location services in code (that is a good thing though).
I would move the first block ^assetGroupEnumerator to the heap by [[<#block#> copy] autorelease]. Why? Because this block would be autoreleased by the runloop, if there are many assets need to be enumerated through.
One more thing: don't use [self loadAssetToScrollView:assets]; inside the block but get the weak reference of self before the block like this:
__block YourExampleClassInstance *weakSelf = self;
and further use this weakSelf instance inside the block:
[weakSelf loadAssetToScrollView:assets];
void (^assetGroupEnumerator)… = ^(ALAssetsGroup *group, BOOL *stop) {
…
};
Why? To avoid retain cycles.
I'm using the Assets Library on an app to enumerate the Photos Events of a device.
My code works fine when i test it on my iPad. The Photos Events are enumerated and i can handle them perfectly. When I try the very same code on my iPhone, nothing happens (and I have Photos Events on this device too). It looks as if the enumeration code was not even called (ie no log appears in the console, cf. code).
Here is the code :
- (void)loadEvents {
NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupEvent
usingBlock:^(ALAssetsGroup *group, BOOL *stop) {
if (group) {
[photosEventsArray addObject:group];
NSLog(#"Adding group");
} else {
NSLog(#"End of the enumeration");
}
}
failureBlock: ^(NSError *error) {
NSLog(#"Failure while enumerating assets: %#", error);
}];
[library release];
NSLog(#"Found %d events", photosEventsFound);
[self performSelectorOnMainThread:#selector(stopSpinner) withObject:nil waitUntilDone:YES];
[pool drain];
}
My deployment target is iOS 4.1.
Any idea of what is going wrong here?
After more investigation, it seems that on iOS 4.3.5, the enumerateGroupsWithTypes method HAS to be called from the main thread.
I've patched my code this way (setting NO from iPhone and iPod Touch, and YES from iPad):
if (scanAssetsInBackground) {
[self performSelectorInBackground:#selector(loadEvents) withObject:nil];
} else {
[self performSelectorOnMainThread:#selector(loadEvents) withObject:nil waitUntilDone:YES];
}
Works fine with that patch.
There's not much information about this in Apple docs and there's no way to know which way (background or main thread) is the right way to scan assets libraries.
Exposure values from camera can be acquired when you take picture (without saving it to SavedPhotos). A light meter application on iPhone does this, probably by using some private API.
That application does it on iPhone 3GS only, so I guess it may be somehow related to EXIF data which is populated with this information when the image is created.
This all applies to 3GS.
Has anything changed with iPhone OS 4.0?
Is there a regular way to get these values now?
Does anyone have a working code example for taking these camera/photo setting values?
Thank you
If you want realtime* exposure information, you can capture a video using AVCaptureVideoDataOutput. Each frame CMSampleBuffer is full of interesting data describing the current state of the camera.
*up to 30 fps
With AVFoundation in iOS 4.0 you can mess with exposure, refer specifically to AVCaptureDevice, here is a link AVCaptureDevice ref. Not sure if its exactly what you want but you can look around AVFoundation and probably find some useful stuff
I think I finally found the lead to the real EXIF data. It'll be a while before I have actual code to post, but I figured this should be publicized in the meantime.
Google captureStillImageAsynchronouslyFromConnection. It's a function of AVCaptureStillImageOutput and following is an excerpt from the documentation (long sought for):
imageDataSampleBuffer -
The data that was captured.
The buffer attachments may contain metadata appropriate to the image data format. For example, a buffer containing JPEG data may carry a kCGImagePropertyExifDictionary as an attachment. See ImageIO/CGImageProperties.h for a list of keys and value types.
For an example of working with AVCaptureStillImageOutput see WWDC 2010 sample code, under AVCam.
Peace,
O.
Here is the complete solution. Dont forget to import appropriate frameworks and headers.
In the exifAttachments var in capturenow method you'll find all data you are looking for.
#import <AVFoundation/AVFoundation.h>
#import <ImageIO/CGImageProperties.h>
AVCaptureStillImageOutput *stillImageOutput;
AVCaptureSession *session;
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCaptureSession];
// Do any additional setup after loading the view, typically from a nib.
}
-(void)captureNow{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) {
CFDictionaryRef exifAttachments = CMGetAttachment( imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
if (exifAttachments)
{
// Do something with the attachments.
NSLog(#"attachements: %#", exifAttachments);
}
else
NSLog(#"no attachments");
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
}];
}
// Create and configure a capture session and start it running
- (void)setupCaptureSession
{
NSError *error = nil;
// Create the session
session = [[AVCaptureSession alloc] init];
// Configure the session to produce lower resolution video frames, if your
// processing algorithm can cope. We'll specify medium quality for the
// chosen device.
session.sessionPreset = AVCaptureSessionPreset352x288;
// Find a suitable AVCaptureDevice
AVCaptureDevice *device = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:nil];
device.whiteBalanceMode = AVCaptureWhiteBalanceModeLocked;
device.focusMode = AVCaptureFocusModeLocked;
[device unlockForConfiguration];
// Create a device input with the device and add it to the session.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// Handling the error appropriately.
}
[session addInput:input];
stillImageOutput = [AVCaptureStillImageOutput new];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
if ([session canAddOutput:stillImageOutput])
[session addOutput:stillImageOutput];
// Start the session running to start the flow of data
[session startRunning];
[self captureNow];
}