Load thumbnail image from video saved at NSDocumentDirectory - iphone

I use the following code to get an image from a video at given path.
- (UIImage*) thumbnailImageForVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
NSParameterAssert(asset);
AVAssetImageGenerator *assetImageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
assetImageGenerator.appliesPreferredTrackTransform = YES;
assetImageGenerator.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
CFTimeInterval thumbnailImageTime = time;
NSError *thumbnailImageGenerationError = nil;
thumbnailImageRef = [assetImageGenerator copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60) actualTime:NULL error:&thumbnailImageGenerationError];
if (!thumbnailImageRef)
NSLog(#"thumbnailImageGenerationError %#", thumbnailImageGenerationError);
UIImage *thumbnailImage = thumbnailImageRef ? [[UIImage alloc] initWithCGImage:thumbnailImageRef] : nil;
previewImage = thumbnailImage;
return thumbnailImage;
}
However this is not working for a video saved in a NSDocumentDirectory. Cant we access the NSDocumentDIrectory using NSURL? If not any alternative. The main idea is to show the thumbnail of an image saved in document directory and later allow it to upload to sever. This can be a temp directory as well.

Worked when you use NSURL *outputURL = [[NSURL alloc] initFileURLWithPath: path];

Related

Change program from getting music from iTunes to local file using URL

I am currently trying to make an app stream raw data from mic over mulipeer connectivity.
I have been using this tutorial as a base https://robots.thoughtbot.com/streaming-audio-to-multiple-listeners-via-ios-multipeer-connectivity
Now however I am struggling with changing the URL from itunes library to my local file.
I am no advanced programmer and this is some kind of summer project.
When the program is getting music from itunes library it uses this code:
- (void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
[self dismissViewControllerAnimated:YES completion:nil];
if (self.outputStreamer) return;
self.song = mediaItemCollection.items[0];
NSMutableDictionary *info = [NSMutableDictionary dictionary];
info[#"title"] = [self.song valueForProperty:MPMediaItemPropertyTitle] ? [self.song valueForProperty:MPMediaItemPropertyTitle] : #"";
info[#"artist"] = [self.song valueForProperty:MPMediaItemPropertyArtist] ? [self.song valueForProperty:MPMediaItemPropertyArtist] : #"";
MPMediaItemArtwork *artwork = [self.song valueForProperty:MPMediaItemPropertyArtwork];
UIImage *image = [artwork imageWithSize:self.albumImage.frame.size];
if (image)
info[#"artwork"] = image;
if (info[#"artwork"])
self.albumImage.image = info[#"artwork"];
else
self.albumImage.image = nil;
self.songTitle.text = info[#"title"];
self.songArtist.text = info[#"artist"];
[self.session sendData:[NSKeyedArchiver archivedDataWithRootObject:[info copy]]];
NSArray *peers = [self.session connectedPeers];
if (peers.count) {
self.outputStreamer = [[TDAudioOutputStreamer alloc] initWithOutputStream:[self.session outputStreamForPeer:peers[0]]];
[self.outputStreamer streamAudioFromURL:[self.song valueForProperty:MPMediaItemPropertyAssetURL]];
[self.outputStreamer start];
But I want it to get music from the recorder:
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
#"MyAudioMemo.m4a",
nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:nil];
player = [[AVAudioPlayer alloc] initWithContentsOfURL:recorder.url error:nil];
I have been struggling with this for a while now and would appreciate any kind of help!

Exporting a photo to Instagram

I am trying to export a photo from my application to Instagram, using this code:
// Capture Screenshot
UIGraphicsBeginImageContextWithOptions(self.view.frame.size,self.view.opaque,0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSData *imageData = UIImageJPEGRepresentation(image, 1.0);
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/test.igo"];
NSURL *igImageHookFile = [[NSURL alloc] initWithString:[[NSString alloc] initWithFormat:#"file://%#", jpgPath]];
dic = [UIDocumentInteractionController interactionControllerWithURL:[NSURL fileURLWithPath:igImageHookFile]];
dic.UTI = #"com.instagram.photo";
dic.annotation = [NSDictionary dictionaryWithObject:#"my caption" forKey:#"InstagramCaption"];
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL]) {
[[UIApplication sharedApplication] openURL:instagramURL];
} else {
NSLog(#"No Instagram Found");
}
But it is not working. It crashes with the following error:
2013-01-19 20:40:41.948 Ayat[12648:c07] * Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[NSURL initFileURLWithPath:]: nil string parameter'
I am not sure I'm properly:
1- Saving jpgpath as the screenshot I took .
2- Passing the "dic" document to Instagram.
In my app, I have also export photo to Instagram. (Image must be larger than 612x612 size)
Try this code:
-(void)shareImageOnInstagram:(UIImage*)shareImage
{
//Remember Image must be larger than 612x612 size if not resize it.
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if([[UIApplication sharedApplication] canOpenURL:instagramURL])
{
NSString *documentDirectory=[NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
NSString *saveImagePath=[documentDirectory stringByAppendingPathComponent:#"Image.ig"];
NSData *imageData=UIImagePNGRepresentation(shareImage);
[imageData writeToFile:saveImagePath atomically:YES];
NSURL *imageURL=[NSURL fileURLWithPath:saveImagePath];
UIDocumentInteractionController *docController=[[UIDocumentInteractionController alloc]init];
docController.delegate=self;
[docController retain];
docController.UTI=#"com.instagram.photo";
docController.annotation=[NSDictionary dictionaryWithObjectsAndKeys:#"Image Taken via #App",#"InstagramCaption", nil];
[docController setURL:imageURL];
[docController presentOpenInMenuFromBarButtonItem:self.navigationItem.rightBarButtonItem animated:YES]; //Here try which one is suitable for u to present the doc Controller. if crash occurs
}
else
{
NSLog (#"Instagram not found");
}
}
I hope this will helps you.
On this line
dic = [UIDocumentInteractionController interactionControllerWithURL:[NSURL fileURLWithPath:igImageHookFile]];
you are calling fileURLWithPath:. For a start this expects an NSString, not an NSURL, but from your error message it looks like igImageHookFile is nil.
It took a bunch of research but I finally found this at this site that helped. I just trimmed it up a bit to not include the caption as that was his original problem... but if you're only putting in the picture, here it is:
-(void)shareImageOnInstagram
{
//Remember Image must be larger than 612x612 size if not resize it.
NSURL *instagramURL = [NSURL URLWithString:#"instagram://app"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL]){
NSString *urlString = [[NSString alloc] initWithFormat:#"file://%#", blendedImageURL];
NSURL *imageUrl = [[NSURL alloc] initWithString: urlString];
self.docController = [self setupControllerWithURL:imageUrl usingDelegate:self];
self.docController.UTI = #"com.instagram.exclusivegram";
// self.docController.annotation = [NSDictionary dictionaryWithObject:#"I want to share this text" forKey:#"InstagramCaption"];
[self.docController presentOpenInMenuFromRect: self.view.frame inView: self.view animated: YES ];}
}
I tried the following code and it worked:
- (void) shareImageWithInstagram
{
NSURL *instagramURL = [NSURL URLWithString:#"instagram://"];
if ([[UIApplication sharedApplication] canOpenURL:instagramURL])
{
UICachedFileMgr* mgr = _gCachedManger;
UIImage* photoImage = [mgr imageWithUrl:_imageView.image];
NSData* imageData = UIImagePNGRepresentation(photoImage);
NSString* captionString = [NSString stringWithFormat:#"%#%#",self.approvedData.approvedCaption,self.approvedData.tag];
NSString* imagePath = [UIUtils documentDirectoryWithSubpath:#"image.igo"];
[imageData writeToFile:imagePath atomically:NO];
NSURL* fileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"file://%#",imagePath]];
self.docFile = [[self setupControllerWithURL:fileURL usingDelegate:self]retain];
self.docFile.annotation = [NSDictionary dictionaryWithObject: captionString
forKey:#"InstagramCaption"];
self.docFile.UTI = #"com.instagram.photo";
// OPEN THE HOOK
[self.docFile presentOpenInMenuFromRect:self.view.frame inView:self.view animated:YES];
}
else
{
[UIUtils messageAlert:#"Instagram not installed in this device!\nTo share image please install instagram." title:nil delegate:nil];
}
}

iphone: Three20 library's MockPhoto with server images

I am trying to download images from server and display in 320 library's TTThumbsViewController. I am not able to get this working!
Following works for me when I have images locally stored:
//Create thumbnails
[ImageUtils checkAndSaveThumbnail:imageName andSize:CGSizeMake(100.0, 100.0) andLocation:NO andPrefixof:#"small"];
NSString *fullImageName = [NSString stringWithFormat:#"documents://%#",imageName];
NSString *imageSmallName = [NSString stringWithFormat:#"documents://small-%#",imageName];
NSString *caption = [tempDict objectForKey:#"Comment"];
UIImage *tempImage = [UIImage imageWithContentsOfFile:[appDelegate filePath:imageName]];
MockPhoto *tempPicture = [[MockPhoto alloc] initWithURL:fullImageName smallURL:imageSmallName size:tempImage.size caption:caption photoname:imageName];
[photoArray addObject:tempPicture];
Adding photoArray into MockPhotoSource:
self.photoSource = [[MockPhotoSource alloc]
initWithType:MockPhotoSourceNormal
title:#"Photos"
photos:photoArray
photos2:nil
];
[photoArray release];
...But when I give download location like following it doesn't work! Doesn't show me thumbnails and neither large pics!
NSString *url = [delegate downloadImageUrl];
MockPhoto *tempPicture = [[MockPhoto alloc] initWithURL:url smallURL:url size:tempImage.size caption:caption photoname:imageName];
[photoArray addObject:tempPicture];
I have been using same same URL with AsyncImage and it downloads image and show in ImageView fine!
NSURL *url = [NSURL URLWithString:[delegate downloadImageUrl]];
UIImageView *profileimage = [[UIImageView alloc] init];
AsyncImageView* asyncImage = [[[AsyncImageView alloc] initWithFrame:CGRectMake(0, 10, 60, 60)] autorelease];
asyncImage.imageName = [[users objectAtIndex:indexPath.row] objectForKey:#"Key"];
[asyncImage loadImageFromURL:url];
[profileimage addSubview:asyncImage];
Could you please let me know what's wrong am I doing with MockPhoto?
Thanks.

Thumbnail image of video

I am selecting video clip from library. And i want to create thumbnail image of it. I have applied this code. But the image appeared rotated. I want its original view.
- (UIImage*)testGenerateThumbNailDataWithVideo {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:appDelegate.videoURL options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
[generate release];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
UIImage *currentImg = [[[UIImage alloc] initWithCGImage:imgRef] autorelease];
static BOOL flag = YES;
if (flag) {
NSData *tmpData = UIImageJPEGRepresentation(currentImg, 0.8);
NSString *path = [NSString stringWithFormat:#"%#thumbNail.png", NSTemporaryDirectory()];
BOOL ret = [tmpData writeToFile:path atomically:YES];
NSLog(#"write to path=%#, flag=%d", path, ret);
flag = NO;
}
return currentImg;
}
Try using AVAssetImageGenerator instead. Apple discusses using AVAssetImageGenerator to create thumbnails here. Here is sample code, which grabs a single thumbnail image. You will need to include the AVFoundation framework. And also add CoreMedia framework
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:vidPath options:nil];
AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:asset];
gen.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMakeWithSeconds(0.0, 600);
NSError *error = nil;
CMTime actualTime;
CGImageRef image = [gen copyCGImageAtTime:time actualTime:&actualTime error:&error];
UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
[gen release];
One more solution is
-(void)generateImage
{
AVURLAsset *asset=[[AVURLAsset alloc] initWithURL:self.url options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(0,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
[button setImage:[UIImage imageWithCGImage:im] forState:UIControlStateNormal];
thumbImg=[[UIImage imageWithCGImage:im] retain];
[generator release];
};
CGSize maxSize = CGSizeMake(320, 180);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler];
}
Or
ALAsset
display image from URL retrieved from ALAsset in iPhone
Swift 5:
func previewImageForLocalVideo(at url: URL) -> UIImage? {
let asset = AVAsset(url: url)
let imageGenerator = AVAssetImageGenerator(asset: asset)
imageGenerator.appliesPreferredTrackTransform = true
var time = asset.duration
//If possible - take not the first frame (it could be completely black or white on camara's videos)
time.value = min(time.value, 2)
do {
let imageRef = try imageGenerator.copyCGImage(at: time, actualTime: nil)
return UIImage(cgImage: imageRef)
} catch let error as NSError {
print("Image generation failed with error \(error)")
return nil
}
}
This will solve the rotation issue
generate.appliesPreferredTrackTransform = YES

Grabbing the first frame of a video from UIImagePickerController?

I'm trying to get the first frame from the selected video in a UIImagePickerController to show in a UIImageView, but I do not know if it's possible. If it is, how would I do it?
You can do this in one of two ways. The first way is to use the MPMoviePlayerController to grab the thumbnail:
MPMoviePlayerController *moviePlayer = [[MPMoviePlayerController alloc]
initWithContentURL:videoURL];
moviePlayer.shouldAutoplay = NO;
UIImage *thumbnail = [moviePlayer thumbnailImageAtTime:time
timeOption:MPMovieTimeOptionNearestKeyFrame];
This works, but MPMoviePlayerController is not a particularly lightweight object and not particularly fast grabbing thumbnails.
The preferred way is to use the new AVAssetImageGenerator in AVFoundation. This is fast, lightweight and more flexible than the old way. Here's a helper method that will return an autoreleased image from the video.
+ (UIImage *)thumbnailImageForVideo:(NSURL *)videoURL
atTime:(NSTimeInterval)time
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
NSParameterAssert(asset);
AVAssetImageGenerator *assetIG =
[[AVAssetImageGenerator alloc] initWithAsset:asset];
assetIG.appliesPreferredTrackTransform = YES;
assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
CFTimeInterval thumbnailImageTime = time;
NSError *igError = nil;
thumbnailImageRef =
[assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
actualTime:NULL
error:&igError];
if (!thumbnailImageRef)
NSLog(#"thumbnailImageGenerationError %#", igError );
UIImage *thumbnailImage = thumbnailImageRef
? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
: nil;
return thumbnailImage;
}
Asynchronous usage
- (void)thumbnailImageForVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time completion:(void (^)(UIImage *)) completion
{
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), ^{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
NSParameterAssert(asset);
AVAssetImageGenerator *assetIG =
[[AVAssetImageGenerator alloc] initWithAsset:asset];
assetIG.appliesPreferredTrackTransform = YES;
assetIG.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
CFTimeInterval thumbnailImageTime = time;
NSError *igError = nil;
thumbnailImageRef =
[assetIG copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60)
actualTime:NULL
error:&igError];
if (!thumbnailImageRef)
NSLog(#"thumbnailImageGenerationError %#", igError );
UIImage *thumbnailImage = thumbnailImageRef
? [[UIImage alloc] initWithCGImage:thumbnailImageRef]
: nil;
dispatch_async(dispatch_get_main_queue(), ^{
completion(thumbnailImage);
});
});
}