AVURLAsset and AVAssetImageGenerator return null images - iphone

I am trying to get a thumbnail from a .mov file that I captured from the iphone camera. I currently have the movie saved in the documents portion of the app. When I call [Asset duration] it returns a null object. Also when I try to call the copyCGImageAtTime:actualtime:error method it also returns a null object. I've spent count less hours trying to figure this out. I've tried moving my code to another main section portion of my app to just see if I could get it to work. I've also tried to run it on the simulator with no luck. Here is the code:
NSString* destinationPath = [NSString stringWithFormat:#"%#/aaa/aaa.mov", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:destinationPath] options:nil];
AVAssetImageGenerator *gen = [[AVAssetImageGenerator alloc] initWithAsset:asset];
gen.appliesPreferredTrackTransform = YES;
CMTime time = CMTimeMakeWithSeconds(0.0, 600);
NSError *error2 = nil;
CMTime actualTime;
CGImageRef image = [gen copyCGImageAtTime:time actualTime:&actualTime error:&error2];
UIImage *thumb = [[UIImage alloc] initWithCGImage:image];
CGImageRelease(image);
I have also confirmed that the movie does exist under that folder. Any help would be greatly appreciated. Thanks :)
--Edit--
Forgot to mention that the Error from copyCGImageAtTime is the AVUnknown error.
--Edit2--
Found out the problem. I didn't include file:// at the beginning of the url. It works now.

Found out the problem. I didn't include file:// at the beginning of the url. It works now.

Related

How to grab a single frame from a video in the app's dir?

How does one use a UIImageView to show a .mov files visible frame, the preview frame that is visible in e.g. finder or any youtube video etc.
I would like to be able to get the preview frame from .mov files saved to an apps library directory, so they will not be AVAssets.
The AVAssetImageGenerator in AVFoundation can be used to load videos both in albums and the local app dirs.
Here's a helper method that will return an image from a video URL (inside or outside the app) at any given time interval:
+ (UIImage *)thumbnailImageForVideo:(NSURL *)videoURL atTime:(NSTimeInterval)time {
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
NSParameterAssert(asset);
AVAssetImageGenerator *assetImageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
assetImageGenerator.appliesPreferredTrackTransform = YES;
assetImageGenerator.apertureMode = AVAssetImageGeneratorApertureModeEncodedPixels;
CGImageRef thumbnailImageRef = NULL;
CFTimeInterval thumbnailImageTime = time;
NSError *thumbnailImageGenerationError = nil;
thumbnailImageRef = [assetImageGenerator copyCGImageAtTime:CMTimeMake(thumbnailImageTime, 60) actualTime:NULL error:&thumbnailImageGenerationError];
NSAssert(thumbnailImageRef, #"CGImageRef shall never be nil.");
UIImage *thumbnailImage = thumbnailImageRef ? [[UIImage alloc] initWithCGImage:thumbnailImageRef] : nil;
return thumbnailImage;
}

How can i share a sound file, to facebook or Twitter in iphone

Hi i am working on application in which , i have to upload a sound file to Facebook.
Please ,provide me a better solution, whether it is possible to share a sound file on Facebook or not.
Thanks in advance
Facebook does not have sound uploading. You could always upload the sound file elsewhere and use Facebook to share the link to it.
if you check the webApps for twitter/facebook, they does not provide any means to UPLOAD an audio file.
Twittier allows only text post and on the other hand, Facebook allow Image/Video to be uploaded.
In the light of these facts, I do not think it is possible without a url share.
It is not possible to upload audio files to Facebook, only photos and videos are allowed. However, another solution would be to upload the audio file somewhere else and then use the Facebook API to post a link using that reference. One place you may wish to look to upload audio is http://developers.soundcloud.com/
Use AVAssetExportSession, create a movie with the sound file and then upload it to Facebook.
This is possible to do but it is a bit of a pain. To do this you must convert the audio file into a video file and then post it to Facebook as a video.
First we need to have access to our audioFile, you should already have this, if not then there are a lot of Stackoverflow questions devoted to this, I won't complicate matters by going off track. We then create a NSURL to a video in our documents. In this case we have an video named video_base.mp4 which has been designed to be a nice background for our audio track. Finally we merge the files before sharing the returned file to Facebook.
- (IBAction)shareToFacebook:(id)sender {
// You should already have your audio file saved
NSString * songFileName = [self getSongFileName];
NSArray * searchPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString * documentPath = [searchPaths objectAtIndex:0];
NSString * file = [documentPath stringByAppendingPathComponent:songFileName];
NSURL * audioFileURL = [NSURL fileURLWithPath: audioFile];
NSURL * videoFileURL = [NSURL fileURLWithPath:[NSFileManager getFilePath:#"video_base.mp4" withFolder:#""]];
[self mergeAudio:audioFileURL andVideo:videoFileURL withSuccess:^(NSURL * url) {
// Now we have the URL of the video file
[self shareVideoToFacebook:url];
}];
}
Credit to #dineshprasanna for this part of the code which can be found here. We want to merge our audio and video and then save them to a path. We then return the exportURL in the completion block.
- (void)mergeAudio: (NSURL *)audioURL andVideo: (NSURL *)videoURL withSuccess:(void (^)(NSURL * url))successBlock {
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioURL options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoURL options:nil];
AVMutableComposition * mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack * compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
NSString * videoName = #"export.mov";
NSString * exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL * exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) {
if(successBlock) successBlock(exportUrl);
}];
}
Finally we want to save our return videoURL to Facebook. It is worth noting that we need a few libraries to be added for this functionality to work:
#import <AssetsLibrary/AssetsLibrary.h>
#import <FBSDKCoreKit/FBSDKCoreKit.h>
#import <FBSDKShareKit/FBSDKShareKit.h>
We then share the merged file to Facebook:
- (void)shareVideoToFacebook: (NSURL *)videoURL {
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
ALAssetsLibraryWriteVideoCompletionBlock videoWriteCompletionBlock = ^(NSURL *newURL, NSError *error) {
if(error) {
NSLog( #"Error writing image with metadata to Photo Library: %#", error );
} else {
NSLog( #"Wrote image with metadata to Photo Library %#", newURL.absoluteString);
FBSDKShareDialog *shareDialog = [[FBSDKShareDialog alloc]init];
NSURL *videoURL = newURL;
FBSDKShareVideo *video = [[FBSDKShareVideo alloc] init];
video.videoURL = videoURL;
FBSDKShareVideoContent *content = [[FBSDKShareVideoContent alloc] init];
content.video = video;
[FBSDKShareDialog showFromViewController:self
withContent:content
delegate:nil];
}
};
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:videoURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:videoURL
completionBlock:videoWriteCompletionBlock];
}
}
This should open up the Facebook app and then allow the user to share their audio file on their wall with a background of the video stored in your app.
Obviously everyone's project is different, this means you might not be able to copy paste this code exactly into your project. I have tried to split up the process meaning it should be easy to extrapolate to get audio messages uploading successfully.

Get album artwork from ID3 tag/Convert function from Java to Objective-C

I've got the following question to ask:
How do you compile taglib with an iOS application?
I'm a bit confused as I added the folder into my project, tried to compile it, but instead it failed with 1640 errors.
How do I make it successfully compile - the reason why I ask is taglib allows for the extraction of album artwork from a tag.
If anyone knows an Objective-C based album artwork extraction class it would help - I don't know why Apple don't add a way of doing this in Core Foundation - because there are methods for extracting some of the data from an ID3 tag.
I can't see why there isn't some Objective-C way of doing it.
Any help appreciated.
EDIT:
After hours upon hours of trying to do this, I've found a Java function here that seems to do the job fine, but I haven't got a clue on how to convert it to Objective-C (or let alone C++ for that matter), as its types seem to be completely different from those that are in Objective-C/C++. If anyone knows of a way to convert this it would really help, as this seems as my last option as I've tried so many others.
- (void)loadArtworksForFileAtPath:(NSString *)path completion:(void (^)(NSArray *))completion
{
NSURL *u = [NSURL fileURLWithPath:path];
AVURLAsset *a = [AVURLAsset URLAssetWithURL:u options:nil];
NSArray *k = [NSArray arrayWithObjects:#"commonMetadata", nil];
[a loadValuesAsynchronouslyForKeys:k completionHandler: ^{
NSArray *artworks = [AVMetadataItem metadataItemsFromArray:a.commonMetadata
withKey:AVMetadataCommonKeyArtwork keySpace:AVMetadataKeySpaceCommon];
NSMutableArray *artworkImages = [NSMutableArray array];
for (AVMetadataItem *i in artworks)
{
NSString *keySpace = i.keySpace;
UIImage *im = nil;
if ([keySpace isEqualToString:AVMetadataKeySpaceID3])
{
NSDictionary *d = [i.value copyWithZone:nil];
im = [UIImage imageWithData:[d objectForKey:#"data"]];
}
else if ([keySpace isEqualToString:AVMetadataKeySpaceiTunes])
im = [UIImage imageWithData:[i.value copyWithZone:nil]];
if (im)
[artworkImages addObject:im];
}
completion(artworkImages);
}];
}
This is just the answer of #hatfinch above but fixed and now working.
- (void)artworksForFileAtPath:(NSString *)path block:(void(^)(NSArray *artworkImages))block
{
NSURL *url = [NSURL fileURLWithPath:path];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
NSArray *keys = [NSArray arrayWithObjects:#"commonMetadata", nil];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
NSArray *artworks = [AVMetadataItem metadataItemsFromArray:asset.commonMetadata
withKey:AVMetadataCommonKeyArtwork
keySpace:AVMetadataKeySpaceCommon];
NSMutableArray *artworkImages = [NSMutableArray array];
for (AVMetadataItem *item in artworks) {
NSString *keySpace = item.keySpace;
if ([keySpace isEqualToString:AVMetadataKeySpaceID3]) {
NSDictionary *d = [item.value copyWithZone:nil];
[artworkImages addObject:[UIImage imageWithData:[d objectForKey:#"data"]]];
} else if ([keySpace isEqualToString:AVMetadataKeySpaceiTunes]) {
[artworkImages addObject:[UIImage imageWithData:[item.value copyWithZone:nil]]];
}
}
if(block) {
block(artworkImages);
}
}];
}
If you are merely wanting to access the artwork of the music already stored on the device i.e from the iPod application, then check out this article which demonstrates how to do it using existing frameworks provided by Apple, specifically the MediaPlayer framework.
If not, it may be wise to provide a little more information about what you are trying to achieve and also provide examples of the errors you are getting when trying to compile TagLib.
EDIT:
Another solution could be to use an embedded webview and use a JavaScript library such as this one to load, parse and fetch the album artwork?
You could use this: https://github.com/EliaCereda/TagLib-ObjC

Create thumbnail image from video url!

I'm try to create thumbnail image from video url.
I use AV Foundation Programming Guide.
My project have a button and an imageview. When button pressed so thumbnail image will load on uiimageview.
my code can't work,it's:
- (IBAction) btnClick : (id)sender
{
NSURL *url = [NSURL URLWithString:#"http://www.youtube.com/watch?v=bgN62D70VLk"];
AVURLAsset *myAsset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc] initWithAsset:myAsset];
Float64 durationSeconds = CMTimeGetSeconds([myAsset duration]);
CMTime midpoint = CMTimeMakeWithSeconds(durationSeconds/2.0, 600);
NSError *error = nil;
CMTime actualTime;
CGImageRef halfWayImage = [imageGenerator copyCGImageAtTime:midpoint actualTime:&actualTime error:&error];
if (halfWayImage != NULL) {
NSString *actualTimeString = (NSString *)CMTimeCopyDescription(NULL, actualTime);
NSString *requestedTimeString = (NSString *)CMTimeCopyDescription(NULL, midpoint);
NSLog(#"got halfWayImage: Asked for %#, got %#", requestedTimeString, actualTimeString);
[actualTimeString release];
[requestedTimeString release];
// Do something interesting with the image.
CGImageRelease(halfWayImage);
}
UIImage *image = [UIImage imageWithCGImage:halfWayImage];
[imageView setImage:image];
[imageGenerator release];
}
Help me for this problem,please!
Thanks!
MPMoviePlayerController has some method to handle this -
thumbnailImageAtTime:timeOption:
requestThumbnailImagesAtTimes: timeOption:
cancelAllThumbnailImageRequests:
I see a few problems in your code:
http://www.youtube.com/watch?v=bgN62D70VLk is the URL of a web page, but AVAssets must be video or audio files. Note: YouTube does not advertise the URLs of its video files.
Your call to [myAsset duration] will block. You should instead use the AVAsynchronousKeyValueLoading protocol (see loadValuesAsynchronouslyForKeys).
You are using halfwayImage after releasing it.
I would recommend watching the AVFoundation sessions from WWDC 2010, and looking at the session sample code.
This is very late but it will help some other who comes to this question.
Look into this answer for the same problem, Hope will help to other guys.

Getting iPhone video thumbnails

How do I get a thumbnail of a video imported from the camera roll, or the camera itself?
This has been asked before, and has been answered. However, the answers kind of suck for me.
This thread iphone sdk > 3.0 . Video Thumbnail? has some options that boil down to:
Crawl some filesystem directory for a JPG with the latest modification date that should correspond to the video you just picked. This is extremely messy, and involves rooting around in directories Apple would probably not really want me doing.
Use ffmpeg. But this is so general that I cannot seem to figure out the steps that it would take to import ffmpeg into my project and to actually call it to extract images.
Is there really no other way? This seems like a HUGE oversight in the SDK to me. I mean the video picker has thumbnails in it, so Apple must be doing something to generate those, yet does not allow us to?
-(void)testGenerateThumbNailDataWithVideo {
NSString *path = [[NSBundle mainBundle] pathForResource:#"IMG_0106" ofType:#"MOV"];
NSURL *url = [NSURL fileURLWithPath:path];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
[generate release];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
UIImage *currentImg = [[UIImage alloc] initWithCGImage:imgRef];
static BOOL flag = YES;
if (flag) {
NSData *tmpData = UIImageJPEGRepresentation(currentImg, 0.8);
NSString *path = [NSString stringWithFormat:#"%#thumbNail.png", NSTemporaryDirectory()];
BOOL ret = [tmpData writeToFile:path atomically:YES];
NSLog(#"write to path=%#, flag=%d", path, ret);
flag = NO;
}
[currentImg release];
}
Best method I've found... MPMoviePlayerController thumbnailImageAtTime:timeOption
Nevermind this... see first comment below. That's the answer.
We use ffmpeg, you can explore our site for hints on how to do it, eventually I want to put up a tutorial.
But right now I'm more concentrated on getting ffmpeg to play movies.
Understand once you have that code the code to generate a thumbnail is just a subset of that.
http://sol3.typepad.com/tagalong_developer_journa/
This tutorial here, has helped us and maybe the majority of developers using ffmpeg to get started.
dranger.com/ffmpeg/ "
Finally,
Apple probably would maybe not have any problems with using the thumbnail generated from the video camera, I don't think its in a private folder however that is only created by the camera and not for videos picked from the image picker.