We can crop images. Can we crop videos?
Since video is a collection of pictures you can crop all frames from video and after create new video. AVFoundation guide describe some tasks:
Putting it all Together: Capturing Video Frames as UIImage Objects
After this you crops images and write video
You can use an asset writer to produce a QuickTime movie file or an
MPEG-4 file from media such as sample buffers or still images.
See for more details AV Foundation Framework
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality];
exportSession.outputURL = outputURL;
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(120.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void){
handler(exportSession);
[exportSession release];}];
Here we get a video of first 2 mins.
You should be able to do this using AVAssetExportSession, AVVideoComposition, and AVVideoCompositionCoreAnimationTool (and just set up a CALayer hierarchy with the positioning you want). I'm not sure if this is the most efficient way, though.
it's not as simple as images
but it could be as easy as the correct specification of the video but there is not enough information.
in the decoding settings, you can manipulate video pixels by geometry, ie, anamorphic, squeezed, stretched and also player/browser settings, the image window or player window, you can specify a small player window and a magnification level. if you allow or disallow zoom/magnification, you'll force an offscreeen draw or black bars.
i would encode to the correct size and platform for best quality, these kinds of fixes are 'kludges' but they work in a pinch. i would grab the quicktime sdk and poke around.
Related
Some videos play fine alone as simple AVAssets but when inserted in an AVMutableVideoComposition they make the whole composition fail even if those videos are inserted alone in an empty composition. Anyone has similar issues? Sometime re-encoding the videos before inserting them makes it work sometime not. I can't see any mistake in my timing instruction and using some other video don't cause any problem at all no matter their length. Can there be issues with the number of frame or the duration of the asset or their format? (all are h264 single track)
Well indeed it looks like some video may cause problem when being inserted. example here duration of the track is the issue.
AVAsset * video = ...;
NSArray * videoTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack * videoTrack = videoTracks.firstObject;
CMTime duration = videoTrack.timeRange.duration // this time cause error
AVMutableCompositionTrack * track = ...;
CMTime insertCMTime = kCMTimeZero;
CMTimeRange trackRange = CMTimeRangeMake(kCMTimeZero, duration);
[track insertTimeRange:trackRange
ofTrack:videoTrack
atTime:insertCMTime
error:nil];
Trimming the range to insert to a round second solved the problem for all video tested so far
NSTimeInterval interval = (int)CMTimeGetSeconds(videoTrack.timeRange.duration);
CMTime roundedDuration = CMTimeMakeWithSeconds(interval, 60000);
I am using an MPMediaPickerController to allow the user to select videos and songs from the library on the device. I allow this with the: initWithMediaTypes:MPMediaTypeAny initialization for the picker. The user can then play the song or video in-app after the export takes place. Here is my movie-exporting code after stripping it to its core functionality:
- (void)mediaPicker:(MPMediaPickerController*)mediaPicker didPickMediaItems:(MPMediaItemCollection*)mediaItemCollection {
AVAssetExportSession *exportSession;
NSString *filePath;
NSURL *fileUrl;
for (MPMediaItem *item in mediaItemCollection.items) {
NSURL *assetUrl = [item valueForProperty:MPMediaItemPropertyAssetURL];
AVAsset *currAsset = [AVAsset assetWithURL:assetUrl];
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPresetHighestQuality];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
filePath = [title stringByAppendingString:#".mov"];
fileUrl = [NSURL fileURLWithPath:[[NSFileManager documentDirectory] stringByAppendingPathComponent:filePath]];
exportSession.outputURL = fileUrl;
dispatch_group_enter(dispatchGroup);
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// success
}
dispatch_group_leave(dispatchGroup);
}];
}
This similar code works fine for doing audio, but for video, the video's audio does not play. Most content from iTunes is protected and non-exportable, so I wanted to test with a homemade quick video I shot with my iPhone. I shot the video, dragged it into iTunes (and I made it a "music video" so that it shows up properly and can be exported to my phone's library). Then I sync'd and sent it to my device for testing.
In the app, the video shows up fine in the Media Picker, and I can export it with no errors that I can see. However, when I play it in-app, it only plays the video and not the audio. Other videos that I import from other sources work fine for playing the video's audio, so I don't 'think' it's the player itself.
Is there something I may be missing here on why the audio would not be coming across from this kind of export from the media picker? Thanks in advance for any assistance on this issue!
Not sure if this is the ideal solution, but the only way we found around this issue was to change it to force m4v format with PresetPassthrough set. I.e:
exportSession = [[AVAssetExportSession alloc] initWithAsset:[AVAsset assetWithURL:assetUrl] presetName:AVAssetExportPresetPassthrough];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeAppleM4V;
filePath = [title stringByAppendingString:#".m4v"];
Audio and video seems to work fine for videos imported locally this way, after making that change.
I am recording and uploading video to server from my iPhone application. The video needs to have a aspect ratio of 16:9 so I set
[videoRecorderController setVideoQuality:UIImagePickerControllerQualityTypeIFrame960x540]
but the video of few seconds takes several MBs space.
Is there any way out to reduce the memory size of the video and maintaining the 16:9 aspect ratio?
This helped me:
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
outputURL:(NSURL*)outputURL
handler:(void (^)(AVAssetExportSession*))handler
{
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetMediumQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
handler(exportSession);
}];
}
when i export a video asset via AVAssetExportSession the result file is in landspace mode.
(file grabbed via itune->apps->file sharing->my app).
how can i export the video asset in portrait mode (rotate it)?
The video coming from the iPhone capture device are always landscape orientated whatever the device orientation is when capturing.
If you want to rotate your video, the 'simple' solution is to assign a transform to the video track of the exported session.
Create 2 mutable tracks in your AVComposition object :
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
Add your medias tracks to your composition's tracks :
...
BOOL videoResult = [videoTrack insertTimeRange:sourceCMTime
ofTrack:[tracks objectAtIndex:0]
atTime:currentTime
error:&error];
BOOL audioResult = [audioTrack insertTimeRange:sourceCMTime
ofTrack:[tracks objectAtIndex:0]
atTime:currentTime
error:&error];
...
After you added all your tracks, apply your transform to the video track of your composition :
CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(M_PI_2);
// CGAffineTransform rotateTranslate = CGAffineTransformTranslate(rotationTransform,360,0);
videoTrack.preferredTransform = rotationTransform;
(be carful that the transform had the upper left corner as origin, so the translation was needed after rotation, but tested on iPhone 4S, iOS 5.1, it seems that the rotation is now made around the center.)
When U transform the track meanwhile should set the composition renderSize since it may out of frame or appear with black block:
self.mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);
How do I get a thumbnail of a video imported from the camera roll, or the camera itself?
This has been asked before, and has been answered. However, the answers kind of suck for me.
This thread iphone sdk > 3.0 . Video Thumbnail? has some options that boil down to:
Crawl some filesystem directory for a JPG with the latest modification date that should correspond to the video you just picked. This is extremely messy, and involves rooting around in directories Apple would probably not really want me doing.
Use ffmpeg. But this is so general that I cannot seem to figure out the steps that it would take to import ffmpeg into my project and to actually call it to extract images.
Is there really no other way? This seems like a HUGE oversight in the SDK to me. I mean the video picker has thumbnails in it, so Apple must be doing something to generate those, yet does not allow us to?
-(void)testGenerateThumbNailDataWithVideo {
NSString *path = [[NSBundle mainBundle] pathForResource:#"IMG_0106" ofType:#"MOV"];
NSURL *url = [NSURL fileURLWithPath:path];
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:nil];
AVAssetImageGenerator *generate = [[AVAssetImageGenerator alloc] initWithAsset:asset];
NSError *err = NULL;
CMTime time = CMTimeMake(1, 60);
CGImageRef imgRef = [generate copyCGImageAtTime:time actualTime:NULL error:&err];
[generate release];
NSLog(#"err==%#, imageRef==%#", err, imgRef);
UIImage *currentImg = [[UIImage alloc] initWithCGImage:imgRef];
static BOOL flag = YES;
if (flag) {
NSData *tmpData = UIImageJPEGRepresentation(currentImg, 0.8);
NSString *path = [NSString stringWithFormat:#"%#thumbNail.png", NSTemporaryDirectory()];
BOOL ret = [tmpData writeToFile:path atomically:YES];
NSLog(#"write to path=%#, flag=%d", path, ret);
flag = NO;
}
[currentImg release];
}
Best method I've found... MPMoviePlayerController thumbnailImageAtTime:timeOption
Nevermind this... see first comment below. That's the answer.
We use ffmpeg, you can explore our site for hints on how to do it, eventually I want to put up a tutorial.
But right now I'm more concentrated on getting ffmpeg to play movies.
Understand once you have that code the code to generate a thumbnail is just a subset of that.
http://sol3.typepad.com/tagalong_developer_journa/
This tutorial here, has helped us and maybe the majority of developers using ffmpeg to get started.
dranger.com/ffmpeg/ "
Finally,
Apple probably would maybe not have any problems with using the thumbnail generated from the video camera, I don't think its in a private folder however that is only created by the camera and not for videos picked from the image picker.