I want to crop the video given the lenght and width and x and y coordinates and it seems its not at possible with avmutablecomposition so I am planning to use AVAssetWriter to crop video using its aspectFill property in video setting.
BUT MY QUESTION is can we use AVAssetWriter as a replacement for AVExportSession ??
If yes, how to initialise the AVAssetWriterInput with AVAsset object as that we do in AVExportSession,Like this
[[AVAssetExportSession alloc] initWithAsset:videoAsset presetName:AVAssetExportPresetHighestQuality];
It’s possible that what you are looking for is AVMutableComposition naturalSize property which from what it looks from here, allows scaling (not cropping) of the video to wished sizes.
https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVMutableComposition_Class/Reference/Reference.html#//apple_ref/occ/instp/AVMutableComposition/naturalSize
Related
I am playing video from the web with AVPlayerLayer :
AVAsset *newAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVPlayerItem *newPlayerItem = [[AVPlayerItem alloc]initWithAsset:newAsset];
audioPlayer = [[AVPlayer alloc]initWithPlayerItem:newPlayerItem];
avPlayerLayer = [AVPlayerLayer playerLayerWithPlayer:audioPlayer];
And i want to be able to add Audio Equalizer to the video. Something like :
Pop, Flat, Balled, Blues, Classical, Dance, Metal
I search for and documentation for this in Google and in apple developer program resources and didn't find nothing.
And i also noticed that some apps have this function, but it's for iOS 6.1 only.
Any help with this issue? Did apple have something build-in or it's a non-apple source?
This Stack Overflow question has answers offering several different solutions: How to make a simple EQ AudioUnit (bass, mid, treble) with iOS?
I am working on an iOS project that uses AV-Out to show contents in a 1280x720 window on a second screen.
I have a MPMoviePlayerController's view as background and on top of that different other elements like UIImages and UILabels.
The background movie plays in a loop.
Now I want to overlay the whole view including all visible elements with another fullscreen animation that has transparency so that only parts of the underlying view are visible.
I first tried a png animation with UIImageView.
I was surprised to find that actually works on iPhone5, but of course the pngs are so big in size that this uses way too much ram and it crashes on everything below iPhone4s.
So i need another way.
I figured out how to play a second movie at the same time using AVFoundation.
So far, so good. Now i can play the overlay video, but of course it is not trasparent yet.
I also learned that with the GPUImage library I can use GPUImageChromaKeyBlendFilter to filter a color out of a video to make it transparent and then combine it with another video.
What i don't understand yet is the best way to implement it in my case to get the result that i want.
Can i use the whole view hierarchy below the top video as first input for the GPUImageChromaKeyBlendFilter and a greenscreen-style video as second input and show the result live in 720p? how would i do that?
Or would it be better to use GPUImageChromaKeyFilter and just filter the greenscreen-style video, and play it in a view above all other views? Would the background of this video be transparent then?
Thanks for your help!
You'll need to build a custom player using AVFoundation.framework and then use a video with alpha channel. The AVFoundation framework allows much more robust handeling of video without many of the limitations of MPMedia framework. Building a custom player isn't as hard as people make it out to be. I've written a tutorial on it here:http://www.sdkboy.com/?p=66
OTHER WAY..........
The SimpleVideoFileFilter example in the framework shows how to load a movie, filter it, and
encode it back to disk. Modifying this to perform chroma keying gives the following:
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[filter setColorToReplaceRed:0.0 green:0.0 blue:1.0];
[filter setThresholdSensitivity:0.4];
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"background.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
The above code will load a movie from the application's resources called sample.m4v, feed it into a chroma key filter that is set to key off of pure blue with a sensitivity of 0.4, attach a background image to use for the chroma keying, and then send all that to a movie encoder which writes Movie.m4v in the application's /Documents directory.
You can adjust the threshold and specific blue tint to match your needs, as well as replace the input image with another movie or other source as needed. This process can also be applied to live video from the iOS device's camera, and you can display the results to the screen if you'd like.
On an iPhone 4 running iOS 5.0, chroma keying takes 1.8 ms (500+ FPS) for 640x480 frames of video, 65 ms for 720p frames (15 FPS). The newer A5-based devices are 6-10X faster than that for these operations, so they can handle 1080p video without breaking a sweat. I use iOS 5.0's fast texture caches for both frame uploads and retrievals, which accelerates the processing on that OS version over 4.x.
The one caution I have about this is that I haven't quite gotten the audio to record right in my movie encoding, but I'm working on that right now.
The SimpleVideoFileFilter example in the framework shows how to load a movie, filter it, and encode it back to disk. Modifying this to perform chroma keying gives the following:
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[filter setColorToReplaceRed:0.0 green:0.0 blue:1.0];
[filter setThresholdSensitivity:0.4];
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"background.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
The above code will load a movie from the application's resources called sample.m4v, feed it into a chroma key filter that is set to key off of pure blue with a sensitivity of 0.4, attach a background image to use for the chroma keying, and then send all that to a movie encoder which writes Movie.m4v in the application's /Documents directory.
You can adjust the threshold and specific blue tint to match your needs, as well as replace the input image with another movie or other source as needed. This process can also be applied to live video from the iOS device's camera, and you can display the results to the screen if you'd like.
On an iPhone 4 running iOS 5.0, chroma keying takes 1.8 ms (500+ FPS) for 640x480 frames of video, 65 ms for 720p frames (15 FPS). The newer A5-based devices are 6-10X faster than that for these operations, so they can handle 1080p video without breaking a sweat. I use iOS 5.0's fast texture caches for both frame uploads and retrievals, which accelerates the processing on that OS version over 4.x.
The one caution I have about this is that I haven't quite gotten the audio to record right in my movie encoding, but I'm working on that right now.
Hey I am currently working with the AVAssetWriter and the AVAssetWriterInput.
In my project I would like to use the hardware acceleration of the iPhone.
Is it possible to use the AVAssetWriter to create compressed images and change the quality on the fly? That means after I initialized the AVAssetWriterInput instance?
I have a video which was recorded on blue screen.
Is there a way to make it transparent on iOS devices? I know I can do it with pixel shaders and openGL but I m afraid that the process of decoding video frame/ uploading openGL texture and eliminate fragments with pixel shader will be too slow.
Any suggestions?
It sounds like you want to do some sort of chroma keying with your video. I just added the capability to do this to my GPUImage framework, which as the name indicates uses GPU-based processing to perform these operations many times faster than CPU-bound filters could.
The SimpleVideoFileFilter example in the framework shows how to load a movie, filter it, and encode it back to disk. Modifying this to perform chroma keying gives the following:
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[filter setColorToReplaceRed:0.0 green:0.0 blue:1.0];
[filter setThresholdSensitivity:0.4];
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"background.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
The above code will load a movie from the application's resources called sample.m4v, feed it into a chroma key filter that is set to key off of pure blue with a sensitivity of 0.4, attach a background image to use for the chroma keying, and then send all that to a movie encoder which writes Movie.m4v in the application's /Documents directory.
You can adjust the threshold and specific blue tint to match your needs, as well as replace the input image with another movie or other source as needed. This process can also be applied to live video from the iOS device's camera, and you can display the results to the screen if you'd like.
On an iPhone 4 running iOS 5.0, chroma keying takes 1.8 ms (500+ FPS) for 640x480 frames of video, 65 ms for 720p frames (15 FPS). The newer A5-based devices are 6-10X faster than that for these operations, so they can handle 1080p video without breaking a sweat. I use iOS 5.0's fast texture caches for both frame uploads and retrievals, which accelerates the processing on that OS version over 4.x.
The one caution I have about this is that I haven't quite gotten the audio to record right in my movie encoding, but I'm working on that right now.
If you mean you want to render the video, but set the blue pixels to transparent, then the only efficient way to do this is with OpenGL. This should be easily possible for iOS devices, video decoding is handled in hardware, and I have several projects where I transfer video frames to OpenGL using glTexSubImage2D, works fine.
I have successfully composed an AVMutableComposition with multiple video clips and can view it and export it, and I would like to be able to transition between them using a cross-fade, so I want to use AVMutableVideoComposition. I can't find any examples on how to even arrange and play a couple AVAsset videos in succession. Does anyone have an example of how to add tracks to an AVMutableVideoComposition with the equivalent of AVMutableComposition's insertTimeRange, or how to set up a cross-fade?
[self.composition insertTimeRange:CMTimeRangeMake(kCMTimeZero,asset.avAsset.duration)
ofAsset:asset.avAsset
atTime:self.composition.frameDuration
error:nil]
I found an example called AVEditDemo from Apple's WWDC 2010 Sample Code.
https://developer.apple.com/library/ios/samplecode/AVCustomEdit/Introduction/Intro.html
There is a lot of detail in the sample, but I'll summarize: You need to use both AVMutableComposition and AVMutableVideoComposition. Add tracks individually to AVMutableComposition instead of with the simpler insertTimeRange, as it allows you to set overlapping times on the tracks. The tracks also need to be added to the AVMutableVideoComposition as AVMutableVideoCompositionLayerInstructions with an opacity ramp. Finally, to play back in an AVPlayer, you need to create an AVPlayerItem using both the AVMutableComposition and AVMutableVideoComposition.
It seems like going each level deeper in the api – in this case from MPMoviePlayer with an asset to AVPlayer with an AVComposition and finally to an AVVideoComposition – increases necessary coding exponentially.