Before I post a lot of code here is the scenario:
Using code based on AVEditDemo from WWDC, I capture a movie using the standard control in portrait.
I post process the video using code identical to that in AVEditDemo which uses Core Animation.
When I play the resultant video using the Camera App, it is rotated 90 degrees and is not longer "portrait" (it is now in landscape) and squashed. (The aspect ratio seems to have been swapped, width -> height & height -> width.
Have spent many hours on this and am at a loss.
The desired result is a movie identical to the captured original. (With an animated overlay eventually).
To see this in action just download and run the AVEditDemo from Apple, turn Title "ON" and export the movie.
I guess the short answer is this:
When processing the original video, you want to retrieve the 'preferredTransform':
AVAssetTrack *sourceVideo = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo]lastObject];
CGAffineTransform *preferredTransform = [sourceVideo preferredTransform];
and then when writing the final video you will do something similar to this:
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack setPreferredTransform:preferredTransform];
Related
We have an SKVideoNode that we're adding to an SKScene in a SpriteKit game, and the mp4 video plays just fine maybe 90% of the time, 10% of the time it just renders transparent video with the audio playing just fine.
What I mean by transparent, is that this video sits overtop of our game board, and when it glitches out, the game can be seen in plain sight below, though nothing is responsive because the video node is positioned over top of everything blocking user interaction. The audio from the video still plays fine so i know its trying to play.
Its totally inconsistent. The video plays fine for the most part, but what seems like 10% of the time, it just doesn't render any video content to the node, only the audio.
We are seeing this in all versions of iOS.
Our node code:
NSURL *fileURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"intro" ofType:#"mp4"]];
AVPlayer* player = [AVPlayer playerWithURL:fileURL];
SKVideoNode* introVideoNode = [[SKVideoNode alloc] initWithAVPlayer:player];
introVideoNode.size = CGSizeMake(self.frame.size.width,self.frame.size.height);
introVideoNode.position = CGPointMake(CGRectGetMidX(self.frame), CGRectGetMidY(self.frame));
introVideoNode.name = #"introVideo";
// this video plays over top of many other SKSpriteNodes
introVideoNode.zPosition = 8000;
[self addChild:introVideoNode];
[introVideoNode play];
Thoughts?
I am currently working on an iPhone app that takes short snippets of video and compiles them together in a AVMutableComposition that is then exported to the user's camera roll. The example code I am using prompts the user to select the video to be merged from their camera roll and works flawlessly. The issue I am running into has to do with the fact that in the production app I am passing the videos to be compiled in the form of an array or their URL's. I am able to successfully populate the AVURLAsset with the following code:
newUrl = [NSURL fileURLWithPath:tempURLholder];
Asset0 = [[AVURLAsset alloc] initWithURL:newUrl options:nil];
However when I attempt to then pass that AVURLAsset onto the AVMutableCompositionTrack I get a null value with this code:
track0 = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track0 insertTimeRange:CMTimeRangeMake(kCMTimeZero, Asset0.duration) ofTrack:[[Asset0 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
And when I attempt to just pass the raw asset I get a 'incompatible pointer sending AVURLAsset to AVAssetTrack' error with this code:
track0 = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track0 insertTimeRange:CMTimeRangeMake(kCMTimeZero, Asset0.duration) ofTrack:Asset0 atTime:kCMTimeZero error:nil];
I know I am probably missing something really simple here has anyone has experience with this or would be willing to pass along some tips to get my resources to show? Thanks so much in advance.
I have managed using the YouTube API to fetch thumbnails for my list of videos, however they have black bars top and bottom of the UIImage I get. How can I fetch a thumbnail without these bars and even better a higher quality thumbnail?
Here is the code I use currently:
GDataEntryBase *entry = [[feed entries] objectAtIndex:i];
NSArray *thumbnails = [[(GDataEntryYouTubeVideo *)entry mediaGroup] mediaThumbnails];
NSData *data = [NSData dataWithContentsOfURL:[NSURL URLWithString:[[thumbnails objectAtIndex:0] URLString]]];
UIImage *thumbnail = [UIImage imageWithData:data];
It's also worth noting that all thumbnails are currently in a 4:3 aspect ratio, for historical purposes. If the underlying video is 16:9 and you plan on using a 16:9 player, then it makes sense to position the thumbnail so that the top and bottom black bars are hidden. That's independent of whether you use a lower-resolution or higher-resolution thumbnail.
Well this answer gave me the hint, so I went ahead and took a guess.
How do I get a YouTube video thumbnail from the YouTube API?
Turns out if I just change the index from 0, to 1 of my data object I get a higher quality thumbnail. Magic, easy.
I am working on an iOS project that uses AV-Out to show contents in a 1280x720 window on a second screen.
I have a MPMoviePlayerController's view as background and on top of that different other elements like UIImages and UILabels.
The background movie plays in a loop.
Now I want to overlay the whole view including all visible elements with another fullscreen animation that has transparency so that only parts of the underlying view are visible.
I first tried a png animation with UIImageView.
I was surprised to find that actually works on iPhone5, but of course the pngs are so big in size that this uses way too much ram and it crashes on everything below iPhone4s.
So i need another way.
I figured out how to play a second movie at the same time using AVFoundation.
So far, so good. Now i can play the overlay video, but of course it is not trasparent yet.
I also learned that with the GPUImage library I can use GPUImageChromaKeyBlendFilter to filter a color out of a video to make it transparent and then combine it with another video.
What i don't understand yet is the best way to implement it in my case to get the result that i want.
Can i use the whole view hierarchy below the top video as first input for the GPUImageChromaKeyBlendFilter and a greenscreen-style video as second input and show the result live in 720p? how would i do that?
Or would it be better to use GPUImageChromaKeyFilter and just filter the greenscreen-style video, and play it in a view above all other views? Would the background of this video be transparent then?
Thanks for your help!
You'll need to build a custom player using AVFoundation.framework and then use a video with alpha channel. The AVFoundation framework allows much more robust handeling of video without many of the limitations of MPMedia framework. Building a custom player isn't as hard as people make it out to be. I've written a tutorial on it here:http://www.sdkboy.com/?p=66
OTHER WAY..........
The SimpleVideoFileFilter example in the framework shows how to load a movie, filter it, and
encode it back to disk. Modifying this to perform chroma keying gives the following:
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[filter setColorToReplaceRed:0.0 green:0.0 blue:1.0];
[filter setThresholdSensitivity:0.4];
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"background.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
The above code will load a movie from the application's resources called sample.m4v, feed it into a chroma key filter that is set to key off of pure blue with a sensitivity of 0.4, attach a background image to use for the chroma keying, and then send all that to a movie encoder which writes Movie.m4v in the application's /Documents directory.
You can adjust the threshold and specific blue tint to match your needs, as well as replace the input image with another movie or other source as needed. This process can also be applied to live video from the iOS device's camera, and you can display the results to the screen if you'd like.
On an iPhone 4 running iOS 5.0, chroma keying takes 1.8 ms (500+ FPS) for 640x480 frames of video, 65 ms for 720p frames (15 FPS). The newer A5-based devices are 6-10X faster than that for these operations, so they can handle 1080p video without breaking a sweat. I use iOS 5.0's fast texture caches for both frame uploads and retrievals, which accelerates the processing on that OS version over 4.x.
The one caution I have about this is that I haven't quite gotten the audio to record right in my movie encoding, but I'm working on that right now.
The SimpleVideoFileFilter example in the framework shows how to load a movie, filter it, and encode it back to disk. Modifying this to perform chroma keying gives the following:
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[filter setColorToReplaceRed:0.0 green:0.0 blue:1.0];
[filter setThresholdSensitivity:0.4];
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"background.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
The above code will load a movie from the application's resources called sample.m4v, feed it into a chroma key filter that is set to key off of pure blue with a sensitivity of 0.4, attach a background image to use for the chroma keying, and then send all that to a movie encoder which writes Movie.m4v in the application's /Documents directory.
You can adjust the threshold and specific blue tint to match your needs, as well as replace the input image with another movie or other source as needed. This process can also be applied to live video from the iOS device's camera, and you can display the results to the screen if you'd like.
On an iPhone 4 running iOS 5.0, chroma keying takes 1.8 ms (500+ FPS) for 640x480 frames of video, 65 ms for 720p frames (15 FPS). The newer A5-based devices are 6-10X faster than that for these operations, so they can handle 1080p video without breaking a sweat. I use iOS 5.0's fast texture caches for both frame uploads and retrievals, which accelerates the processing on that OS version over 4.x.
The one caution I have about this is that I haven't quite gotten the audio to record right in my movie encoding, but I'm working on that right now.
I have a video which was recorded on blue screen.
Is there a way to make it transparent on iOS devices? I know I can do it with pixel shaders and openGL but I m afraid that the process of decoding video frame/ uploading openGL texture and eliminate fragments with pixel shader will be too slow.
Any suggestions?
It sounds like you want to do some sort of chroma keying with your video. I just added the capability to do this to my GPUImage framework, which as the name indicates uses GPU-based processing to perform these operations many times faster than CPU-bound filters could.
The SimpleVideoFileFilter example in the framework shows how to load a movie, filter it, and encode it back to disk. Modifying this to perform chroma keying gives the following:
NSURL *sampleURL = [[NSBundle mainBundle] URLForResource:#"sample" withExtension:#"m4v"];
movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];
filter = [[GPUImageChromaKeyBlendFilter alloc] init];
[filter setColorToReplaceRed:0.0 green:0.0 blue:1.0];
[filter setThresholdSensitivity:0.4];
[movieFile addTarget:filter];
UIImage *inputImage = [UIImage imageNamed:#"background.jpg"];
sourcePicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:YES];
[sourcePicture addTarget:filter];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
[filter addTarget:movieWriter];
[movieWriter startRecording];
[movieFile startProcessing];
[movieWriter setCompletionBlock:^{
[filter removeTarget:movieWriter];
[movieWriter finishRecording];
}];
The above code will load a movie from the application's resources called sample.m4v, feed it into a chroma key filter that is set to key off of pure blue with a sensitivity of 0.4, attach a background image to use for the chroma keying, and then send all that to a movie encoder which writes Movie.m4v in the application's /Documents directory.
You can adjust the threshold and specific blue tint to match your needs, as well as replace the input image with another movie or other source as needed. This process can also be applied to live video from the iOS device's camera, and you can display the results to the screen if you'd like.
On an iPhone 4 running iOS 5.0, chroma keying takes 1.8 ms (500+ FPS) for 640x480 frames of video, 65 ms for 720p frames (15 FPS). The newer A5-based devices are 6-10X faster than that for these operations, so they can handle 1080p video without breaking a sweat. I use iOS 5.0's fast texture caches for both frame uploads and retrievals, which accelerates the processing on that OS version over 4.x.
The one caution I have about this is that I haven't quite gotten the audio to record right in my movie encoding, but I'm working on that right now.
If you mean you want to render the video, but set the blue pixels to transparent, then the only efficient way to do this is with OpenGL. This should be easily possible for iOS devices, video decoding is handled in hardware, and I have several projects where I transfer video frames to OpenGL using glTexSubImage2D, works fine.