I am currently working on an iPhone app that takes short snippets of video and compiles them together in a AVMutableComposition that is then exported to the user's camera roll. The example code I am using prompts the user to select the video to be merged from their camera roll and works flawlessly. The issue I am running into has to do with the fact that in the production app I am passing the videos to be compiled in the form of an array or their URL's. I am able to successfully populate the AVURLAsset with the following code:
newUrl = [NSURL fileURLWithPath:tempURLholder];
Asset0 = [[AVURLAsset alloc] initWithURL:newUrl options:nil];
However when I attempt to then pass that AVURLAsset onto the AVMutableCompositionTrack I get a null value with this code:
track0 = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track0 insertTimeRange:CMTimeRangeMake(kCMTimeZero, Asset0.duration) ofTrack:[[Asset0 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
And when I attempt to just pass the raw asset I get a 'incompatible pointer sending AVURLAsset to AVAssetTrack' error with this code:
track0 = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[track0 insertTimeRange:CMTimeRangeMake(kCMTimeZero, Asset0.duration) ofTrack:Asset0 atTime:kCMTimeZero error:nil];
I know I am probably missing something really simple here has anyone has experience with this or would be willing to pass along some tips to get my resources to show? Thanks so much in advance.
Related
I'm pretty confused by this one and can't think of anything obvious that I am doing wrong.
I can't get iOS to pull an m4a out of my app bundle
eg
NSURL *clickurl = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/fishy2.m4a", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:clickurl
error:&error];
audioPlayer.delegate = self;
if (error)
NSLog(#"Error: %#",
[error localizedDescription]);
else
[audioPlayer play];
I get an error -43
However, if I instead use an identical mp3 instead
NSURL *clickurl = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/fishy.mp3", [[NSBundle mainBundle] resourcePath]]];
everything is happy
I've tried this many many times across different projects and can't see where I'm going wrong. I've been using NSFileManager to check if it is there and it says no to the m4a, but yes to the mp3. Tried this with all manner of different methods of importing different audio files in different formats and I can't seem to get it to find the m4a (and it really really has to be m4a). mp3, wav, caf etc... all work. Interestingly, the m4as that I run from the user documents directory work just fine
All I want is to be able to copy the file!
Any ideas at all?
Work around = right click and "add files to " instead of dragging
I am writing a small POC to play the video in iPhone app. I would like to do so in UIWebView with the help of HTML5. I dont want to use standard players like AVPlayer or MPMoviePlayerController.
Can someone throw some light on it?..
I am new to this HTML5 environment on iPhone. So, any help to start this POC would be appriciated.
I forget to add that i would like to play h264 live stream on server.
Thanks in advance.
Yes, you can do that:
NSString *bundleDirectory = [[NSBundle mainBundle] pathForResource:#"mov" ofType:nil];
NSString *filePath = [bundleDirectory stringByAppendingPathComponent:#"loopvideo.html"];
NSString *HTMLData = [NSString stringWithContentsOfFile:filePath encoding:NSUTF8StringEncoding error:nil];
[self.web loadHTMLString:HTMLData baseURL:[NSURL fileURLWithPath:bundleDirectory]];
In your HTML, you can embed the <video> tag and it should play.
I have debug this code in url I am getting the "fileName" string, which i want. But in _audio i am getting null.
I am using an AVAudioPlayer to play and stream audio files. I am playing audio files from resources and document directory, but if they are absent in document directory then I want to play that specific audio from it's url. By streaming and buffering it. Here is my code:
NSString *fileName = #"http://ilm.coeus-solutions.de/frontend/images/audios/mp3test.mp3";
fileName = [fileName stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSURL *url = [[NSURL alloc] initWithString:fileName];
AVAudioPlayer *_audio = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
You can try following code,
NSData *filelocation=[[NSData alloc]initWithContentsOfURL:[NSURL URLWithString:#"http://ilm.coeus-solutions.de/frontend/images/audios/mp3test.mp3"]];
MyAudioPlayer = [[AVAudioPlayer alloc]initWithData:filelocation error:nil];
hope this will solve your problem
Slam dear! Please read again my question I have just asked about audio streaming rather than downloading it and then playing it.
This below link helped me. By the way thanks all who try to help me.
http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
Thanks!
I am using the MediaItem api to gather assets from the iPod library. There is a strange bug I am running into after the application has been running for a while. I will run the following code and the AVURLAsset will have no associated tracks.
NSURL* url = [iPodSong valueForProperty:MPMediaItemPropertyAssetURL];
mAssetToLoad = [[AVURLAsset alloc] initWithURL:url options:nil];
bool protectedCon = mAssetToLoad.hasProtectedContent;
bool exportable = true;//mAssetToLoad.exportable; //4.3 only
if(!protectedCon && exportable) {
AVAssetTrack* songTrack = [mAssetToLoad.tracks objectAtIndex:0];
//CRASH tracks is of size 0
The asset where the problem occurs seems to change, and if I restart the app and load same asset again, it loads correctly.
Has anyone seen this before? Any idea what I might be doing wrong?
I believe the problem is the CMSampleBufferRef weren't being released. So too many items were open from the iPod library at one time. The same behavior did happen on other devices, it just took longer on those with more RAM.
Before I post a lot of code here is the scenario:
Using code based on AVEditDemo from WWDC, I capture a movie using the standard control in portrait.
I post process the video using code identical to that in AVEditDemo which uses Core Animation.
When I play the resultant video using the Camera App, it is rotated 90 degrees and is not longer "portrait" (it is now in landscape) and squashed. (The aspect ratio seems to have been swapped, width -> height & height -> width.
Have spent many hours on this and am at a loss.
The desired result is a movie identical to the captured original. (With an animated overlay eventually).
To see this in action just download and run the AVEditDemo from Apple, turn Title "ON" and export the movie.
I guess the short answer is this:
When processing the original video, you want to retrieve the 'preferredTransform':
AVAssetTrack *sourceVideo = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo]lastObject];
CGAffineTransform *preferredTransform = [sourceVideo preferredTransform];
and then when writing the final video you will do something similar to this:
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack setPreferredTransform:preferredTransform];