i have done Real time video processing in ios using AVFoundation framework ,
help of this link.i have tested it is working fine. now i want to use h264 encoding and decoding[Before draw] .i try to get h264 encoded data from AVCaptureSession ,so i have set AVVideoCodecH264 in AVCaptureSession's videosetting,before Start capturing .
NSDictionary* videoSettings = [NSDictionary
dictionaryWithObjectsAndKeys:value,key,AVVideoCodecH264,AVVideoCodecKey,
nil];
[captureOutput setVideoSettings:videoSettings];
above code doesn't produce any change in output buffer,same buffer format get as like before. how to accomplish my requirement ? is it possible ? if so Please Help me to getting started ,h264 in ios.
There's no direct access to encoded keyframes. You may found some insightful comments about that here: https://stackoverflow.com/questions/4399162/how-to-stream-live-video-from-iphone-to-a-media-server-like-wowza
A pair AVVideoCodecH264, AVVideoCodecKey may be used as parameters in outputSettings in
NSError *error = nil;
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey, nil];
AVAssetWriterInput *assetWriterInput = [[AVAssetWriterInput alloc]
initWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
AVAssetWriter *assetWriter = [[AVAssetWriter alloc] initWithURL:fileURL
fileType:AVFileTypeMPEG4
error:&error];
if (!error && [assetWriter canAddInput:assetWriterInput])
[assetWriter addInput:assetWriterInput];
Related
I'm porting an app that works with aac audio files to iOS6 and I've found an strange behavior, when I try to get the duration of the (valid) aac audio file, it's always returns 0, in iOS4 and iOS5 it works fine.
¿Is there any bug on AvAudioPlayer class that affects duration property? I have read about some troubles with the currentTime property.
Here's the code:
NSURL* urlFichero = [NSURL fileURLWithPath:rutaFichero];
avaPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL: urlFichero error:nil];
segundos = avaPlayer.duration;
NSLog(#"[ControladorFicheros] Fichero: '%#' Duración: '%f'", nombreFichero, segundos);
[avaPlayer stop];
[avaPlayer release];
Thanks ;)
Finally the problem is that in newest versions of the API, AVAudioPlayer appears to only returns the correct duration of a file when it is ready for play it, that's why my solution was wrong, the correct way to get the duration of a file (in seconds) if you don't want to reproduce it is:
AVURLAsset *asset = [[[AVURLAsset alloc] initWithURL:anURI_ToResource
options:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES],
AVURLAssetPreferPreciseDurationAndTimingKey,
nil]] autorelease];
NSTimeInterval durationInSeconds = 0.0;
if (asset)
durationInSeconds = CMTimeGetSeconds(asset.duration) ;
Swift
let asset = AVURLAsset(url: url, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
let durationInSeconds = CMTimeGetSeconds(asset.duration)
I noticed the same problem. My solution is to use instead.
MPMoviePlayerController *testPlayer = [[MPMoviePlayerController alloc] initWithContentURL:filePath];
[testPlater prepareToPlay];
[testPlater play];
I'm using AVAssetReaderTrackOutput to music raw data from iPod library, racing WAV or MP3 file format from iPod library work fine. However, if i read M4A file i will get data with all 0s or garbage... The M4A file i'm reading is playable and i'm sure is not encrypted... It seems to be iOS encrypt or output 0s when you are reading M4A file. Does anyone has any suggestion on how to resolve this issue?
NOTE: If i use AVAssetExportSession to export from m4a file to m4a file then i get correct raw data but this means i have to first save it to a disk (on Apps document folder for example) and then reading it. I'd like to be able to stream it so i need to find a way to read it directly.
Here is my code and i get correct mp3 or wav raw data "except m4a":
NSURL *assetURL = [MediaItem valueForProperty:MPMediaItemPropertyAssetURL];
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVAssetTrack* songTrack = [songAsset.tracks objectAtIndex:0];
AVAssetReaderOutput *assetReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[songAsset.tracks objectAtIndex:0] outputSettings:nil];
NSError *assetError = nil;
AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:songAsset
error:&assetError];
[assetReader addOutput: assetReaderOutput];
//Retain
self.mStreamAssetReader = assetReader;
self.mStreamAssetReaderOutput = assetReaderOutput;
Assuming that by "raw music data" you mean raw PCM audio, you should pass in an initialised NSDictionary into your AVAssetReaderTrackOutput with the appropriate parameters such that iOS will decode the AAC bitstream inside the .m4a file for you.
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey,
[NSNumber numberWithFloat:48000.0], AVSampleRateKey,
[NSNumber numberWithInt:16], AVLinearPCMBitDepthKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved,
[NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey,
[NSNumber numberWithBool:NO], AVLinearPCMIsBigEndianKey,
nil];
AVAssetReaderOutput *assetReaderOutput =
[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[songAsset.tracks objectAtIndex:0]
outputSettings:settings];
I am using GData Api to upload video on youtube from ios application.
It successfully uploads the video but audio is missing from that.
I am using .mp4 format of video.
Do any one has a clue?
Thanks
-(BOOL) setupWriter{
NSError *error = nil;
// NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
// NSString *documentsDirectory = [paths objectAtIndex:0];
// NSURL * url = [NSURL URLWithString:documentsDirectory];
// url = [url URLByAppendingPathComponent:#"om.mp4"];
// NSString *path = [documentsDirectory stringByAppendingPathComponent:#"om.mp4"];
// [data writeToFile:path atomically:YES];
NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"Documents/movie.mp4"]];
_videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
error:&error];
NSParameterAssert(_videoWriter);
// Add video input
NSDictionary *videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithDouble:128.0*1024.0], AVVideoAverageBitRateKey,
nil ];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:192], AVVideoWidthKey,
[NSNumber numberWithInt:144], AVVideoHeightKey,
videoCompressionProps, AVVideoCompressionPropertiesKey,
nil];
_videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
float angle = M_PI/2; //rotate 180°, or 1 π radians
_videoWriterInput.transform = CGAffineTransformMakeRotation(angle);
NSParameterAssert(_videoWriterInput);
_videoWriterInput.expectsMediaDataInRealTime = YES;
// Add the audio input
AudioChannelLayout acl;
bzero( &acl, sizeof(acl));
acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;
NSDictionary* audioOutputSettings = nil;
// Both type of audio inputs causes output video file to be corrupted.
if( NO ) {
// should work from iphone 3GS on and from ipod 3rd generation
audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatMPEG4AAC ], AVFormatIDKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 64000 ], AVEncoderBitRateKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil];
} else {
// should work on any device requires more space
audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:
[ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
[ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
[ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
[ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,
[ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
nil ];
}
_audioWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType: AVMediaTypeAudio
outputSettings: audioOutputSettings ] retain];
_audioWriterInput.expectsMediaDataInRealTime = YES;
// add input
[_videoWriter addInput:_videoWriterInput];
[_videoWriter addInput:_audioWriterInput];
return YES;
}
this is setup writer I am using to capture audio is something wring in this????
From the CamStudio Support Forum:
.MOV / .MP4 / .3GPP files
MOV /MP4 /3GPP files are index based. Simply put, this means that
there is an index in the file that tells us the specific location of
where in the file the video and audio frames are present. Without the
index, it is almost impossible to know where the data for a specific
video or audio frame is. This index is contained in what is called a
'moov' atom in the file.
Now, if the index is at the beginning of the file, it will enable
processing of the video as and when successive bytes of the file are
uploaded. On the other hand, if the index is at the end, processing
the video cannot begin until the entire upload is complete - since the
index is needed to interpret the file.
Hence, for MOV / MP4 / 3gpp files, we prefer the "moov" atom in the
beginning of the file - also known as a "fast start"MP4 / MOV file.
There are tools available on the web to flatten your MOV file. Usually
the video editing/export software will have options to create your
files with the moov atom in the beginning rather than the end of your
file. If you are using Apple editing tools, then see this article on
how to produce a "fast start" MP4/MOV file./p>
Here's a list of some well-known formats that YouTube supports:
WebM files - Vp8 video codec and Vorbis Audio codecs
.MPEG4, 3GPP and MOV files - Typically supporting h264, mpeg4 video
codecs, and AAC audio codec
.AVI - Many cameras output this format - typically the video codec is
MJPEG and audio is PCM
.MPEGPS - Typically supporting MPEG2 video codec and MP2 audio
.WMV
.FLV - Adobe-FLV1 video codec, MP3 audio
I have been working with the GData API the last few days and had no similar Problems. I would suggest you checkout the filetypes supported by Youtube: http://www.google.com/support/youtube/bin/answer.py?answer=55744. Can you verify the file you are uploading actually has audio? What does Quicktime say about the Format of your Audio track? Youtube recommends using AAC (as far as I can tell).
I need to hold some video frames from a captureSession in memory and write them to a file when 'something' happens.
Similar to this solution, i use this code to put a frame into a NSMutableArray:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection
{
//...
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
[m_frameDataArray addObject:rawFrame];
[rawFrame release];
//...
}
And this to write the video file:
-(void)writeFramesToFile
{
//...
NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil ];
AVAssetWriterInput *bufferAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
AVAssetWriter *bufferAssetWriter = [[AVAssetWriter alloc]initWithURL:pathURL fileType:AVFileTypeQuickTimeMovie error:&error];
[bufferAssetWriter addInput:bufferAssetWriterInput];
[bufferAssetWriter startWriting];
[bufferAssetWriter startSessionAtSourceTime:startTime];
for (NSInteger i = 1; i < m_frameDataArray.count; i++){
NSData *rawFrame = [m_frameDataArray objectAtIndex:i];
CVImageBufferRef imgBuf = [rawFrame bytes];
[pixelBufferAdaptor appendPixelBuffer:imgBuf withPresentationTime:CMTimeMake(1,10)]; //<-- EXC_BAD_ACCESS
[rawFrame release];
}
//... (finishing video file)
}
But something is wrong with the imgBuf reference. Any suggestions? Thanks in advance.
You're supposed to lock base address before accessing imageBuffer's properties.
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
uint8 *baseAddress = (uint8*)CVPixelBufferGetBaseAddress(imageBuffer);
NSData *rawFrame = [[NSData alloc] initWithBytes:(void*)baseAddress length:(height * bytesPerRow)];
[m_frameDataArray addObject:rawFrame];
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
This is pretty old, but to help those that come after, there's a few issues to be fixed:
Lock/unlock the base address when you copy out as suggested by Alex's answer
CVImageBufferRef is an abstract base class type. You want to use CVPixelBufferCreateWithBytes to make an instance, not just typecast the raw pixel bytes. (The system needs to know the size/format of those pixels)
You should create and store the new CVPixelBuffer directly from the original one's data instead of using an intermediary NSData for storage. That way you only have to do one copy instead of two.
Since Days i'm trying to parse a YOUTUBE-XML-Feed by using GDATA-API for iOS.
http://code.google.com/intl/de-DE/apis/youtube/2.0/developers_guide_protocol_channel_search.html
NSDictionary *namespaces = [NSDictionary dictionaryWithObjectsAndKeys:
#"http://www.w3.org/2005/Atom", #"",
#"http://schemas.google.com/g/2005", #"gd",
#"http://a9.com/-/spec/opensearch/1.1/",#"opensearch",
#"http://gdata.youtube.com/schemas/2007",#"yt",
#"W/"DkYGRH48fCp7ImA9Wx5WFEw."",#"gd:etag",
nil];
NSError *error = [[NSError alloc] init];
GDataXMLDocument *doc = [[GDataXMLDocument alloc] initWithData:receivedData options:0 error:nil];
NSArray *elements = [doc nodesForXPath:#"//entry" namespaces:namespaces error:&error];
I don't get any results. Does anyone got an solution to this? Thanks in advance!
this is how I use this API
NSArray *entries = [doc.rootElement elementsForName:#"entry"];
for (GDataXMLElement *e in entries) {
// do something..
}
There is documentation on the GData Objective-C API, and a sample application for using the YouTube GData API is here.
I encountered a similar problem, it doesn't seem to deal well with a prefixless namespace.
Try changing:
#"http://www.w3.org/2005/Atom", #"",
to
#"http://www.w3.org/2005/Atom", #"atom",
and your xpath:
#"//entry"
to
#"//atom:entry"