I am getting raw aac data from web stream and try to put it in ADTS frame in order to play it on iPhone.
It works for 10 seconds then sound stops and restart but seems accelerated or mixed with others audio data.
Quicktime and others audio app are unable to open my file.
It seems my ADTS header is wrong but I am unable to find where.
Is there an ADTS guru there which could be help me ?
Here my adts file
Thanks a lot for your help.
Thierry
PS : do you know a tool to check and report problem in audio file ?
Found.
Frame size was not valid.
Also I found afinfo on os x which is very useful to get file info.
Related
I need to develop an app what is capable of receiving a RTSP Stream.
I tried to find solutions/tutorials in the internet for the whole day now, but without any success.
I read a lot about using FFMPEG or Live555 (more FFMPEG, also I read that Live555 is not necessary when using the newest version of FFMPEG), but nowhere I looked it was described in a form I could understand, when I found questions on stackoverflow the answers were really short and I could not figure out what they tried to explain.
So now I need to ask myself.
I used "Homebrew" to download and install FFMPEG, now when I look at my dir /usr/local/
I can see this, the installed files are contained in subfolders of "Cellar"
I also tried to have a look at these projects:RTSPPlay by Mooncatventures and kxmovie by kolyvan.
I did not really figure out how to work with these projects, the Documentation is indefinite and "murky".
Well, when I tried to compile these projects the kxmovie failes with errors that are like "missing avformat.h",
I added the dylibs from the usr/local/cellar/ffmpeg/1.2.1/lib to the project but it seems that this is not the right method.
Nearly the same Issue with the RTSPPlay xcodeprj, it gives back the error that an "Entitlements.plist" is missing, after removing the linkings to that file completely I am getting 99+ Apple Mach-O Linker Errors, honestly I could not understand why.
I wanted to try the Live555 too but I cant see through all these obscure and confusing files, again I could not oversee the documentation and how to build the libraries for iphoneos (I read it is the easiest way to receive RTSP Stream but it was the same stack of confusing files as the other projects had)
Maybe if someone tried with these Projects or developed an Application himself could help me with his/her SourceCode or if somebody is seeing through all the Content of FFMPEG / Homebrew made dir's he/she could maybe explain me how to use it, that would probably help me and all the other desperate developers who are searching for a solution.
Just a little edit: I am trying to receive a RTSP H.264 decoded Video Stream.
Thanks in advance, Maurice Arikoglu.
(If you need any kind of SourceCode, Links, ScreenShots etc. please let me know)
For anyone who is searching for a working RTSP Player Library have a look at Durfu's Mooncat Fork I've found online, it is working perfectly fine on iOS 8. There is a sample code included, if you struggle with the implementation I can help you.
I've examined many of those projects that you mentioned, and experienced on those well..
You can use ffmpeg libraries to make a rtsp streaming client, as ffmpeg supports rtsp protocol... But there is an important issue which I've seen in my tests that ffmpeg's rtsp protocol has some important issues when using UDP transport layer for streaming... Even with the latest version (2.01) many RTP packets are dropping during streaming so images become glitchy sometimes... If you use TCP or http transport layer, then it works well...
For live555 project, this library works well with both UDP and TCP transports when streaming rtsp streams.. But ffmpeg is so much powerful/has many capabilities than live555.
If you decide to use ffmpeg then, basically, you should follow the below steps,
make a connection - avformat_open_input
find audio/video codecs - avcodec_find_decoder
open audio/video codecs - avcodec_open2
read encoded packets from file/network stream in a loop - av_read_frame
A.
a. decode audio packets - avcodec_decode_audio4
b. fill audio buffers with audiounit/audioqueue framework
B. a. decode video packets - avcodec_decode_video2
b. convert yuv pictures to rgb pictures - sws_scale or opengles shaders
good luck...
I'm using Matt Gallagher's AudioStreamer to perform audio streaming in app, the URLis coming from server, right now its of type .m3u8 format, and the problem is ON, before it was of .mp3type and streamed normally. It showing an error "No audio data found". However I tried to play URL in Safari browser in simulator and its playing good so there's no problem with the URL.
I've been searching long in google but it was ended with these two SO questions. Question-1, Question-2, but none of having the answer of it yes one has solution to play it using MPMoviePlayerController but I want to stream the same with the same I've.
So I dig into code of AudioStreamer .h and .m files, where I get to know that the logic of file type selection is at #line no. 555
+ (AudioFileTypeID)hintForFileExtension:(NSString *)fileExtension { .... }
an AudioFileTypeID need to return there, the list is defined in AudioFile.h of AudioToolbox.framework doesn't contains .m3u8 file type so I can't return it here (I tried for patching with different types there).
I tried to find any alternative types (which can be use instead of this) but no results I get. Then I gone through Apple Doc, and Issues Discussion but none of helped me!
P.S. I've checked AudioToolbox.framework in iOS6.0. to check for the availability of file type but it doesn't exist at all.
Any solution?
You can try with mpmovieviewcontroller. I am also using it for playing streaming audio in one of my Apps. For streaming type content (Like audio/video from web services or from internet) then it is perfect player. And it also look like iPhone default player. Search some tutorial on it & implement it. It is easily be implemented also.
I have a music file with a particular tone(music.mp4). I want to convert existing sound file (speech.mp4) into the tone which is specified in music.mp4. Its like converting a speech into some particular tone. I do not want to play both files simultaneously. I want to convert source file with help of some music file. So, output file will be converted file.
Is it possible? I searched for Audio Unit Hosting and Multimedia guide. But do not get any clue.
Thanks in advance.
The answer is: it sounds (no pun intended) like it would be possible with the iOS. You just need to find someone who knows how to program that specific functionality. I do not know why you would think to find the answer in the Apple Docs. I want to know if it's possible to program music that plays backwards. I want to know if I can program a sound that converts my words into something my dog understands. I can't imagine the documentation could possibly cover everything everyone ever wants to program an iPhone to do.
There are no iOS public APIs for frequency analysis of audio files. You would have to write your own DSP code for that. The AVFoundation and Accelerate frameworks have some audio file conversion and math functions that may help, but that is only a small portion of the code needed.
I have a program on the server side that keeps generating a series of JPEG files, and I want to play these files on the client browser as a video stream, with a desired frame rates (this video should be playing while the new JPEG files are being generated). Meanwhile, I have a wav file that is handy and I want to play this wav file in the client side, when the streaming video is being played.
Is there anyway to do it? I have done a plenty of research but can't find a satisfactory solution -- they are either just for video streaming or just for audio streaming.
I know mjpg-streamer at http://sourceforge.net/projects/mjpg-streamer/ is capable of playing streaming videos in MJPG format from JPEG files, but it doesn't look like that it can play streaming audios.
I am very new to this area, so more detailed explanation will be extremely appreciated. Thank you so much!!!
P.S. a solution/library in C++ is preferred but anything else would help as well. I am working on linux.
The browser should be able to do this natively, no? Firefox can do this certainly, if you simply give it the correct url of the streaming mjpeg source. The mjpeg stream should be properally formatted.
I figured it out. The proper way of doing it is to use ffmpeg, libav and an RTMP server, such as red5.
I am trying to create an MP3 audio file on the device (iPhone).
I have found that there is no possibility to record directly in MP3 format.
Only .caf file that may contain AIFF, Linear PCM etc.
I have also found here that there is an AudioConverterFillComplexBuffer function that might, theoretically, convert CAF file to MP3, but I can't find any sample code or explanation on how to use this function.
Any help will be greatly appreciated.
--
Michael Kessler
Try using LAME - open source MP3 encoder library (liblame)
http://lame.sourceforge.net/
As far as I know LAME is GPL which may be an issue for you if you are building an iPhone app.
I believe AudioConverterFillComplexBuffer indeed is the way to go. You might want to take a look at the book "Core Audio Rough Cuts" which has explanations and examples on how to convert audio.
Good luck.