How to convert audio binary on server from iPhone POST request? - iphone

I have a working POST request that will send a CAF audio file (recorded on the iPhone with AVAudioRecorder) to a web server -- the request itself is structured similarly to Send an audio FILE and JSON string in iOS.
The server correctly receives the file, but I'm having trouble converting it into a format that will play directly in a browser. In addition to every audio/iPhone/server post I could find on stack overflow, I've looked into ffmpeg and HTML5 audio as well, but couldn't find clear instructions on how to convert the received audio binary to a browser-playable format. Essentially, I need to understand how to accomplish Arun's server-side suggestion here.
I know that Audacity, Soundbooth, etc. will allow you to use the raw data and save it as another format, but I need this to be done programmatically on the server. If there are any suggestions they would be very much appreciated! Thank you.

Related

Uploading & Storing audio files

We are in the stage of designing our audio application, and we need to support uploading audio files from desktop applications to a cloud server, and also playing those audio files in the desktop applications.
How should we process the file before uploading? should be turn them into base 64 thus increasing their size by ~ 30%?
Or should we upload it as a raw binary file?
What about the different audio formats, should we transcode it in the client / server into mp3 or something like that?
Does anybody know what is the SoundCloud approach in this case?
Thank you!
SoundCloud stores the original file, and it transcodes the original for streaming purposes. The original can be downloaded if the owner makes that option available.
If you use HTTP to communicate with your server you'll need to encode with Base64. If you choose the right framework, this should be easy and possibly transparent to you.

Streaming audio/video to iPhone other than http server

I find most of streaming audio discussions are about the streaming media from http server, e.g. AudioStreamer from cocoa with love or MPMoviePlayerController. They both init with NSURL. But my case is other than that. I use SMB to access the media files on some window shared server. The media content is got with SMB message (thru socket) and is accumulated in memory (NSMutableData)
So is there a way to play them (those NSMutableData) before download is finished ?
Update, so for streaming audio I understand I need audio queue service.
What about stream video other than http? I think it is doable because there is a free app called TIOD which does not only stream audio but also video from SMB server.
BTW, I never expect others to do work for me. I check all the document I can find and can't find a way to do it (for video). I had thought, well, that may mean it can't be done. But then I find TIOD can do that. That's why I raised the quesion in the first place to see if other has experiences for it.
Yea you can stream that as well, its the same thing as getting the data from an NSURL... if you look at audio streaming example by matt gallagher here you see that he is getting data from some URL, but ultimatly when he calls the parse function he is giving it bytes of data, same things should apply for your situation, with the data you get you should be able to call the parse function and have the Audio Player stream your audio file..

MJPG streaming with audio

I have a program on the server side that keeps generating a series of JPEG files, and I want to play these files on the client browser as a video stream, with a desired frame rates (this video should be playing while the new JPEG files are being generated). Meanwhile, I have a wav file that is handy and I want to play this wav file in the client side, when the streaming video is being played.
Is there anyway to do it? I have done a plenty of research but can't find a satisfactory solution -- they are either just for video streaming or just for audio streaming.
I know mjpg-streamer at http://sourceforge.net/projects/mjpg-streamer/ is capable of playing streaming videos in MJPG format from JPEG files, but it doesn't look like that it can play streaming audios.
I am very new to this area, so more detailed explanation will be extremely appreciated. Thank you so much!!!
P.S. a solution/library in C++ is preferred but anything else would help as well. I am working on linux.
The browser should be able to do this natively, no? Firefox can do this certainly, if you simply give it the correct url of the streaming mjpeg source. The mjpeg stream should be properally formatted.
I figured it out. The proper way of doing it is to use ffmpeg, libav and an RTMP server, such as red5.

iphone XML attachments

My iPhone app will be receiving an xml feed from a Java web service. The xml is in a SOAP message. I can easily parse data from within the xml however there is a jpeg attachment to the SOAP message that I need to display within the iPhone app. Does anyone have example code or a link to some documentation on how to work with SOAP attachments on the iPhone?
Thanks
I'm not 100% certain, but I'm reasonably sure that wsdl2objc supports binary attachments. Documentation is sparse, however.
That said, if you are parsing the message by hand, take a look at the Message format from w3c; it's just MIME attachments, which means it's going to base64 encoded. Unfortunately, the iPhone SDK doesn't have a base64 encoder/decoder, so you'll have to roll your own or use a third party library. This article should help you roll your own.

How do I parse the PLS playlist format in an iPhone app?

I'm developing an iPhone application for a radio station. I'm in need of parsing playlist.pls and playlist.qtl, to get the stream audio url present in it. I'm stuck with that.
Since pls uses ini format, you can parse it using Properties. qtl is an xml format, you can find a lot of examples of parsing xml.