I have downloaded ffmpeg libraries for iPhone and compiled them. My objective is to create a movie file from a series of images using ffmpeg libraries.The amount of documentation for ffmpeg on iphone is very less. I checked an app called iFrameExtractor, which does the opposite of what i want, it extracts frames from a video.
On the command line there is a command called
ffmpeg -f image2 -i image%d.jpg video.mov
This turns a series of images into a video. I actually checked it on my mac and it works fine. What i wanted to know was how do we get the equivalent in iPhone. Or rather which class/api or method to call. There are a couple of examples of apps doing this on iPhone. Not sure whether they do it through ffmpeg though. Anyways, for reference "Time lapser" and "reel moments"
Thanks in advance
Do you need sound ?
If you do not need sound, you can try to use openCv.
I think it is easier to use. You just call a method to append a picture to the movie:
http://opencv.willowgarage.com/documentation/c/reading_and_writing_images_and_video.html
Someone please correct me if I'm wrong, but I don't believe iPhone apps can call external executables (which makes sense, that would be multitasking). You'll have to use libav* to accomplish what you're after. Compile ffmpeg for the iPhone:
http://github.com/gabriel/ffmpeg-iphone-build
then read the API docs at ffmpeg.org. For each library there should be an example file in the ffmpeg source that displays usage of that libraries core functions. i.e. in libavcodec, there is a file called api-example.c and in libavformat, there's one called output-example.c.
Related
I have a music file with a particular tone(music.mp4). I want to convert existing sound file (speech.mp4) into the tone which is specified in music.mp4. Its like converting a speech into some particular tone. I do not want to play both files simultaneously. I want to convert source file with help of some music file. So, output file will be converted file.
Is it possible? I searched for Audio Unit Hosting and Multimedia guide. But do not get any clue.
Thanks in advance.
The answer is: it sounds (no pun intended) like it would be possible with the iOS. You just need to find someone who knows how to program that specific functionality. I do not know why you would think to find the answer in the Apple Docs. I want to know if it's possible to program music that plays backwards. I want to know if I can program a sound that converts my words into something my dog understands. I can't imagine the documentation could possibly cover everything everyone ever wants to program an iPhone to do.
There are no iOS public APIs for frequency analysis of audio files. You would have to write your own DSP code for that. The AVFoundation and Accelerate frameworks have some audio file conversion and math functions that may help, but that is only a small portion of the code needed.
These days, I was researching the software architechture for iPhone Streaming (Base on MMS protocol).
As we know, in order to playback MMS audio stream, we should call libMMS to read wma stream data from remote media server, and then call FFmpeg to decode the stream data from wma format into PCM data buffer, and finally, enqueue the PCM data buffer into iPhone’s audioqueue to generate real sound.
The introduction above just describe the working process of iPhone streaming. If we only need to implement this simple functionality, that is not difficult. Just follow the introduction above to call libMMS, FFMpeg and audioqueue step by step, we can achieve the streaming function. Actually, I have implemented the code last week.
But, what I need is not only a simple streaming function! I need a software architechture makes FFmpeg accessing libMMS just like accessing local filesystem!
Does anybody know how to hook the libMMS interfaces like mms_read/mms_seek onto FFmpeg filesystem interfaces like av_read_frame/av_seek_frame?
I think I have to answer my own question again this time……
After several weeks reseach and debuging, I got the truth finally.
Actually, we don’t need to “hook” libMMS onto FFMpeg. Why? Because the FFMpeg already has its native mms protocol process module “mms_protocol” (see in mms_protocol.c in FFMpeg).
All we need to do is just configuring the FFMpeg to enable the mms module like this (see in config.h in FFMpeg):
#define ENABLE_MMS_PROTOCOL 1
#define CONFIG_MMS_PROTOCOL 1
After this configuration, FFMpeg will add mms protocol into its protocol list. (Actually, the protocol list has already contained “local file system protocol”). As result, the FFMpeg could be able to treat the “mms://hostserver/abc” media file like local media file. Therefore, we can still open and read the mms media file using:
av_open_input_file();
av_read_frame();
like we did on local media file before!
By the way, in my ffmpeg version, there are still many bugs in libAVFormat module for proccessing mms protocol. It took one week for me to debug it, however, I think it will be much shorter for the guy as smart as you:-)
I've been searching a lot for ffmpeg on iphone, and how to use it to steam audio(wma ect.)
but cant figure out how this is done.
can someone please try help me on, how to/what to, download and how i get the ffmpeg lib. into my xcode project, so i can use it to steam some links i have ?
another thing is, i read something about the lisence somewhere. is it really true, that if i use the ffmpeg lib, i need to make my project/code, open source?
There is this existing question on SO, but…
You might want to read the Media Player Framework Docs as the functionality you mention already exists in the iOS SDK for many non WMA files. It is probably going to be less of a headache to convert them to mp3 or another format on your server and go from there using the built-in tech that Apple provides.
I have a webservice returning .flv file, it has to be played in iphone application, how do i play a .flv (flash file) in iphone?
Does anyone has faced this scenario? Programmatically is it possible to convert to some format and play in iphone?
Thanks.
IPhone doesn't and judging by the Apple official statements won't ever (or at least in the forseeable future) support flash content.
Converting the content to another format on the server side should be easy to do and would allow content playback on an iDevice.
SInce the video is probably already h.264 encoded inside the FLV container, you may want to try FLV Extract on the server to avoid recompression:
http://www.videohelp.com/tools/FLV_Extract
Basically you just need to run it once for each of the videos on the server and keep the results around.
I would recommend setting up your webservice to use something like ffmpeg ( http://www.ffmpeg.org/ ) to convert the .flv file to an mp4 file which can be played directly from the iPhone's web browser.
Pioto and Josaih are on the right track in suggesting that you should convert the video server-side using a tool like FFMpeg. As far as I know there is zero support for flv in any part of iOS, so you'd be unable to transcode it locally. Even if you could, it would make your users angry, since transcoding is a resource-intensive process that would kill their battery life and take a significant amount of time.
So, your solution is to transcode your videos to h.264 server-side. However, I'd caution against transcoding from flv->h.264 if there are any other options available. If you have the original, uncompressed (or at least less-compressed) source video available, you'll get higher-quality video by transcoding that to h.264. Each time lossy compression (eg, squeeze or h.264) is used on a file, you lose some information and quality. If you've ever seen a 3rd or 4th generation copy of a VHS tape, you can understand what I'm getting at.
Once you have a h.264 formatted video, you can play it on iOS. Not sure about the exact details of this.
You may be able to use ffmpeg or something on your server to transcode it to H.264. I'm not so sure you would really want to do that transcoding on the phone. Given Apple's current stance on Flash, this is probably your best option.
For FLV files, what I do is I upload them on Google Drive and watch them from Google Drive app.
I need to convert image sequences(ie,png) to video file in iPhone. How i can convert the images to video.
Regards,
Just ignore bad advice like "use ffmpeg". That would work on the desktop, but the license issue makes including ffmpeg source code in your iPhone app legally questionable. Apple provides a class named AVAssetWriter that you would use in your app to encode a series of images as h.264 stored in a .m4v quicktime container file. While the apple provided logic does work, it is not so easy to actually use and you will need to read quite a lot of documentation to get the code working. If you want to skip implementing it yourself (and likely save yourself 3 or 4 days of work), please consider using my AVAnimator library for iOS as the h.264 encoding logic is already implemented in the class AVAssetWriterConvertFromMaxvid. Once encoded as h.264, the video can be played with the standard player and it is small enough to upload to a remote server.
You are likely going to need something like FFMPEG