I'm using Facebook Messenger to send an audio file to my bot. The webhook call that Facebook sends me includes a URL, which is the audio file and when I download it it's in .mp4 format (weird, since it's supposed to be audio not video).
I didn't find any documentation as to why it's .mp4, and whether there is the option to get a different format from Facebook. Anyone knows anything about this?
Wikipedia, as found by searching for "mpeg 4 audio" using Google
MPEG-4 Part 14 or MP4 is a digital multimedia container format most
commonly used to store video and audio, but it can also be used to
store other data such as subtitles and still images.
I can confirm this is the case as I've pulled one of these files down as well.
Related
I have an iOS app in which users upload a selfie video on my server. Afterwards, a reviewer sees the video on a back-office website, and accept/reject it.
The problem is that I upload MP4 video files from my iOS client app. When I load the file url in the <video> tag of my website, chrome doesn't load the video (no error is displayed in the console). In Safari, everything works fine.
After some researches, I saw that sometimes, mp4 video can't be played by Chrome. Strangly, I tried to play some other mp4 video found on my laptop in chrome, and it worked. As if there were several "kind" of MP4 encoding and the one my client apps uses to encode is not supported by Chrome.
I saw that Chrome could handle .webm format, but I takes a lot of time to my server (4 entiere minutes at 100% CPU...) to translate from a MP4 to a WEBM.
My questions are :
(1) Why some mp4 can be played by Chrome and other not ? How I can make Swift AVFoundation module encoding the recorded video in the "right" mp4 encoding ?
(2) If it's impossible, I'd like to encode the 2 files (webm and mp4) client-side, within the user devices (I really want to avoid handling these computations by my server as they look extremely long to perform). The problem is that Apple does not provide webm as a possible translatable format in the documentation : https://developer.apple.com/documentation/avfoundation/avfiletype. Is there any way to translate a mp4 into a webm in Swift ?
(3) If (2) is impossible, is translating a user video into different formats server-side something commonly done as a "best practice" to manage video files and make them available in all platforms ? I mean, am I just missing some client-side trick to make mp4 videos playable everywhere or is it normal when dealing with cross platform videos to budget a whole CPU machine from AWS or whatever to handle conversions server-side ?
For whatever reason, Swift was encoding in a wrong encoding. To encode in the right encoding so it's available in a <video> tag in Chrome, I used, in my iOS app :
if movieFileOutput.availableVideoCodecTypes.contains(.h264) {
// Use the H.264 codec to encode the video.
movieFileOutput.setOutputSettings([AVVideoCodecKey: AVVideoCodecType.h264], for: connection!)
}
What is the best way to upload a ~700MB of AVCHD (1080p, 50fps) video file to facebook to get the best performance?
Uploading it the regular way gets video and audio out of sync, and the video appears "stuttering"...
Maybe convert it to so some sort of format?
Thanks!
user handbrake (http://handbrake.fr) to convert it into mp4 format (the normal preset).
Is it possible access raw PCM data from the iphone audio output?
I know I can embed an MP3 and use AudioUnit. But if the user is playing music in the background from their itunes library, is it possible to access that audio data?
This is for an app that shows visual effects, which react to the music.
From what I can tell, it isn't possible, but that's just from lack of finding any information at all, rather than actual confirmation that it can't be done.
If it isn't possible to access the audio stream from the ipod, is it possible to access raw audio output from the Media Player inside an app, or is pretty much not permitted to access raw audio data from the itunes library at all?
EDIT: I found this question: iOS - Access output audio from background program, which say I can't access the audio from a background app. But is it possible to get the audio data from the itunes library if I play it inside the app?
I am busy coding something similar and as far as I know an AUGraph is needed, the hardware pulling from the recorder. You will have to get the URL of the MPMediaItem from the track the user selected with Apple's MPMediaPickerViewController. Then use the URL with Core-Audio. Core-Audio is a beast.
If your app is playing raw audio PCM samples, then your app has access to those samples. An app does not have access to the audio samples that another app (including the Music player) is playing via any public API.
An app can use AVAssetReader and Writer to convert mp3 files from the iTunes Library into raw audio (WAV) files.
I have a webservice returning .flv file, it has to be played in iphone application, how do i play a .flv (flash file) in iphone?
Does anyone has faced this scenario? Programmatically is it possible to convert to some format and play in iphone?
Thanks.
IPhone doesn't and judging by the Apple official statements won't ever (or at least in the forseeable future) support flash content.
Converting the content to another format on the server side should be easy to do and would allow content playback on an iDevice.
SInce the video is probably already h.264 encoded inside the FLV container, you may want to try FLV Extract on the server to avoid recompression:
http://www.videohelp.com/tools/FLV_Extract
Basically you just need to run it once for each of the videos on the server and keep the results around.
I would recommend setting up your webservice to use something like ffmpeg ( http://www.ffmpeg.org/ ) to convert the .flv file to an mp4 file which can be played directly from the iPhone's web browser.
Pioto and Josaih are on the right track in suggesting that you should convert the video server-side using a tool like FFMpeg. As far as I know there is zero support for flv in any part of iOS, so you'd be unable to transcode it locally. Even if you could, it would make your users angry, since transcoding is a resource-intensive process that would kill their battery life and take a significant amount of time.
So, your solution is to transcode your videos to h.264 server-side. However, I'd caution against transcoding from flv->h.264 if there are any other options available. If you have the original, uncompressed (or at least less-compressed) source video available, you'll get higher-quality video by transcoding that to h.264. Each time lossy compression (eg, squeeze or h.264) is used on a file, you lose some information and quality. If you've ever seen a 3rd or 4th generation copy of a VHS tape, you can understand what I'm getting at.
Once you have a h.264 formatted video, you can play it on iOS. Not sure about the exact details of this.
You may be able to use ffmpeg or something on your server to transcode it to H.264. I'm not so sure you would really want to do that transcoding on the phone. Given Apple's current stance on Flash, this is probably your best option.
For FLV files, what I do is I upload them on Google Drive and watch them from Google Drive app.
I'm building some YouTube search functionality into an iPhone app and want to ensure that I only receive results that will be playable on the device. According to the Searching for videos section in the API reference doc this seems to be relatively straightforward:
The format parameter specifies that videos must be available in a particular video format. Your request can specify any of the following formats:
I've tried setting "format=1" to limit to:
RTSP streaming URL for mobile video playback. H.263 video (up to 176x144) and AMR audio.
This provides a high proportion of playable videos but some are still unplayable and I'm worried that it's not returning others that would be playable.
When I leave the format field blank I receive an even higher proportion of non-streamable URLs.
This does not sound appropriate. My understanding is that iPhone does not stream RTSP rather it supports Apple's HTTP Streaming of segmented files for live and HTTP streaming of MPEG4 video files via range requests. I'd also expect the video to be H.264 and AAC audio.
Your setting sounds appropriate for low-end cellphones In particular, the 176x144 is a QCIF resolution commonly used on non-smartphones.
When you look at the XML file returned by a call to
http://gdata.youtube.com/feeds/api/videos/<your video id>
then you will notice that videos which are not playable on the iPhone will have the following tag:
<yt:state name='restricted' reasonCode='limitedSyndication'>Syndication of this video was restricted by its owner.</yt:state>
Just make sure to look for the above tag and ignore the video if the tag is present.