Looking for a way to record video from an iphone embedded within a web browser and saved to a server - iphone

Ok, so I'm working on a project that will allow users to record themselves within a browser and have the video save to the server for later watching.
Right now I have an implementation where I'm using Red5 server with Red5 Recorder and that is working fine, but I'm wondering how exactly you could go about this on an iphone as that is expected to be a large user base.
As far as my research has shown there is no universal way to gather this video within the browser as there is no HTML5 solution and Flash seems to be by far the best way to record webcam to a server.
So what I'm wondering is has anybody encountered this issue and found a solution, whether it be for Iphone only, or a universal solution that would work across all platforms.

At the moment, the only way to accomplish this on the iPhone would be to write a native application. The web browser doesn't give access to the camera, and it doesn't support Flash, Java etc.

Related

Best Recommendation for Capturing Video in a Meteor App on iOS devices

I ran into this problem in Safari where it appears that WebRTC is not fully supported. So when I call
navigator.webkitGetuserMedia()
I get an undefined error.
So my question to the community is what is the best way to write a Meteor app that captures Video on a mobile device and saves it on the said device.
If you have done this, I would appreciate it very much if you could share with me and the community how you went about this.
Specific Answer
The modern API is: navigator.mediaDevices.getUserMedia(constraints). See the docs here.
In the past, I've been unsuccessful with getUserMedia on iOS, but according to this post it can be done on iOS 11.
As for saving it, you can write to the browser's file system, but that API is only supported in Chrome. If you want to write to the camera roll, you'd need native code in the mix.
General Advice
I've spent several years of my life dealing with recording, uploading, and processing video using meteor. If you are doing anything more than trivial web recording, these observations may save you some time:
Chrome (on everything but iOS) has the best API for web recording. If you can require chrome for recording, that's ideal. Firefox is a close second, only because it doesn't support the file system API.
If you need to record and upload long videos on iOS, build a native app. Don't consider any kind of hybrid - that's a serious trap. The number of corner cases and things you need to check is pretty astounding, and the only way to get over those hurdles is with native code.

Personalized Video / Facebook App - What is the best approach?

I want to build a facebook app featuring a personalized video which imports content assets from the user's facebook profile and their extended social graph and integrates these assets within the timeline. I am thinking of using Flash however a key stipulation is that the app works on mobile - and so I would need to use HTML5. My question is: Can I use Flash to build the application and then compile the app as HTML5 - or is there an alternative solution in the form of a HTML5 video toolkit with a programming layer that would allow me to build a web app / access the Facebook API?
I have done this a few times over the years and yes flash was the easiest however there are a few options which you have available to you that I know of which will be purely HTML5 based, personally I'd stay away from flash here as it will end up just getting int he way:
1- The cleanest method is to use a video compositing tool on the server side which can be programmed to accept variables. Personally I have only ever done this using ffmpeg however there a couple of alternatives which are out there.
The basic process would be to grab the media from FB then to composite them at certain point on top/below/around a base video which is sitting on the server using a shell script which you then pass the media assets to as variables. There are so many options as to how you might want this to be done, probably best id to have a look at some of these examples:
http://broadcasterproject.wordpress.com/2010/05/18/how-to-layerremix-videos-with-free-command-line-tools/
http://graphcomp.com/ffmpeg/
ffmpeg watermark without vhook?
note that last time I did this I used vhooks and custom filters, vhooks are now deprecated
This method will mean a reasonably heavy server load if your app is popular but it's probably the most robust across devices etc.
2- Use Popcorn.js, and let the processing be done on the client side. you could hard code it using css/js/html but popcorn is pretty stable although I havent seen how it runs on devices but in theory it should work (all standardized technologies). Basically the process would be to use javascript to fire the display of images overlayed on the video base file at preset cue points. Popcorn has all of the methods and means for you to do this already.
Hope this helps a bit. Good luck, sounds fun.
we realised some interactive video apps and one recent project was quite like your question describes.
We used adobe flash to track the motion - and published the project via create.js. You could have an image sequence from within create.js or put a video in a layer behind. This video would then control the player head time of the create.js motion tracked sequence via jquery.
worked fine - here a link to a testsetup with an image sequence.
Video Integration would be the next step.
http://www.jungeroemer.net/projekte/testpersvid/elftest01.html
(German text, sorry but it's nothing important to read there.
Just click the images and go for it)
you can download the sources from the link, if you need i can also upload the flash file to show you the motion tracking.

Mp4 video not working on iPad *in Offline Mode*

I'm getting a weird problem when embedding an mp4 onto a webpage in iOS Safari. I am embedding it with a video tag:
<video src='gizmo.mp4' width=560 height=320></video>
However, on the page, I'm getting the 'video not available' placeholder graphic (play button with a slash through it)
However, when I go to the direct video on my server (http://www.example.com/gizmo.mp4), the video works perfectly.
I am using the video from here to test this out, I don't have the final video files yet. I have also replaced the gizmo.mp4 file with a gizmo.m4v file that Quicktime generated when I hit "Export for Web." I get the same result.
I am only interested in targeting iOS, so specific solutions for iPhone/iPad are welcome (even if they wouldn't work in the web at large)
Thanks in advance!
-Esa
EDIT: Did a bit more testing. Since this is an offline app that I am working on, I was completely offline for this, relying on the manifest. However, the videos worked once I took the manifest out and was working completely online again. So it looks like something up with iOS not caching video resources? The video in question is 748kB, so it's not a cache size issue (though, when I tries with a 13MB movie online, Safari automatically asked to cache the content)
Videos are regarded by the browser as a streaming resource and are not cached - even when referenced directly in the .appcache manifest file. I think the only way you could get this to work is to package the HTML 5 application up as a native app, using one of the many available tools for this (https://trigger.io, Accelerator etc).

How do i stream an audio file from the server to iphone?

I need to stream an audio file which is saved on my server. Is it possible for me to stream that file in order to play it on my iPhone? Or is there any other way to play an audio file from the server to iPhone? help me please.
Thanks,
Shibin
This link was useful to me : http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
He's got a project linked from that page http://projectswithlove.com/projects/iPhoneStreamingPlayer.zip
In this project, interesting lines are in iPhoneStreamingPlayerViewController.m, lines 82-89 start streaming the audio from a url.
I've manged to get this running on my iPhone and tested it using an mp3 on another server and it works fine. However, I've not picked through the code so I can't help you anymore than this, sorry!
Sam
NS To get the project to compile I had to change the SDK to 3.0 - if you right click on the project name and choose Get Info, then change the option called Base SDK to iPhone Device 3.0 and it should work.
There's a couple of ways to get the file playing on the iPhone, but the first problem is that you need to decide how to serve the file from your server.
One great way is to share the file out via HTTP using a Web Server. If the server is Windows, look into 'IIS'. If it's a Mac or Linux, Apache is your friend.
Once you've got the serving going, here are the options on the iPhone:
1) Use iPhone Safari to navigate to http://your-server/your-folder/the-file.ext. If the serving is correct, it'll open the mediaplayer and stream it.
2) Write an iPhone application that uses the AVMediaPlayer framework to play the file. Non-trivial, but there are plenty of samples.

Streaming Audio FROM iPhone to Browser. Ideas?

I have seen plenty of articles and SO questions about streaming TO an iPhone app, but my question is the reverse, that is, streaming FROM an iPhone app.
I have audio content in an iPhone app, that I want to stream to a browser. So the idea is that the browser can connect to a server running on the iphone. The server on the iphone will give the audio to the browser. The browser will play the endless stream.
I already have seamless looping content on the phone with AudioQueue. I already know how to setup a server running on the phone with CocoaHTTPServer. Is there a third piece that can make the AudioQueue (or a FileStream) stream to a browser connected to the internal iPhone server?
Anybody have any thoughts on how to implement this?
Well, there are a few good open source projects to dissect, port, or imitate for this. What I would suggest is looking at how Icecast and streamTranscoderv3 operate together. The latter will take an audio source and send it to an Icecast server as a source. Port parts of both and run them locally on the iPhone and you'd have a solution. I imagine that Bonjour could be used so that other systems on the LAN could find and listen to the iPhone.
Or send the streamTranscoder output to an Icecast server elsewhere and make it available for the world.
Unfortunately, neither project is over engineered - the code isn't super modular but it is comprehensible and modestly cross platform.