Best Recommendation for Capturing Video in a Meteor App on iOS devices - iphone

I ran into this problem in Safari where it appears that WebRTC is not fully supported. So when I call
navigator.webkitGetuserMedia()
I get an undefined error.
So my question to the community is what is the best way to write a Meteor app that captures Video on a mobile device and saves it on the said device.
If you have done this, I would appreciate it very much if you could share with me and the community how you went about this.

Specific Answer
The modern API is: navigator.mediaDevices.getUserMedia(constraints). See the docs here.
In the past, I've been unsuccessful with getUserMedia on iOS, but according to this post it can be done on iOS 11.
As for saving it, you can write to the browser's file system, but that API is only supported in Chrome. If you want to write to the camera roll, you'd need native code in the mix.
General Advice
I've spent several years of my life dealing with recording, uploading, and processing video using meteor. If you are doing anything more than trivial web recording, these observations may save you some time:
Chrome (on everything but iOS) has the best API for web recording. If you can require chrome for recording, that's ideal. Firefox is a close second, only because it doesn't support the file system API.
If you need to record and upload long videos on iOS, build a native app. Don't consider any kind of hybrid - that's a serious trap. The number of corner cases and things you need to check is pretty astounding, and the only way to get over those hurdles is with native code.

Related

Personalized Video / Facebook App - What is the best approach?

I want to build a facebook app featuring a personalized video which imports content assets from the user's facebook profile and their extended social graph and integrates these assets within the timeline. I am thinking of using Flash however a key stipulation is that the app works on mobile - and so I would need to use HTML5. My question is: Can I use Flash to build the application and then compile the app as HTML5 - or is there an alternative solution in the form of a HTML5 video toolkit with a programming layer that would allow me to build a web app / access the Facebook API?
I have done this a few times over the years and yes flash was the easiest however there are a few options which you have available to you that I know of which will be purely HTML5 based, personally I'd stay away from flash here as it will end up just getting int he way:
1- The cleanest method is to use a video compositing tool on the server side which can be programmed to accept variables. Personally I have only ever done this using ffmpeg however there a couple of alternatives which are out there.
The basic process would be to grab the media from FB then to composite them at certain point on top/below/around a base video which is sitting on the server using a shell script which you then pass the media assets to as variables. There are so many options as to how you might want this to be done, probably best id to have a look at some of these examples:
http://broadcasterproject.wordpress.com/2010/05/18/how-to-layerremix-videos-with-free-command-line-tools/
http://graphcomp.com/ffmpeg/
ffmpeg watermark without vhook?
note that last time I did this I used vhooks and custom filters, vhooks are now deprecated
This method will mean a reasonably heavy server load if your app is popular but it's probably the most robust across devices etc.
2- Use Popcorn.js, and let the processing be done on the client side. you could hard code it using css/js/html but popcorn is pretty stable although I havent seen how it runs on devices but in theory it should work (all standardized technologies). Basically the process would be to use javascript to fire the display of images overlayed on the video base file at preset cue points. Popcorn has all of the methods and means for you to do this already.
Hope this helps a bit. Good luck, sounds fun.
we realised some interactive video apps and one recent project was quite like your question describes.
We used adobe flash to track the motion - and published the project via create.js. You could have an image sequence from within create.js or put a video in a layer behind. This video would then control the player head time of the create.js motion tracked sequence via jquery.
worked fine - here a link to a testsetup with an image sequence.
Video Integration would be the next step.
http://www.jungeroemer.net/projekte/testpersvid/elftest01.html
(German text, sorry but it's nothing important to read there.
Just click the images and go for it)
you can download the sources from the link, if you need i can also upload the flash file to show you the motion tracking.

Looking for a way to record video from an iphone embedded within a web browser and saved to a server

Ok, so I'm working on a project that will allow users to record themselves within a browser and have the video save to the server for later watching.
Right now I have an implementation where I'm using Red5 server with Red5 Recorder and that is working fine, but I'm wondering how exactly you could go about this on an iphone as that is expected to be a large user base.
As far as my research has shown there is no universal way to gather this video within the browser as there is no HTML5 solution and Flash seems to be by far the best way to record webcam to a server.
So what I'm wondering is has anybody encountered this issue and found a solution, whether it be for Iphone only, or a universal solution that would work across all platforms.
At the moment, the only way to accomplish this on the iPhone would be to write a native application. The web browser doesn't give access to the camera, and it doesn't support Flash, Java etc.

Turning an iPhone or iPod into a wireless webcam

I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.

Audio recording with HTML5 and Javascript

I'm trying to build a web application for iPhone and Android that deals with audio input.
Is this possible?
Apparently ... yes, or it should be able to when it's finished at least. It will supposedly become possible using the device API which is due to be part of HTML5 when it's finished and released (HTML5 isn't finalised yet however, and information is subject to potential for change).
W3C Device API Requirements (camera section)
Sony Erricson community blog posting, with examples (pre-final API)
While it isn't explicitly mentioned in the W3C spec, audio recording as part of (web)camera interactions is, so it's definitely hopeful. There seems to be a shortage of good information at this stage though. I'd expect to see more as HTML5 comes closer to being finalised.
As of now, HTML5 Can't record Audio. but in future, it will be able to, by using the Device's native features.
HTML 5 can not record audio (at least currently). HTML basically is a markup language and therefore only declares how a browser should display certain content. Although HTML 5 introduces new features that make some interaction possible, you can't record audio straight into.. HTML (even saying that sounds wrong). When the HTML5 spec is finished, it might become reality, until then, no way.
Web applications that record audio normally require a plugin like Flash or Silverlight, because those can access system resources like audio hardware. Both are a no-go on iOS, although Flash is theoretically possible on Android, I don't know if it supports audio input.
I would suggest you write a native app (for iOS and Android) that can access the audio hardware and connects to your web application in the background, so that the recording takes place natively and the recorded audio will be transmitted to your servers (think of Shazam, for example).
Here are the basic developer guides on recording audio in:
Android
OS X, iOS
A new MediaStream Recording API is being worked on. It is currently availble only in the Firefox Nightly build for demo purposes
Here's the draft with the latest updates directly form W3C site:
https://dvcs.w3.org/hg/dap/raw-file/default/media-stream-capture/MediaRecorder.html
Also the following article covers up other attempts on recording audio and video directly in the browser:
http://hdfvr.com/html5-video-recording

How to convert speech to text in iphone?

I want to build an application where user when talks something on iphone it will convert into corresponding text.
I heard in windows platform it is possible.
Wheather this is possible in iphone ? Any API available for this ?
I used Nuanceā€™s Dragon Speech SDK for this purpose.
Its free for developers and their SDK have a sample project for STT and TTS both.
Tried Speech to text using this SDK on iOS 9 and it works like a charm.
Here is the link.
https://developer.nuance.com/public/Help/DragonMobileSDKReference_iOS/SpeechKit_Guide/RecognizingSpeech.html
Limitations:
60 seconds recording time limit.
Recorded audio file is not accessible.
Pauses taken are detected as end of recording.
There's an app for that.
Search for "Dragon Speech".
The question has been asked a lot of times here already, this being one of these questions that received quite a few answers and good ideas.
There is no API for doing speech to text on the iPhone, but you can record the voice on the phone, send the recording to a server that runs the speech recognition software on Windows or whatever OS suits you best, then return the text results back to the phone.
It is possible on the iPhone. Pocketsphinx has been ported. For example, an app called cactus dialer uses pocketsphinx. No API has been published but its not hard to get it built. Many people have.
For full blown dictation it will be hard. You will need to make it server based like Nuance's 'dragon speech' does or accept a smaller vocabulary.