Trying to figure out the best way to allow users to enable their webcams to stream live to my server so others can view the streams.
I've read a handful of posts on here but haven't seen a good example of a solution for this.
Really just looking for what needs to be done to have their webcam turn on and stream the feed to my server and what needs to be done to push that feed out to a page viewable by others.
I believe Adobe Flash Player would be the best to receive the feed and display for others but again need some guidance here on how to put this all together.
Any direction is much appreciated!
Thanks,
NCoder
You will require Javascript + Adobe Flash Player, the user will grant permission for their cam to be accessed.
For Adobe:
http://www.adobe.com
For Javascript:
$int event
startcam-on.user
open.? Rec.begin (int)
Related
I want to build a facebook app featuring a personalized video which imports content assets from the user's facebook profile and their extended social graph and integrates these assets within the timeline. I am thinking of using Flash however a key stipulation is that the app works on mobile - and so I would need to use HTML5. My question is: Can I use Flash to build the application and then compile the app as HTML5 - or is there an alternative solution in the form of a HTML5 video toolkit with a programming layer that would allow me to build a web app / access the Facebook API?
I have done this a few times over the years and yes flash was the easiest however there are a few options which you have available to you that I know of which will be purely HTML5 based, personally I'd stay away from flash here as it will end up just getting int he way:
1- The cleanest method is to use a video compositing tool on the server side which can be programmed to accept variables. Personally I have only ever done this using ffmpeg however there a couple of alternatives which are out there.
The basic process would be to grab the media from FB then to composite them at certain point on top/below/around a base video which is sitting on the server using a shell script which you then pass the media assets to as variables. There are so many options as to how you might want this to be done, probably best id to have a look at some of these examples:
http://broadcasterproject.wordpress.com/2010/05/18/how-to-layerremix-videos-with-free-command-line-tools/
http://graphcomp.com/ffmpeg/
ffmpeg watermark without vhook?
note that last time I did this I used vhooks and custom filters, vhooks are now deprecated
This method will mean a reasonably heavy server load if your app is popular but it's probably the most robust across devices etc.
2- Use Popcorn.js, and let the processing be done on the client side. you could hard code it using css/js/html but popcorn is pretty stable although I havent seen how it runs on devices but in theory it should work (all standardized technologies). Basically the process would be to use javascript to fire the display of images overlayed on the video base file at preset cue points. Popcorn has all of the methods and means for you to do this already.
Hope this helps a bit. Good luck, sounds fun.
we realised some interactive video apps and one recent project was quite like your question describes.
We used adobe flash to track the motion - and published the project via create.js. You could have an image sequence from within create.js or put a video in a layer behind. This video would then control the player head time of the create.js motion tracked sequence via jquery.
worked fine - here a link to a testsetup with an image sequence.
Video Integration would be the next step.
http://www.jungeroemer.net/projekte/testpersvid/elftest01.html
(German text, sorry but it's nothing important to read there.
Just click the images and go for it)
you can download the sources from the link, if you need i can also upload the flash file to show you the motion tracking.
So generally, I want to make an app which has video chat functionality for iPhone. But after many searches, I am still not able to find any successful results. Is there any public or even for that matter, private API available for doing this on iPhone??? If you have an YES answer, please help me.
Basically, what I want is to read the streams of the video on both the devices connected for chatting. Thanks a lot in advance and please help me if you can.
p.s - I have already checked iDoubs but it failed and always shows some unknown problem and for that reason, doesn't allow me to connect to anyone.
ALSO : The suggested method I have found is via HTTP Live Streaming. But, in that too, I have multiple doubts.
1.) I need to find how do I upload my video from iPhone to the HTTP server from where I would be broadcasting?
2.) Can you please post something related to setting up the server? How do I feed the video to the FFMPEG Server?
Mainly, I need to find the upload method. I am right now simply sending hex-code in the form of NSDATA to the server and I am stuck there. The main problem is, It is live. How do I handle that?
It would be best, if you could help me make the iDoubs work properly.
Thank you so much for any kind of support!
have a look on this how to implement video chat in iphone But before starting you must have a IMS server up & running.
here is the live video chat framework what you are looking for. Its easy and simple to implement for face to face video chat. I have already tried this. Its working very fine. Great thing about this framework is multiple platform support.
Tokbox : https://tokbox.com/platform
https://tokbox.com/opentok/tutorials/
Sample Code:
https://github.com/opentok/opentok-ios-sdk-samples/
Edit:
Here is the article explaining opentok using parse.
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
HTTP live streaming is primarily an approach for adaptive streaming from server-to-client. For client-to-server rather go for traditional streaming. There exists an open library for streaming, see this question.
Whilst it is possible to facetime to do two-way chat, it is not certain that you will be able to using public iOS APIs. That said, I have implemented one-way live streaming for iPhone and the difficult part was not the core streaming itself, but encoding of the payload. You will be able to do H264 in hardware and AAC / iLBC in software.
How you want to feed this to the FFMPEG depends on your transport, possibly changing from 'file' H264 frames to 'streaming' H264. Check out the H264 frame types if you implement frame dropping; reconfiguring the H264 encoder on-the-fly is not possible to my knowledge, but restarting with fresh parameters typically does not take more than a second or so.
Did you attempt to play back a live resource while capturing? That is a good starting point. If you come across an open API for H264 encoding, please post it here ;-)
Actually i am developing an app like net flix and in that i need to save favorite songs of end user, i am playing http live streamed videos and also i need to save played time of an video so that end user would be able to play a song from where he has left this song rather then just play it again from beginning.
They are sending me url of streamed videos in following format
http://xxxxxxxxxx/vod/definsts/mp4/low/mp4:1975010026_01.mp4/playlist.m3u8
so my question is that
What should be the best option to save user's favorite songs according to streamed url, means net flix kind of app having favorite songs in client side or at server side.what should be the preferred one and i am using MPMediaPlayerController from apple MoviePlayer sample app code.
If some one has any idea and want to know any thing more from my side then i will be available. I am and will be highly obliged for your any help.
Sorry but my English is not good and i don't know how to play with words.
Any small help or suggestion would be much appreciated.
Preferably your best option here looks like server side;
Saving the Data When you are removed from the application should be ultimately dealt with on the execution of the app to close.
you have multiple options when doing server Side Calls, One i found to be the best is ASIHTTPRequestDelegate
this has worked wonders and is fairly easy to learn.
You may even want to look into a REST method of pulling Video Feeds, may be a Faster and more secure approach.
hope this Helped! :)
I want to retrieve uploaded videos from a certain youtube channel and I want to display the list in a UITableView. Then when user clicks on a row the video will play. Is this possible to do with youtube API?
Im new to iPhone app development and I need to get this done asap. Can some one provide some code samples or point me in the right direction. I cant find much useful stuff on the net
Please Help
Maybe a good start is
Google Data APIs Objective-C Client Library
or
Google Data APIs Examples
I've been told that some apps have video streaming which streams initially and when completely downloaded the video is stored to the user's device for quick and internet-free subsequent viewing.
Firstly, is this possible? Secondly, could you point me towards resources demonstrating how it could be done, or possibly offer some insight to get me started?
Thanks friends.
good tutorial on how to stream video:
http://buildmobilesoftware.com/2010/08/09/how-to-stream-videos-on-the-iphone-or-ipad/