I am designing my web project by Wicket. There I want to add web cam video capturing facility. I have Recorder.swf which records the video and save in rtmp and Player.swf which playback that saved video. Normally both of the swf are working well, if I open them simply by any browser. But If that two swf are added in wicket application then they are not working. But I have tested other swf, like I have played flv by media player swf file in my project. I can not able to understand what is happening! Can rtmp not be integrated with wicket? Thank you.
From what you describe, it has not much to do with Wicket.
Wicket is on server side.
If it works on some machine but does not on other, then it's most probably a problem either in the flash or on the client side (flash player).
Related
Hi I am trying to create an iphone app that streams a video from a remote server (that automatically creates video files) , I'm using flash builder 4.7, mobile flex.
My expertise with fb / flex is not great but am getting there.
Using StageWebView I have made some progress and can stream a simple mp4 file , however the files I want to stream have the wrong extension so the iphones internal video player won't play them. I have no control over the remote server so can't change the mime type or file extensions. The files are legitimate mp4 files, if I copy one and change the extension they work fine.
Anybody got any idea how I can fool the ios video player into playing them?
Any help grateful appreciated I have been working on this for weeks and its driving me round the bend
Cheers
Toby
Ok, so I'm working on a project that will allow users to record themselves within a browser and have the video save to the server for later watching.
Right now I have an implementation where I'm using Red5 server with Red5 Recorder and that is working fine, but I'm wondering how exactly you could go about this on an iphone as that is expected to be a large user base.
As far as my research has shown there is no universal way to gather this video within the browser as there is no HTML5 solution and Flash seems to be by far the best way to record webcam to a server.
So what I'm wondering is has anybody encountered this issue and found a solution, whether it be for Iphone only, or a universal solution that would work across all platforms.
At the moment, the only way to accomplish this on the iPhone would be to write a native application. The web browser doesn't give access to the camera, and it doesn't support Flash, Java etc.
I am working with a group at developing an app that will essentially be a 'radio' app. One view that will just play whatever audio is streaming at the time, and another view or two of archives to listen to past programs. What I am working on right now is how to assemble the view to play. The site in question is on-this-rock.org and the source for playing is here
Any suggestions for how I can best go about building the player to stream in the audio, without needing the rest of the site graphics?
Thanks
The stream URL is actually:
http://s4.voscast.com:8080/
This is just a SHOUTcast stream. You can build your radio player to connect directly to it. No need for the HTML/Flash on the website itself.
You can find this easily by looking at your browser tool's network tab, or by using a tool such as Fiddler or Wireshark.
How can I stream video data from the network and play it on an iPhone?
First, are you developing a Web app optimized for iPhone or a native application ?
In the first case, your only option is to transcode your video files to Quicktime H.264 (m4v or mp4 extension). You can use Quicktime Pro (use the export menu) or VLC (as a free alternative). Then simply add a hyperlink to the video file on your HTTP server. Make sure it presents the right content-type and stuff (read Safari Web Content Guide for iPhone OS: Configuring Your Server). That'll work for web and native apps (in a native app you would use the MPMoviePlayerController view). So can "stream" (technically called progressive download of a Quicktime movie file).
If you're talking about streaming live content (i.e. content that you produce live or transcode a live feed) there is currently no official way of doing it (as of iPhone OS 2.2). iPhone OS does not support RTSP/RTP streaming. A number of native iPhone applications (such as UStream.tv and Orb Live) have created their custom live streaming solution (most of them transfer a delayed streams with many seconds of latency over HTTP then somehow decode it on the phone using FFmpeg or other libraries).
Are you trying to stream video in your app or just streaming on your iPhone? For streaming video through an app, use the MPMoviePlayerController and pass the URL of your video to it. The MPMoviePlayerController will itself stream the video and play it for you.
If you're looking for a server based solution (with a very affordable Amazon EC2 option), be sure to check out Wowza at http://www.wowzamedia.com/advanced.php
It streams directly to iPhone/iPod Touch without a custom app.
note: I'm not affiliated with them at all... just a fan/customer.
edit: Just noticed how old this question was. :)
I am planning to write an iPhone app which can display streaming audio/video from the internet (backend would most probably be Red5 or Wowza and video will be streamed on RTMP (although I have the option to change that). Any ideas on implementation?
http://www.youtube.com/watch?v=5-UoLsSSw30 demos something similar to what I have in mind.
--
MI
You better know that, and start looking at HTTP Live Streaming!
Displaying the video is as simple as having the <video> tag in an HTML5 page, but there should be an API also for that (I am not an iPhone developer)