I'm trying to post a Flash media player (not Facebook's default MP3 player) as an attachment to a stream post. I can get the flash app to load, but it won't load the media file. Both the media file and .swf are hosted on my server. It works fine if I load the .swf from a test page hosted on my own server instead of in a Facebook feed.
The attachment is defined like this:
{media:[{
type:"flash",
imgsrc:"http://mysite.com/images/my-icon.png",
swfsrc:"http://assets.mysite.com/files/wavplayer.swf?gui=mini&h=20&w=300&sound=test123.wav",
width:40,height:40,expanded_width:40,expanded_height:40}]}
I'm not up on cross-domain Flash permissions; is there a recipe for getting this to work? Any way even to get debugging info on the Flash app in Firefox?
Thanks!
Do be able debug flash movie in Firefox you need:
Firebug
Debug Flash Player
Flash Bug (not sure if this addon will work with latest version of FF)
Some errors like broken link to media assets (audio, fonts, images), could be seen on Net tab of firebug. Debug Flash Player allow you to see errors of Flash movies, and if .swf was compiled in debug mode, you will be able to see all traces in the log of Flash Bug.
Related
I'm getting a weird problem when embedding an mp4 onto a webpage in iOS Safari. I am embedding it with a video tag:
<video src='gizmo.mp4' width=560 height=320></video>
However, on the page, I'm getting the 'video not available' placeholder graphic (play button with a slash through it)
However, when I go to the direct video on my server (http://www.example.com/gizmo.mp4), the video works perfectly.
I am using the video from here to test this out, I don't have the final video files yet. I have also replaced the gizmo.mp4 file with a gizmo.m4v file that Quicktime generated when I hit "Export for Web." I get the same result.
I am only interested in targeting iOS, so specific solutions for iPhone/iPad are welcome (even if they wouldn't work in the web at large)
Thanks in advance!
-Esa
EDIT: Did a bit more testing. Since this is an offline app that I am working on, I was completely offline for this, relying on the manifest. However, the videos worked once I took the manifest out and was working completely online again. So it looks like something up with iOS not caching video resources? The video in question is 748kB, so it's not a cache size issue (though, when I tries with a 13MB movie online, Safari automatically asked to cache the content)
Videos are regarded by the browser as a streaming resource and are not cached - even when referenced directly in the .appcache manifest file. I think the only way you could get this to work is to package the HTML 5 application up as a native app, using one of the many available tools for this (https://trigger.io, Accelerator etc).
I am building a website: daretogaincontrol.com. On that site, I am using videolightbox to play videos.
The videos don't play on iPad or iPhone. The site works fine on desktops (Mac and PC).
The nice people at videolightbox put together a little test page to show that their player and one of our videos will play on iPad/iPhone here: http://www.videolightbox.com/3/
I copied that code to a test page here: http://www.daretogaincontrol.com/test_video
On the test_video page there are two play options one is playing the video from videolightbox.com with the player from that site. Same as the videolightbox.com page, but served from the daretogaincontrol server. The bottom one attempts to play the same video with the same player but the player and video reside on the daretogaincontrol server.
All the javascript and css on the test_video page are linked to the videolightbox.com site.
Neither video on the test_video page plays on iPad/iPhone, which leads me to believe there might be a server issue, since I can eliminate (?) the video format and player has having identifiable problems.
You can play the video on iPad/iPhone using a direct link: [which you need to figure out on your own because I am limited to 2 links per post, sorry], but not in the player on the page.
I have no iPad/iPhone to test with. I know pretty much nothing about iPad/iPhone. I must make changes and call the client to have them test. I also have little experience with delivering video content over the web.
By using these instructions you can set up your PC to act as a proxy and capture the web traffic.
When I do this, I see the requests are malformed when coming from the iPhone
GET http://www.daretogaincontrol.com/test_video
200 OK (text/html)
GET http://www.daretogaincontrol.com/videos/index_videolb/http://www.daretogaincontrol.com/videos/having_fun.mp4
404 Not Found (text/html)
GET http://www.daretogaincontrol.com/videos/index_videolb/http://www.daretogaincontrol.com/videos/having_fun.mp4
404 Not Found (text/html)
I'm trying to parse streaming video websites in order to play them in an iPad/iPhone app.
For example, in www.veetle.com, opening a channel with an iPad or iPhone you can see the video, because the code for this finds a .m3u (or .m3u8) file which can be played. But opening from a computer browser, or parsing the channel address in Xcode doesn't show you this .m3u file, it uses flash.
What I want is to get this .m3u file to be able to play the channel, I have been searching all around the web how to open a website as a mobile in Xcode, but I haven't found anything. Any idea?
Thanks.
You need to change the so called user agent string in your request.
See here: Changing the userAgent of NSURLConnection
and here: What is the iOS 5.0 user agent string?
Developer tools for safari also offers the ability so quickly change the user agent which is nice if you just want to quickly check a site: http://designshack.net/articles/developer-tools-in-safari/
Use eg this to see your current user agent: https://duckduckgo.com/?q=user+agent+
If I want to use Google Video chat on my browser
I have to download and install a plugin for it to work.
I would like to make a piece of software that creates
some interactions with a video displayed in the browser.
I assume that it might be problematic doing it with one solution
for all the browser, so if I might need to focus on only one browser
lets talk about firefox, although I think the firefox addon SDK
would not let me do a thing as complex as video interaction.
But how does the Google Video chat plugin work for the browsers?
It's only an example for one of those plugins that lets you
do activities (media in this case) with your browser
which are normally impossible.
As I understand it, Google Video Chat uses Flash.
I'm looking for something official-looking to back that up now...
Edit: I think this explains it pretty well.
Flash Player exposes certain audio/video functions to the (SWF) application. But the Flash Player does not give access to the raw real-time audio/video data to the application. There are some ActionScript API classes and methods: the Camera class allows you to capture video from your camera, the Microphone class allows you to capture audio from your microphone, the NetConnection/NetStream classes allow you to stream the video from Flash Player to remote server and vice-versa, the Video class allows you to render video either captured by Camera or received on NetStream. Given these, to display the video in Flash Player the video must be either captured by Camera object or received from remote server on NetStream. Luckily, ActionScript allows you to choose which Camera to use for capture.
When the Google plugin is installed, it exposes itself as two Camera devices; actually virtual device drivers. These devices are called 'Google Camera Adaptor 0' and 'Google Camera Adaptor 1' which you can see in the Flash Player settings, when you right click on the video. One of the device is used to display local video and the other to display the remote participant video. The Google plugin also implements the full networking protocol and stack, which I think are based on the GTalk protocol. In particular, it implements XMPP with (P2P) Jingle extension, and UDP-based media transport for transporting real-time audio/video. The audio path is completely independent of the Flash Player. In the video path: the plugin captures video from the actual camera device installed on your PC, and sends it to the Flash Player via one of the virtual camera device driver. It also encodes and sends the video to the remote user. In the reverse direction, it receives video (over UDP) from the remote user, and gives it to the Flash Player via the second of the virtual camera device drivers. The SWF application running in the browser creates two Video objects, and attaches them to two Camera object, one each for the two virtual video device, instead of attaching it to your real camera device. This way, the SWF application can display both the local and remote video in the Flash application.
I am designing my web project by Wicket. There I want to add web cam video capturing facility. I have Recorder.swf which records the video and save in rtmp and Player.swf which playback that saved video. Normally both of the swf are working well, if I open them simply by any browser. But If that two swf are added in wicket application then they are not working. But I have tested other swf, like I have played flv by media player swf file in my project. I can not able to understand what is happening! Can rtmp not be integrated with wicket? Thank you.
From what you describe, it has not much to do with Wicket.
Wicket is on server side.
If it works on some machine but does not on other, then it's most probably a problem either in the flash or on the client side (flash player).