Video tag in chromecast don't trigger 'waiting' event - callback

I don't want to use MPL. I want to use myself HTML5 player on chromecast receiver. But when the video tag don't have sufficient buffer to play, it won't trigger 'waiting' event which is ok in Chrome.
Does anyone know why the chromecast don't trigger 'waiting' event?
Or can I choose other method to solve it?
I got below description from https://developers.google.com/cast/docs/player.
How can I detect the buffering state?
When you are using MPL for adaptive streaming, you should check the MPL state for data underflow instead of checking for waiting/stalled events of the media element. MPL pauses playback to buffer enough media data in the source buffer to resume playback without stutter.

Related

Flutter lower background volume while TTS plays

I have Text to Speech app and I'm wondering if there is a way to allow the user to listen to the TTS audio while they are listening to their music i.e. on spotify or an audio player.
At the moment the TTS plays over the top of spotify by default. Spotify doesn't stop when the tts starts, which is good, but, it is too loud.
Does anyone know if it's possible to lower the volume of spotify or other music that is playing when the user presses play on the TTS?
This requires the usage of AudioManager and AudioFocus. Unfortunately there isn't any package published for flutter to communicate with the platform channel to control volume or request Audio focus and alternate volume programmatically, yet.
developer.android.com/guide/topics/media-apps/volume-and-earphones
> Controlling stream volume programmatically
In rare cases, you can set the volume of an audio stream
programmatically. For example, when your app replaces an existing UI.
This is not recommended because the Android AudioManager mixes all
audio streams of the same type together. These methods change the
volume of every app that uses the stream. Avoid using them:
adjustStreamVolume()
adjustSuggestedStreamVolume()
adjustVolume()
setStreamVolume() setStreamVolume()
setStreamSolo()
setStreamMute()
About Audio Focus -
developer.android.com/guide/topics/media-apps/audio-focus
Managing audio focus
Two or more Android apps can play audio to the same output stream
simultaneously. The system mixes everything together. While this is
technically impressive, it can be very aggravating to a user. To avoid
every music app playing at the same time, Android introduces the idea
of audio focus. Only one app can hold audio focus at a time.
When your app needs to output audio, it should request audio focus.
When it has focus, it can play sound. However, after you acquire audio
focus you may not be able to keep it until you’re done playing.
Another app can request focus, which preempts your hold on audio
focus. If that happens your app should pause playing or lower its
volume to let users hear the new audio source more easily.
Audio focus is cooperative. Apps are encouraged to comply with the
audio focus guidelines, but the system does not enforce the rules. If
an app wants to continue to play loudly even after losing audio focus,
nothing can prevent that. This is a bad experience and there's a good
chance that users will uninstall an app that misbehaves in this way.
#cmd_prompter is right in that you're looking for the AudioManager on Android. (On iOS, the equivalent is the AVAudioSession.)
However, there is a Flutter package available for this use case now — the audio_session package.
The package allows you to determine how your app deals with background audio. You can ask other apps to "duck" their audio (= temporarily lower their volume) or to pause playback altogether.

How to mute local audio playback while still receiving raw audio data with Agora API?

I'm trying to play back raw audio data from an Agora remote stream through an AudioSource in Unity. For this, I have to first disable the default playback method used by the Agora SDK. I have tried MuteAllRemoteAudioStreams but this stops the API from receiving remote audio data altogether.
I have also tried AdjustPlaybackSignalVolume and AdjustAudioMixingVolume, which successfully mute the audio playback but this also makes the OnPlaybackAudioFrameHandler callback receive empty audio frames, stopping me from accessing raw audio data.
SetAudioPlaybackDeviceMute simply mutes the playback device, which is not what I am looking for.
Is there a way to only mute the playback of remote streams while also being able to access raw audio data?
Try mRtcEngine.AdjustPlaybackSignalVolume(0);
This will turn the playback volume from the channel, but your AudioSource volume is not affected.
If you have other issues, please come chat with us and search for answers in our slack group:
https://agoraiodev.slack.com/messages/unity-help-me

Play two audio sources during a live video stream

I'm working with the Azure Media Player( https://ampdemo.azureedge.net/ ). I need a second audio source (which will be music) to be overlaid on top of the video with presenter talking. I see there's the ability to select multiple audio streams if the video is stored on Azure. (English/spanish/etc...) However, in my case it will be video and presenter(talking) on the video live stream and I'd like to be able to select a music stream of the viewers choice during the live stream which will be stored music in Azure.
Is this possible?
Then take it to the next step, does the player provide the capability of setting the volume on each stream? If not, is the player extendable?
Azure Media Player(AMP) doesn't allow playing multiple audio streams at the same time. You could achieve your desired behavior by including a hidden HTML5 video player and use that to play your additional audio. You'll probably want to use javascript to "synchronize" the hidden video player with Azure Media Player, i.e. When AMP plays, you play the hidden video player. When AMP pauses, you pause the hidden video player.

Phonegap HTML5 audio does not resume playback after incoming phone call on iPhone

I have created an app with PhoneGap that plays music from an online stream. It continues playing in background while the iPhone is locked but, and it stops when an incoming phone call arrives.
The problem is that when the phone call ends, music does not sound anymore. I use an HTML5 audio element to play the stream, and my app traces show me that play and playing HTML5 audio element events are triggered after phone call ends.
However, it simply does not produce any music or sound at all.
Hook up to the pause and resume events of Cordova in your app, if you're in paused state, then your app is suspended and would not play anything. You should invoke your play function on the resume event. Fore more info see Events: http://cordova.apache.org/docs/en/2.6.0/cordova_events_events.md.html#Events
it happens when play command is executed before audio device enabled
for your app.
yeah it is because of the app shutdown when your call arrives, to get back to app & to play audio it requires some seconds, in short it requires some seconds to start audio device after your call ends.if your play instruction executed before audio device is initiated for your application
you can solve this by cordova's resume event
As an android programmer, there are similar issues. When dealing with CordovaWeb views doing HTML5, if the activity is interrupted for something else, such as a phone call, it needs to be handled in code. In Android, when there is an interrupt, onPause or similar is called. Inside of that, you would need to save state, and the URL of the page for the webview.
When you navigate back, you need to restore the state.
I know i am speaking in Android while this is iOS, but there is some overlap when doing stuff like that.
See: Save & restoring WebView in an embedded PhoneGap app

Record HTTP Live Streaming Video To File While Watching?

I am trying to create a streaming video DVR like functionality in an app I am developing. I have an HTTP Live Stream that I have successfully gotten to play on the iPad. I want the user to be able to push the "Record" button, and begin recording the video that is currently playing from that point. This video file will be accessible from the app or from the camera roll. Currently, I am using the MPMoviePlayerController object to play the video stream. I do not see any methods of accessing the data from the object in Apple's documentation. Here are some thoughts I had on ways of going about this.
1) Somehow access the video data from MPMoviePlayerController, and write this to a file. Or use another type of player object that will allow me to play the video and access the currently playing data.
2) Implement some sort of screen capture recording that gets a video capture of the iPad's screen. This would allow me to record the video in a "screenshot" sort of way.
3) Locate the HTTP Live Streaming video segments where they are stored by MPMoviePlayerController. Presumably they need to be stored somewhere on the iPad for playback. Is there a way of accessing these files?
4) Manually download the stream video segments over http while streaming the file. This seems like its not ideal since the stream would have to be downloaded twice.
5) This could work. Periodically download the video segments to the iPhone. Set up a local http server on the iPhone and server the videos to the MPMoviePlayerController. This way the video segments could be marked for recording and assembled into a video.
6) I do have control of the streaming server. I could write some server side code to record the video on the server end, then send the video to the iPad after the fact. I would rather not do this.
Has anyone done any of these things? Ideally the iPhone would just be able to access the video data somehow and easily record it. I would rather not get into options 4, 5, or 6 (above) if I don't have to.
Thanks in advance.
DVR on the device is somewhat not encouraged, due to the limited space available and other factors like battery life, processing power, cleanup procedures after the user stops the dvr, etc.
If you want to achieve DVR playback on iOS devices (or other devices using HLS), I suggest you keep the video server side. The live stream is already captured and segmented server side, all you would have to do is keep the segments a bit longer, instead of deleting them. By using the EXT-X-PLAYLIST-TYPE and EXT-X-MEDIA-SEQUENCE tags, you can suggest to the player that he's opening a live stream which has DVR (earlier) video available.
Alternatively, you can use a server that does that out of the box, for example Wowza. Here's an article on how to achieve this with Wowza