reading videos using cordova-plugin-media-streaming close the window automatically - ionic-framework

I'm working on an ionic mobile application where I needed to read videos on streaming by providing the URI of the video online. So I used cordova-plugin-media-streamingplugin offered by cordova.
My problem is that: the window reading the video closes automatically after the video finishes, the user won't be able to play the video again in this window.
In the official documentation of the plugin [that i found here], there is an attribute called shouldAutoClosethat should be set to false to avoid that problem. But this didn't work for me.
Here is the code I used to play a video on streaming :
startVideo(item : Multimediasendtrust) {
let options = {
successCallback: () => { console.log('Finished Video') },
errorCallback: (e) => { console.log('Error: ', e) },
orientation: 'portrait',
controls: true,
shouldAutoClose: false
};
console.log('those are option ',options );
console.log('the link of the video ', item.url_media);
this.streamingMedia.playVideo(item.url_media, options); }
Can anyone help please. Thanks in advance.

Related

How to implement an audio listening stream in Flutter Web?

I'm making a Flutter Web App which has to access the microphone and streams the audio data as an array of integers for further processing.
I already succeeded doing this in plain JavaScript.
Things I've tried:
The flutter_sound library, but I couldn't get it to work. I also can't find any working examples for that library.
dart:web_audio seems to be a thing, but apparently you can't even import it yet in normal Flutter Apps.
dart:js is what im trying to do right now. I was able to create an AudioContext with var audioContext = JsObject(context['AudioContext']);. However, after that I dont know what syntax can be used to transfer the JavaScript code into Dart. Here is what I'm doing in JavaScript:
function initAudio() {
try {
audioCtx = new AudioContext();
const GotAudioStream = function(stream) {
const audioSource = audioCtx.createMediaStreamSource(stream);
const audioProcessor = audioCtx.createScriptProcessor(bufSize, 1, 1);
audioSource.connect(audioProcessor);
audioProcessor.connect(audioCtx.destination);
audioStarted = true;
audioProcessor.onaudioprocess = function(e) {
checkAudioBuffer(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({ audio: true, video: false }).then(GotAudioStream);
}
catch (err) {
console.log(err);
}
}
Does anyone have experience with the dart:js library or another Idea on how to implement a simple (live!) audio stream in Flutter Web?
Regards,
Kaisky

Google assistant media player goes away on pause

BRIEF :
I have created Google assistant application that plays music using Google Action Builder. On specific command, it triggers a webhook. Webhook contains MediaResponse
OR Media from '#assistant/conversation' Library and the code is following
conv.add(new Media({
mediaType: 'AUDIO',
start_offset: `3.000000001s`,
mediaObjects: [{
name: music,
description: 'This is example of code ',
url: `https://example.com`,
image: {
large: {
url: 'https://example.com'
},
}
}]
}));
It is running well on android and the emulator .
ISSUE :
When I pause the music (USING PAUSE BUTTON), the Media player goes away.
What should I do to keep the media player so that I can resume the music?
Any information regarding this would be appreciated & Thanks in advance.
EDITED: It works well for showing media player and plays music but if you click pause button it goes away for both above devices(Android/Test Emulator).
Just adding acknowledgment to it fixed the issue.
app.handle('media_status', (conv) => {
const mediaStatus = conv.intent.params.MEDIA_STATUS.resolved;
switch (mediaStatus) {
case 'FINISHED':
conv.add('Media has finished playing.');
break;
case 'FAILED':
conv.add('Media has failed.');
break;
case 'PAUSED' || 'STOPPED':
if (conv.request.context) {
// Persist the media progress value
const progress = conv.request.context.media.progress;
}
// Acknowledge pause/stop
conv.add(new Media({
mediaType: 'MEDIA_STATUS_ACK'
}));
break;
default:
conv.add('Unknown media status received.');
}
});

Any API documentation for the Cast package?

so I recently got started with the flutter package cast in order to communicate with Chromecast devices. But I couldn't find any details on how to use it. If you could give me some help in actually playing a media file such as a song or a video that would be wholesome!
My current code:
CastSession session;
Future<void> _connect(BuildContext context, CastDevice object) async {
session = await CastSessionManager().startSession(object);
session.stateStream.listen((state) {
if (state == CastSessionState.connected) {
// Close my custom GUI
Navigator.pop(context);
_sendMessage(session);
}
});
session.messageStream.listen((message) {
print('receive message: $message');
});
}
// My video playing code
session.sendMessage(CastSession.kNamespaceReceiver, {
'type': 'MEDIA',
'link': 'http://somegeneratedurl.com',
});
Ok, so I found a solution to the answer. There is unfortunately no command to play a video file. I've looked through the Gcast protocol reference and there is no command for playing video files. I found this package that can cast videos, and I'm gonna use that package instead.

How to get Video Played Duration While Using Streaming Media

I'm using Streaming Media plugin in my Ionic App .
The Video is played but how can we retrieve the video played duration while using this plugin.
playVideo(attachment){
var url = this.promotogramPath+attachment;
let options: StreamingVideoOptions = {
successCallback: (data) => { console.log('Video played',data) },
errorCallback: (e) => { console.log('Error streaming') },
orientation: 'landscape'
};
this.streamingMedia.playVideo(url, options);
}
I used this code and in successCallback function i m getting 'OK'.
How can we get the video played duration

accessing iPhone compass with JavaScript

Know if it's possible to access the iPhone compass in Safari using JavaScript? I see how the GPS can be accessed, but I can't figure out the compass.
On iOS, you can retrieve the compass value like this.
window.addEventListener('deviceorientation', function(e) {
console.log( e.webkitCompassHeading );
}, false);
For more informations, read the Apple DeviceOrientationEvent documentation.
Hope this helps.
You cannot access that information via javascript, unless you're using something like iPhoneGap
At the time this was true, in iOS 5 you can use the compass heading in JS. https://developer.apple.com/documentation/webkitjs/deviceorientationevent/1804777-webkitcompassheading
For Android it works auto, for iOS it needs to be clicked to start it.
Here's a part of code you can use for that
startBtn.addEventListener("click", startCompass);
function startCompass() {
if (isIOS) {
DeviceOrientationEvent.requestPermission()
.then((response) => {
if (response === "granted") {
window.addEventListener("deviceorientation", handler, true);
} else {
alert("has to be allowed!");
}
})
.catch(() => alert("not supported"));
} else {
window.addEventListener("deviceorientationabsolute", handler, true);
}
}
function handler(e) {
const degree = e.webkitCompassHeading || Math.abs(e.alpha - 360);
}
Full tutorial is here, try demo also
https://dev.to/orkhanjafarovr/real-compass-on-mobile-browsers-with-javascript-3emi
I advise you to use LeafletJS with this plugin
https://github.com/stefanocudini/leaflet-compass
very simple to use with events and methods.
You can try a demo here:
https://opengeo.tech/maps/leaflet-compass/