I'm using Ionic3 to build an android videochat application.
The videochat works perfectly between two tabs on my browser, but only shows the local video on my android device (the remote video being blank).
I'm using PeerJS for the peer-to-peer connection in my index.html:
I'm using the stunServer {url: "stun:stun.l.google.com:19302"} for the connection.
I'm using the functions shown on the home page: http://peerjs.com/
My config service:
import {Injectable} from '#angular/core';
#Injectable()
export class WebRTCConfig {
peerServerPort: number = 9000;
key:string = '<my peer id>';
stun: string = 'stun.l.google.com:19302';
stunServer = {
url: 'stun:' + this.stun
};
getPeerJSOption() {
return {
// Set API key for cloud server (you don't need this if you're running your own.
key: this.key,
// Set highest debug level (log everything!).
debug: 3,
// Set it to false because of:
// > PeerJS: ERROR Error: The cloud server currently does not support HTTPS.
// > Please run your own PeerServer to use HTTPS.
secure: false,
config: {
iceServers: [
this.stunServer/*,
this.turnServer*/
]
}
};
}
/**********************/
audio: boolean = true;
video: boolean = false;
getMediaStreamConstraints(): MediaStreamConstraints {
return <MediaStreamConstraints> {
audio: this.audio,
video: this.video
}
}
}
Snippet of my Peer WebRTC service:
createPeer(userId: string = '') {
// Create the Peer object where we create and receive connections.
this._peer = new Peer(/*userId,*/ this.config.getPeerJSOption());
setTimeout(()=> {
console.log(this._peer.id);
this.myid = this._peer.id;
}, 3000)
}
myCallId() {
return this.myid;
}
answer(call) {
call.answer(this._localStream);
this._step2(call);
}
init(myEl: HTMLMediaElement, otherEl: HTMLMediaElement, onCalling: Function) {
this.myEl = myEl;
this.otherEl = otherEl;
this.onCalling = onCalling;
// Receiving a call
this._peer.on('call', (call) => {
// Answer the call automatically (instead of prompting user) for demo purposes
this.answer(call);
});
this._peer.on('error', (err) => {
console.log(err.message);
// Return to step 2 if error occurs
if (this.onCalling) {
this.onCalling();
}
// this._step2();
});
this._step1();
}
call(otherUserId: string) {
// Initiate a call!
var call = this._peer.call(otherUserId, this._localStream);
this._step2(call);
}
endCall() {
this._existingCall.close();
// this._step2();
if (this.onCalling) {
this.onCalling();
}
}
private _step1() {
// Get audio/video stream
navigator.getUserMedia({ audio: true, video: true }, (stream) => {
// Set your video displays
this.myEl.src = URL.createObjectURL(stream);
this._localStream = stream;
// this._step2();
if (this.onCalling) {
this.onCalling();
}
}, (error) => {
console.log(error);
});
}
private _step2(call) {
// Hang up on an existing call if present
if (this._existingCall) {
this._existingCall.close();
}
// Wait for stream on the call, then set peer video display
call.on('stream', (stream) => {
this.otherEl.src = URL.createObjectURL(stream);
});
// UI stuff
this._existingCall = call;
// $('#their-id').text(call.peer);
call.on('close', () => {
// this._step2();
if (this.onCalling) {
this.onCalling();
}
});
}
In my chat.ts, I use this to call the function from the peer webrtc service:
call() {
this.webRTCService.call(this.calleeId);
}
It's likely to be a permission problem. You need to grant it permission to use the camera.
Camera Permission - Your application must request permission to use a
device camera.
<uses-permission android:name="android.permission.CAMERA" />
See
https://developer.android.com/guide/topics/media/camera.html
Related
I would like to subscribe to the existing user in a channel.
If a user(host) was published before the audience joined and subscribe host. The remote tracks do not play.
client.getClient().on("user-published", async (user, mediaType) => {
I found a solution. If you are in this post, you can try this code below:
We can use client.remoteUsers to get the remote users.
The sample code as below:
if (client.remoteUsers.length > 0) {
const host = client.remoteUsers[0];
setState((s) => {
return {
...s,
statusLive: StatusLive.live,
isPlayed: true,
};
});
if (host.hasVideo) {
await client.subscribe(host, "video");
host.videoTrack?.play(ref.current as HTMLElement);
}
if (host.hasAudio) {
await client.subscribe(host, "audio");
host.audioTrack?.play();
}
}
In my Huawei quick app, during music playback, when a user switches to another page in the app, and switches to playing another song using a status bar, the music playback pauses. How does it occurs?
Listen to audio events on the app home page, not only on the playback page. In this way, when the user leaves the playback page, each audio event can still be listened to, so as to control the playback logic.
Note: Huawei Quick App Engine does not support calling of the audio API in app.ux. Therefore, when a user exits the app, the quick app cannot receive the audio event callback even though the music is still playing in the background.
The following demo has two pages: Main (home page) and Audio. To avoid repeated code and ensure maintainability, separate the code for the Audio page as a common JavaScript for each page to call.
public utils.js:
import audio from '#system.audio';
export default{
listenAudio() {
var that=this;
console.info("util.js listenAudio ");
audio.onplay = function () {
console.log('audio onplay')
}
audio.onpause = function () {
console.log('audio onpause')
}
audio.onended = function () {
console.log('audio onended')
}
audio.ondurationchange = function () {
console.log('util.js ondurationchange')
var total = audio.duration
console.log('util.js ondurationchange total=' + total)
}
audio.ontimeupdate = function () {
var time = audio.currentTime
// console.log('util.js ontimeupdate time=' + time)
}
audio.onprevious = function () {
audio.cover = 'https://xx.jpg'
audio.title = "Piano music"
audio.artist = "Mozart"
// Replace with the music resource link.
audio.src = 'https://xx.mp3'
console.log(' util.js on previout event from notification ')
}
audio.onnext = function () {
audio.cover = 'xx.jpg'
audio.title = 'Pop';
audio.artist = 'Michael Jackson'
// Replace with the music resource link.
audio.src = 'https://xx.mp3'
console.log(' util.js on next event from notification ')
}
},
getAudioPlayState() {
audio.getPlayState({
success: function (data) {
console.log(`getAudioPlayState success: state: ${data.state},src:${data.src},
currentTime:${data.currentTime},autoplay:${data.autoplay},loop:${data.loop},
volume: ${data.volume},muted:${data.muted},notificationVisible:${data.notificationVisible}`);
},
fail: function (data, code) {
console.log('getAudioPlayState fail, code=' + code);
}
});
},
startPlay() {
audio.play();
},
pausePlay() {
audio.pause();
},
stopPlay() {
audio.stop();
},
seekProress(len) {
audio.currentTime = len;
},
setVolume(value) {
audio.volume = value;
},
setMute(isMuted) {
audio.muted = isMuted
},
setLoop(isloop) {
audio.loop = isloop
},
setStreamType() {
if (audio.streamType === 'music') {
audio.streamType = 'voicecall'
} else {
audio.streamType = 'music'
}
console.error('audio.streamType =' + audio.streamType);
},
setTitle(title) {
console.info('setTitle=' + title);
audio.title = title;
},
setArtist(artist) {
console.info('setArtist artist=' + artist) ;
audio.artist = artist;
},
setCover(src) {
console.info('setCover src=' + src);
audio.cover = src;
}
}
Add an audio event listener to the lifecycle method onShow of the Main page and call listenAudio in utils.js. Sample code:
<script>
import utils from '../Util/utils.js';
module.exports = {
onShow(options) {
utils.listenAudio();
},
}
</script>
Add an audio event listener to the lifecycle method onShow of the Audio page, and call listenAudio in utils.js. The progress callback event is listened separately because the playback progress needs to be displayed on the playback page. Sample code:
onShow(options) {
var that = this;
utils.listenAudio();
audio.ondurationchange = function () {
console.log('audio ondurationchange')
that.total = audio.duration
console.log('audio ondurationchange total=' + that.total)
}
audio.ontimeupdate = function () {
that.time = audio.currentTime
console.log('ontimeupdate time=' + that.time)
}
},
For more details, please check the following guide:
Quick App Audio Development Guide
Following official doc vue-socketio i init socket in store.js.
import VueSocketio from 'vue-socket.io'
import socketio from 'socket.io-client'
Vue.use(VueSocketio, socketio(process.env.SOCKET_PATH), store)
But socket opens right after project is openned. Can i avoid this string Vue.use(VueSocketio, socketio(ws://somepath), store)
and use something like this this.$socket.connect(ws://somepath) in my component. And how i ca open two different socket connection from 1 client?
You can use html5 WebSocket. And you don't need to import or require it. It's already provided. You can open any number of connections. In your component's script:
...
data() {
return {
ws1: null,
ws2: null,
}
},
mounted() {
this.startStream1()
this.startStream2()
},
methods: {
startStream1 () {
let vm = this
vm.ws1 = new WebSocket("wss://somepath1")
vm.ws1.onmessage = function (event) {
vm.$store.dispatch("handleStream", JSON.parse(event.data))
}
vm.ws1.onerror = function (error) {
console.log(error)
}
},
closeStream1 () {
this.ws1 && this.ws1.close()
},
startStream2() {
let vm = this
vm.ws2 = new WebSocket("wss://somepath2")
...
},
...
}
I am making a ionic 3 app. I want notifications to appear even when app is in foreground. I have tried using FCM Plugin I'm getting notifications only when app is in background.
Home.ts
import { AngularFireDatabase } from 'angularfire2/database';
import { Component } from '#angular/core';
import { NavController } from 'ionic-angular';
import firebase from 'firebase';
declare var FCMPlugin;
#Component({
selector: 'page-home',
templateUrl: 'home.html'
})
export class HomePage {
firestore = firebase.database().ref('/pushtokens');
firemsg = firebase.database().ref('/messages');
constructor(public navCtrl: NavController,public afd:AngularFireDatabase) {
this.tokensetup().then((token)=>{
this.storeToken(token);
})
}
ionViewDidLoad() {
FCMPlugin.onNotification(function (data) {
if (data.wasTapped) {
//Notification was received on device tray and tapped by the user.
alert(JSON.stringify(data));
} else {
//Notification was received in foreground. Maybe the user needs to be notified.
alert(JSON.stringify(data));
}
});
FCMPlugin.onTokenRefresh(function (token) {
alert(token);
});
}
tokensetup(){
var promise = new Promise((resolve,reject)=>{
FCMPlugin.getToken(function(token){
resolve(token);
},(err)=>{
reject(err);
});
})
return promise;
}
storeToken(token){
this.afd.list(this.firestore).push({
uid: firebase.auth().currentUser.uid,
devtoken: token
}).then(()=>{
alert('Token stored')
}).catch(()=>{
alert('Token not stored');
})
// this.afd.list(this.firemsg).push({
// sendername:'adirzoari',
// message: 'hello for checking'
// }).then(()=>{
// alert('Message stored');
// }).catch(()=>{
// alert('message not stored');
// })
}
}
the function cloud for notifications
var functions = require('firebase-functions');
var admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
var wrotedata;
exports.Pushtrigger = functions.database.ref('/messages/{messageId}').onWrite((event) => {
wrotedata = event.data.val();
admin.database().ref('/pushtokens').orderByChild('uid').once('value').then((alltokens) => {
var rawtokens = alltokens.val();
var tokens = [];
processtokens(rawtokens).then((processedtokens) => {
for (var token of processedtokens) {
tokens.push(token.devtoken);
}
var payload = {
"notification":{
"title":"From" + wrotedata.sendername,
"body":"Msg" + wrotedata.message,
"sound":"default",
},
"data":{
"sendername":wrotedata.sendername,
"message":wrotedata.message
}
}
return admin.messaging().sendToDevice(tokens, payload).then((response) => {
console.log('Pushed notifications');
}).catch((err) => {
console.log(err);
})
})
})
})
function processtokens(rawtokens) {
var promise = new Promise((resolve, reject) => {
var processedtokens = []
for (var token in rawtokens) {
processedtokens.push(rawtokens[token]);
}
resolve(processedtokens);
})
return promise;
}
it works only when the app in the background. but when i exit from the app and it's not in the background I don't get any notification.
You need to edit the FCM Plugin files. I found the solution only for android now.
I use https://github.com/fechanique/cordova-plugin-fcm this FCM plugin for android and ios in cordova.
You need to edit file MyFirebaseMessagingService.java line 53(line no be may be differ).
In this file there is a method onMessageReceived at the end of the method there is a line which is commented, this line calling an another method i.e. sendNotification(....).
sendNotification(remoteMessage.getNotification().getTitle(), remoteMessage.getNotification().getBody(), data);
You have to uncomment this line and change last parameter from remoteMessage.getData() to data (data variable is already there in the code).
And comment this line FCMPlugin.sendPushPayload( data );
Now you are good to go. Now you are able to receive notification even when app is opened (foreground), you will receive the banner (floating) notifications.
If you found anything for IOS please let me know!!!
I am using firebase plugin for ionic 3.
There is a check if notification data contain "notification_foreground" or not and save it in variable foregroundNotification.
if(data.containsKey("notification_foreground")){
foregroundNotification = true;
}
then it create showNotification variable which decide if we need to show notification or not and pass this to the sendMessage (show notification function).
if (!TextUtils.isEmpty(body) || !TextUtils.isEmpty(title) || (data != null && !data.isEmpty())) {
boolean showNotification = (FirebasePlugin.inBackground() || !FirebasePlugin.hasNotificationsCallback() || foregroundNotification) && (!TextUtils.isEmpty(body) || !TextUtils.isEmpty(title));
sendMessage(data, messageType, id, title, body, showNotification, sound, vibrate, light, color, icon, channelId, priority, visibility);
}
your payload should contain notification_foreground, notification_title and notification_body.
I am trying to test a real-time data connection between peers using RTCMultiConnection.
Setting up a session/room seems to work, but once it has been made, peers cannot seem to join. If I run this function again from another browser, while a session is opened, it still says the room does not exist and it opens up a new one, rather than joining in.
The channel and session id's are identical, so why does the peer not find the session?
function makeOrJoinRoom(id){
channelid = 'channel'+id;
roomid = 'room'+id;
sessionMedia = {audio: false, video: false, data: true};
var connection = new RTCMultiConnection(channelid);
connection.socketURL = 'https://rtcmulticonnection.herokuapp.com:443/';
connection.checkPresence( roomid, function(roomExists, roomid) {
alert('checking presence...');
alert('Room exists='+roomExists);
if(roomExists) {
alert('I am a participant');
connection.join({
sessionid: roomid,
session: sessionMedia
});
} else {
alert('I am the moderator');
connection.session = sessionMedia;
connection.open({
sessionid: roomid
});
}
});
}
Please replace your function with this:
function makeOrJoinRoom(roomid) {
var connection = new RTCMultiConnection();
connection.session = {
data: true
};
connection.socketURL = 'https://rtcmulticonnection.herokuapp.com:443/';
alert('checking presence...');
connection.checkPresence(roomid, function(roomExist, roomid) {
alert('Room exists=' + roomExist);
if (roomExist === true) {
alert('I am a participant');
connection.join(roomid);
} else {
alert('I am the moderator');
connection.open(roomid);
}
});
connection.onopen = function(event) {
alert('WebRTC chat opened!');
};
}
// call above function like this
makeOrJoinRoom('your-unique-room-id');