I'm having trouble in the Flutter SDK Example project.
Specifically the WebRTC portion. I'm unable to get a video call connected.
I have the project deployed to two Android devices. One is physical, the other is an emulator.
I updated the credentials.dart to match my application information.
One device/app has user1. Other device/app has user2. With the opponents pointed to each other.
The steps I have performed are as follows:
Settings -> init
Auth -> Login
Chat -> Connect
WebRTC -> init
WebRTC -> Subscribe Events WebRTC
WebRTC -> Call WebRTC
I can see debug info in each app, after I click call, so it seems as though the 'chat' connection is working to deliver the messages.
When I try to 'accept' I get an error that 'id' is a required parameter.
Looking at the example source, it seem the 'sessionId' is only ever set when a call is placed. How would the receiving end ever get the sessionId?
Also, anything like 'enable video' or 'start rendering local' results in a null ref error being displayed in a dialog.
QuickBlox team has resolved this issue in an updated version of Flutter SDK.
Now the described logic is working correctly.
I can confirm the latest SDK (0.2.4-alpha) fixes the issue.
Also, be sure to grant the Camera and Microphone permissions.
Related
We are developing an app with angular/ionic in which we use capacitor jitsi plugin for video calls. What we are now trying to do is to receive notifications (via firebase) like in whatsapp with the incoming call screen and two buttons to accept and decline. Any idea on how to do that?
Thanks
If you got the choice to change the notification service, instead or directly using firebase, you could use Onesignal which extends firebase and they already have a service named VOIP notifications which should kinda do your needs and here is the link:
https://documentation.onesignal.com/docs/voip-notifications
In case your are restricted with firebase or need to know how this could be done, bellow will be the way to achieve it..
As for android:
First as logic part, you need to add some code in the native layer since hybrid apps usually can't interact from JavaScript side to native side in case app was not launched, so in order to wake the application on a specific event like notification received or any other actions that phone system can hold..
Second, as technical part, you need to add broadcast receivers and the receivers role stand to interact as native code with system. example in the link below:
https://www.digitalocean.com/community/tutorials/android-broadcastreceiver-example-tutorial
also another video about foreground and background broadcast receiver service in the link below:
https://www.youtube.com/watch?v=rlzfcqDlovg
video code output in git:
https://github.com/borntocoderepos/callrecorder
in the Youtube video example, the user is launching a toast message on phone call if app was opened or closed (background or foreground) so you can launch your app with intent with passing data and capture the data on app start as Deep Links as capacitor (https://capacitorjs.com/docs/guides/deep-links) or Cordova (https://ionicframework.com/docs/native/deeplinks)..
And instead of listening to network or phone calls, you can listen to Notifications and for sure you need to do searches about your topic and or the notification service that you'll choose.
Now for the video and the tutorial not sure of the quality of code so make sure to do more researches about the way its done from different places (could be outdated code or bad code quality or even not complete service and will discuss about this point below).
In android there is policy about using background and foreground services so once you start a service you need to end it after your done so make sure after you receive the notification and launch your app to stop the listening since it would cost power usage and perhaps could be stopped by Playstore as harmful app.
Now considering IOS it should be the same concept so make searches about this topic, but for IOS, the listeners policy as I remember , the receivers should not be waked up for more than 15 mins, so also keep this in mind and make sure you stop the receivers directly after launching your Hybrid app.
Broadcast equivalent receiver for IOS:
http://www.andrewcbancroft.com/2014/10/08/fundamentals-of-nsnotificationcenter-in-swift/
I am building a live streaming flutter application. I have implemented live streaming and live comments (using pusher). When a viewer from the audience sends a request to join the live stream as a co-host, the main host accepts the request and the viewer navigates to the live streaming page again with a role of Broadcaster with the same Channel Name as before but instead of showing 2 camera frames, one of the main host and the other of the newly joined co-host it shows a black screen and throws and exception Unhandled Exception: PlatformException(-7, , null, null). I have also searched this exception, it says your SDK is not initialized properly but I have initialized it properly. I don't know where I am wrong. My question is how can I implement the multiple host functionality in a flutter app using agora. I have also read the agora documentation but unable to fix the issue. Any help will be appreciated. THANKS
Please I want to create an online or offline feature for users in my app
I tried onDisconnect in Firebase and it didn't work for me
You can use socket.io for flutter. When user open the app then socket will tell the server that specific user is connected.The user able to see other connected users. Here is a article of how to use socket.io in your project.
You can try pusher presence channels.
It is based on websockets, and it tells you who is connected to the channel (who is present). You can find the use cases for presence channels here.
click here to find out how to build presence channels.
I'm currently building an online and offline statusbar for a flutter app if a user is connected to a presence channel. If I have the working code I'll post an update.
I just integrated AdMob into my project and I get a whole bunch of these error messages in the Xcode output.
The app does not communicate w/ the internet and does not open up WKView (all I found n SO was references to WKWebView like this https://stackoverflow.com/a/44623268/14414215 but doesn't seem to be related to me since I don't use WKWebView). All I did was integrate Google-Mobile-Ads using cocoa pods and followed the Admob support pages.
Some SO Pages talk about ATS, but google support pages don't have the same error message (https://developers.google.com/admob/ios/app-transport-security)
App Transport Security has blocked a cleartext HTTP (http://) resource
load since it is insecure. Temporary exceptions can be configured via
your app's Info.plist file.
Its happening both on simulator and real device. Is this a real issue or something I can ignore?
Also, there's a ton of messages coming out from the Admob SDK, it's frankly a bit annoying to filter through.
The Messages in the console remains and does not seem to affect the app performance (AFAICT) and while it is excessive, I have silenced them some-what using environment variable ( https://stackoverflow.com/a/64471106/14414215) in my scheme. (per below pictures)
Take note however, if you do have issues w/ Google-Mobile-Ads, please be reminded to remove this such that you will be able to see the console messages.
I would like to create an Android app that is basically a chat.
I have read many articles on the web until I found QuickBlox.
(I admit I didn't understand exactly how it works and how to use it and I'm new in mobile app develiping).
I am following this tutorial and now I should download the sample code.
I saw that there are different types:
Simple Android Chat code sample
Simple Android WebRTC (VideoChat) code sample
Simple Android Location code sample
Simple Android Push Notifications (Messages) code sample
Simple android Custom Objects (key value data store) code sample
Simple android Users authentication code sample
Simple android content storage and update code sample
I would like some advice on which is the best in my case.
My app is a "simple" chat app that allows a registration (for example, using social networks like Google+ and Facebook) and orders the users based on proximity.
So if the user X is located in Paris, registered users will see the app in order of proximity (eg before those who live in Saint-Denis and then those live in Orleans).
For the moment I'm only interested to send/receive text, not multimedia content such as photos and videos.
So I'm undecided whether it is better to use between "Simple Chat Android code sample", "Simple Android Location code sample" or "Simple Android Push Notifications (Messages) code sample".
From what I understand, push notifications can be added at a later time, or am I wrong?
I hope I have said all that is necessary for the selection..
Thanks for your suggestions
You will need all three in order to do what you want:
Simple Chat Android code sample - helps with almost everything involved in normal chat applications(send and receive messages,emoji support,typing status etc). However it doesn't include multimedia messaging and push notification support.
Simple Android Push Notifications (Messages) code sample - basically shows how to send and receive push notifications. Which can be added anytime you want. The chat will work fine without it.
3.Simple Android Location code sample - shows how the location api works and can be used for what you described.
Hope this helps.