integrate real-time a/v chat into construct 2 game or embed construct 2 into app - ionic-framework

I would like to incorporate QuickBlox or Twilio WebRTC chat and A/V calling into the same Angular apps running on a web page or inside a Cordova/Crosswalk app, as a Construct 2 game. I would like to have an audio/video chat running during game play.
Can I embed Construct 2 games into an Ionic view or simple DOM element and then render the video chat over it? Or, should I be integrating the WebRTC chat sessions into Construct 2? Or can I simply display both canvases in the same page?
Thanks in advance.
See: https://quickblox.com/developers/Sample-webrtc-cordova

Junior, here's an answer from the Twilio Video team.
We aren’t investing time in Cordova/Crosswalk right now, although some customers have been asking for it on our GitHub project (https://github.com/twilio/twilio-video.js/issues/85).
twilio-video.js can be integrated into an Angular app easily today. We have a minimal framework test in our GitHub project showing how to set it up (https://github.com/twilio/twilio-video.js/tree/master/test/framework/twilio-video-angular). This isn't a full-fledged application; instead, it's meant to ensure we retain compatibility with Angular as we develop twilio-video.js. It might be nice if we had a more full-fledge Angular Quickstart application in the future, but it gets difficult to support and maintain the various different front-end frameworks (Angular, React, Ember, Meteor, Vue, etc.).
I don’t know much about Construct 2, although it looks like a commercial game engine built on JavaScript/HTML5.
Can I embed Construct 2 games into an Ionic view or simple DOM element and then render the video chat over it?
Yes, this would work.
Or, should I be integrating the WebRTC chat sessions into Construct 2?
This might work, too, assuming Construct 2 allows arbitrary JavaScript inside the game engine.
Or can I simply display both canvases in the same page?
Yes, this would work.
The technique used will depend on how much interaction between the game and the video chat needs to take place. For example, if the lifecycle of the video chat should correspond in some way to in-game elements, then it should be created within Construct 2. If the video chat serves more like a commentary on the game, separate from the gameplay mechanics, then either overlaid or alongside in the same page should work.

Related

Embed Unity game within ReactNative (handling bidirectional communication)

I am creating a multi-mobile (iOS/Android) app with React Native.
The app needs to embed & launch a Unity game.
It will need to send information to the game, and also receive information from the game. A function passing a JSON string would be sufficient.
Six years ago I embedded native iOS code within a Unity app and it was rather a dark art.
What is the state of play in 2018?
Presumably it is going to involve separate iOS and Android codebases and a ReactNative component to wrap these, providing a single JavaScript interface. At the Unity end, I'm not sure if it will require separate per-platform coding.

Unity3D multiplayer game using GunDB

I usually using firebase for syncing every player for my multiplayer game but this time I can't because this time I want create a desktop game and firebase only support mobile.
can I use Gundb as alternative to store the player position and animation. and every client automatically syncing the data
#alucard555 yes, there is a very very simple example of a browser-based game (Asteroids in 250LOC!) that could work in a desktop app via Electron or something:
https://github.com/amark/gun/blob/master/examples/game/space.html
You can play the game (arrow keys to move, space to fire a shockwave, doesn't work on mobile or small screens) here:
http://gunjs.herokuapp.com/game/space.html
With regards to Unity3D specifically, you would need a JavaScript bridge. I myself have not done Unity3D development myself, but I have (?) heard (?) it supports JavaScript? Or some variant of it?
GUN by itself is plain vanilla JS, the only porting UnityScript may need is changing the default localStorage and WebSocket adapters (these are modular and can easily be switched out for something Unity supports).
However I do not have enough Unity3D experience to speak on this matter. (I just looked up Firebase's Unity support, and noticed that it is not JS based, it is C++. This may mean JS is incompatible with Unity?)

How to communicate with Unity WebGL app

I want to implement WebGL model viewer using Unity. This viewer is integrated into existing website. On this website there is a list of models and when user clicks on a model a window pops out displaying this model (like sketchfab). I integrate model viewer which was built with Unity WebGL using
<iframe src="./path-to-unity-webgl-viewer"></iframe>
How can I communicate with the viewer. Is there a way to create some javascript API using Unity which can be accessed from outside Unity app?
Yes! See: https://docs.unity3d.com/Manual/webgl-interactingwithbrowserscripting.html
I did this for my games site, https://simmer.io by using SendMessage and JSLib as described in that article. Once you have that communication going, you need to do an additional layer of cross frame communication.
If this is on the same domain as you, it's easy (Communication between iFrames?). If the frames are on separate domains, you use cross domain messaging. See: http://blog.teamtreehouse.com/cross-domain-messaging-with-postmessage

Personalized Video / Facebook App - What is the best approach?

I want to build a facebook app featuring a personalized video which imports content assets from the user's facebook profile and their extended social graph and integrates these assets within the timeline. I am thinking of using Flash however a key stipulation is that the app works on mobile - and so I would need to use HTML5. My question is: Can I use Flash to build the application and then compile the app as HTML5 - or is there an alternative solution in the form of a HTML5 video toolkit with a programming layer that would allow me to build a web app / access the Facebook API?
I have done this a few times over the years and yes flash was the easiest however there are a few options which you have available to you that I know of which will be purely HTML5 based, personally I'd stay away from flash here as it will end up just getting int he way:
1- The cleanest method is to use a video compositing tool on the server side which can be programmed to accept variables. Personally I have only ever done this using ffmpeg however there a couple of alternatives which are out there.
The basic process would be to grab the media from FB then to composite them at certain point on top/below/around a base video which is sitting on the server using a shell script which you then pass the media assets to as variables. There are so many options as to how you might want this to be done, probably best id to have a look at some of these examples:
http://broadcasterproject.wordpress.com/2010/05/18/how-to-layerremix-videos-with-free-command-line-tools/
http://graphcomp.com/ffmpeg/
ffmpeg watermark without vhook?
note that last time I did this I used vhooks and custom filters, vhooks are now deprecated
This method will mean a reasonably heavy server load if your app is popular but it's probably the most robust across devices etc.
2- Use Popcorn.js, and let the processing be done on the client side. you could hard code it using css/js/html but popcorn is pretty stable although I havent seen how it runs on devices but in theory it should work (all standardized technologies). Basically the process would be to use javascript to fire the display of images overlayed on the video base file at preset cue points. Popcorn has all of the methods and means for you to do this already.
Hope this helps a bit. Good luck, sounds fun.
we realised some interactive video apps and one recent project was quite like your question describes.
We used adobe flash to track the motion - and published the project via create.js. You could have an image sequence from within create.js or put a video in a layer behind. This video would then control the player head time of the create.js motion tracked sequence via jquery.
worked fine - here a link to a testsetup with an image sequence.
Video Integration would be the next step.
http://www.jungeroemer.net/projekte/testpersvid/elftest01.html
(German text, sorry but it's nothing important to read there.
Just click the images and go for it)
you can download the sources from the link, if you need i can also upload the flash file to show you the motion tracking.

How to record game in cocos2d iPhone

I am developing a cocos2d app.
It's almost completed but now I want to record the activities of my app as a video file, including sound produced by the app.
How can I implement this?
Anybody can help me.
Please suggest a way to implement this.
Thanks in advance.
The question isn't new, but since it isn't answered I thought I'd pitch in:
We provide an SDK called "Everyplay" that allows you to do exactly what you're looking for. It's free to use, and is lightweight.
We provide out-of-the-box integrations for Unity3D, cocos2d (1.x, 2.x), cocos2d-x, and you can of course integrate to a custom OpenGL-based game engine.
The documentation is available at https://developers.everyplay.com/doc
The documentation contains an example app key to use when developing, but you can of course sign up for your own client key at https://developers.everyplay.com/
There are many options - and the fact that your app is cocos2d doesn't matter much.
iSimulate works well. You can actually play the app on your device and record the gameplay as well as the touch events. This is important if you want to show user interaction in your app. You run the app in the simulator but you control it from your device.
If you just want to record the app interaction without caring about showing users the touch events, you can use Screenflow or Jing or some other recording software. I used to use Jing (free) but Screenflow works better for me and it also lets you create more advanced video like a trailer with effects. edit You should be able to capture touch events through the simulator with Screenflow too. You can choose to show them or not. And can use different indicators for those events.
Search on google for mac or iphone recording software. There are many options. I had the best experience with Screenflow because I wanted to make a trailer and gameplay video.
I'm developing similar application which allow user record the activity within cocos2d-x activity.
I'm using screen capture method and then combine it using FFMPEG. The performance wasn't too good thought but is the easiest way to achieve.