How to send a photo over network with Unity? - unity3d

I am trying to show the players Facebook profile picture each other when they are playing vs game. I tried to send the picture with [Command] and [ClientRpcCall] attributes but do not work. Any idea?

It's trivial to google an answer to this question.
To get you started something like ...
[RPC]
void SendTextures(byte[] receivedByte){
receivedTexture = null;
receivedTexture = new Texture2D(1, 1);
receivedTexture.LoadImage(receivedByte);
GetComponent<Renderer>().material.mainTexture = receivedTexture;
}
Just google for 100s of full examples and discussions.
Here's a huge discussion with many full scripts and so on
http://forum.unity3d.com/threads/sync-texture-over-network.76538/
Try googling,
Unity3D send image over RPC
Unity3D send PNG over RPC
Unity3D texture RPC
and so on

Related

Game Manager that controls two characters

I have a question.
I'm trying to make a runner game with 2 characters not one. I've done the movement and the camera related stuff.
Now I'm trying to add the Game Manager.
The problem is that my Game Manager isn't able to accesss the PlayerMotor of both characters.
I found a tutorial on Youtube that uses a singleton but it accesses only one player character which is obvious cause it's a singleton. So can you help me out guys? Unfortunately I'm not a programmer so I can't figure it out.
How can the Game Manager access both of their PlayerMotor instances to start the game?
From: Unity - Scripting API: Object.FindObjectsOfType
You can retrieve your Playermotor scripts by using FindObjectsOfType like that:
var playersMotor = Object.FindObjectsOfType<Playermotor>(); // find Playermotor scripts in your scene and store it in playersMotor
Or:
var playersMotor = Object.FindObjectsOfType(typeof(Playermotor));

Agora SDK not working in Windows Build. VideoSurface.cs always gets tmpi = -1 in Update

I am trying to implement Screen broadcast with Unity using the Agora Video Chat SDK for Unity. I used this source, which doesn't work initially. But after modifying the code as below, I am able to receive my own stream through the server, inside Unity editor (2019.1.2f1).
//Adding inside Start
mRtcEngine.OnJoinChannelSuccess = Joined;
}
private void Joined(string channelName, uint uid, int elapsed)
{
var videoSource = FindObjectOfType<VideoSurface>();
videoSource.SetForUser(uid);
videoSource.SetEnable(true);
}
But nothing happens in the Windows build. I checked the VideoSurface.cs file. I am continuously getting tmpi = -1 inside Update. What could be the reason?
PS. I check all firewall permissions for the build. Also, the user is able to join the channel. It's just the stream that is not being received. Help appreciated.
You shouldn't need to modify the code like that. And also, in the code above you register the callback for the local user. If you want to show remote user's video, you should register the callback for OnUserJoined().
Have you seen the tutorial about the Screensharing? https://www.agora.io/en/blog/how-to-broadcast-your-screen-with-unity3d-and-agora/
Please try that. If you are still confused, you may take a look at this github repo. It has different contents to share, but the concept and Agora API usage are pretty much the same.

How do I feed Unity Web Player extern data?

I am new to using Unity3D, and I am supposed to use it on a web application (which is build with Ruby on Rails).
My problem is that I need to feed it data from my database but I don`t know how to give it extern data.
I`d appreciate any kind of help. :)
You can use WWW to make a request on the server and receive data.
http://docs.unity3d.com/ScriptReference/WWWForm.html (see the 2nd example)
Note that Unity Webplayer is no longer updated and its already removed from Unity 5.4 ( http://blogs.unity3d.com/2015/10/08/unity-web-player-roadmap/ )

Show Twitch.tv stream within a unity3D application

I would like to make an Unity3D Application in which one can watch a current Twitch.tv live stream.
I am not sure if this is possible, for example, with the twitch api (https://github.com/justintv/twitch-api)
I know about the video textures in Unity3D and I know how to use the default twitch api basics but I do not have an idea how to integrate a running twitch stream into my application.
Can someone please give me a hint if this is possible?
Thanks very much and best regards
Meph
Ultimately the stream coming from twitch is MPEG-4 (H264/M3U). So if you can render that in Unity, then you can render twitch streams.
There's a few steps to get the right URLs requested from twitch, and that can change over time. You'll have to inspect a current twitch page while a stream is playing to see how the javascript builds the request (url and headers).
In the end, the javascript will build an access token then use it to request a file called index-live.m3u8 from one of the twitch edge servers. That file contains a list of files names for the last few seconds of the live stream (and some other meta data). It looks something like this
https://video-edge-c61b44.lax01.hls.ttvnw.net/v0/[some-long-access-token]/index-live.m3u8
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:5
#ID3-EQUIV-TDTG:2017-10-27T16:53:27
#EXT-X-MEDIA-SEQUENCE:8788
#EXT-X-TWITCH-ELAPSED-SECS:17576.000
#EXT-X-TWITCH-TOTAL-SECS:17589.870
#EXTINF:2.000,
index-0000008788-Y6OH.ts
#EXTINF:2.000,
index-0000008789-l0QY.ts
#EXTINF:2.000,
index-0000008790-gCUV.ts
#EXTINF:2.000,
index-0000008791-1ngg.ts
#EXTINF:2.000,
index-0000008792-wpQL.ts
#EXTINF:2.000,
index-0000008793-koO4.ts
You then swap out index-live.m3u8 with the name of a file in the list and request it to get that clip. Something like:
https://video-edge-c61b44.lax01.hls.ttvnw.net/v0/[the-same-long-access-token]/index-0000008793-koO4.ts
It will be an MPEG-4 stream about 1 second long. The list is about 6 files long, so if you request them all you can get about a 6 second buffer.
Every second or two, index-live.m3u8 is updated and older files roll off as new ones are added to the bottom. You need to re-request this file every few seconds as your buffer runs out, and request the new clips in it to keep your playback going.
using Ted Bigham's answer you can get the m3u8 stream of the video. You can then feed this stream into a plugin like AVPro video from Renderheads to play it directly in unity on any shape or model you want, with a wide variety of customisation and settings.
However this is not a free plugin, and there may be alternatives. I am in no way affiliated to them, but have used this plug-in in the past with good results.
p.s i originally wanted to post this as just a comment, and believe Ted Bigham answered the grunt of your question. But am not allowed due to reputation.
#Ted Bigham is there anyway to do this just using the default unity video player?
Everything you've said so far has worked great! With just chrome, I'm able to open the .m3u8 and download each 1 second .ts file. With just any video player, I can view each video file just fine.
When I try to pass the url through Video Player though, it says "Cannot read file." I even tried replacing .ts with .mp4 since that is a supported format. This works when I test it outside of Unity, but gives me the same error.
Here's my code.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Networking;
using UnityEngine.Video;
public class TwitchStreamer : MonoBehaviour
{
public string m3u8 = "https://video-weaver.atl01.hls.ttvnw.net/v1/playlist/Cq4EuEYkh5IqurC4gy3nHqGhPVtUwH_" +
"5QcQbCatC5Fhit9qbivMQ2rdh2MS_m_2OuLd3VS2mF0eTrCKrz8YTuDO19mcbIBMJL3BUMz4jnNuU_t-e53V51TOtaN3vcCk9n3Qr" +
"dP0WyLREnVR0n_30d4PlUjjW4_si5Wr2XuePQ0dPtxP6jnsenKnX56YTLohtCN2-FdfSvHQMdfd0aw68FA4h9wawHoIM9-U6YRmPa" +
"fDsfdCiZr_iToSR6lZi81VoYPVjt7Ygf7xKwhjlNwrvA5SnsAnWQGIOVt4UjDkNLw-hmNMAr7RT0iiDghKXZY1VI6Tuc-umB1VXYE" +
"7BH5hHbDfHgB3_IYNb0fjoudtSuaZxISyWoazPrw3AibEO7k1-quhdcjarBTGpIi_dlPEp-yZlQOy98_OZY_tqjk8ZWTBIaAAYEG_" +
"miwqsgH4d6eIfkh3ehyMvPQH1C5dVG9tQcSWPUYU6D6hWhxvJhEr-UC0_BYWIVzX7z_Uf74FJGIEqSQc0d6igiowdMM_lyD8ZV9BE" +
"7wqQs3RegMPqux-AOfF-_Q7Ki2MBv9u7D9ZRXMH_cm20bTx5-ShEDRnWMApSfXK-9bAGNXUcw8YlBbHYeSN5VxEZMC2oGjcivBsGs" +
"RPMTQ_yNBSM1S6GxFRIR4nqA-mbdXg3rXMW3V6MNybBb1lrrQeEqF1tdYE0rfxe3Ki5WWkxeKmSjMGbMl1tHCwMaReTYkQnX5Qhjl" +
"HXXtKLtEIEEhB3cXW3oF05-E_q87s68JQIGgyIEKPiQlTsANR9zRc.m3u8";
public float updateClip = 1f, updateFile = 6f;
public VideoPlayer vp;
Queue<string> urls;
void OnEnable()
{
StartCoroutine(UpdateFile());
//InvokeRepeating("UpdateFile", 0f, updateFile);
InvokeRepeating("UpdateClip", updateClip, updateClip);
urls = new Queue<string>();
vp.prepareCompleted += Vp_prepareCompleted;
vp.errorReceived += Vp_errorReceived;
}
private void Vp_errorReceived(VideoPlayer source, string message)
{
Debug.Log("4567: Playback preparation failed. " + message, source);
}
private void Vp_prepareCompleted(VideoPlayer source)
{
Debug.Log("4567: Playback preparation complete.");
source.Play();
}
void OnDisable()
{
CancelInvoke();
StopAllCoroutines();
vp.prepareCompleted -= Vp_prepareCompleted;
vp.errorReceived -= Vp_errorReceived;
}
void UpdateClip()
{
//vp.Stop();
if (urls.Count > 0)
{
vp.url = urls.Dequeue();
vp.Prepare();
Debug.Log("4567: Prepare playback for " + vp.url);
}
}
IEnumerator UpdateFile()
{
UnityWebRequest www = UnityWebRequest.Get(m3u8);
yield return www.SendWebRequest();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
// Show results as text
//Debug.Log(www.downloadHandler.text);
string[] raw = www.downloadHandler.text.Split('\n');
foreach (string line in raw)
{
if (line.Contains("http"))
{
string cnvrt = line.Replace(".ts", ".mp4");
urls.Enqueue(cnvrt);
Debug.Log(line + " has been added to stream queue.");
}
}
// Or retrieve results as binary data
//byte[] results = www.downloadHandler.data;
}
yield return new WaitForSeconds(updateFile);
}
}
I believe what this person wants to know is whether or not the Twitch API actually retrieves a live video feed and passes the data through to the application. The short answer is, it doesn't. The Twitch API doesn't allow you to take a live video feed, only get the information about it.
You can get strings telling you the name of the channel, the game and links to take you to the channel. You can also get their channel background or previews as image files. What you cannot do is get video files or stream video data. The best you can do is get the URLs and link to them.
The only method I can think of to get a Live Stream in any game is for it to be the player's. Essentially you would code in a method by which they can stream their game within the game itself, taking the stream data and copying it into some variable or class somewhere, before it is uploaded to Twitch, so you can use the data at the same time it is being broadcast. But if you want to retrieve some random streamer's gameplay and display it on a computer screen in your game, I'm afraid you can't do that. Not in Unity, anyway.
You can embed a twitch stream into a website
There are web browser plugins for Unity(see)
So if you combine these ideas, you can show a twitch stream in unity by running an in-app browser that will run a custom web page which embed the stream that you want. I don't know what kind of performance you'll get out of this but that sounds simple enough to do. It probably is not as simple as directly showing the stream on a texture but it's at least simpler than learning C, network programming, video processing and etc like Thomas suggested.
This is a tricky thing to do. You would need an API key from Twitch, a program that can establish not only a connection with your chosen API, but also a data stream. I would suggest starting with learning basic network programming in a less specialized language like C, and work from there. After you understand the basics, please come back and ask more specific questions. I can't do that much with this question in the state that it's in.
As with everything else, this is definitely possible, just maybe not with C#. For the sake of not destroying your game's performance, I would suggest writing a separate program that feeds the data into the game without the game directly grabbing data.
Once you gain an understanding of network programming, I would hop over to the twitch dev chat on the API, found here

Audio recording with HTML5 Web Audio Api

Does anyone know if the Web Audio API provides the ability to save audio played using the WebAudioContext?
I actually wrote a small utility called RecorderJS that might help.
There is a startRendering function in Chrome at least (haven't checked Safari). I think it's undergoing some rework and thus isn't included in the spec, but might be added at a later stage (or not). If you want to check out the current implementation, have a look at the answer at Is there a way to use the Web Audio API to sample audio faster than real-time?
There is a W3C specification for a recording API http://www.w3.org/TR/mediastream-recording/ , but as of now it is being implemented only in Firefox.
Client side there is available only the ScriptProcessorNode hack (which is what Record.js is based on).
Alternatively, for some use cases it might make sense to stream the audio to a server using WebRTC and write a server side recorder using Libjingle.
This library work fine, web audio api only (meaning no i.e users):
https://github.com/higuma/web-audio-recorder-js
But we can fairly use it now:
http://caniuse.com/#feat=audio-api
Anyway like you said your sound is already in an audiocontext, so I think you are looking for how to use the AudioDestinationNode, the final node of the web audio api. As soon as you can playing your audio through a regular html audio player, you will gain the record function on right click, like playDataUri do. You need to add the attribute "controls" to the player, or you can make a special link with download attribute.
I made a small enhancement of the Mdn script to send the data to an player, it should give you a good idea:
var audioCtx = new AudioContext();
var source = audioCtx.createMediaElementSource(myMediaElement);
myMediaElement = document.createElement("audio");
myMediaElement.setAttribute("autoplay", true);
myMediaElement.setAttribute("src", uri);
myMediaElement.setAttribute("controls", "controls");
document.getElementById('player').appendChild(myMediaElement);
source.connect(audioCtx.destination);
The AudioDestinationNode interface represents the end destination of
an audio graph in a given context — usually the speakers of your
device. It can also be the node that will "record" the audio data when
used with an OfflineAudioContext.
https://developer.mozilla.org/en-US/docs/Web/API/AudioDestinationNode
This is now available in the latest browsers, its called media recorder you can find more information here
Easiest way is to create a stream as
var dest = audioCtx.createMediaStreamDestination();
var options = {
audioBitsPerSecond : 320000,
sampleSize: 16,
channelCount: 2,
mimeType : 'audio/ogg'
}
var mediaRecorder = new MediaRecorder(dest.stream, options);
var chunks = [];
var isrecording = "Not Recording";
function rec(){
mediaRecorder.start();// start record
dataavailable = true;
isrecording = mediaRecorder.state;
}
you can can check out an example of my soundrec
App here. Although its also a full on Multiband compressor and 5 band paragraphic here.
Oh and if you check out my link . the most important thing to get your head around is the ondataavilable method. And Saving as a blob is a bit of ahead bash too.
Ps if anyone wants to help me get this working on chrome sen me an Email. Thanx.
It will only work in Firefox at the moment.
MultiBand Compressor