MonoGame on iOS (init texture from internet) - monogame

I'm developing a game on iOS with Monogame. Is it possible to init a texture2D with images downloaded from the internet? If so, is there any samples I can follow? Thanks a lot.

I'm not sure if this maps 1 to 1 with iOS, but here is how I did it on Android MonoGame. Most of this code came from another post on Stack, but unfortunately I can't remember the post!
/// <summary>
/// Downloads a texture based on a URL path, and stores it in this sprites texture.
/// </summary>
/// <param name="URL">The path to the file.</param>
private void DownloadTexture(String URL)
{
HttpWebRequest request = HttpWebRequest.Create(new Uri(URL)) as HttpWebRequest;
request.BeginGetResponse((ar) =>
{
HttpWebResponse response = request.EndGetResponse(ar) as HttpWebResponse;
using (Stream stream = response.GetResponseStream())
{
using (MemoryStream ms = new MemoryStream())
{
int count = 0;
do
{
byte[] buf = new byte[1024];
count = stream.Read(buf, 0, 1024);
ms.Write(buf, 0, count);
} while (stream.CanRead && count > 0);
ms.Seek(0, SeekOrigin.Begin);
/*Texture2D*/ mTexture = Texture2D.FromStream(GameObjectManager.pInstance.pGraphicsDevice, ms); //.PreMultiplyAlpha();
}
}
}, null);
}

Related

Unity3d how to play streamed audio via Audioclip?

I am streaming audio via websocket from IBM Watson TTS,
currently I already have the audio playing from the stream with NAudio,
but I would like to shift out NAudio for the builtin audioclip
( NAudio is great, but would like to be compatible with Oculus lipsync plus other plugins using the audioclip )
Solution below with NAudio,
The websocket is adding the received bytes to the MemoryStream
private MemoryStream ms = new MemoryStream();
websocket.OnMessage += async (bytes) =>
{
var pos = ms.Position;
ms.Position = ms.Length;
ms.Write(bytes, 0, bytes.Length);
ms.Position = pos;
if (!isPlaying)
{
StartCoroutine(PlayAudioStream());
}
}
The playback is currently handled by NAudio, I would like the shift to a AudioClip if possible,
using (var blockAlignedStream = new BlockAlignReductionStream(WaveFormatConversionStream.CreatePcmStream(new RawSourceWaveStream(ms, new WaveFormat(22050, 16, 1)))))
{
var aggregator = new SampleAggregator(blockAlignedStream.ToSampleProvider());
aggregator.NotificationCount = blockAlignedStream.WaveFormat.SampleRate / 50;
using (var wo = new WaveOutEvent())
{
isPlaying = true;
wo.Init(aggregator);
wo.Play();
while (wo.PlaybackState == PlaybackState.Playing) //&& !disconnected
{
yield return new WaitForEndOfFrame();
}
wo.Dispose();
aggregator.Reset();
aggregator = null;
}
}

How to access Hololens front camera

I'm working with hololens, and I'm trying to get the image of the front camera. The only thing I need is to take the image of each frame of that camera and transform it into a byte array.
Follow Unity Example in the documentation. It has a well written example:
https://docs.unity3d.com/Manual/windowsholographic-photocapture.html
Copied from the unity documentation from the link above:
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.XR.WSA.WebCam;
public class PhotoCaptureExample : MonoBehaviour {
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
// Use this for initialization
void Start() {
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, delegate (PhotoCapture captureObject) {
photoCaptureObject = captureObject;
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, delegate (PhotoCapture.PhotoCaptureResult result) {
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
});
});
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame) {
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a GameObject to which the texture can be applied
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Custom/Unlit/UnlitTexture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result) {
// Shutdown the photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
}
}
To get bytes: Replace the following method:
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame) {
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a GameObject to which the texture can be applied
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Custom/Unlit/UnlitTexture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
with the method below:
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
List<byte> imageBufferList = new List<byte>();
imageBufferList.Clear();
photoCaptureFrame.CopyRawImageDataIntoBuffer(imageBufferList);
var bytesArray = imageBufferList.ToArray();
}
If you are using Unity 2018.1 or 2018.2 (I think also 2017.4) now, then it will NOT work. Unity has public tracker to resolve it:
https://issuetracker.unity3d.com/issues/windowsmr-failure-to-take-photo-capture-in-hololens
I created a workaround until Unity fixes the bug:
https://github.com/MSAlshair/HoloLensMediaCapture
Basic sample without using PhotoCapture from unity as a workaround: More details in the link above
You must add #if WINDOWS_UWP to be able to use MediaCapture: Ideally, you want to use PhotoCapture from unity to avoid this, but until Unity resolve the issue, you can use something like this.
#if WINDOWS_UWP
public async System.Threading.Tasks.Task<byte[]> GetPhotoAsync()
{
//Get available devices info
var devices = await Windows.Devices.Enumeration.DeviceInformation.FindAllAsync(
Windows.Devices.Enumeration.DeviceClass.VideoCapture);
var numberOfDevices = devices.Count;
byte[] photoBytes = null;
//Check if the device has camera
if (devices.Count > 0)
{
Windows.Media.Capture.MediaCapture mediaCapture = new Windows.Media.Capture.MediaCapture();
await mediaCapture.InitializeAsync();
//Get Highest available resolution
var highestResolution = mediaCapture.VideoDeviceController.GetAvailableMediaStreamProperties(
Windows.Media.Capture.MediaStreamType.Photo).
Select(item => item as Windows.Media.MediaProperties.ImageEncodingProperties).
Where(item => item != null).
OrderByDescending(Resolution => Resolution.Height * Resolution.Width).
ToList().First();
using (var photoRandomAccessStream = new Windows.Storage.Streams.InMemoryRandomAccessStream())
{
await mediaCapture.CapturePhotoToStreamAsync(highestResolution, photoRandomAccessStream);
//Covnert stream to byte array
photoBytes = await ConvertFromInMemoryRandomAccessStreamToByteArrayAsync(photoRandomAccessStream);
}
}
else
{
System.Diagnostics.Debug.WriteLine("No camera device detected!");
}
return photoBytes;
}
public static async System.Threading.Tasks.Task<byte[]> ConvertFromInMemoryRandomAccessStreamToByteArrayAsync(
Windows.Storage.Streams.InMemoryRandomAccessStream inMemoryRandomAccessStream)
{
using (var dataReader = new Windows.Storage.Streams.DataReader(inMemoryRandomAccessStream.GetInputStreamAt(0)))
{
var bytes = new byte[inMemoryRandomAccessStream.Size];
await dataReader.LoadAsync((uint)inMemoryRandomAccessStream.Size);
dataReader.ReadBytes(bytes);
return bytes;
}
}
#endif
Do NOT forget to add capabilities to the manifest:
WebCam
Microphone: I am not sure if it needed since we are only taking photos or not, but I added it anyway.
Pictures Library if you want to save to it
In player settings, enable access to the camera, and try again.

Unable to load movie via WWW

I'm trying to load a video via url, but I keep getting the same error. I'm using Unity 5.3 and the example code from http://docs.unity3d.com/ScriptReference/WWW-movie.html (heavily modified because the current example doesn't compile).
using UnityEngine;
using System.Collections;
// Make sure we have gui texture and audio source
[RequireComponent (typeof(GUITexture))]
[RequireComponent (typeof(AudioSource))]
public class TestMovie : MonoBehaviour {
string url = "http://www.unity3d.com/webplayers/Movie/sample.ogg";
WWW www;
void Start () {
// Start download
www = new WWW(url);
StartCoroutine(PlayMovie());
}
IEnumerator PlayMovie(){
MovieTexture movieTexture = www.movie;
// Make sure the movie is ready to start before we start playing
while (!movieTexture.isReadyToPlay){
yield return 0;
}
GUITexture gt = gameObject.GetComponent<GUITexture>();
// Initialize gui texture to be 1:1 resolution centered on screen
gt.texture = movieTexture;
transform.localScale = Vector3.zero;
transform.position = new Vector3 (0.5f,0.5f,0f);
// gt.pixelInset.xMin = -movieTexture.width / 2;
// gt.pixelInset.xMax = movieTexture.width / 2;
// gt.pixelInset.yMin = -movieTexture.height / 2;
// gt.pixelInset.yMax = movieTexture.height / 2;
// Assign clip to audio source
// Sync playback with audio
AudioSource aud = gameObject.GetComponent<AudioSource>();
aud.clip = movieTexture.audioClip;
// Play both movie & sound
movieTexture.Play();
aud.Play();
}
}
I added this as a script to the Main Camera in a new scene, and I get this error:
Error: Cannot create FMOD::Sound instance for resource (null), (An invalid parameter was passed to this function. )
UnityEngine.WWW:get_movie()
<PlayMovie>c__Iterator4:MoveNext() (at Assets/TestMovie.cs:20)
UnityEngine.MonoBehaviour:StartCoroutine(IEnumerator)
TestMovie:Start() (at Assets/TestMovie.cs:16)
(Line 20 is MovieTexture movieTexture = www.movie;)
I've been working on this for a while now, it's happened on many files and both of my systems.
I found a solution! I tested the code with unity 5.2.X and 5.3.X.
I dont know why but Unity 3d requires first wait unitl the download is done with isDone == true and after the copy of the texture wait until the isReadyToPlay == true.
The movie file must be OGG Video format some MP4 files doesn't work.
Well. Check the code:
using UnityEngine;
using UnityEngine.UI;
using System.Collections;
public class MovieTextureStream : MonoBehaviour {
public Text progressGUI;
public MeshRenderer targetRender = null;
public AudioSource targetAudio = null;
public string URLString = "http://unity3d.com/files/docs/sample.ogg";
MovieTexture loadedTexture;
IEnumerator Start() {
if(targetRender ==null) targetRender = GetComponent<MeshRenderer> ();
if(targetAudio ==null) targetAudio = GetComponent<AudioSource> ();
WWW www = new WWW (URLString);
while (www.isDone == false) {
if(progressGUI !=null) progressGUI.text = "Progresso do video: " + (int)(100.0f * www.progress) + "%";
yield return 0;
}
loadedTexture = www.movie;
while (loadedTexture.isReadyToPlay == false) {
yield return 0;
}
targetRender.material.mainTexture = loadedTexture;
targetAudio.clip = loadedTexture.audioClip;
targetAudio.Play ();
loadedTexture.Play ();
}
}
I won't provide a solution with the WWW.movie, but an alternative solution that may help ou or anyone else.
On IPhone, we didn't find a solution to stream video from a server, we decided to download the video before reading it, here's how:
string docPath = Application.persistentDataPath + "/" + id + ".mp4";
if (!System.IO.File.Exists(docPath))
{
WWW www = new WWW(videouUrl);
while(!www.isDone)
{
yield return new WaitForSeconds(1);
Loading.Instance.Message = "Downloading video : " + (int)(www.progress * 100) + "%";
if (!string.IsNullOrEmpty(www.error))
Debug.Log(www.error);
}
byte[] data = www.bytes;
System.IO.File.WriteAllBytes(docPath, data);
}
mediaPlayer.Load(docPath);
onVideoReady();
mediaPlayer.Play();
This is the coroutine used to download and write the video on the iphone's file system. Once it's done, you can load and play it.
Hope it helps.

Load sprite from a base64 string which came from a websocket

I am trying to turn a base64 String into an Sprite in Unity 3D, but my sprite in scene remains blank.
public var cardPicture : Image;
function ReceiveData(jsonReply : JSONObject) {
var pictureBytes : byte[] = System.Convert.FromBase64String(jsonReply.GetString("picture"));
var cardPictureTexture = new Texture2D( 720, 720);
Debug.Log(cardPictureTexture.LoadImage(pictureBytes));
var sprite : Sprite = new Sprite ();
sprite = Sprite.Create (cardPictureTexture, new Rect (0,0,720,720), new Vector2 (0.5f, 0.5f));
cardPicture.overrideSprite = sprite;
}
This prints out true, but I am not sure if it is loading the image appropriately from the bytes or if something else is going wrong. I am not sure what to check in order to determine what is going wrong either. Assigning some picture to the cardPicture in scene displays correctly.
I logged the jsonReply.picture and used an online base64 to image converter and it displayed the image correctly.
byte[] pictureBytes = System.Convert.FromBase64String(jsonReply.GetString("picture"));
Texture2D tex = new Texture2D(2, 2);
tex.LoadImage( imageBytes );
Sprite sprite = Sprite.Create(tex, new Rect(0.0f, 0.0f, tex.width, tex.height), new Vector2(0.5f, 0.5f), 100.0f);
cardPicture.overrideSprite = sprite;
I assume you are trying to fetch an image from a remote url and trying to parse bytes into a texture. In unity WWW has facilitated this and does not require user involvement in conversion.
I believe your response may have header details which might cause issues in converting to a texture. You may use a code like below,
public string Url = #"http://dummyimage.com/300/09f/fff.png";
void Start () {
// Starting a coroutine to avoid blocking
StartCoroutine ("LoadImage");
}
IEnumerator LoadImage()
{
WWW www = new WWW(Url);
yield return www;
Debug.Log ("Loaded");
Texture texture = www.texture;
this.gameObject.GetComponent<Renderer>().material.SetTexture( 0,texture );
}
I don't know if this is solved but i want to share my solution.
void Start()
{
StartCoroutine(GetQR());
}
IEnumerator GetQR()
{
using (UnityWebRequest www = UnityWebRequest.Get(GetQR_URL))
{
yield return www.SendWebRequest();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
// Show results as text
Debug.Log(www.downloadHandler.text);
QRData qr = JsonUtility.FromJson<QRData>(www.downloadHandler.text);
string result = Regex.Replace(qr.img, #"^data:image\/[a-zA-Z]+;base64,", string.Empty);
CovertBase64ToImage(result);
}
}
}
void CovertBase64ToImage(string img)
{
byte[] bytes = Convert.FromBase64String(img);
Texture2D myTexture = new Texture2D(512,212);
myTexture.LoadImage(bytes);
Sprite sprite = Sprite.Create(myTexture, new Rect(0, 0, myTexture.width, myTexture.height), new Vector2(0.5f, 0.5f));
QRimage.transform.parent.gameObject.SetActive(true);
QRimage.sprite = sprite;
}
It is working perfectly on unity version 2019.4

EmguCV + Unity Open WebCam is Error;

I use EmguCV open webcam in unity.
But it's fps is low much.
this is my code ↓
private Texture2D texture;
private Capture capture;
private Color32[] color = new Color32[640*480];
// Use this for initialization
void Start () {
texture = new Texture2D (640, 480);
capture = new Capture ();
}
// Update is called once per frame
void Update () {
Image<Bgr, Byte> currentFrame = capture.QueryFrame();
Bitmap bitmapCurrentFrame = currentFrame.ToBitmap();
Image<Bgra, Byte> img = new Image<Bgra, Byte> (bitmapCurrentFrame);
for(int y=0; y<480; y++){
for(int x=0; x<640; x++){
int index = y+x*480;
print(index+";"+x+";"+y);
//byte b = img.Data[x,y,0];
color[index].r = img.Data[x,y,2];
color[index].g = img.Data[x,y,1];
color[index].b = img.Data[x,y,0];
color[index].a = 0xff;
}
}
texture.SetPixels32 (color);
texture.Apply (false);
renderer.material.mainTexture = texture;
}
i don't know why fps is so low...
and why my boss like EmguCV with Unity, why he don't use Unity-WebCamTexture...
OKAY,i really thank you for your read.
Hope, I can get some answer.
Look at Texture2D.LoadRawTextureData. It has no proper docs, so here's a snippet:
Texture2D tex = new Texture2D(width, height, format, false, true);
tex.LoadRawTextureData(buffer);
tex.Apply(false, true);
Buffer must be in the correct hardware format. For the format variable look at the list of formats that unity accepts.