How to access Hololens front camera - unity3d

I'm working with hololens, and I'm trying to get the image of the front camera. The only thing I need is to take the image of each frame of that camera and transform it into a byte array.

Follow Unity Example in the documentation. It has a well written example:
https://docs.unity3d.com/Manual/windowsholographic-photocapture.html
Copied from the unity documentation from the link above:
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.XR.WSA.WebCam;
public class PhotoCaptureExample : MonoBehaviour {
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
// Use this for initialization
void Start() {
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, delegate (PhotoCapture captureObject) {
photoCaptureObject = captureObject;
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, delegate (PhotoCapture.PhotoCaptureResult result) {
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
});
});
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame) {
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a GameObject to which the texture can be applied
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Custom/Unlit/UnlitTexture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result) {
// Shutdown the photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
}
}
To get bytes: Replace the following method:
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame) {
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a GameObject to which the texture can be applied
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Custom/Unlit/UnlitTexture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
with the method below:
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
List<byte> imageBufferList = new List<byte>();
imageBufferList.Clear();
photoCaptureFrame.CopyRawImageDataIntoBuffer(imageBufferList);
var bytesArray = imageBufferList.ToArray();
}
If you are using Unity 2018.1 or 2018.2 (I think also 2017.4) now, then it will NOT work. Unity has public tracker to resolve it:
https://issuetracker.unity3d.com/issues/windowsmr-failure-to-take-photo-capture-in-hololens
I created a workaround until Unity fixes the bug:
https://github.com/MSAlshair/HoloLensMediaCapture
Basic sample without using PhotoCapture from unity as a workaround: More details in the link above
You must add #if WINDOWS_UWP to be able to use MediaCapture: Ideally, you want to use PhotoCapture from unity to avoid this, but until Unity resolve the issue, you can use something like this.
#if WINDOWS_UWP
public async System.Threading.Tasks.Task<byte[]> GetPhotoAsync()
{
//Get available devices info
var devices = await Windows.Devices.Enumeration.DeviceInformation.FindAllAsync(
Windows.Devices.Enumeration.DeviceClass.VideoCapture);
var numberOfDevices = devices.Count;
byte[] photoBytes = null;
//Check if the device has camera
if (devices.Count > 0)
{
Windows.Media.Capture.MediaCapture mediaCapture = new Windows.Media.Capture.MediaCapture();
await mediaCapture.InitializeAsync();
//Get Highest available resolution
var highestResolution = mediaCapture.VideoDeviceController.GetAvailableMediaStreamProperties(
Windows.Media.Capture.MediaStreamType.Photo).
Select(item => item as Windows.Media.MediaProperties.ImageEncodingProperties).
Where(item => item != null).
OrderByDescending(Resolution => Resolution.Height * Resolution.Width).
ToList().First();
using (var photoRandomAccessStream = new Windows.Storage.Streams.InMemoryRandomAccessStream())
{
await mediaCapture.CapturePhotoToStreamAsync(highestResolution, photoRandomAccessStream);
//Covnert stream to byte array
photoBytes = await ConvertFromInMemoryRandomAccessStreamToByteArrayAsync(photoRandomAccessStream);
}
}
else
{
System.Diagnostics.Debug.WriteLine("No camera device detected!");
}
return photoBytes;
}
public static async System.Threading.Tasks.Task<byte[]> ConvertFromInMemoryRandomAccessStreamToByteArrayAsync(
Windows.Storage.Streams.InMemoryRandomAccessStream inMemoryRandomAccessStream)
{
using (var dataReader = new Windows.Storage.Streams.DataReader(inMemoryRandomAccessStream.GetInputStreamAt(0)))
{
var bytes = new byte[inMemoryRandomAccessStream.Size];
await dataReader.LoadAsync((uint)inMemoryRandomAccessStream.Size);
dataReader.ReadBytes(bytes);
return bytes;
}
}
#endif
Do NOT forget to add capabilities to the manifest:
WebCam
Microphone: I am not sure if it needed since we are only taking photos or not, but I added it anyway.
Pictures Library if you want to save to it

In player settings, enable access to the camera, and try again.

Related

Unity, How To SetPixels32 Properly?

I'm receiving the error SetPixels32 called with invalid number of pixels in the array UnityEngine.Texture2D:SetPixels32(Color32[]) on a line of code that is trying to retrieve a pixel array.
In some instances I receive the error but in others the webcams stream just fine. I'm not sure why this is occurring. This is the line of code that is giving me problems:
streamTexture.SetPixels32(webcamTexture.GetPixels32(pixels))
That isn't much to go on, however, below is the full script. If Anyone can tell me why this error is occurring since the streaming texture is set to the dimensions of the webcam texture. Any help is much appreciated!
using System.Collections;
using UnityEngine;
using Photon.Pun;
using UnityEngine.UI;
public class WebcamStream : MonoBehaviourPun, IPunObservable
{
[SerializeField] private LocalPlayerSettings playerSettings;
[SerializeField] private AvatarHandler parent;
[SerializeField] private RawImage streamRawimage;
private WebCamTexture webcamTexture;
private Texture2D streamTexture;
private Color32[] pixels;
private byte[] data;
private void OnEnable()
{
if (!parent.photonView.IsMine)
return;
InitWebcam();
}
private void OnDisable()
{
if (!parent.photonView.IsMine)
return;
webcamTexture.Stop();
}
private void InitWebcam()
{
//cast dimensions of target UI as ints for new webcam texture
int width = (int)streamRawimage.rectTransform.rect.width;
int height = (int)streamRawimage.rectTransform.rect.height;
//set new dimensions and target device
webcamTexture = new WebCamTexture(width, height)
{
deviceName = playerSettings.Webcam
};
//display webcam texture on the raw image and start camera
streamRawimage.material.mainTexture = webcamTexture;
webcamTexture.Play();
//set pixels and stream texture to match webcamTexture
pixels = new Color32[webcamTexture.width * webcamTexture.height];
streamTexture = new Texture2D(webcamTexture.width, webcamTexture.height, TextureFormat.RGB24, false);
//Begin Streaming Webcam Data
StartCoroutine(StreamWebcam());
}
private IEnumerator StreamWebcam()
{
while (webcamTexture.deviceName == playerSettings.Webcam)
{
if (webcamTexture.isPlaying)
{
//set the target texture pixels to the webcam texture pixels and apply get/set
streamTexture.SetPixels32(webcamTexture.GetPixels32(pixels));
streamTexture.Apply();
//convert image to byte array
data = streamTexture.EncodeToJPG();
}
yield return null;
}
webcamTexture.Stop();
if(WebCamTexture.devices.Length > 0)
{
InitWebcam();
}
}
public void OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info)
{
if (stream.IsWriting)
{
//send the byte array through the stream
stream.SendNext(data);
}
else
{
//convert object received into byte array via cast
data = (byte[])stream.ReceiveNext();
//create new texture to load received data into
streamTexture = new Texture2D(1, 1, TextureFormat.RGB24, false);
streamTexture.LoadImage(data);
//set webcam raw image texture to the newly updated texture
streamRawimage.texture = streamTexture;
}
}
}
Solved! There was another script responsible for enabling this scripts game object and needed a yield for about 1 second.
Also a quick edit of streamRawimage.material.mainTexture = webcamTexture; to streamRawimage.texture = webcamTexture; fixed another issue surrounding a missing texture.

HoloLens - How to get webcam texture 2D from Vuforia

I'm developing a Vuforia app for HoloLens by using Unity.
This app displays a simple 3D Object when an image target is detected.
I'm also using the fm Exhibition Tool Pack hololens from the Unity Asset Store in order to stream the app running on HoloLens to a PC.
Everything works fine but when i stream the app to PC i see the 3D Unity scene instead of the room.
So i've tried to get the webcam texture and attach it to a cube inside the scene but the vuforia ARCamera get somehow conflict with it and i can't see anything on the cube. Instead when i run the app inside the Unity Simulator i see myself on the cube.
Is there a way to get the webcam texture 2D from Vuforia and attach it to a GameObject inside the scene? Maybe with the Vuforia.Image class? But i don't know how it works.
Below scripts are compatible with FMETP STREAM.
The scripts are tested on mobile.
using UnityEngine;
using System.Collections;
using Vuforia;
using UnityEngine.UI;
public class VuforiaCamAccess : MonoBehaviour
{
private bool mAccessCameraImage = true;
public RawImage rawImage;
public GameObject Mesh;
private Texture2D texture;
#if UNITY_EDITOR
private Vuforia.PIXEL_FORMAT mPixelFormat = Vuforia.PIXEL_FORMAT.GRAYSCALE;
#else
private Vuforia.PIXEL_FORMAT mPixelFormat = Vuforia.PIXEL_FORMAT.RGB888;
#endif
private bool mFormatRegistered = false;
void Start()
{
#if UNITY_EDITOR
texture = new Texture2D(Screen.width, Screen.height, TextureFormat.R8, false);
#else
texture = new Texture2D(Screen.width, Screen.height, TextureFormat.RGB24, false);
#endif
// Register Vuforia life-cycle callbacks:
Vuforia.VuforiaARController.Instance.RegisterVuforiaStartedCallback(OnVuforiaStarted);
Vuforia.VuforiaARController.Instance.RegisterOnPauseCallback(OnPause);
Vuforia.VuforiaARController.Instance.RegisterTrackablesUpdatedCallback(OnTrackablesUpdated);
}
private void OnVuforiaStarted()
{
// Try register camera image format
if (CameraDevice.Instance.SetFrameFormat(mPixelFormat, true))
{
Debug.Log("Successfully registered pixel format " + mPixelFormat.ToString());
mFormatRegistered = true;
}
else
{
Debug.LogError("Failed to register pixel format " + mPixelFormat.ToString() +
"\n the format may be unsupported by your device;" +
"\n consider using a different pixel format.");
mFormatRegistered = false;
}
}
private void OnPause(bool paused)
{
if (paused)
{
Debug.Log("App was paused");
UnregisterFormat();
}
else
{
Debug.Log("App was resumed");
RegisterFormat();
}
}
private void OnTrackablesUpdated()
{
//skip if still loading image to texture2d
if (LoadingTexture) return;
if (mFormatRegistered)
{
if (mAccessCameraImage)
{
Vuforia.Image image = CameraDevice.Instance.GetCameraImage(mPixelFormat);
//if (image != null && image.IsValid())
if (image != null)
{
byte[] pixels = image.Pixels;
int width = image.Width;
int height = image.Height;
StartCoroutine(SetTexture(pixels, width, height));
}
}
}
}
bool LoadingTexture = false;
IEnumerator SetTexture(byte[] pixels, int width, int height)
{
if (!LoadingTexture)
{
LoadingTexture = true;
if (pixels != null && pixels.Length > 0)
{
if (texture.width != width || texture.height != height)
{
#if UNITY_EDITOR
texture = new Texture2D(width, height, TextureFormat.R8, false);
#else
texture = new Texture2D(width, height, TextureFormat.RGB24, false);
#endif
}
texture.LoadRawTextureData(pixels);
texture.Apply();
if (rawImage != null)
{
rawImage.texture = texture;
rawImage.material.mainTexture = texture;
}
if (Mesh != null) Mesh.GetComponent<Renderer>().material.mainTexture = texture;
}
yield return null;
LoadingTexture = false;
}
}
private void UnregisterFormat()
{
Debug.Log("Unregistering camera pixel format " + mPixelFormat.ToString());
CameraDevice.Instance.SetFrameFormat(mPixelFormat, false);
mFormatRegistered = false;
}
private void RegisterFormat()
{
if (CameraDevice.Instance.SetFrameFormat(mPixelFormat, true))
{
Debug.Log("Successfully registered camera pixel format " + mPixelFormat.ToString());
mFormatRegistered = true;
}
else
{
Debug.LogError("Failed to register camera pixel format " + mPixelFormat.ToString());
mFormatRegistered = false;
}
}
}

How to capture video from web camera using unity?

I am trying to capture video from web camera using unity and hololens.
I found this example on the unity page here .
I am pasting the code below. The light on the cam turns on, however it doesnt record.
The VideoCapture.CreateAsync doesnt create a VideoCapture. So the delegate there is never executed.
I saw this thread, however that was on. On the player settings the webcam and microphone capabilities are on.
What could be the problem?
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.XR.WSA.WebCam;
public class VideoCaptureExample : MonoBehaviour
{
static readonly float MaxRecordingTime = 5.0f;
VideoCapture m_VideoCapture = null;
float m_stopRecordingTimer = float.MaxValue;
// Use this for initialization
void Start()
{
StartVideoCaptureTest();
Debug.Log("Start");
}
void Update()
{
if (m_VideoCapture == null || !m_VideoCapture.IsRecording)
{
return;
}
if (Time.time > m_stopRecordingTimer)
{
m_VideoCapture.StopRecordingAsync(OnStoppedRecordingVideo);
}
}
void StartVideoCaptureTest()
{
Resolution cameraResolution = VideoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
Debug.Log(cameraResolution);
float cameraFramerate = VideoCapture.GetSupportedFrameRatesForResolution(cameraResolution).OrderByDescending((fps) => fps).First();
Debug.Log(cameraFramerate);
VideoCapture.CreateAsync(false, delegate (VideoCapture videoCapture)
{
Debug.Log("NULL");
if (videoCapture != null)
{
m_VideoCapture = videoCapture;
Debug.Log("Created VideoCapture Instance!");
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.frameRate = cameraFramerate;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
m_VideoCapture.StartVideoModeAsync(cameraParameters,
VideoCapture.AudioState.ApplicationAndMicAudio,
OnStartedVideoCaptureMode);
}
else
{
Debug.LogError("Failed to create VideoCapture Instance!");
}
});
}
void OnStartedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Started Video Capture Mode!");
string timeStamp = Time.time.ToString().Replace(".", "").Replace(":", "");
string filename = string.Format("TestVideo_{0}.mp4", timeStamp);
string filepath = System.IO.Path.Combine(Application.persistentDataPath, filename);
filepath = filepath.Replace("/", #"\");
m_VideoCapture.StartRecordingAsync(filepath, OnStartedRecordingVideo);
}
void OnStoppedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Stopped Video Capture Mode!");
}
void OnStartedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Started Recording Video!");
m_stopRecordingTimer = Time.time + MaxRecordingTime;
}
void OnStoppedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Stopped Recording Video!");
m_VideoCapture.StopVideoModeAsync(OnStoppedVideoCaptureMode);
}
}
EDIT:
The problem was that the API doesnt work on the Emulator
You should try taking a look at this thread here. Where it goes into detail on how to record a video with HoloLens as well as how to take a photo. Also make sure you have the WebCam and microphone capabilities set. Also if you are trying to save it, make sure you have the Videos Library capability as well.
OnVideoCaptureCreated:
void OnVideoCaptureCreated (VideoCapture videoCapture)
{
if (videoCapture != null)
{
m_VideoCapture = videoCapture;
Resolution cameraResolution = VideoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
float cameraFramerate = VideoCapture.GetSupportedFrameRatesForResolution(cameraResolution).OrderByDescending((fps) => fps).First();
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.frameRate = cameraFramerate;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
m_VideoCapture.StartVideoModeAsync(cameraParameters,
VideoCapture.AudioState.None,
OnStartedVideoCaptureMode);
}
else
{
Debug.LogError("Failed to create VideoCapture Instance!");
}
}
OnStartVideoCaptureMode:
void OnStartedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
if (result.success)
{
string filename = string.Format("MyVideo_{0}.mp4", Time.time);
string filepath = System.IO.Path.Combine(Application.persistentDataPath, filename);
m_VideoCapture.StartRecordingAsync(filepath, OnStartedRecordingVideo);
}
}
OnStartRecordingVideo:
void OnStartedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Started Recording Video!");
// We will stop the video from recording via other input such as a timer or a tap, etc.
}
StopRecordingVideo:
// The user has indicated to stop recording
void StopRecordingVideo()
{
m_VideoCapture.StopRecordingAsync(OnStoppedRecordingVideo);
}
OnStopRecordingVideo:
void OnStoppedRecordingVideo(VideoCapture.VideoCaptureResult result)
{
Debug.Log("Stopped Recording Video!");
m_VideoCapture.StopVideoModeAsync(OnStoppedVideoCaptureMode);
}
void OnStoppedVideoCaptureMode(VideoCapture.VideoCaptureResult result)
{
m_VideoCapture.Dispose();
m_VideoCapture = null;
}

Unable to load movie via WWW

I'm trying to load a video via url, but I keep getting the same error. I'm using Unity 5.3 and the example code from http://docs.unity3d.com/ScriptReference/WWW-movie.html (heavily modified because the current example doesn't compile).
using UnityEngine;
using System.Collections;
// Make sure we have gui texture and audio source
[RequireComponent (typeof(GUITexture))]
[RequireComponent (typeof(AudioSource))]
public class TestMovie : MonoBehaviour {
string url = "http://www.unity3d.com/webplayers/Movie/sample.ogg";
WWW www;
void Start () {
// Start download
www = new WWW(url);
StartCoroutine(PlayMovie());
}
IEnumerator PlayMovie(){
MovieTexture movieTexture = www.movie;
// Make sure the movie is ready to start before we start playing
while (!movieTexture.isReadyToPlay){
yield return 0;
}
GUITexture gt = gameObject.GetComponent<GUITexture>();
// Initialize gui texture to be 1:1 resolution centered on screen
gt.texture = movieTexture;
transform.localScale = Vector3.zero;
transform.position = new Vector3 (0.5f,0.5f,0f);
// gt.pixelInset.xMin = -movieTexture.width / 2;
// gt.pixelInset.xMax = movieTexture.width / 2;
// gt.pixelInset.yMin = -movieTexture.height / 2;
// gt.pixelInset.yMax = movieTexture.height / 2;
// Assign clip to audio source
// Sync playback with audio
AudioSource aud = gameObject.GetComponent<AudioSource>();
aud.clip = movieTexture.audioClip;
// Play both movie & sound
movieTexture.Play();
aud.Play();
}
}
I added this as a script to the Main Camera in a new scene, and I get this error:
Error: Cannot create FMOD::Sound instance for resource (null), (An invalid parameter was passed to this function. )
UnityEngine.WWW:get_movie()
<PlayMovie>c__Iterator4:MoveNext() (at Assets/TestMovie.cs:20)
UnityEngine.MonoBehaviour:StartCoroutine(IEnumerator)
TestMovie:Start() (at Assets/TestMovie.cs:16)
(Line 20 is MovieTexture movieTexture = www.movie;)
I've been working on this for a while now, it's happened on many files and both of my systems.
I found a solution! I tested the code with unity 5.2.X and 5.3.X.
I dont know why but Unity 3d requires first wait unitl the download is done with isDone == true and after the copy of the texture wait until the isReadyToPlay == true.
The movie file must be OGG Video format some MP4 files doesn't work.
Well. Check the code:
using UnityEngine;
using UnityEngine.UI;
using System.Collections;
public class MovieTextureStream : MonoBehaviour {
public Text progressGUI;
public MeshRenderer targetRender = null;
public AudioSource targetAudio = null;
public string URLString = "http://unity3d.com/files/docs/sample.ogg";
MovieTexture loadedTexture;
IEnumerator Start() {
if(targetRender ==null) targetRender = GetComponent<MeshRenderer> ();
if(targetAudio ==null) targetAudio = GetComponent<AudioSource> ();
WWW www = new WWW (URLString);
while (www.isDone == false) {
if(progressGUI !=null) progressGUI.text = "Progresso do video: " + (int)(100.0f * www.progress) + "%";
yield return 0;
}
loadedTexture = www.movie;
while (loadedTexture.isReadyToPlay == false) {
yield return 0;
}
targetRender.material.mainTexture = loadedTexture;
targetAudio.clip = loadedTexture.audioClip;
targetAudio.Play ();
loadedTexture.Play ();
}
}
I won't provide a solution with the WWW.movie, but an alternative solution that may help ou or anyone else.
On IPhone, we didn't find a solution to stream video from a server, we decided to download the video before reading it, here's how:
string docPath = Application.persistentDataPath + "/" + id + ".mp4";
if (!System.IO.File.Exists(docPath))
{
WWW www = new WWW(videouUrl);
while(!www.isDone)
{
yield return new WaitForSeconds(1);
Loading.Instance.Message = "Downloading video : " + (int)(www.progress * 100) + "%";
if (!string.IsNullOrEmpty(www.error))
Debug.Log(www.error);
}
byte[] data = www.bytes;
System.IO.File.WriteAllBytes(docPath, data);
}
mediaPlayer.Load(docPath);
onVideoReady();
mediaPlayer.Play();
This is the coroutine used to download and write the video on the iphone's file system. Once it's done, you can load and play it.
Hope it helps.

Load sprite from a base64 string which came from a websocket

I am trying to turn a base64 String into an Sprite in Unity 3D, but my sprite in scene remains blank.
public var cardPicture : Image;
function ReceiveData(jsonReply : JSONObject) {
var pictureBytes : byte[] = System.Convert.FromBase64String(jsonReply.GetString("picture"));
var cardPictureTexture = new Texture2D( 720, 720);
Debug.Log(cardPictureTexture.LoadImage(pictureBytes));
var sprite : Sprite = new Sprite ();
sprite = Sprite.Create (cardPictureTexture, new Rect (0,0,720,720), new Vector2 (0.5f, 0.5f));
cardPicture.overrideSprite = sprite;
}
This prints out true, but I am not sure if it is loading the image appropriately from the bytes or if something else is going wrong. I am not sure what to check in order to determine what is going wrong either. Assigning some picture to the cardPicture in scene displays correctly.
I logged the jsonReply.picture and used an online base64 to image converter and it displayed the image correctly.
byte[] pictureBytes = System.Convert.FromBase64String(jsonReply.GetString("picture"));
Texture2D tex = new Texture2D(2, 2);
tex.LoadImage( imageBytes );
Sprite sprite = Sprite.Create(tex, new Rect(0.0f, 0.0f, tex.width, tex.height), new Vector2(0.5f, 0.5f), 100.0f);
cardPicture.overrideSprite = sprite;
I assume you are trying to fetch an image from a remote url and trying to parse bytes into a texture. In unity WWW has facilitated this and does not require user involvement in conversion.
I believe your response may have header details which might cause issues in converting to a texture. You may use a code like below,
public string Url = #"http://dummyimage.com/300/09f/fff.png";
void Start () {
// Starting a coroutine to avoid blocking
StartCoroutine ("LoadImage");
}
IEnumerator LoadImage()
{
WWW www = new WWW(Url);
yield return www;
Debug.Log ("Loaded");
Texture texture = www.texture;
this.gameObject.GetComponent<Renderer>().material.SetTexture( 0,texture );
}
I don't know if this is solved but i want to share my solution.
void Start()
{
StartCoroutine(GetQR());
}
IEnumerator GetQR()
{
using (UnityWebRequest www = UnityWebRequest.Get(GetQR_URL))
{
yield return www.SendWebRequest();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
// Show results as text
Debug.Log(www.downloadHandler.text);
QRData qr = JsonUtility.FromJson<QRData>(www.downloadHandler.text);
string result = Regex.Replace(qr.img, #"^data:image\/[a-zA-Z]+;base64,", string.Empty);
CovertBase64ToImage(result);
}
}
}
void CovertBase64ToImage(string img)
{
byte[] bytes = Convert.FromBase64String(img);
Texture2D myTexture = new Texture2D(512,212);
myTexture.LoadImage(bytes);
Sprite sprite = Sprite.Create(myTexture, new Rect(0, 0, myTexture.width, myTexture.height), new Vector2(0.5f, 0.5f));
QRimage.transform.parent.gameObject.SetActive(true);
QRimage.sprite = sprite;
}
It is working perfectly on unity version 2019.4