Converting RenderTexture to Texture2D in Unity 2019 - unity3d

I'm using Intel Real Sense as camera device to capture picture. The capture result is displayed as a RenderTexture. Since I need to sent it via UDP, I need to convert it to byte[], but it only work for Texture2D. Is it possible to convert RenderTexture into Texture2D in unity 2019?
Edit:
Right now, I'm using this code to convert RenderTexture to Texture2D:
Texture2D toTexture2D(RenderTexture rTex)
{
Texture2D tex = new Texture2D(rTex.width, rTex.width, TextureFormat.ARGB32, false);
RenderTexture.active = rTex;
tex.ReadPixels(new Rect(0, 0, rTex.width, rTex.height), 0, 0);
tex.Apply();
return tex;
}
I got this code from here, which doesn't work anymore for unity 2019 since if I display the texture it only give me white texture.
Edit 2:
Here how i called that function:
//sender side
Texture2D WebCam;
public RawImage WebCamSender;
public RenderTexture tex;
Texture2D CurrentTexture;
//receiver side
public RawImage WebCamReceiver;
Texture2D Textur;
IEnumerator InitAndWaitForWebCamTexture()
{
WebCamSender.texture = tex;
CurrentTexture = new Texture2D(WebCamSender.texture.width,
WebCamSender.texture.height, TextureFormat.RGB24, false, false);
WebCam = toTexture2D(tex);
while (WebCamSender.texture.width < 100) //WebCam
{
yield return null;
}
StartCoroutine(SendUdpPacketVideo());
}
then i'll send it via network like this :
IEnumerator SendUdpPacketVideo()
{
...
CurrentTexture.SetPixels(WebCam.GetPixels());
byte[] PNGBytes = CurrentTexture.EncodeToPNG();
...
}
On receiver side, i'm gonna decode it and display on raw image:
....
Textur.LoadImage(ReceivedVideo);
WebCamReceiver.texture = Textur;
...

The most optimized way to do this is:
public Texture2D toTexture2D(RenderTexture rTex)
{
Texture2D dest = new Texture2D(rTex.width, rTex.height, TextureFormat.RGBA32, false);
dest.Apply(false);
Graphics.CopyTexture(rTex, dest);
return dest;
}

Related

Unity: Reduce size of Render Texture before executing Texture2D.ReadPixels

I'm working on a code where I basically have to take a low quality screenshot about every 30 milliseconds. The script is attached to a camara.
What I want to do is reduce the render texture size. The way the code is right now changing either W or H basically gets me a SECTION of of all that is being seen by the camara instead of a reduced size version. So my question is how can I resized or downsample what is read into the screenshot (Texture2D) but that it still is a representation of the entire screen.
public class CameraRenderToImage : MonoBehaviour
{
private RemoteRenderServer rrs;
void Start(){
TimeStamp.SetStart();
Camera.onPostRender += OnPostRenderCallback;
}
void OnPostRenderCallback(Camera cam){
if (TimeStamp.HasMoreThanThisEllapsed(30)){
TimeStamp.SetStart();
int W = Screen.width;
int H = Screen.height;
Texture2D screenshot = new Texture2D(W,H, TextureFormat.RGB24, false);
screenshot.ReadPixels( new Rect(0, 0, W,H), 0, 0);
byte[] bytes = screenshot.EncodeToPNG();
System.IO.File.WriteAllBytes("check_me_out.png", bytes);
TimeStamp.Tok("Encode to PNG and Save");
}
}
// Remove the onPostRender callback
void OnDestroy()
{
Camera.onPostRender -= OnPostRenderCallback;
}
}
If you need to resize your render texture from script you can refer to the next code snippet
void Resize(RenderTexture renderTexture, int width, int height) {
if (renderTexture) {
renderTexture.Release();
renderTexture.width = width;
renderTexture.height = height;
}
}
To make it possible to resize the render texture you first need to make sure it is released.
To get Texture2d:
private Texture2D ToCompressedTexture(ref RenderTexture renderTexture)
{
var texture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.ARGB32, false);
var previousTarget = RenderTexture.active;
RenderTexture.active = renderTexture;
texture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
RenderTexture.active = previousTarget;
texture.Compress(false);
texture.Apply(false, true);
renderTexture.Release();
renderTexture = null;
return texture;
}

Unity, How To SetPixels32 Properly?

I'm receiving the error SetPixels32 called with invalid number of pixels in the array UnityEngine.Texture2D:SetPixels32(Color32[]) on a line of code that is trying to retrieve a pixel array.
In some instances I receive the error but in others the webcams stream just fine. I'm not sure why this is occurring. This is the line of code that is giving me problems:
streamTexture.SetPixels32(webcamTexture.GetPixels32(pixels))
That isn't much to go on, however, below is the full script. If Anyone can tell me why this error is occurring since the streaming texture is set to the dimensions of the webcam texture. Any help is much appreciated!
using System.Collections;
using UnityEngine;
using Photon.Pun;
using UnityEngine.UI;
public class WebcamStream : MonoBehaviourPun, IPunObservable
{
[SerializeField] private LocalPlayerSettings playerSettings;
[SerializeField] private AvatarHandler parent;
[SerializeField] private RawImage streamRawimage;
private WebCamTexture webcamTexture;
private Texture2D streamTexture;
private Color32[] pixels;
private byte[] data;
private void OnEnable()
{
if (!parent.photonView.IsMine)
return;
InitWebcam();
}
private void OnDisable()
{
if (!parent.photonView.IsMine)
return;
webcamTexture.Stop();
}
private void InitWebcam()
{
//cast dimensions of target UI as ints for new webcam texture
int width = (int)streamRawimage.rectTransform.rect.width;
int height = (int)streamRawimage.rectTransform.rect.height;
//set new dimensions and target device
webcamTexture = new WebCamTexture(width, height)
{
deviceName = playerSettings.Webcam
};
//display webcam texture on the raw image and start camera
streamRawimage.material.mainTexture = webcamTexture;
webcamTexture.Play();
//set pixels and stream texture to match webcamTexture
pixels = new Color32[webcamTexture.width * webcamTexture.height];
streamTexture = new Texture2D(webcamTexture.width, webcamTexture.height, TextureFormat.RGB24, false);
//Begin Streaming Webcam Data
StartCoroutine(StreamWebcam());
}
private IEnumerator StreamWebcam()
{
while (webcamTexture.deviceName == playerSettings.Webcam)
{
if (webcamTexture.isPlaying)
{
//set the target texture pixels to the webcam texture pixels and apply get/set
streamTexture.SetPixels32(webcamTexture.GetPixels32(pixels));
streamTexture.Apply();
//convert image to byte array
data = streamTexture.EncodeToJPG();
}
yield return null;
}
webcamTexture.Stop();
if(WebCamTexture.devices.Length > 0)
{
InitWebcam();
}
}
public void OnPhotonSerializeView(PhotonStream stream, PhotonMessageInfo info)
{
if (stream.IsWriting)
{
//send the byte array through the stream
stream.SendNext(data);
}
else
{
//convert object received into byte array via cast
data = (byte[])stream.ReceiveNext();
//create new texture to load received data into
streamTexture = new Texture2D(1, 1, TextureFormat.RGB24, false);
streamTexture.LoadImage(data);
//set webcam raw image texture to the newly updated texture
streamRawimage.texture = streamTexture;
}
}
}
Solved! There was another script responsible for enabling this scripts game object and needed a yield for about 1 second.
Also a quick edit of streamRawimage.material.mainTexture = webcamTexture; to streamRawimage.texture = webcamTexture; fixed another issue surrounding a missing texture.

Unity OpenGL Texture not accessible in build

I am trying to pass the render texture from Unity into a C++ plugin to then use CUDA on the texture. This works fine in DirectX in editor and build, and also OpenGL in editor. But when I build in OpenGL it gives me a blank texture. I've also noticed that if I try and save said texture it also saves as a blank texture. I call my plugin using GL.IssuePluginEvent(PluginFunction(), 1), and this happens in a coroutine after yield return new WaitForEndOfFrame(). Any ideas on how to fix it?
private IEnumerator CallPlugin()
{
while (enabled) {
yield return new WaitForEndOfFrame();
//Attempt to save image to file, only works in editor
// Texture2D tex = new Texture2D(width, height, TextureFormat.RGBA32, false);
// RenderTexture.active = _renderTexture;
// tex.ReadPixels(new Rect(0, 0, width, height), 0, 0);
// tex.Apply();
// byte[] bytes = tex.EncodeToPNG();
// System.IO.File.WriteAllBytes("takeScreenshot.png", bytes);
GL.IssuePluginEvent(GetRenderEventFunc(), 1);
}
}
void OnRenderImage(RenderTexture src, RenderTexture dest)
{
if (shader != null)
{
Graphics.Blit(src, dest, material);
}
}

Unity code works in the editor but does not work on Build (EXE File) C#

My code load external images and build from them a SkyBox for unity3d.
All the files of the SkyBox are on the right paths (Copied manually).
The code loads 6 external images and then builds a SkyBox And I think there's a problem in the loading when it's in build stat. (But not sure)
Or maybe Unity prevents me from doing it?
And i have no error or warning code. It really drives me crazy!!!
using System.IO;
using UnityEngine;
public class ChangeSkyBox : MonoBehaviour
{
public static Texture2D LoadPNG(string filePath)
{
Texture2D tex = null;
byte[] fileData;
if (File.Exists(filePath))
{
fileData = File.ReadAllBytes(filePath);
tex = new Texture2D(1024, 1024);
tex.LoadImage(fileData);
}
tex.wrapMode = TextureWrapMode.Clamp;
return tex;
}
public static Material CreateSkyboxMaterial(SkyboxManifest manifest)
{
Material result = new Material(Shader.Find("RenderFX/Skybox"));
result.SetTexture("_FrontTex", manifest.textures[0]);
result.SetTexture("_BackTex", manifest.textures[1]);
result.SetTexture("_LeftTex", manifest.textures[2]);
result.SetTexture("_RightTex", manifest.textures[3]);
result.SetTexture("_UpTex", manifest.textures[4]);
result.SetTexture("_DownTex", manifest.textures[5]);
return result;
}
private Texture2D[] textures;
private void Start()
{
Texture2D xt1 = LoadPNG(Directory.GetCurrentDirectory() + "\\Assets\\SkyBox\\Front.png");
Texture2D xt2 = LoadPNG(Directory.GetCurrentDirectory() + "\\Assets\\SkyBox\\Back.png");
Texture2D xt3 = LoadPNG(Directory.GetCurrentDirectory() + "\\Assets\\SkyBox\\Left.png");
Texture2D xt4 = LoadPNG(Directory.GetCurrentDirectory() + "\\Assets\\SkyBox\\Right.png");
Texture2D xt5 = LoadPNG(Directory.GetCurrentDirectory() + "\\Assets\\SkyBox\\Top.png");
Texture2D xt6 = LoadPNG(Directory.GetCurrentDirectory() + "\\Assets\\SkyBox\\Bottom.png");
SkyboxManifest manifest = new SkyboxManifest(xt1, xt2, xt3, xt4, xt5, xt6);
Material newMat = new Material(Shader.Find("RenderFX/Skybox"));
newMat = CreateSkyboxMaterial(manifest);
RenderSettings.skybox = newMat;
DynamicGI.UpdateEnvironment();
}
}
public struct SkyboxManifest
{
public Texture2D[] textures;
public SkyboxManifest(Texture2D front, Texture2D back, Texture2D left, Texture2D right, Texture2D up, Texture2D down)
{
textures = new Texture2D[6]
{
front,
back,
left,
right,
up,
down
};
}
}
I believe your problem lies on this Line.
Material result = new Material(Shader.Find("RenderFX/Skybox"));
Unity cannot find it at runtime. To fix it, make the "base" Material by hand in Unity and attach it to your script throurgh the Inspector.

Load sprite from a base64 string which came from a websocket

I am trying to turn a base64 String into an Sprite in Unity 3D, but my sprite in scene remains blank.
public var cardPicture : Image;
function ReceiveData(jsonReply : JSONObject) {
var pictureBytes : byte[] = System.Convert.FromBase64String(jsonReply.GetString("picture"));
var cardPictureTexture = new Texture2D( 720, 720);
Debug.Log(cardPictureTexture.LoadImage(pictureBytes));
var sprite : Sprite = new Sprite ();
sprite = Sprite.Create (cardPictureTexture, new Rect (0,0,720,720), new Vector2 (0.5f, 0.5f));
cardPicture.overrideSprite = sprite;
}
This prints out true, but I am not sure if it is loading the image appropriately from the bytes or if something else is going wrong. I am not sure what to check in order to determine what is going wrong either. Assigning some picture to the cardPicture in scene displays correctly.
I logged the jsonReply.picture and used an online base64 to image converter and it displayed the image correctly.
byte[] pictureBytes = System.Convert.FromBase64String(jsonReply.GetString("picture"));
Texture2D tex = new Texture2D(2, 2);
tex.LoadImage( imageBytes );
Sprite sprite = Sprite.Create(tex, new Rect(0.0f, 0.0f, tex.width, tex.height), new Vector2(0.5f, 0.5f), 100.0f);
cardPicture.overrideSprite = sprite;
I assume you are trying to fetch an image from a remote url and trying to parse bytes into a texture. In unity WWW has facilitated this and does not require user involvement in conversion.
I believe your response may have header details which might cause issues in converting to a texture. You may use a code like below,
public string Url = #"http://dummyimage.com/300/09f/fff.png";
void Start () {
// Starting a coroutine to avoid blocking
StartCoroutine ("LoadImage");
}
IEnumerator LoadImage()
{
WWW www = new WWW(Url);
yield return www;
Debug.Log ("Loaded");
Texture texture = www.texture;
this.gameObject.GetComponent<Renderer>().material.SetTexture( 0,texture );
}
I don't know if this is solved but i want to share my solution.
void Start()
{
StartCoroutine(GetQR());
}
IEnumerator GetQR()
{
using (UnityWebRequest www = UnityWebRequest.Get(GetQR_URL))
{
yield return www.SendWebRequest();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
// Show results as text
Debug.Log(www.downloadHandler.text);
QRData qr = JsonUtility.FromJson<QRData>(www.downloadHandler.text);
string result = Regex.Replace(qr.img, #"^data:image\/[a-zA-Z]+;base64,", string.Empty);
CovertBase64ToImage(result);
}
}
}
void CovertBase64ToImage(string img)
{
byte[] bytes = Convert.FromBase64String(img);
Texture2D myTexture = new Texture2D(512,212);
myTexture.LoadImage(bytes);
Sprite sprite = Sprite.Create(myTexture, new Rect(0, 0, myTexture.width, myTexture.height), new Vector2(0.5f, 0.5f));
QRimage.transform.parent.gameObject.SetActive(true);
QRimage.sprite = sprite;
}
It is working perfectly on unity version 2019.4