How to set Texture2D to skybox Panoramic material? - unity3d

I'm developing a VR App by using skybox Panoramic material. And I want to bind the material with a Texture2D created in C# script, then rendering the Texture2D in native plugin, the codes is below, but the skybox doesn't show anything, why?
private int texWidth = 2304;
private int texHeight = 2304;
private Texture2D tex;
// Start is called before the first frame update
void Start()
{
// Create a texture
Texture2D tex = new Texture2D(2304, 2304, TextureFormat.ARGB32, true);
// Set point filtering just so we can see the pixels clearly
tex.filterMode = FilterMode.Trilinear;
// Call Apply() so it‘s actually uploaded to the GPU
tex.Apply();
init(tex.GetNativeTexturePtr());
// Set texture onto our material
//RenderSettings.skybox.SetTexture("_MainTex", tex);
RenderSettings.skybox.mainTexture = tex;
}

Related

How to change texture format from Alpha8 to RGBA in Unity3d?

I have been trying to change the format from a camera that give a texture in Alpha8 to RGBA and have been unsuccessful so far.
This is the code I've tried:
public static class TextureHelperClass
{
public static Texture2D ChangeFormat(this Texture2D oldTexture, TextureFormat newFormat)
{
//Create new empty Texture
Texture2D newTex = new Texture2D(2, 2, newFormat, false);
//Copy old texture pixels into new one
newTex.SetPixels(oldTexture.GetPixels());
//Apply
newTex.Apply();
return newTex;
}
}
And I'm calling the code like this:
Texture imgTexture = Aplpha8Texture.ChangeFormat(TextureFormat.RGBA32);
But the image gets corrupted and isn't visible.
Does anyone know how to change this Alpha8 to RGBA so I can process it like any other image in OpenCV?
A friend provided me with the answer:
Color[] cs =oldTexture.GetPixels();
for(int i = 0; i < cs.Length; i++){//we want to set the r g b values to a
cs[i].r = cs[i].a;
cs[i].g = cs[i].a;
cs[i].b = cs[i].a;
cs[i].a = 1.0f;
}
//set the pixels in the new texture
newTex.SetPixels(cs);
//Apply
newTex.Apply();
This will take alot of resources but it will work for sure.
If you know a better way to make this change please add an answer to this thread.

How to use scene camera with Agora.io in Unity

In Unity I have integrated Agora.io such that from within my virtual reality app, i can connect a video call to an outside user on a webpage. The VR user can see the website user, but the website user cannot see the VR user because there is no available physical camera to use. Is there a way to use a scene camera for the Agora video feed? This would mean that the website user would be able to see into the VR user's world
Yes. Although I haven't done projects in VR before, but the concept should be there. You may use the External Video Source to send any frames of the video as if it is sent from the physical camera. For Scene cameras, you may use a RenderTexture to output the camera feed, and extract the raw data from the RenderTexture. So the steps are:
Set up your camera to output to a RenderTexture (plus logic to display this RenderTexture somewhere locally if needed.)
Also make sure when you set up the Agora RTC engine, enable external video source using this call:
mRtcEngine.SetExternalVideoSource(true, false);
At each frame, extract the raw image data from the RenderTexture
Send the raw frame data to the SDK function rtc.pushVideoFrame()
You may find the code for the last step here
https://gist.github.com/icywind/92053d0983e713515c64d5c532ebee21
I modified the sharescreen code Agora io edited to extract a render texture. The problem is I only get a white or black screen on the receiver while my render texture is a depth cam video flow.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using agora_gaming_rtc;
using UnityEngine.UI;
using System.Globalization;
using System.Runtime.InteropServices;
using System;
public class ShareScreen : MonoBehaviour
{
Texture2D mTexture;
Rect mRect;
[SerializeField]
private string appId = "Your_AppID";
[SerializeField]
private string channelName = "agora";
public IRtcEngine mRtcEngine;
int i = 100;
public RenderTexture depthMap;
void Start()
{
Debug.Log("ScreenShare Activated");
mRtcEngine = IRtcEngine.getEngine(appId);
mRtcEngine.SetLogFilter(LOG_FILTER.DEBUG | LOG_FILTER.INFO | LOG_FILTER.WARNING | LOG_FILTER.ERROR | LOG_FILTER.CRITICAL);
mRtcEngine.SetParameters("{\"rtc.log_filter\": 65535}");
mRtcEngine.SetExternalVideoSource(true, false);
mRtcEngine.EnableVideo();
mRtcEngine.EnableVideoObserver();
mRtcEngine.JoinChannel(channelName, null, 0);
mRect = new Rect(0, 0, depthMap.width, depthMap.height);
mTexture = new Texture2D((int)mRect.width, (int)mRect.height, TextureFormat.RGBA32, false);
}
void Update()
{
//Start the screenshare Coroutine
StartCoroutine(shareScreen());
}
//Screen Share
IEnumerator shareScreen()
{
yield return new WaitForEndOfFrame();
//FB activate automaticaly the render texture for the copy
RenderTexture.active = depthMap;
//Read the Pixels inside the Rectangle
mTexture.ReadPixels(mRect, 0, 0);
//Apply the Pixels read from the rectangle to the texture
mTexture.Apply();
// Get the Raw Texture data from the the from the texture and apply it to an array of bytes
byte[] bytes = mTexture.GetRawTextureData();
// Make enough space for the bytes array
int size = Marshal.SizeOf(bytes[0]) * bytes.Length;
// Check to see if there is an engine instance already created
IRtcEngine rtc = IRtcEngine.QueryEngine();
//if the engine is present
if (rtc != null)
{
//Create a new external video frame
ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame();
//Set the buffer type of the video frame
externalVideoFrame.type = ExternalVideoFrame.VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA;
// Set the video pixel format
externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_BGRA;
//apply raw data you are pulling from the rectangle you created earlier to the video frame
externalVideoFrame.buffer = bytes;
//Set the width of the video frame (in pixels)
externalVideoFrame.stride = (int)mRect.width;
//Set the height of the video frame
externalVideoFrame.height = (int)mRect.height;
//Remove pixels from the sides of the frame
externalVideoFrame.cropLeft = 0;
externalVideoFrame.cropTop = 0;
externalVideoFrame.cropRight = 0;
externalVideoFrame.cropBottom = 0;
//Rotate the video frame (0, 90, 180, or 270)
externalVideoFrame.rotation = 180;
// increment i with the video timestamp
externalVideoFrame.timestamp = i++;
//Push the external video frame with the frame we just created
int a = rtc.PushVideoFrame(externalVideoFrame);
Debug.Log(" pushVideoFrame = " + a);
}
}
}

Scale a PNG in Unity5? - Bountie

Surprisingly in Unity, for years the only way to simply scale an actual PNG is to use the very awesome library http://wiki.unity3d.com/index.php/TextureScale
Example below
How do you scale a PNG using Unity5 functions? There must be a way now with new UI and so on.
So, scaling actual pixels (such as in Color[]) or literally a PNG file, perhaps downloaded from the net.
(BTW if you're new to Unity, the Resize call is unrelated. It merely changes the size of an array.)
public WebCamTexture wct;
public void UseFamousLibraryToScale()
{
// take the photo. scale down to 256
// also crop to a central-square
WebCamTexture wct;
int oldW = wct.width; // NOTE example code assumes wider than high
int oldH = wct.height;
Texture2D photo = new Texture2D(oldW, oldH,
TextureFormat.ARGB32, false);
//consider WaitForEndOfFrame() before GetPixels
photo.SetPixels( 0,0,oldW,oldH, wct.GetPixels() );
photo.Apply();
int newH = 256;
int newW = Mathf.FloorToInt(
((float)newH/(float)oldH) * oldW );
// use a famous Unity library to scale
TextureScale.Bilinear(photo, newW,newH);
// crop to central square 256.256
int startAcross = (newW - 256)/2;
Color[] pix = photo.GetPixels(startAcross,0, 256,256);
photo = new Texture2D(256,256, TextureFormat.ARGB32, false);
photo.SetPixels(pix);
photo.Apply();
demoImage.texture = photo;
// consider WriteAllBytes(
// Application.persistentDataPath+"p.png",
// photo.EncodeToPNG()); etc
}
Just BTW it occurs to me I'm probably only talking about scaling down here (as you often have to do to post an image, create something on the fly or whatever.) I guess, there would not often be a need to scale up in size an image; it's pointless quality-wise.
If you're okay with stretch-scaling, actually there's simpler way by using a temporary RenderTexture and Graphics.Blit. If you need it to be Texture2D, swapping RenderTexture.active temporarily and read its pixels to Texture2D should do the trick. For example:
public Texture2D ScaleTexture(Texture src, int width, int height){
RenderTexture rt = RenderTexture.GetTemporary(width, height);
Graphics.Blit(src, rt);
RenderTexture currentActiveRT = RenderTexture.active;
RenderTexture.active = rt;
Texture2D tex = new Texture2D(rt.width,rt.height);
tex.ReadPixels(new Rect(0, 0, tex.width, tex.height), 0, 0);
tex.Apply();
RenderTexture.ReleaseTemporary(rt);
RenderTexture.active = currentActiveRT;
return tex;
}

Read image and set texture

I add a cube to unity scene. I want to set this cub's texture by using an image.
I use below code to load image and set texture :
Texture2D text2D = new Texture2D(Screen.width, Screen.height,TextureFormat.RGB24 , false);
text2D.SetPixels(((Texture2D)Resources.Load("image")).GetPixels());
MeshRenderer renderer = cube.GetComponent<MeshRenderer>();
renderer.material.mainTexture = text2D;
I see only a gray cube not the image on the scene.
You can shorten this quite a bit with only:
renderer.material.mainTexture = Resources.Load<Texture2D>("image");
Note that if the image is not found then you get null.
To see changes on the Texture2D, use text2d.Apply();
This is even more easy to do.
Try
public GameObject _cube;
void Start()
{
Renderer rend = _cube.GetComponent<Renderer> ();
rend.material.mainTexture = Resources.Load ("image") as Texture;
}
LoadImage method can also be used to do the job. But here, you have to pass in the image as .bytes format.
Example:
public TextAsset image;
void Start()
{
var texture = new Texture2D(100, 100, TextureFormat.ARGB32, false);
texture.LoadImage(image.bytes);
GetComponent<Renderer>().material.mainTexture = texture;
texture.Apply();
}

How to set correct color to lines?

I'm working on my first game in Unity.
I'm trying to draw lines on my game field.
Code:
private void DrawLine(Vector3 start, Vector3 stop, GameObject template)
{
GameObject toInstiateGridLine = template;
GameObject gridLineInstance = Instantiate(toInstiateGridLine, start, Quaternion.identity) as GameObject;
LineRenderer gridLineRenderer = gridLineInstance.GetComponent<LineRenderer>();
gridLineRenderer.SetVertexCount(2);
gridLineRenderer.SetWidth(0.01f, 0.01f);
gridLineRenderer.SetColors(Color.black, Color.black);
gridLineRenderer.SetPosition(0, start);
gridLineRenderer.SetPosition(1, stop);
}
It works with one problem. I get pink lines instead of black that I expect.
Settings of LineRenderer component that has been created in runtime:
You are missing material. Pink is the standard color when material is missing.
LineRenderer gridLineRenderer = gridLineInstance.GetComponent<LineRenderer>();
Material mat = new Material(Shader.Find("Unlit/Texture"));
gridLineRenderer.material = mat;
Or, you can change material color directly. As I consider, calling directly will cause to create standard default material
gridLineRenderer.material.color = Color.white;