I am working an application in which i need to open device camera in full view and i am making this app for both iOS and Android. So can any one tell me that how can i open full screen device camera in Unity for all devices Android and iPhone.
This will be great help for me. Thanks in advance.
After some more digging on Google and Official docs. I got solution which I am going to share with you , It help someone .. someday..
1.Create New Project.
2.Select Main Camera in GameObject and change Transform via Inspector
Position X= -90 Y=785 Z=0 Rotation X=90 Y=90 Z=0 Scale X=1 Y=1 Z=1
3.Now go to GameObject — > Create Other — > Plane.
4.Select Main Camera in GameObject and
4.1 change Transform via Inspector
Position X=0 Y=0 Z=0 Rotation X=0 Y=0 Z=0 Scale X=100 Y=100 Z=100
4.2 change Tag=Player
Now create a c# script with name “CameraController” and replace the code with below one
using UnityEngine;
using System.Collections;
public class CameraController : MonoBehaviour
{
public WebCamTexture mCamera = null;
public GameObject plane;
// Use this for initialization
void Start ()
{
Debug.Log ("Script has been started");
plane = GameObject.FindWithTag ("Player");
mCamera = new WebCamTexture ();
plane.renderer.material.mainTexture = mCamera;
mCamera.Play ();
}
// Update is called once per frame
void Update ()
{
}
}
5.Finally save and Drag this Script file onto “Plane” GameObject
Note - you may see preview rotated in Unity Game view but on RealDevice it works well. tested on iPhone5 and Android - Nexus 5.
Here is the snap shot how it comes if you change rotation angle to 180:
If you mean you want to use the camera to take and save photographs, I'd recommend Prime31's iOS and Android plugins. Unfortunately the Etcetera plugin is us$65 per platform, but I've used them both and they work great.
http://u3d.as/content/prime31/i-os-etcetera-plugin/2CU
http://u3d.as/content/prime31/android-etcetera-plugin/2CY
If you just want to show the camera's live output inside of your app's scene, you can create a plane and use a WebCamTexture to display the camera's live video output.
http://docs.unity3d.com/Documentation/ScriptReference/WebCamTexture.html
There is a toolkit available to open the device camera in unity for iOS and Android called CameraCaptureKit - (https://www.assetstore.unity3d.com/en/#!/content/56673)
All the source is available and it has a plain and and simple plug'n'play demo as with Unity UI - it solves some of the obstacles involved with taking still images using the camera on your device.
Since the WebCamTexture is a generic solution to taking still images it doesn't enable sharpening of the images, the quality as well is more low-res on iOS since it uses a configuration for capturing real time video.
If you want to turn on flash / torch there you can do that with the toolkit.
There is a great plugin named Android native Camera for open device camera and save the video or image inside the device. Currently, it works only android.
Related
I am making an AR game with Unity.
I used the commonly used 'Time.timeScale = 1' to use the pause function,
This will turn the AR Camera screen into a black screen.
In addition, if you add the ar occlusion manager function in this state, the app will be down with the screen bugged.
When using AR Camera, what is the proper way to implement temporary suspension?
The actual screen you are viewing with AR camera should be seen with a frozen frame.
Please
What i would do is take a camera shot
ScreenCapture.CaptureScreenshot("TestImage.png");
then i would display it on a raw image
RawImage qrRenderer;
renderer.material.mainTexture = webcamTexture;
you can at that time disable youe camera
camera.enabled = false;
I am building an AR scene onto a Galaxy S7 using AR Core and Unity 2018.2.20f where I use the AugmentedImageVisualizer to track an image and place a 3d gameObject in front. In order to track the image, I need autofocus enabled. The issue is that my anchor jitters around and sometimes completely moves away and disappears. From debugging it seems to be that the camera autofocus might be causing the tracking issue.
My best workaround so far is to switch the camera config to fixed once the image has been detected, but 1/3 of the time the anchor moves while switching to fixed focus. If the anchor is still in sight once the camera is on fixed focus the tracking works great.
Has anyone else noticed this issue before? Is there a way to control the camera focus more specifically?
If you use ARCore version 1.9 or above, the tracking of the Augmented Images is improved than the previous versions. Please find the official releases from the link below.
Link: https://github.com/google-ar/arcore-unity-sdk/releases
Having the config in Auto Mode would be fine, as far as the debugging is concerned. Sometimes, if you are using both plane tracking and Augmented images in the config, the jittering might appear.
Also, there is no need to use an Augmented Visualizer to detect the Augmented Image. The following piece of code could help out to track the Augmented Image and Place any 3D Object.
using GoogleARCore;
private List<AugmentedImage> augmentedImages = new List<AugmentedImage>();
public GameObject AndyObject;//object to place on the image
private bool Is_Placed=false;
void Update()
{
Session.GetTrackables<AugmentedImage>(augmentedImages,
TrackableQueryFilter.Updated);
foreach (var image in augmentedImages)
{
if (image.TrackingState == TrackingState.Tracking && Is_Placed==false)
{
Anchor anchor=image.CreateAnchor(image.CenterPose);
var andyref = Instantiate(AndyObject,image.CenterPose.position,
image.CenterPose.rotation);
andyref.transform.parent=anchor.transform;
Is_Placed=true;//To only place the object once
}
}
}
I am failing at playing a simple video under unity free 2017.1.0f3 personal
I am working on a game and I'd like to play an introduction video at the start of the app. Then move onto the login screen whenever the player clicks it.
I have created a video player object, dragged and dropped the video clip (mp4) into the video clip field of the object.
I then attached the object to the camera. In the script attached to the camera I created a public VideoPlayer that I have populated with the video player object.
I then execute :
void Awake ()
{
VideoPlayer.Play();
}
But nothing happens.
Perhaps it should be executed within a separate thread (coroutine)? I tried but did not work either.
Any help please?
Provided you filled the right settings, if it still doesn't play but you get no error, no freeze, no nothing.
Trying restarting Unity.
(I struggled for 45 min trying to figure out why my video wouldn't play anymore until I restarted Unity and it magically reworked)
did you check if the Video can be played back by unity?
put a quad in front of your camera, put a videocomponent on there, check Loop and playonAwake, hit Play and see if it works.
GameObjectWithPlayerComponentAttatched.GetComponent<VideoPlayer>().Play();
should work fine
Did you assign the RenderTexture?
What I do is create a RenderTexture, then assign it to the videoplayer, then add a Raw Image and then give it the RenderTexture in the Texture field.
You should uncheck play onAwake if you want to play it at a certain point, instead of videoplayer.Play, use videoPlayer.
Prepare and prepareCompleted Play the video.
Like this:
private void Start()
{
videoPlayer.prepareCompleted += VideoPlayer_prepareCompleted;
videoPlayer.Prepare();
}
private void VideoPlayer_prepareCompleted(VideoPlayer source)
{
videoPlayer.Play();
}
Video Player has three modes: Render Texture -- and then you need to select a RenderTexture to render to, Material Override -- and then it needs a mesh to write to its material, and Camera -- then you need to assign a camera and select to render to the near or far plane.
To have the camera feature work automatically, you need to instantiate the VideoPlayer script on the camera object itself.
Follow these steps to play a video using unity's VideoPlayer component:
Create a plane, under 3d Objects.
Add a VideoPlayer component to that plane.
Set render mode as Material Override.
Drag the mesh renderer component of the plane to the videoplayer's renderer field.
Select a video clip to play and enable play on awake.
Press play in the editor window.
I'm creating a programfor project tango in Unity and i'm trying to make a class implementing ITangoDepth. Just for testing I've made this class implement the method OnTangoDepthAvailable just for printing a text, so I can see the thing workin. I can't -.-' This is what I have:
public void Start(){
m_tangoApplication = FindObjectOfType<TangoApplication>();
m_tangoApplication.Register(this);
}
public void OnTangoDepthAvailable(TangoUnityDepth tangoDepth)
{
// Calculate the time since the last successful depth data
// collection.
debug = "Depth Available";
}
I've enabled Depth in TangoManager too.
I've been a long time studying the code in Point Cloud example but i don't see what else do I have to set to enable the depth sensor. Can anyone help me make this work?
Thanks so much :)
EDIT: OK. I think i found the problem, but it created another one: in my app I'm using a material that shows what the camera sees in a plane in front of the camera. When i disable this plane it all works properly. Is it possible that camera and depth sensor can't work the same time??
You can only use the Tango API to use access the camera if you are using the depth. That's being said, the webcam texture in Unity won't work when the depth is enabled. The augmented reality example uses both depth and color image together, you can take a look of that.
1) I am a beginner to using Metaio in Unity, so my question is that is it possible to include two cameras on my scene: one is the main for 3D environment, and another camera (metaio camera) to display objects from Unity on the real world that must be fixed on the top wright corner as UI plane of my game. Something like in the picture below:
2) Also, how to display on the metaio camera view on the scene instead of displaying all scene objects as shown in the image below, although I only attached one cube object under metaio tracker:
Any help or answers gratefully received.
Thanks.
you have to attach an script to that plain and add the code bellow to that script:
void Start() {
WebCamTexture webcamTexture = new WebCamTexture();
renderer.material.mainTexture = webcamTexture;
webcamTexture.Play();
}
if you working for android you have to add camera permission to you manifest.
i think you need to work on it, because it doesnt work simply and you have to play with camera positions and to work very well.
this link shows you camera options that you have in unity:
http://docs.unity3d.com/ScriptReference/WebCamTexture.html