Can I share game screens with other Unity programs? - unity3d

I'm going to ask you a question because there's something blocked while I was creating a VR program with Unity.
There is program A and program B.
A is a game program.
B is a monitoring program.
So the screen that A is running (main camera, sub-camera)
You must be able to see it at B.
The solution I thought would be
Calling up Unity Game Screens in Visual Studio
Sync the game screen itself over the network
I've tried both.
Number 1 shows the main camera, but I don't think I can see the sub-camera.
No. 2 is not syncing at all. I can connect.
So I'm wondering if there's another solution. Mr. Unity, please help me.

Related

Oculus Go: How to keep awake while still in app

I'm working on an Oculus Go app, using Unity, where the user is literally expected to sleep while using the app and with the headset still on their head. I seem to be running into the problem where the Oculus Go goes into power-save mode because of inactivity. Presumably, the user isn't moving enough when they are in a deep sleep.
Although I've included instructions for the user on how to disable sleep as a device-wide setting, this is far from ideal. iOS has idleTimerDisabled (Keep iPhone from sleeping) which is a simple one-line stay-awake type of command. I'm looking for the Oculus Go equivalent of iOS's idleTimerDisabled
Does anyone have any hints on how to keep the Oculus Go from turning off?
Just to be clear, this stay-awake behavior should only happen while the user is wearing the headset and using the this app, and only this app. If the user takes off the headset, normal turn-off behavior should be immediately restored.
Thanks in advance,
JJ
Include
Screen.sleepTimeout = SleepTimeout.NeverSleep;
in your MonoBehaviour script if you just want keep the device awake while app is running and helmet is worn.
https://docs.unity3d.com/ScriptReference/Screen-sleepTimeout.html
As of May 2019, it is not possible to keep the Oculus Go awake. Oculus wrote to me
Unfortunately, this doesn't look to be functionality (disabling sleep when there is no movement) that is available at this time.

Hololens Methods of Input and Output

I have planned to start building an application for the Hololens in a month from now. So right now, I am just in the preliminary design and feasibility check. (For the record I have built simple applications for the Hololens using Unity and also have used the camera for some image recognition)
My main concern is methods of inputing data to my application. In a normal application you have GUI widgets such as spinners or sliders, if you want to enter a numberic number.
How can I input numeric values to a Hololens application?
Since you've made a few applications for HoloLens before I'm guessing you know about the MixedRealityToolkit That Microsoft offers. If you don't know about it yet, and want to use it, here is a quick guide for how to set it up (which can also be found on the MixedRealityToolkit Github). In this toolkit there are a lot of tools that can help you with building the interactions for the HoloLens.
In this toolkit there are also a few examples on how to go about making sliders and other sorts of input.
If you look under Examples/UX you'll see a few scenes/prefabs/scripts that give an example on how you could go about making such GUI widgets for hololens

Couldn't find Crashlytics GameObject in Unity 5.6

Struggling with adding Crashalytics to our build.
We've downloaded and added the Fabric UnityPackage, upgraded to the latest version, signed in to our Fabric account from within the Unity Interface, and dragged the GameObject from the final step of the 'Prepare Fabric' modal into our first scene. Finally, we've built the game to Android.
After this, playing the game in the editor is prompting "Couldn't find Crashlytics GameObject" warnings, even though FabricInit and CrashlyticsInit are in the scene. The message pops up twice when running the game -- There are two consecutive Unity Scenes that this startup scene leads to.
There doesn't seem to be any specific documentation on the website, and the Fabric website leads us to the download page, as opposed to the dashboard.
Mike from Fabric here. The goal of this warning was to let you and other developers know that they haven't dragged in the CrashlyticsInit game object within a game, but this is being flagged in other cases due to how we try and detect this.
We're working on a more graceful way to get this point across, but in the meantime, if you comment out line 65 of Fabric/Editor/CommonBuild/FabricCommonBuild.cs then this will go away.
I removed the [PostProcessScene(0)] in Fabric/Editor/CommonBuild/FabricCommonBuild.cs
Because I only initialize and put the Crashlytics GameObject in my first scene.
I don't need it to check every scene to confirm if there is an object or if it needs to be initialized.
But still not sure if it will have some side effect.

Disabling Microsoft PixelSense (aka Surface) Surface Shell Timeouts

I am writing a game for the Microsoft PixelSense written in Unity, communicating with the table through the SurfaceToTUIO Bridge and the unity3d-tuio Unity Plugin.
I am currently trying to get the game to play nicely with the Microsoft PixelSense launcher. I have been able to get my application to appear in the launcher by mimicking the Surface Bing Application - duplicating a link to an XML file in C:\ProgramData\Microsoft\Surface\v2.0\Programs and creating the corresponding XML in the proper format.
When I go into Surface Mode - either through a dedicated Surface User Account, or through the Surface Shell on the Administrator's Profile, the game appears correctly on the launcher bar with the custom icon I set in the XML. The game launches correctly from the bar, and I can play it without any errors for about two minutes.
At that point, the Launcher closes my game. With a little research, I learned that its the Application's Responsibility to dismiss the Launcher.
Being that this is part of the Microsoft PixelSense SDK and not accessible to Unity, I've tried various methods to get around this:
I tried running the game in Single Application Mode. It turns out there is a different timeout that still waits for the SignalApplicationLoadComplete call.
I tried using the CriticalProcessMonitoring Disable and ApplicationProcessMonitoring Disable keys in the Registry.
I tried setting the LoadingScreenTimeout and SingleAppLoadingScreenTimeout Registry Keys to 0 - below their valid threshold.
What I want is to provide my users with the smoothest experience getting into and out of the Unity game. Currently we have to launch our game from the Windows Desktop, which can frustrate users due to the fact that Windows can't differientiate between a finger touching the screen, and a palm hovering above the screen.
Does anyone have a workaround, an good understanding of how I could spoof a SignalApplicationLoadingCall from Unity, or a suggestion of something to try?
Thanks!
Update: I've found some more things that didn't work:
I found the Microsoft.Surface DLL at C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.Surface\v4.0_2.0.0.0__31bf3856ad364e35. I imported it into my Unity project, but recieved a System.TypeLoadException that appears to be that the DLL is compiled with .NET 4.0, which Unity does not currently support.
I have been unable to find any COM objects that would allow me to communicate with the launcher without needing the DLL.
I cannot use a XNA wrapper program, as System.Diagnostics.Process.Start doesn't work in Surface Mode according to this post.

iPad/iPhone Screen Mirror

I'm trying to figure out a way to mirror an iPad screen to other iPads. This doesn't seem to be supported on the platform though.
Basically, a teacher would have an iPad, then the students would have iPads and see everything that is happening on the teachers screen, but on their screens.
Thoughts?
I have been attempting to find a solution to this problem myself. I have not found any apps that can mirror exactly what is happening on another IPAD, but some come close.
RabbleBrowser and Ideaflight both had potential. Ideaflight appears to be more for business. RabbleBrowser appears to allow the mirroring, except it only works as a browser and a file/picture mirroring.
Both iPads are linked to the same wifi and when you join a session, they will mirror the iPad that started session. Also allows chat (controlled by session starter).
It does NOT continue to mirror if you move out of browser and into another app however. I had dreams of leading a class through a a lesson on google earth, but no go .:(
Another option is attaching a laptop to a projector. Then you download Airserver on the laptop. Go to the menu bar at bottom of iPad and turn on AirPlay. The laptop will mirror the iPad perfectly and project it! It's wireless and works well. I tried the HDMI connector to laptop but it gives a poor quality, shaky image.
Hope they allow mirroring in future updates. The capability is there, don't know why they don't! Guess trying to sell more appletv!
A similar question was asked on the Apple forum (https://discussions.apple.com/thread/3118281?start=0&tstart=0), and the following app seemed to help them or answer their question.
Have a look at Replicate Pro on the app store:
http://itunes.apple.com/us/app/replicate-pro-for-ipad/id363286515?mt=8
One feature listed in the notes:
Share files between two iPads/iPhones that are running this app. (Pro
version only)
I'm not sure if this will cover multiple devices or simply between two, but it may be worth a look. Sadly, the only way to try would be to spend $5.99.
You'll need to create an application for the student iPad that emulates the screen of the teachers iPad. I would suggest that, although i dont know if its possible, the teacher somehow starts up and app that emulates their entire iPad. Meaning, from within the app named "teacher share" (or whatever it is), they can access the music, settings, notes and other apps found on their ipad. Then that information could be sent over a network to the students.
Nearpod is an app that will allow you to mirror a presentation on several iPads. I have had up to 9 at one time. Through the Nearpod program you can make a presentation similar to PowerPoint, and also incorporate interactive questions, which can be multiple choice, short answer, and even drawings. The only drawback is the full version costs $10/month. The free version is still good, you are just limited on the size of the presentation.
After doing lots of research, I found one app which shares iPhone device into another iPhone device. Really great logic they have applied for screen mirroring.
No idea about detailed how they have implemented but after installing and checking the app I came to know that I think they have used iPhone Screen Recording and broadcasting it on to their server and then on another device they are syncing from the same URL.
OliOli a free and simple screen sharing app for iOS.
iOS App: https://itunes.apple.com/us/app/olioli-screen-sharing/id1382253993?mt=8
WebSite: https://olioli.io/