Multidisplay Rendering - unity3d

I'm trying to build a project on 4 screens (2x projectors and 2x monitors all running at 1024x768). I'm using 2xGTX 295s.
Every time I run the file (using -multidisplay) the main screen hangs black and if I look at the processes I see 3 other screens being created, but nothing ever appears. The log files say "D3D11:failed to create RenderTexture(1024 x 768 fmt 44 aa 8), error 0x8007000e" where each Display.Activate() code should execute.
Relevant code:
if (Display.displays.Length > 1) {
Display.displays [1].SetRenderingResolution (1024, 768);
Display.displays [1].Activate ();
}
if (Display.displays.Length > 2){
Display.displays [2].SetRenderingResolution (1024, 768);
Display.displays [2].Activate ();
}
if (Display.displays.Length > 3) {
Display.displays [3].SetRenderingResolution (1024, 768);
Display.displays [3].Activate ();
}
I've tried searching for a solution, but because multi-display is relatively new I can't seem to find anything on it.
Does anyone have a solution or idea? it would be greatly appreciated. Thanks for your time.

This sounds like a driver issue. Windows update probably updated your video driver. Update your driver from device manager. If that didn't work, use a video driver from the manufacturer of your PC.You can easily download this from their website. Will provide link if you update your question with name of your PC make, the model number and the Operating System.

Related

Flir Lepton 3.5 Banana pi zero Armbian I2C issue

I have (I believed succesfully) enabled the spi and i2c on armbian to communicate with the lepton. Using the LEPTON FLIR libraries (e.g. https://github.com/Myzhar/Lepton3_Jetson/tree/master/grabber_lib/Lepton_SDK) I can communicate with the camera (set output etc. returns LEP_OK). However when I try to reboot the camera (using the LEP_RunOemReboot( &port )) I get I2C error. I tried to check in the file linux_I2C.c and I know that in function DEV_I2C_MasterReadData(...) the check if(bbb_result != 0 || bytesActuallyRead != bytesToRead) fails with values bbb_result = -1; bytesActuallyRead = 0 and bytesToRead = 2. As the settings otherwise returns OK (I have no way to check it, as the camera seems to get stuck after one frame. I want to reboot it and I get to this loop).
Any idea what is going on?
[EDIT] does not seem to be this issue as sleeps around the write function did nothing.
(the issue seems to be the write int DEV_I2C_MasterReadData(...) returning -1)

How to integrate Hand Controlle HC1 in my car simulator game

I m building a car simulator game using Unity. For the input I m using Logitheck steering wheel G29. Now I need to use Hand Controller to accelerate or break.
This is my Hand Controller
Hand Controller HC1
Link
Now I can I interpect his input ? This device is recognize by my windows 10 system, but if I try to start the game with this device I cannot accelerate or break the car.
I configured this in my InputController of Unity:
And in my IRDSPlayerControls.cs file I write these lines of code:
if (Input.anyKey)
{
foreach (KeyCode kcode in Enum.GetValues(typeof(KeyCode)))
{
Debug.Log("Joystick pressed " + kcode);
}
}
Debug.Log("Input debug acc: " + Input.GetAxis("Vertical3"));
Debug.Log("Input debug frenata: " + Input.GetAxis("Vertical4"));
In Console of Unity, I can display this:
Input debug acc: -1
Input debug frenata: -1
You can detect a specific button on a specific joystick joystick 1 button 0, joystick 1 button 1, joystick 2 button 0…
or a specific button on any joystick joystick button 0, joystick button 1, joystick button 2…
Check out Input Manager
I can explain this step by step here but it wont be as good as some tutorials online. I recommend this video as a good tutorial to do this.
UPDATE:
I think your hand controller give analog values and the acceleration/brake buttons are not actually buttons but they are analog joy sticks and have a range of values.
to check this use Input.GetJoystickNames :
using UnityEngine;
public class Example : MonoBehaviour
{
// Prints a joystick name if movement is detected.
void Update()
{
// requires you to set up axes "Joy0X" - "Joy3X" and "Joy0Y" - "Joy3Y" in the Input Manager
for (int i = 0; i < 4; i++)
{
if (Mathf.Abs(Input.GetAxis("Joy" + i + "X")) > 0.2 ||
Mathf.Abs(Input.GetAxis("Joy" + i + "Y")) > 0.2)
{
Debug.Log(Input.GetJoystickNames()[i] + " is moved");
}
}
}
}
I will suggest to first check if those inputs are being received with:
if (Input.anyKey)
{
foreach(KeyCode kcode in Enum.GetValues(typeof(KeyCode)))
{
Debug.Log(kcode);
}
}
This way you can know if the game is recognizing the keycodes of your controller, and if it does, which names are assigned to them.
Once you got this, you only need to check keycodes as an usual keyboard!
Not every joystick, steer etc. is mapping it's inputs to the same axis.
There is a unity forum about that topic (and other related problems). And I found that there are some unity plugins, that could probably solve your problem:
https://github.com/speps/XInputDotNet
https://github.com/JISyed/Unity-XboxCtrlrInput
There are some programs that you can use to list all input axis and see which one you are currently affecting. I used one of them but don't remember the name of it. It might help you to see to which axis your break and throttle are mapped to.
Some of them also allow you to remap then, if this is what you want.
The most probable cause is Unity3D does not support this device.
Unity3D uses a mix of XInput, GameInput?, and USB HID processing for its input on Windows.
It is unclear(closed source), if GameInput is used on Windows, it is required on the modern XBOX's.
I cannot provide a definitive answer, since I do not have this controller to test, and the documentation on the controller is sparse.
The best I can do is point you in the right direction.
Does the device exist in Unity3D:
See if the Input System identifies the device when plugged in while running (make sure the game window has focus):
Adapted from https://docs.unity3d.com/Packages/com.unity.inputsystem#1.4/manual/HowDoI.html
InputSystem.onDeviceChange +=
(device, change) =>
{
switch (change)
{
case InputDeviceChange.Added:
// New Device.
Debug.Log("New device added.");
break;
case InputDeviceChange.Disconnected:
// Device got unplugged.
break;
case InputDeviceChange.Connected:
// Plugged back in.
break;
case InputDeviceChange.Removed:
// Remove from Input System entirely; by default, Devices stay in the system once discovered.
break;
default:
// See InputDeviceChange reference for other event types.
break;
}
}
A lack of log output, when plugged in means the device was not identified as a potential input device. Skip to "All else Fails" below.
Identification at this level does not imply support, as it may flag all HID devices.
Look at all low level input events while pressing the buttons:(Also adapted from 4)
var trace = new InputEventTrace(); // Can also give device ID to only
// trace events for a specific device.
trace.Enable();
//…run stuff
var current = new InputEventPtr();
while (trace.GetNextEvent(ref current))
{
Debug.Log("Got some event: " + current);
}
// Trace consumes unmanaged resources. Make sure to dispose.
trace.Dispose();
The chances of getting here with responses(given the edited output) are slim, but if it happens explore the output to find hints to the device associations and fix your mappings accordingly.
All else Fails
Request device support though Unity3D.com website. Highly recommended.
You can write your own support for the device using either the USB HID, may be flagged by virus scanners, and there is limited documentation or implement a custom GameInput interface. The inclusion in Windows Game Controllers makes this the most probable solution.

Blank output for image subtraction when using raspicam library

I am using raspberry pi with raspicam to run a project. I have downloaded the raspicam library from http://sourceforge.net/projects/raspicam/files/?
I am trying to run a code for image subtraction but not getting results. Here is my code
raspicam::RaspiCam_Cv Camera;
Camera.set(cv::CAP_PROP_FRAME, CV_8UC1);
if(!Camera.open())
{
std::cerr<<"cannot open camera"<<std::endl;
}
Camera.grab();
Camera.retrieve(frame1);
Camera.grab();
Camera.retrieve(frame2);
Camera.grab();
Camera.retrieve(frame3);
while (True)
{
frame1=frame2;
frame2=frame3;
Camera.grab();
Camera.retrieve(frame3);
absdiff(frame2,frame1,d1);
imshow("result1",d1);
absdiff(frame2,frame3,d2);
imshow("result2",d2);
}
when I run this code I get blank frames of result1 and result2 as output. This is just a part of my code ignore if i have missed something.
Well, inside your loop
frame1=frame2;
...
absdiff(frame2,frame1,d1);
that'll be sort of identically zero...
Also, have you considered the timing here? You're grabbing images very close together in the time-domain, so naturally they'll be mostly identical (bar noise and fast motion) so the difference will be near-zero.
Cheers,

IMFSourceReader giving error 0x80070491 for some resolutions

I'm trying to catpure video from a 5MP UVC camera using an IMFSourceReader from Microsoft Media Foundation (on Windows 7 x64). Everything works just like the documentation with no errors on any API calls until the first callback into OnReadSample() which has "0x80070491 There was no match for the specified key in the index" as it's hrStatus parameter.
When I set the resolution down to 1080p it works fine even though 5MP is the camera's native resolution and 5MP (2592x1944) enumerates as an available format.
I can't find anything in the the Microsoft documentation to say that this behaviour is by design but it seems consistent so far. Has anyone else got IMFSourceReader to work at more that 1080p?
I see the same effects on the Microsoft MFCaptureToFile example when it's forced to select the native resolution:
HRESULT nativeTypeErrorCode = S_OK;
DWORD count = 0;
UINT32 streamIndex = 0;
UINT32 requiredWidth = 2592;
UINT32 requiredheight = 1944;
while ( nativeTypeErrorCode == S_OK )
{
IMFMediaType * nativeType = NULL;
nativeTypeErrorCode = m_pReader->GetNativeMediaType( streamIndex, count, &nativeType );
if ( nativeTypeErrorCode != S_OK ) continue;
// get the media type
GUID nativeGuid = { 0 };
hr = nativeType->GetGUID( MF_MT_SUBTYPE, &nativeGuid );
if ( FAILED( hr ) ) return hr;
UINT32 width, height;
hr = ::MFGetAttributeSize( nativeType, MF_MT_FRAME_SIZE, &width, &height );
if ( FAILED( hr ) ) return hr;
if ( nativeGuid == MFVideoFormat_YUY2 && width == requiredWidth && height == requiredheight )
{
// found native config, set it
hr = m_pReader->SetCurrentMediaType( streamIndex, NULL, nativeType );
if ( FAILED( hr ) ) return hr;
break;
}
SafeRelease( &nativeType );
count++;
}
Is there some undocumented maximum resolution with Media Framework?
It turns out that the problem was with the camera I was using, NOT the media streaming framework or UVC cameras generally.
I have switched back to using DirectShow sample grabbing which seems to work ok so far.
I ran into this same problem on windows 7 with a usb camera module I got from Amazon.com (ELP-USBFHD01M-L21). The default resolution of 1920x1080x30fps (MJPEG) works fine, but when I try to select 1280x720x60fps (also MJPEG, NOT h.264) I get the 0x80070491 error in the ReadSample callback. Various other resolutions work OK, such as 640x480x120fps. 1280x720x9fps (YUY2) also works.
The camera works fine at 1280x720x60fps in Direct Show.
Unfortunately, 1280x720x60fps is the resolution I want to use to do some fairly low latency augmented reality stuff with the Oculus Rift.
Interestingly, 1280x720x60fps works fine with the MFCaptureD3D sample in Windows 10 technical preview. I tried copying the ksthunk.sys and usbvideo.sys drivers from my windows 10 installation to my windows 7 machine, but they failed to load even when I booted in "Disable Driver Signing" mode.
After looking around on the web, it seems like various people with various webcams have run into this problem. I'm going to have to use DirectShow to do my video capture, which is annoying since it is a very old API which can't be used with app store applications.
I know this is a fairly obscure problem, but since Microsoft seems to have fixed it in Windows 10 it would be great if they backported the fix it to Windows 7. As it is, I can't use their recommended media foundation API because it won't work on most of the machines I have to run it on.
In any case, if you are having this problem, and Windows 10 is an option, try that as a fix.
Max Behensky

Formula or API for calulating desktop icon spacing on Windows XP

I've built a simple application that applies grid-lines to an image or just simple colors for use as desktop wallpaper. The idea is that the desktop icons can be arranged within the grid. The problem is that depending on more things than I understand the actual spacing in pixels seems to be different from system to system. I've learned that at least these things play a factor:
Resolution (duh)
Taskbar size and placement
Fonts
There has to be more than this. Maybe there's some api call that I don't know about?
there are a 1001 ways to get/set this (but I only know 2) :-D
Windows Register:
HKEY_CURRENT_USER\Control Panel\Desktop\WindowMetrics
values are IconSpacing and IconVerticalSpacing
by code:
using System.Management;
public string GetWinIconSpace()
{
ManagementObjectSearcher searcher = new ManagementObjectSearcher("root\\CIMV2","SELECT * FROM Win32_Desktop");
foreach (ManagementObject wmi in searcher.Get())
{
try
{
return "Desktop Icon Spacing: " + wmi.GetPropertyValue("IconSpacing").ToString();
}
catch { }
}
return "Desktop Icon Spacing: Unknown";
}
and the 3rd that I never tried you can find it here
They might also be a size problem due to scaling algorithm if the requested size of the icon is not available.
(since an icon file is actually a collection of icons, as explained in this thread about Icons and cursors know where they came from, from the The Old New Thing)